The corpus callosum (CC) is the largest commissural pathway in the brain. The splenium of corpus callosum (SCC) represents the thickest and most posterior portion of the CC. Transient signal alterations in the SCC on magnetic resonance imaging (MRI) have been observed in various neurological and non-neurological conditions, termed cytotoxic lesions of the corpus callosum (CLOCCs). We aim to study the different MRI brain patterns in patients with CLOCCs and correlate probable etiologies with clinical and radiological presentations. During the study period of 8 years, 10,000 MRI brain scans were reviewed. A total of 127 (1.27%) patients had splenial involvement. Cases of splenial involvement due to stroke and tumors were excluded. Ultimately, 35 patients who had CLOCCs were enrolled in the study. After analyzing splenial lesions with respect to size, shape, involvement of the genu, and involvement of extracallosal sites, CLOCCs were further classified into dot sign, boomerang sign, double boomerang sign, and boomerang plus sign. The dot sign was observed in 19 (54.28%), the boomerang sign in 3 (8.57%), the double boomerang sign in 3 (8.57%) patients, and the boomerang plus sign in 10 (28.57%) patients. The majority of the patients had seizures. The other probable etiologies included infections, hypernatremia, hypoglycemia, and headache. CLOCCs is a clinicoradiological syndrome resulting from a wide variety of causes. MRI brain patterns can range from isolated involvement of the splenium to involvement of extracallosal sites, and these patterns may help identify the probable etiologies.
The editors of the Annals of Indian Academy of Neurology have accepted an unusual paper for publication in this issue titled “Scheme for The Conduct of National Exit Test in India Suggestions by Stakeholders.”[1] The authors have suggested a three step scheme for the conduct of a single nationwide National Exit Test (NEXT) to bring uniformity in medical examinations. The ideas discussed and suggestions made are sensible, thoughtful, and interesting. Obviously, the paper is not related to neurology. However, it gives us an opportunity to think about the current state of neurology education and assessment (NEA) in India. We, the neurology teachers must realize that we are also stakeholders in the NEXT. Neurology is an important component of teaching and assessment at the undergraduate and postgraduate (PG) level. It is taught and assessed by non-neurologists in many medical colleges. Ideally, that should not be the case. It should be a matter of inquiry on how input by neurologists would make crucial and meaningful differences. Neurology teachers must be contributing to creation of valid, reliable, and feasible methods and instruments to evaluate students and residents. Many of us are great teachers and examiners, but our skills, time, and attention are restricted to Doctorate of Medicine (DM)/ Diplomate of National Board (DNB) residents. We need to expand our gaze to Doctor of Medicine (MD) and Bachelor of Medicine, Bachelor of Surgery (MBBS) levels too. Not only neurologists but, more importantly, general practitioners (GPs) and internists must be proficient in the clinical neurological examination and Bayesian decision-making process for diagnosis and management. We also need to devise, improve, and implement evidence-based teaching courses for non-neurologists in practice.[2] As a teacher in a non-DM medical college, the present author has strongly felt the need and scope for improving neurology education at undergraduate (UG) and PG and GP level.[3] Neurology teachers need to be trained and oriented toward an evidence-based objective assessment of students in theory and practice. The proposed establishment of the Central Board of Medical Education for NEXT should encourage neurology fraternity to develop something similar to the American Board for Neurology and Psychiatry (ABPN) for NEA. Currently the NEA at DM/DNB in marred by wide variations in quality. Opening up of new residency programs and a rapidly increasing number of seats are posing challenges in ensuring quality and uniformity. Sadly, the clinical skills are being lost with neglect of bedside teaching at the cost of rote-memory-based multiple-choice-option-type examinations. Research in NEA of examinees is neglected in India. We need to think about relevant and important research questions, design of the experiment, collection of data, and rigorous statistical analyses. There are different theories about learning. The G theory, for example, addresses the variance in scores associated with many facets like the student's true ability, types of cases chosen, domains and items to be probed, and skills of the raters.[4] Traditional oral examination for DM/DNB neurology is at high risk of subjective bias. ABPN has long replaced the locally administered oral examination with a standardized neurology clinical skills examination (NEX).[5] Videotaped NEA events and encounters may be used for research purposes. Research is also needed to assess the competence of teachers and examiners. The validity and standardization of videotapes of NEA have not been probed in India. The Objective Structured Clinical Examinations (OSCEs) in neurology at UG, PG, and DM level are another potential area for research in India to improve the validity and reliability of performance-based-assessment, wherein, examinees, rotate through a wide variety of standardized patients (SPs) or partial task trainers (PTTs).[6] These methods have not caught the imagination of neurology faculty in India. Data from SPs using OSCE can be utilized to answer many more ingenious research questions. For example, one study confirmed the intuitive knowledge that one or other specific component of clinical reasoning (long-term supportive management for neurological diseases) may not be as good as others (acute diagnosis and therapy).[7] Whether we keep real or SPs in the practical examination, the selection should include both positive and negative appropriate findings. The emphasis should be on classical examples rather than infrequent, atypical, or esoteric ones. A lot of work is needed to develop high-quality scenarios, demonstrate the reproducibility and reliability of OSCE scores, and teach precise evidence-based neurological examination.[8] We will be doing well by establishing clinical skills laboratories to teach neurological examination and reduce neurophobia among students.[9]“Scheduled bedside skills modeling” is another approach wherein students observe a clinical encounter for a comprehensive history and neurological examination, followed by questions and debriefing. An observation guide is supplied to residents with a checklist for many elements of symptoms and signs and spaces for notes and questions. Faculty are provided a preceptor guide. Many learning themes and subthemes can be identified. The effect of the modelling experiences on acquiring procedural and cognitive skills have been documented.[10] Direct observation of residents performing the neurological exam as recorded in a video format and offline assessment by blinded faculty in quantitative and qualitative formats can be another research protocol. It would inform the teachers what are the common difficulties faced by students during neurological physical examination.[11] The softer, but no less important skills of empathetic and truthful communication between clinicians and patients and caregivers are even harder to assess, more so in neurology, which often involves the delivery of complex and difficult news and decision-making. “Cross sectional concurrent nested mixed methods” studies have utilized data collection by an electronic communication tracker along with qualitative survey questionnaires.[12] The imperative for research in NEA is very strong. There must be comprehensive quantitative and qualitative studies of hypothesis-driven observations or interventions on the acquisition of knowledge and skills, with lessons in pedagogy and assessment. Research in NEA will promote career development for neurology teachers. We will need to identify many diverse outcome measures, including the impact of educational programs on not only knowledge but also long-term physician behavior, patient safety, professionalism, and long-term career success.[13] A comprehensive review and discussion about many more research questions and research methods in NEA in global as well Indian context will be worthy of a lengthy monograph. It is high time that the executive committee and member community of the Indian Academy of Neurology (IAN) appreciate the importance of NEA and take lead in improving the neurology component of NEET and NEXT. In addition, they also became organized with respect to residency curricula, practices, methods, and assessments at DM/DNB level. Many activities may help in these tasks, like an expert group meeting, a satellite symposium, midterm conference, supplement of Annals of Indian Academy of Neurology, and dedicated subgroup.
Spinal muscular atrophy (SMA) is the most common neurodegenerative disorder in children, with an incidence of 1 in 6000-10000 live births. It is characterized by progressive lower motor neuron weakness. Studies on the natural history and long-term follow-up of an Indian cohort are sparse. We aim to study the clinical course, systemic complications, and treatment outcomes of children with genetically confirmed SMA during a 2-year follow-up. A prospective observational study was conducted at Aster MIMS Hospital Calicut, Kerala, including patients with genetically confirmed SMA aged 1 month to 18 years attending the neuromuscular clinic of the pediatric neurology department from 1 st April 2022 to 31 st March 2024. Data were collected using a standardized neuromuscular clinical proforma and were followed up biannually. Of the 80 enrolled patients, there were 33 boys and 47 girls. Our study included three types of SMA: 6 with SMA type 1, 55 with SMA type 2, and 19 with SMA type 3. A total of 22/80 had a positive family history of SMA. Among 46 patients on disease-modifying therapies, 24 showed improvement in functional scores. Bilevel positive airway pressure (BIPAP) ventilation was initiated in 21/25 patients with abnormal sleep studies. Scoliosis surgery was performed in 10/54 children with scoliosis. A multidisciplinary approach and early initiation of disease-modifying drugs significantly improved the quality of life in SMA patients and their functional scores. Early BIPAP initiation and timely management of scoliosis were crucial.
Central nervous system tuberculosis (CNS TB) is a major cause of neurological morbidity in developing countries, with diverse clinical presentations. This study aimed to characterize the demographic and clinical profile of patients with CNS TB at a tertiary neurology center in India and to identify key clinical, laboratory, and radiological predictors of prognosis. A prospective observational study was conducted between August 2023 and January 2025 on 110 patients with CNS TB at a tertiary care center. Demographic, clinical, neuroimaging, cerebrospinal fluid (CSF), systemic laboratory data, and cartridge-based nucleic acid amplification test (CBNAAT) findings were analyzed. Functional outcome was assessed using the modified Rankin Scale at the 9-month follow-up. Of 110 patients, 56.4% were female; the mean age was 29.86 ± 11.23 years, with 84% being younger than 40 years. Common features included meningeal signs (92.7%), headache (90.9%), fever (88.2%), altered sensorium (83.6%), and seizures (33.6%). Cranial nerve palsies occurred in 28.2% patients, mainly the sixth nerve. Pulmonary TB was present in 23.6% patients. Neuroimaging showed tuberculomas (37.3%), hydrocephalus (27.3%), infarcts (20%), and basal exudates (11.8%). CSF analysis revealed lymphocytic pleocytosis (mean 173 cells/mm³), elevated protein (182.7 mg/dL), and low glucose (mean CSF/plasma ratio 0.31). CBNAAT was positive in 18.2% subjects, with 16.36% being rifampicin-sensitive and rifampicin resistance was observed in 0.9%. Poor outcome was associated with increasing age, longer duration of illness, cranial neuropathy, hydrocephalus, and paradoxical reaction. CNS TB in our cohort predominantly affected young adults and presented with classical meningeal features and diverse neuroimaging abnormalities. Microbiological confirmation remained limited, underscoring reliance on clinical-radiological diagnosis. Poor outcome was associated with increasing age, longer duration of illness, cranial neuropathy, hydrocephalus, and paradoxical reaction.
Cerebrovascular disease is the fifth leading cause of death in the United States. Ischemic stroke, often due to atherosclerotic carotid artery stenosis, carries a high risk of recurrence. Preventive options for stroke include medical management, carotid endarterectomy (CEA), and carotid artery stenting (CAS). CAS offers comparable safety and efficacy to CEA, but the role of embolic protection devices (EPDs) is still unclear. The study aims to evaluate the safety and efficacy of CAS without EPDs in symptomatic carotid artery stenosis, focusing on periprocedural complications and long-term outcomes. A retrospective analysis included patients who underwent CAS for symptomatic carotid artery stenosis at a tertiary care center. The primary outcome was 30-day major complications (stroke, myocardial infarction, death). Secondary outcomes were restenosis and modified Rankin Scale (mRS) scores during follow-up. Data was recorded after reviewing of medical records and imaging studies. A P value of less than 0.05 was considered significant. A total of 146 patients underwent 156 CAS without EPDs. The median duration from symptom onset to CAS was 37 days (interquartile range [IQR] =14 days). Ischemic stroke was the most common clinical presentation (56.8%). Bradycardia occurred in 24 (16.44%) patients and periprocedural stroke in 6 (4.11%) patients: 4 (2.7%) ischemic and 2 (1.3%) hemorrhagic. Three ischemic stroke patients became clinically asymptomatic and one progressed from mRS 1 to 2, but both hemorrhagic stroke patients died. Restenosis (>50%) occurred in 8 (5.12%) stents in 30-month follow-up. CAS without EPDs is safe and effective for symptomatic carotid artery stenosis, with low complication and restenosis rates, comparable to CEA and CAS with EPDs. Interval CAS without EPDs (delay >2 weeks) with plaque stabilization can be safe and cost-effective, making it a viable option in resource-limited settings with experienced operators.
Cerebral venous thrombosis (CVT) accounts for <0.5%-3% of all strokes but is increasingly identified with the availability of advanced imaging. Standard anticoagulation therapy is effective in most cases; however, a subset of patients experience clinical deterioration despite optimal medical management. Endovascular treatment (EVT) has emerged as a potential rescue therapy, though evidence remains limited due to the absence of randomized controlled trials. We aimed to evaluate the efficacy and safety of EVT in severe, medically refractory CVT cases at a tertiary care center in central India. This prospective study included 36 patients with severe CVT who underwent EVT over 30 months. Procedures included catheter-directed thrombolysis, mechanical thrombectomy, and balloon venoplasty. Clinical features, procedural details, and outcomes were recorded. Post-procedure outcomes were assessed using the modified Rankin Scale (mRS) at 3 months. During the study period, out of the total cases of CVT presented to our center, 22.7% underwent endovascular thrombectomy. Of the 36 patients, 89.3% were male, and the majority were younger than 30 years. Headache was a universal presenting symptom, followed by seizures (75%) and focal neurological deficits (58.3%). All patients underwent balloon venoplasty; 35% received adjunctive thrombolysis, and 21% underwent aspiration thrombectomy. Technical success was achieved in all cases. Median mRS improved significantly from 3.0 at baseline to 1.0 at 3 months ( P < 0.001). No major complications occurred. Right sigmoid sinus thrombosis was significantly associated with complications ( P = 0.041). EVT is a safe and effective rescue option for patients with CVT refractory to standard anticoagulation, leading to substantial neurological recovery. Larger multicenter studies are needed to refine patient selection and standardized treatment protocols.
Epilepsy, with its complex interplay of neurobiological, genetic, and pharmacological factors, continues to challenge neurologists worldwide-particularly in managing drug resistance. This study aimed to evaluate the frequency of the SCN1A and SCN2A gene polymorphisms among patients with epilepsy (PWE) from Eastern India to assess their relationship with clinical profiles, electroencephalography (EEG) findings, and response to antiseizure medication. In this case-control genetic association study, a total of 110 PWE were assessed for demographic and clinical profiles, EEG and neuroimaging findings, and treatment response. SCN1A and SCN2A gene polymorphisms (five selected intronic single nucleotide polymorphisms [SNPs]-four in SCN1A [rs2298771, rs6730344, rs10167228, rs6732655] and one in SCN2A [rs17183814]) were assessed through polymerase chain reaction and restriction fragment length polymorphism. In the present study cohort, we observed that the generalized seizure type (73.6%) and early-onset seizures (mean 7.4 years in drug-resistant epilepsy [DRE]) were significantly more frequent in drug-resistant persons. The SCN2A rs17183814 AG genotype frequencies were 36/92 (39.1%) in DRE, 10/18 (55.6%) in responders, and 46/110 (41.8%) in controls. Statistical comparison showed no significant association between the AG genotype and drug resistance (DRE vs. responders: odds ratio [OR] 0.51, 95% confidence interval [CI] 0.19-1.43, P = 0.303; DRE vs. controls: OR 0.89, 95% CI 0.51-1.57, P = 0.808). On genetic analysis, the AG genotype of SCN2A rs17183814 was numerically higher among responders than DRE, but the difference was not significant. None of the SCN1A SNPs studied ( rs2298771, rs6730344, rs10167228, rs6732655 ) showed significant associations with treatment response, consistent with previous Indian reports.
Spinal muscular atrophy (SMA) is an inherited neuromuscular disorder with a grave prognosis. Gene replacement therapy has significantly altered the disease trajectory. This study presents real-world evidence of the efficacy and safety of onasemnogene abeparvovec (OA) in children below 2 years of age from India. This single-centre, prospective cohort study includes longitudinal data of patients treated with OA for over 18 months. All patients diagnosed with SMA under 2 years of age were enrolled for OA through either the Global Managed Access Program or via commercial procurement. Once selected, patients were admitted for OA infusion, with close monitoring for adverse effects and blood parameters. Patients were followed prospectively to assess motor milestones and functional motor scores, ventilator and feeding support requirements, and hospitalization frequency. Thirteen children received OA over the study period. Three died within a month-two likely due to the severity of the underlying disease and one from an unknown cause. Transaminitis and transient thrombocytopenia were frequently observed but managed successfully. Most patients showed gains in motor milestones during follow-up; the remainder maintained their baseline. No new ventilatory or feeding support was required. One child was weaned from permanent ventilation and gastrostomy to predominantly oral feeding, and another transitioned from nasogastric to oral feeds. Caregivers reported clinically meaningful improvement in 50% of patients. OA has transformed the landscape for children with early-onset SMA, enabling the achievement of motor milestones previously considered unattainable. However, the chronic nature of the disease necessitates long-term multidisciplinary care.
Sleep-related breathing disorders (SRBDs) in children with epilepsy (CWE) are often under-recognized despite their potential impact on seizure control and quality of life. This study aimed to determine the prevalence of SRBDs in CWE using the Pediatric Sleep Questionnaire (PSQ) and to identify clinical factors associated with SRBDs. This observational study included 238 CWE attending the pediatric epilepsy clinic of a tertiary care hospital in Mumbai, India, between June 2020 and May 2021. The PSQ was administered to all participants, with a score >8/22 suggesting SRBDs. Categorical variables were compared using the Chi-square test. SRBDs were present in 44.5% of CWE according to the PSQ. Age, gender, age at epilepsy onset, family history of seizures, seizure type, body mass index, and neck-circumference-to-height ratio did not differ significantly between children with and without SRBDs. SRBDs were more frequent in children on multiple anti-seizure medications (ASMs), those with higher seizure burden, and those with drug-resistant epilepsy. It was present in 36.7% of children on <2 ASMs versus 50.7% on ≥2 ASMs ( P = 0.004). SRBDs affected 31.1% of drug-responsive children compared with 58.0% of those with drug-resistant epilepsy ( P = 0.0001). SRBDs are a common comorbidity, affecting nearly half of CWE. Routine screening for SRBDs in CWE is recommended, especially if drug refractory, followed by confirmation with polysomnography. Early detection of SRBDs offers a modifiable target that can enhance seizure control and neurobehavioral outcomes in CWE.
Ischemic heart disease (IHD) remains a leading cause of death worldwide, with dietary risks being its most significant modifiable factor. Here, using the Global Burden of Diseases, Injuries and Risk Factors Study 2023, we estimated the mortality and disability-adjusted life years from diet-related IHD across 204 countries. In 2023, a suboptimal diet was responsible for 4.06 million (95% uncertainty interval (UI) 0.74-6.22) IHD deaths and 96.84 million (18.82-142.52) IHD disability-adjusted life years. The global age-standardized death rate of IHD attributable to suboptimal diet decreased by 43.92% (95% UI 34.44-53.23) per 100,000 population from 1990 to 2023. Among dietary factors, low intake of nuts and seeds (9.87, 95% UI 2.84-17.12 deaths per 100,000 population), low whole grains (9.22, 4.73-13.67), low fruits (7.25, 1.54-13.34) and high sodium (7.15, 0.92-17.97) were primary contributors to IHD deaths. The burden was particularly pronounced in low- and middle-sociodemographic index countries. By disentangling dietary risk factors, we identified the portion of IHD burden directly modifiable through food interventions.
Spontaneous intracranial hypotension (SIH) is a rare, often underdiagnosed cause of secondary headache resulting from spontaneous cerebrospinal fluid leakage. Indian data on its clinical profile and treatment outcomes remain limited. We aimed to describe the clinical, radiological, and treatment characteristics of patients with SIH at a tertiary neurology centre in South India and to compare outcomes between those treated with epidural blood patch (EBP) and conservative management. This retrospective observational study included 33 patients fulfilling the International Classification of Headache Disorders - 3 (ICHD-3) diagnostic criteria for SIH between 2004 and 2024. Clinical features, magnetic resonance imaging (MRI) findings, treatment modalities, and outcomes were analyzed. Statistical comparisons were performed using Chi-square or t tests, with P < 0.05 considered significant. Of the 33 patients (73% female; mean age, 42 ± 11 years), orthostatic headache was present in 91%. Pachymeningeal enhancement was the most common MRI finding (90%), followed by pituitary engorgement (59%) and spinal epidural fluid collections (52%). Twenty patients (61%) received EBP, and thirteen (39%) were managed conservatively. EBP led to significantly higher rates of headache resolution at discharge (80% vs. 31%, P = 0.005), although long-term outcomes at 3 months were comparable (75% vs. 62%, P = 0.5). Procedure-related complications were mild and transient. This study represents the largest Indian SIH series to date, highlighting the predominance of orthostatic headache and pachymeningeal enhancement. EBP offers rapid symptomatic relief and remains the preferred initial therapy. Early recognition and standardized imaging protocols are vital for improved outcomes.
暂无摘要(点击查看详情)
Diagnosing myopathy subtypes is challenging due to clinical and genetic heterogeneity. While muscle magnetic resonance imaging (MRI) enables pattern recognition, standardized imaging data from India are lacking. This study aimed to define MRI patterns in myopathies, compare semi-quantitative scores with fat fraction (FF) analysis, and derive a diagnostic algorithm. In this study, a total of 102 patients with confirmed dystrophic or inflammatory myopathies underwent 3T MRI of the pelvic girdle and lower limbs, combining conventional sequences with Dixon-based fat quantification. Analysis used a modified Mercuri T1 scale and Stramare T2 edema scoring, along with FF measurements across 6,180 muscles. Disease patterns, correlations between imaging and clinical variables, and associations between qualitative and quantitative metrics were analyzed. MRI patterns were distinct for each myopathy. Facioscapulohumeral dystrophy showed hamstring involvement with low asymmetry (5%). Dystrophinopathies exhibited a "trefoil with single fruit" sign (68%). Calpainopathy showed symmetrical end-stage involvement of the gluteal, adductor, and hamstring muscles, while dysferlinopathy affected the gluteus minimus and posterior compartment. Glucosamine-N-acetyl Epimerase (GNE) myopathy showed severe involvement of the gluteus minimus, sartorius, gracilis, and tibialis anterior muscles. In inflammatory myopathies, dermatomyositis showed edema (62%) without a fixed pattern, whereas inclusion body myositis affected the gastrocnemius and gluteal muscles. Disease duration correlated with T1 scores (r = 0.3, P = 0.001) and FF (r = 0.4, P = 0.003). A significant association ( P < 0.001) was observed between T1 scores and FF categories. Combined semi-quantitative scoring and FF MRI distinguished myopathy subtypes, correlated with disease duration, and supported the use of MRI as a biomarker. Our muscle atlas defines disease-specific imaging phenotypes and serves as a reference for diagnostic evaluation in diverse populations.
暂无摘要(点击查看详情)
暂无摘要(点击查看详情)
暂无摘要(点击查看详情)
暂无摘要(点击查看详情)
暂无摘要(点击查看详情)
BACKGROUND: Regular, detailed reporting on population health by underlying cause of death is fundamental for public health decision making. Cause-specific estimates of mortality and the subsequent effects on life expectancy worldwide are valuable metrics to gauge progress in reducing mortality rates. These estimates are particularly important following large-scale mortality spikes, such as the COVID-19 pandemic. When systematically analysed, mortality rates and life expectancy allow comparisons of the consequences of causes of death globally and over time, providing a nuanced understanding of the effect of these causes on global populations. METHODS: The Global Burden of Diseases, Injuries, and Risk Factors Study (GBD) 2021 cause-of-death analysis estimated mortality and years of life lost (YLLs) from 288 causes of death by age-sex-location-year in 204 countries and territories and 811 subnational locations for each year from 1990 until 2021. The analysis used 56 604 data sources, including data from vital registration and verbal autopsy as well as surveys, censuses, surveillance systems, and cancer registries, among others. As with previous GBD rounds, cause-specific death rates for most causes were estimated using the Cause of Death Ensemble model-a modelling tool developed for GBD to assess the out-of-sample predictive validity of different statistical models and covariate permutations and combine those results to produce cause-specific mortality estimates-with alternative strategies adapted to model causes with insufficient data, substantial changes in reporting over the study period, or unusual epidemiology. YLLs were computed as the product of the number of deaths for each cause-age-sex-location-year and the standard life expectancy at each age. As part of the modelling process, uncertainty intervals (UIs) were generated using the 2·5th and 97·5th percentiles from a 1000-draw distribution for each metric. We decomposed life expectancy by cause of death, location, and year to show cause-specific effects on life expectancy from 1990 to 2021. We also used the coefficient of variation and the fraction of population affected by 90% of deaths to highlight concentrations of mortality. Findings are reported in counts and age-standardised rates. Methodological improvements for cause-of-death estimates in GBD 2021 include the expansion of under-5-years age group to include four new age groups, enhanced methods to account for stochastic variation of sparse data, and the inclusion of COVID-19 and other pandemic-related mortality-which includes excess mortality associated with the pandemic, excluding COVID-19, lower respiratory infections, measles, malaria, and pertussis. For this analysis, 199 new country-years of vital registration cause-of-death data, 5 country-years of surveillance data, 21 country-years of verbal autopsy data, and 94 country-years of other data types were added to those used in previous GBD rounds. FINDINGS: The leading causes of age-standardised deaths globally were the same in 2019 as they were in 1990; in descending order, these were, ischaemic heart disease, stroke, chronic obstructive pulmonary disease, and lower respiratory infections. In 2021, however, COVID-19 replaced stroke as the second-leading age-standardised cause of death, with 94·0 deaths (95% UI 89·2-100·0) per 100 000 population. The COVID-19 pandemic shifted the rankings of the leading five causes, lowering stroke to the third-leading and chronic obstructive pulmonary disease to the fourth-leading position. In 2021, the highest age-standardised death rates from COVID-19 occurred in sub-Saharan Africa (271·0 deaths [250·1-290·7] per 100 000 population) and Latin America and the Caribbean (195·4 deaths [182·1-211·4] per 100 000 population). The lowest age-standardised death rates from COVID-19 were in the high-income super-region (48·1 deaths [47·4-48·8] per 100 000 population) and southeast Asia, east Asia, and Oceania (23·2 deaths [16·3-37·2] per 100 000 population). Globally, life expectancy steadily improved between 1990 and 2019 for 18 of the 22 investigated causes. Decomposition of global and regional life expectancy showed the positive effect that reductions in deaths from enteric infections, lower respiratory infections, stroke, and neonatal deaths, among others have contributed to improved survival over the study period. However, a net reduction of 1·6 years occurred in global life expectancy between 2019 and 2021, primarily due to increased death rates from COVID-19 and other pandemic-related mortality. Life expectancy was highly variable between super-regions over the study period, with southeast Asia, east Asia, and Oceania gaining 8·3 years (6·7-9·9) overall, while having the smallest reduction in life expectancy due to COVID-19 (0·4 years). The largest reduction in life expectancy due to COVID-19 occurred in Latin America and the Caribbean (3·6 years). Additionally, 53 of the 288 causes of death were highly concentrated in locations with less than 50% of the global population as of 2021, and these causes of death became progressively more concentrated since 1990, when only 44 causes showed this pattern. The concentration phenomenon is discussed heuristically with respect to enteric and lower respiratory infections, malaria, HIV/AIDS, neonatal disorders, tuberculosis, and measles. INTERPRETATION: Long-standing gains in life expectancy and reductions in many of the leading causes of death have been disrupted by the COVID-19 pandemic, the adverse effects of which were spread unevenly among populations. Despite the pandemic, there has been continued progress in combatting several notable causes of death, leading to improved global life expectancy over the study period. Each of the seven GBD super-regions showed an overall improvement from 1990 and 2021, obscuring the negative effect in the years of the pandemic. Additionally, our findings regarding regional variation in causes of death driving increases in life expectancy hold clear policy utility. Analyses of shifting mortality trends reveal that several causes, once widespread globally, are now increasingly concentrated geographically. These changes in mortality concentration, alongside further investigation of changing risks, interventions, and relevant policy, present an important opportunity to deepen our understanding of mortality-reduction strategies. Examining patterns in mortality concentration might reveal areas where successful public health interventions have been implemented. Translating these successes to locations where certain causes of death remain entrenched can inform policies that work to improve life expectancy for people everywhere. FUNDING: Bill & Melinda Gates Foundation.
暂无摘要(点击查看详情)