Adolescent tobacco use remains a major public health concern (World Health Organization, 2024). In recent years, electronic cigarettes have become a frequent pathway to nicotine initiation, posing new challenges for traditional school-based tobacco prevention strategies. A repeated cross-sectional observational study was conducted with 5,658 students aged 11 to 18 years from secondary schools in Burgos (Spain), corresponding to six academic cohorts between 2010 and 2023. The study was carried out within the framework of a universal school-based prevention program targeting tobacco and electronic cigarette use, delivered during the first year of compulsory secondary education. Following the intervention, students voluntarily and anonymously completed an evaluation questionnaire. Trends in experimentation across cohorts and associations between self-reported program participation and psychosocial and program evaluation variables were examined using logistic regression models and interpreted as observational associations without causal attribution. Experimentation with conventional cigarettes showed a marked decline in the earlier cohorts and a subsequent stabilization in more recent years. Electronic cigarette use was assessed from 2021 onwards, with relevant prevalence levels observed in the most recent cohorts. In the subsample with detailed consumption data (n = 2,206), electronic cigarettes were the most commonly used product in the past 30 days (10.4%), followed by conventional cigarettes (7.2%).Social exposure to tobacco use was high, with more than 80% of students reporting at least one smoker in their immediate environment. In bivariate regression analyses, self-reported participation in the program was associated with experimentation with electronic cigarettes (OR = 0.66; p = 0.006), although this association did not remain significant after adjustment for age, gender, school type, and social exposure. By contrast, program participation showed consistent associations with indicators of program evaluation, including positive appraisal of the sessions (AOR = 5.25; 95% CI: 3.85-7.16) and discussion of program content within the family (OR = 1.92; p < 0.001). In this repeated cross-sectional series, adolescent use patterns were characterized by the increasing prominence of electronic cigarettes. Self-reported program participation was associated with indicators of engagement and appraisal, which should be interpreted within the limits of the observational design. Overall, the findings highlight the relevance of prevention approaches that explicitly address vaping and integrate school-based actions with adolescents' social and family contexts.
Donor-specific HLA antibodies (DSA) occurring prior to or appearing de novo after transplantation play a major role in kidney graft loss, mainly caused by complement activation by DSA. As potential therapeutics, immunologically inert human monoclonal antibodies (mAbs) preventing binding of DSA to HLA-A1 molecules would limit their detrimental effects. In this study, the complement-activating potential of anti-HLA-A1 mAbs (WIM8E5), modified by single amino-acid changes inhibiting C1q binding, Fc-Fc interactions, or FcγR binding, was assessed in the classical crossmatch and by detection of C3d fixation in single antigen Luminex beads. Subsequently, the efficacy of these modified human mAbs was assessed in combination with patient sera to determine whether they could prevent complement-mediated cell lysis and HLA antibodies-induced C3d fixation. Point mutations in anti-HLA-A1-WIM8E5 mAb preventing C1q binding resulted in reduced to absent complement-mediated cell lysis and C3d fixation. Patient sera containing anti-HLA-A1 antibodies are able to activate complement, and WIM8E5-K322A-P329R-S440K (preventing C1q, Fc:Fc interaction and FcyR binding) mAb could prevent cell lysis induced by patient antibodies and inhibit C3d binding to HLA-A1 coated beads. Furthermore, the 109F epitope recognized by the mAb differed from those recognized by patient antibodies, indicating that a single modified mAb can block multiple HLA-A1 recognition sites. In conclusion, single amino-acid mutations in anti-HLA-A1-WIM8E5 directed to inhibit C1q binding, Fc:Fc interactions, or FcγR binding could be used to prevent complement-mediated cell lysis and C3d formation in kidney transplant recipients. Further research is needed to investigate the applicability of modified mAb as potential therapeutics in the clinic.
Adherence to secondary prevention therapies after ischemic stroke or transient ischemic attack is often suboptimal, and the clinical implications of adherence patterns remain unclear. This study identified adherence trajectories to statins and antithrombotic agents and evaluated their associations with lipid control and cardiovascular outcomes. Using the Chang Gung Research Database linked to Taiwan's population claims data, we identified patients with first-ever acute ischemic stroke/transient ischemic attack between 2012 and 2018 who survived ≥1 year and initiated secondary prevention. Monthly adherence was measured by proportion of days covered and classified using group-based multitrajectory modeling. The primary outcome was a composite of recurrent ischemic stroke, systemic embolism, and myocardial infarction; secondary outcomes included individual components and all-cause death. Among 13 299 eligible patients (mean age, 65.6 years; 60.3% men), 4 adherence trajectories were identified: dual high adherence (46.7%), antithrombotic-only adherence (35.1%), early dual discontinuation (12.2%), and gradual dual decline (6.0%). Dual high adherence achieved the greatest low-density lipoprotein cholesterol reduction (-24.3%) and lowest mortality rate. Early dual discontinuation was associated with higher risks of the composite outcome (adjusted hazard ratio [aHR], 1.57 [95% CI, 1.31-1.88], recurrent ischemic stroke/systemic embolism (aHR, 1.60 [95% CI, 1.31-1.95]), myocardial infarction (aHR, 1.48 [95% CI, 1.02-2.14]), and all-cause death (aHR, 1.56 [95% CI, 1.36-1.79]). Poor adherence had greater adverse impact among patients with baseline low-density lipoprotein cholesterol ≥100 mg/dL, younger adults, and men. Adherence trajectories were strongly associated with low-density lipoprotein cholesterol and major cardiovascular outcomes. Sustained dual adherence conferred substantial protection, whereas early discontinuation markedly increased recurrent ischemic and mortality risks.
Insulin resistance (IR) Insulin resistance (IR) is a central metabolic disturbance implicated in a broad range of cardiometabolic and neurological disorders. Increasing evidence suggests that IR plays a pivotal role in the initiation and progression of cerebrovascular diseases (CeVD) and contributes to neurodegeneration through shared molecular and vascular mechanisms. In the brain and its vasculature, impaired insulin signaling-particularly involving the IRS-PI3K-Akt-GSK-3β axis-interacts with inflammation, oxidative stress, endothelial dysfunction, and neurovascular unit impairment, thereby linking CeVD with Alzheimer's disease, Parkinson's disease, and cerebral small-vessel disease. Advances in the assessment of IR, ranging from traditional indices such as HOMA-IR to surrogate markers including the triglyceride-glucose (TyG) index and METS-IR, have enabled large-scale population studies and improved risk stratification for cardiometabolic disorders, with emerging relevance to cerebrovascular and cognitive outcomes. In parallel, genetic and Mendelian randomization studies support a causal contribution of IR-related traits to hypertension, atherosclerosis, and stroke risk, highlighting shared susceptibility pathways across metabolic and neurological phenotypes. This review synthesizes current evidence on the mechanisms, measurement, genetic architecture, and clinical implications of IR across cerebrovascular and related disorders. By integrating mechanistic insights with epidemiological and translational data, we propose IR as a tractable, cross-cutting therapeutic target. Future priorities include standardizing IR assessment, validating brain-relevant biomarkers, and linking mechanistic pathways to longitudinal cerebrovascular and neurodegenerative outcomes to advance precision prevention and individualized care.
This study aimed to examine how migration background is associated with current smoking and whether this relationship varies by sex, age, and education. We analysed data from 19,441 participants of the 2022 Swiss Health Survey, an official, national cross-sectional dataset. Current smoking (yes/no) was the outcome, and migration background (none, 1st generation, 2nd or higher generation) the primary exposure. Multivariable logistic regression models adjusted for sociodemographic and behavioural covariates, were followed by stratified analyses by sex, age, and education. Compared to people without a migration background, the odds of current smoking were elevated among those with a 1st-generation background (OR = 1.42, 95% CI: 1.28-1.49) and 2nd or higher generation (OR = 1.75, 95% CI: 1.48-2.06). Stratified analyses showed that the higher odds of current smoking among people with a migration background were particularly pronounced among younger adults (15-24, 25-34 years), and among people with lower educational attainment. Migration background contributes to smoking inequalities in Switzerland and intersects with other social factors, such as age and education. Elevated risks among people with 2nd or higher-generation migration backgrounds in younger age groups and those with lower educational attainment demonstrate the need for culturally adapted, equity-oriented prevention strategies and stronger national tobacco control policies.
Inconsistent links between arterial stiffness and cognition may reflect limited cognitive tests and unaccounted diurnal pulse wave velocity variation. To bridge this knowledge gap, we investigated 24-hour ambulatory estimated pulse wave velocity (ePWV) and its association with dementia-related neuroimaging and cognitive function in hypertension. We assessed 893 patients with hypertension aged ≥50 years (mean age, 67.2 years; 52.3% women), including brain magnetic resonance imaging (n=545), global cognitive testing (n=623), and ambulatory ePWV measurements. White matter hyperintensity and hippocampus were quantified via Computational Anatomy Toolbox 12 and Statistical Parametric Maps 12. Cognition was assessed via the Mini-Mental State Examination and Montreal Cognitive Assessment. Among 623 tested participants, the prevalence of mild cognitive impairment was 10% (Mini-Mental State Examination, n=62) and 18.5% (Montreal Cognitive Assessment, n=115). Cognitive scores decline with higher white matter hyperintensity burden and lower hippocampal volume (P≤0.024). Higher 24-hour ePWV quartiles showed graded associations with higher white matter hyperintensity volume and lower hippocampal volume (both P<0.001) and lower cognitive scores (P≤0.037). Multivariable models showed each 1-SD (+1.2 m/s) increment in 24-hour ePWV were associated with 2.00±1.74 mL greater white matter hyperintensity volume (P=0.004), and 0.54±0.14 mL smaller hippocampal volume (P<0.001), independent of age, systolic blood pressure, and other confounders. These associations persisted after further adjustment for carotid-femoral PWV, which itself showed no independent association (P≥0.18). Results were consistent for daytime and nighttime ePWV and across key subgroups. Ambulatory ePWV is an independent risk factor for dementia-related brain pathology. Targeting arterial stiffness represents a promising strategy for dementia prevention.
Ischemic stroke remains a leading cause of morbidity and mortality worldwide, with substantial heterogeneity across regions, sexes, age groups, and levels of socioeconomic development. Previous studies have described global trends; however, integrated analyses combining long-term spatiotemporal patterns, sociodemographic stratification, and future projections remain limited. Data on ischemic stroke from 1990 to 2021 were extracted from the Global Burden of Disease (GBD) 2021 database, covering 204 countries and territories. Age-standardized prevalence, incidence, mortality, and disability-adjusted life years (DALYs) were analyzed by age, sex, region, and sociodemographic index (SDI). Temporal trends were assessed using estimated annual percentage change (EAPC). Future disease burden from 2022 to 2041 was projected using autoregressive integrated moving average (ARIMA) models implemented with the R package forecast (version 8.24.0). Globally, from 1990 to 2021, age-standardized incidence declined from 109.79 to 92.39 per 100,000 (EAPC = -0.67), mortality from 73.15 to 44.18 per 100,000 (EAPC = -1.83), and DALYs from 1286.31 to 837.36 per 100,000 (EAPC = -1.59), whereas prevalence showed only a modest decrease (EAPC = -0.18). Marked disparities were observed across SDI levels: middle SDI regions experienced increasing prevalence (EAPC = 0.27) and incidence (EAPC = 0.12), while high SDI regions demonstrated the most pronounced reductions in mortality and DALYs. Eastern Europe, East Asia, and Southern Sub-Saharan Africa remained high-burden regions. Pronounced sex- and age-specific patterns were identified, with men exhibiting earlier peak incidence (65-79 years) and women showing later and shifting peak prevalence toward older age groups. Projections indicate that by 2041, global age-standardized prevalence will increase by approximately 15.3%, driven primarily by a substantial rise among females (from 769.40 to 1005.77 per 100,000), despite continued declines in mortality and DALYs. This comprehensive spatiotemporal analysis reveals a paradoxical pattern of increasing ischemic stroke prevalence amid declining mortality and disability, with widening sex- and SDI-related disparities. By integrating long-term trends with sex-specific future projections, this study provides novel evidence to inform targeted prevention strategies, gender-sensitive interventions, and context-specific health policy planning to mitigate the evolving global burden of ischemic stroke.
Stroke remains a leading cause of mortality and disability worldwide, requiring modifiable risk factors for prevention. Oral health, particularly posterior occlusal contact that supports mastication, may influence systemic vascular outcomes; however, its role in stroke remains unclear. This study investigated whether reduced posterior occlusal contact independently predicts stroke in Japanese adults. We conducted a retrospective cohort study using data collected from April 2016 to March 2022 from a nationwide health insurance database in Japan. Adults aged 40 to 74 years without a history of stroke were included. Posterior occlusal status was categorized as Eichner A (full posterior occlusal contact), B (partial contact), or C (no contact). The primary outcome was stroke incidence, identified using validated insurance claims. Cox proportional hazards models were used to estimate hazard ratios (HRs), adjusting for demographics, health behaviors, comorbidities, and tooth counts. Among 981 543 participants (mean age 49.6±7.0 years; 57.5% male) followed for 2 712 815 person-years, 7086 strokes occurred. In adjusted Cox models, Eichner B was significantly associated with higher stroke risk than Eichner A (HR, 1.26 [95% CI, 1.10-1.44], P<0.001). Eichner C showed elevated risk, although not statistically significant (HR, 1.17 [95% CI, 0.91-1.50], P=0.229). Associations were observed in both sexes and across age groups. Reduced posterior occlusal support was associated with higher stroke risk independent of tooth count and conventional vascular risk. These findings highlight that posterior occlusal support may represent a potential marker of cerebrovascular risk and warrant further investigation into its clinical relevance.
Psychological distress, such as stress, depression or anxiety, is a prevalent mental health concern during pregnancy. However, data on the association between prenatal maternal psychological distress and the risk of autism or autism spectrum disorders (ASD) in their offspring have not been synthesized systematically. We performed a meta-analysis to explore this issue and provide evidence regarding maternal mental health screening and ASD prevention. Six electronic databases were systematically searched up to June 2025. English-language full-text observational studies were included, with no geographic or race restrictions. Studies that quantitatively assessed the association between maternal psychological distress during pregnancy and the risk of ASD in offspring were eligible for inclusion. Pooled odds ratios (ORs) were calculated. Heterogeneity, publication bias, and sensitivity analyses were assessed. Among 484 full-text records screened, 22 studies were eligible. Data analysis demonstrated that offspring of mothers with prenatal psychological distress have a 72% higher likelihood of being diagnosed with ASD or autism after the age of two (OR = 1.72, 95% CI 1.50-1.97, p < 0.01) compared to those of mothers without distress. This association was observed across different study designs and ASD diagnostic ascertainment methods, although effect estimates varied. Substantial between-study heterogeneity was observed (I 2 = 87.90%), largely attributable to differences in study design, ASD ascertainment and distress assessment rather than psychological distress subtype. In this meta-analysis, prenatal maternal psychological distress was associated with an increased likelihood of an ASD diagnosis in offspring. Across the included studies, effect estimates were generally similar for stress, depression, and anxiety, despite substantial heterogeneity in study design and exposure assessment. This consistency suggests that elevated ASD risk is not confined to a single diagnostic category of maternal distress. At the same time, the findings should be interpreted considering the variability in how psychological distress was measured and controlled for across studies. Taken together, the results indicate that maternal psychological distress during pregnancy warrants attention in epidemiological research and routine antenatal care, without implying that specific psychiatric subtypes can be clearly distinguished in terms of offspring ASD risk. https://www.crd.york.ac.uk/PROSPERO/view/CRD420251119825, PROSPERO: CRD420251119825.
To identify the risk factors for severe primary dysmenorrhea in adolescent girls and provide evidence for clinical prevention and intervention. This retrospective study enrolled adolescent girls with primary dysmenorrhea from Jingzhou Central Hospital between January 2022 and December 2023. Participants were divided into severe group and mild-to-moderate group grouped by VAS score (VAS 0-10, 0=no pain, 1-3=mild, 4-6=moderate, 7-10=severe). Demographic, lifestyle, menstrual and laboratory parameters were collected via questionnaires and medical records. Bivariate analysis and multivariate Logistic regression were used to screen independent risk factors, and ROC curve was applied to assess model performance. 426 participants were included (158 in severe group, 268 in mild-to-moderate group). Early menarche (≤12 years old), frequent cold food intake (≥3 times/week), high serum PGF2α level and low physical activity level were independent risk factors (all P<0.05). The predictive model yielded an AUC of 0.826 (95% CI: 0.783-0.869), showing good predictive efficiency. Early menarche, frequent cold food intake, high serum PGF2α and low physical activity are independent risk factors for severe primary dysmenorrhea in adolescent girls. Targeted interventions on these factors can help reduce the occurrence of severe dysmenorrhea.
Introduction Epistaxis is a common complaint seen by otolaryngologists and one of the most common otolaryngological emergencies. The aim of our study is to identify educational gaps in home epistaxis management and determine the efficacy of an educational video in increasing layperson knowledge of epistaxis treatment. Methods Adults aged 18 years and older were recruited in pediatric otolaryngology clinics and online. The survey consisted of pre- and post-test questions, with a locally produced educational video on epistaxis management presented between assessments. Demographics collected included age, gender, race, and parental status. Experience with nosebleeds and nasal cautery was queried. The assessment included 10 multiple-choice and true/false questions regarding active nosebleed management and prevention techniques. Results A total of 145 participants completed both assessments. The majority were female (N = 87, 60.0%), younger than 30 years (N = 120, 82.8%), and of Caucasian/White race (N = 103, 71.0%). Nineteen participants (13.1%) identified as parents, and 109 (75.2%) reported personal or child experience with epistaxis. Post-test scores were significantly higher than pre-test scores (mean difference = 2.21, t(144) = 13.10, p < 0.001). Parents had significantly higher post-test scores compared to non-parents (p = 0.025). There was no statistically significant difference in post-test scores based on gender, age, race, history of epistaxis, or history of nasal cautery. Conclusion The educational video increased layperson knowledge about epistaxis. Future research is needed to understand how to best use educational videos in the clinical setting.
Heart failure (HF) is increasingly driven by cardiometabolic risk factors such as obesity and insulin resistance. The estimated glucose disposal rate (eGDR) is a validated surrogate marker of insulin resistance. Reduced eGDR, reflecting higher insulin resistance, has been linked to cardiovascular disease, but its associations with myocardial fibrosis and HF remain unclear. The study included 6025 participants in MESA (Multi-Ethnic Study of Atherosclerosis) free of HF at exam 2 (2002-2004). eGDR was calculated using body mass index, hypertension, and hemoglobin A1c. Cardiac magnetic resonance imaging (2010-2012) assessed left ventricular ejection fraction and myocardial fibrosis by late gadolinium enhancement. Associations of baseline eGDR with incident HF were evaluated using Cox models, stratified by diabetes. Over a mean follow-up of 14±5 years, 404 participants developed HF. eGDR was inversely associated with incident HF (adjusted hazard ratio [HR] per unit decrease, 1.28 [95% CI, 1.22-1.35]), a relationship that persisted regardless of diabetes status and was more pronounced for HF with preserved ejection fraction (n=200; adjusted HR per unit decrease, 1.36 [95% CI, 1.27-1.45]) than HF with reduced ejection fraction (n=168; HR, 1.19 [95% CI, 1.11-1.27]). Lower eGDR was also associated with higher left ventricular ejection fraction (n=2899; β=0.16, [95% CI, 0.03-0.28]) and higher odds of myocardial fibrosis (n=1780; adjusted odds ratio, 1.27 [95% CI, 1.16-1.41]). Lower eGDR is independently associated with subclinical myocardial damage and incident HF, highlighting insulin resistance as a key driver of adverse remodeling and a potential marker for early HF risk stratification and prevention.
Pronounced variations in suicide mortality persist across Europe. Understanding long-term temporal patterns through age, period and cohort (APC) effects, alongside suicide means, is essential for tailored prevention. This study aims to determine how suicide mortality rates in Europe have changed across APC dimensions at national and subregional levels. Our analysis was restricted to European countries with complete age- and sex-specific suicide mortality data from 1990 to 2019 within the World Health Organization mortality database. The analysis comprised two components. The first component disentangled long-term suicide mortality trends (1990-2019) into APC dimensions using an age-period-cohort model via the National Cancer Institute's APC Web Tool. The second component involved an assessment of suicide means, restricted to 2010-2019 and to countries with detailed International Classification of Diseases, 10th Revision (ICD-10) cause-of-death data. In 2019, Europe recorded 47,793 male and 13,111 female suicide deaths. Overall suicide mortality rates declined in most subregions from 1990 to 2019, with the largest reductions among Eastern European men, from 77.81 (95% CI: 77.17-78.45) per 100,000 in the mid-1990s to 22.93 (95% CI: 22.58-23.28) per 100,000 by 2019, although this region retained the highest male suicide burden. Age-specific risk patterns differed markedly: among men, risk peaked in early adulthood and then declined in Eastern Europe, while in Western and Southern Europe, it was lower and more stable but rose after age 60; for women, risk was generally lower, with peaks in early adulthood in Eastern Europe and in midlife elsewhere. Period reflected continued improvement, especially in Eastern Europe where the period risk in 2015-2019 was approximately 60% lower than 2000-2004. Cohort effects similarly showed progressive declines. However, upward trends emerged among younger generations. In Northern Europe, the cohort relative risk for females increased from 0.73 (95% CI: 0.68-0.78) in the 1980 cohort to 0.90 (95% CI: 0.70-1.04) in the 2000 cohort. While the completeness of suicide means analysis varied by subregion, the primary data indicated that hanging was the predominant means for both sexes during 2010-2019. Despite an overall decline, suicide mortality in Europe exhibits persistent regional and demographic differences. This study reveals emerging risks among younger cohorts, specifically Northern European women and Southern European men, signalling shifting patterns that are not apparent from overall temporal trends alone. This evolving risk profile calls for sustained surveillance and research to investigate the drivers of these population-specific vulnerabilities.
Hyperuricemia is an emerging public health concern associated with gout and metabolic diseases. While numerous studies have examined associations between individual dietary factors and hyperuricemia, comprehensive assessments of overall diet quality and their impact on hyperuricemia are lacking. This study aimed to investigate the association between overall diet quality, as measured by the Korean Healthy Eating Index (KHEI), and the prevalence of hyperuricemia among Korean adults using nationally representative data. We analyzed data from 27,149 Korean adults aged ≥ 20 years from the Korea National Health and Nutrition Examination Survey (2016-2021). The KHEI assesses dietary quality based on adherence to Korean dietary guidelines, encompassing adequacy, moderation, and balance of intake. Hyperuricemia was defined as serum uric acid ≥ 7.0 mg/dL in men and ≥ 6.0 mg/dL in women. Logistic regression was performed, adjusting for sociodemographic, lifestyle, and health-related factors. Higher total KHEI scores were significantly associated with a lower prevalence of hyperuricemia (10.6% vs. 15.5%; P < 0.001). After adjustment, participants with high KHEI scores exhibited 22% lower odds of hyperuricemia compared to those with low scores (odds ratio, 0.78; 95% confidence interval, 0.72-0.84). Specifically, higher consumption of breakfast, whole grains, fruits, vegetables, and balanced energy intake were inversely associated with hyperuricemia risk. However, higher intake of purine-rich foods (meat, fish, eggs, beans) was positively associated with hyperuricemia. An unexpected finding was observed regarding sodium: lower sodium intake correlated with increased hyperuricemia risk, possibly related to renal uric acid reabsorption mechanisms or confounding by medications. Our findings demonstrate a significant inverse relationship between overall dietary quality, assessed using the KHEI, and hyperuricemia risk among Korean adults. The results emphasize the protective effects of balanced dietary habits, highlighting the benefits of regular breakfast consumption, increased fruit and vegetable intake, and balanced energy distribution. These insights underscore the importance of adopting holistic dietary strategies for the prevention and management of hyperuricemia, moving beyond traditional recommendations focused on individual nutrients. Further longitudinal studies are required to clarify causality and mechanisms underlying these associations.
Epidemiological research on self-harm in older adults (aged 65+ years) remains scarce despite its growing public health significance amid global ageing. This study aimed to analyse the global burden, risk factors and projections of self-harm mortality in adults aged 65+ years from 1990 to 2050. Utilising data from the Global Burden of Disease Study 2021, this research examined the spatiotemporal patterns of self-harm mortality and years of life lost by age, gender and socio-demographic index (SDI) in adults aged 65+ across 204 countries and territories (grouped into 21 regions) from 1990 to 2021. It also explored the changes during the coronavirus disease 2019 pandemic, identified key risk factors and projected the future burden of self-harm mortality through 2050. Global self-harm deaths among older adults increased from 116 642 in 1990 to 167 920 in 2021, a rise of 43.96%. However, the age-standardised mortality rate (ASMR) decreased by 39.56%, from 36.83 to 22.26 per 100 000. In 2021, central sub-Saharan Africa had the highest ASMR at 61.35 per 100 000, while North Africa and the Middle East recorded the lowest at 4.88 per 100 000. Male ASMR was 2.5 times as high as that of females (33.61 vs. 13.58 per 100 000), and adults aged 85 years and older were at particularly elevated risk. High alcohol use was identified as a major risk factor, especially for males. A U-shaped relationship between ASMR and the SDI was observed, with the lowest point at an SDI of approximately 0.70. Projections indicate a further 46.05% decline in ASMR to 12.01 per 100 000 by 2050. These results highlight complex global trends in self-harm mortality and associated risk factors among older adults, emphasising the urgent need for sex-, age-, and region-specific interventions, enhanced social support and systematic risk monitoring to inform age-friendly self-harm prevention policies, and sustainable development support goals.
Early onset scoliosis (EOS) is defined as scoliosis in children under the age of 10 years irrespective of etiology. Rapid progression of curve leading to poor cardiopulmonary function is a common feature in all types of EOS. There are 4 distinct etiologies namely idiopathic, congenital, neuromuscular and syndromic. To understand the current principles in the management, it is essential to understand the natural history of these curves including which curves progress, which curves do not progress and how the growth of these curves can be manipulated to ensure appropriate spinal growth, sagittal balance and prevention of cardiopulmonary complications. A detailed review was done specifically looking at the terminologies used in early onset scoliosis, development of thoracic insufficiency during the natural course of these curves, natural history of curve progression, outcome in each category of EOS and how cast/brace treatment can affect the natural history; and findings of literature were summarized. The evidence from literature shows that early onset scoliosis is a rapidly progressive disorder leading to thoracic insufficiency except for a few cases of resolving type of idiopathic infantile scoliosis and some cases of congenital scoliosis which are non-progressive. Natural history of idiopathic and congenital scoliosis supports the theory that the use of cast/brace can lead to resolution of curves in some cases, while it may help in some cases to delay surgery till skeletal maturity allowing better growth of thorax and better cardiopulmonary development. In contrast, natural history of neuromuscular and syndromic scoliosis shows rapid deterioration and effectiveness of braces and casts in halting the progression of curve is not clearly observed. In congenital hemivertebra, it is essential to understand exact nature of deformity and type of vertebral malformations before appropriate treatment is planned as there are variations in the natural history depending on type of abnormality. Finally, there is a scope for further research in understanding how thoracic insufficiency can be minimized, rate of progression in progressive variant of idiopathic scoliosis and non-idiopathic scoliosis and genetic factors associated with different types of scoliosis so that a early detection of these cases can be done.
We aimed to determine the one-year recurrence rate of apical prolapse following sacrospinous ligament fixation and to identify clinical and surgical risk factors associated with recurrence in a low-resource setting. This was a prospective cohort study of women who underwent sacrospinous ligament fixation as part of surgery for apical prolapse at the urogynecology unit of Mbarara Regional Referral Hospital (MRRH) in Uganda. The surgeries performed included vaginal hysterectomy with sacrospinous ligament vault fixation, sacrospinous ligament hysteropexy, and sacrospinous ligament vault fixation for those with post-hysterectomy vaginal vault prolapse. Concomitant procedures such as anterior or posterior repair, or both, were performed for women with prolapse in other compartments. The women were followed up for a period of one year post-surgery. Recurrence was assessed with the women in lithotomy position under maximum strain using the Pelvic Organ Quantification (POP-Q) system. Recurrence was defined as apical prolapse of ≥POP-Q stage II. Multivariable log-binomial regression was performed to determine risk factors for recurrence. A total of 123 participants were enrolled in this study, of which 111 (90.2%) completed follow-up. The mean age was 53.1 (SD ±13.6) years. The majority of apical prolapse was uterine (94.6%) and classified as POP-Q stage III (58.6%). The recurrence rate was 19.8% (22/111, 95% CI: 13.4-28.4). Risk factors for apical prolapse recurrence included body mass index (BMI) >25 kg/m2 (relative risk (RR) = 6.02; 95% confidence interval (CI): 2.05-17.67; p = 0.001) and post-operative complications (RR = 19.87; 95% CI: 5.77-68.47; p < 0.001). Undergoing vaginal hysterectomy as part of the prolapse surgery was found to be protective (RR = 0.09; 95% CI: 0.03-0.25; p < 0.001). Apical prolapse recurrence after sacrospinous ligament fixation is common in this setting. To reduce the risk of recurrence, management protocols should prioritize prevention and timely management of postoperative complications, and counseling for weight optimization. Furthermore, vaginal hysterectomy should be considered in uterine prolapse where uterine sparing surgery is not required.
This study aimed to investigate the role of lactate in the progression of cardiac dysfunction after myocardial infarction (MI) and clarify the effect of aerobic exercise (AE) on improving post-infarction cardiac function by regulating lactate metabolism, so as to provide experimental evidence for the clinical improvement of cardiac function after MI. NIH 3 T3 mouse embryonic fibroblasts and C57BL/6 J mice were used as research subjects. In vitro, fibroblasts were treated with different concentrations of lactate, and the activation state of fibroblasts was evaluated by detecting the expression levels of Collagen I (COL I) and α-Smooth Muscle Actin (α-SMA). In vivo, four mouse models (SED-SHAM, AE-SHAM, SED-MI, and AE-MI) were established. Cardiac function was assessed by echocardiography for left ventricular ejection fraction (LVEF) and left ventricular fractional shortening (LVFS). Lactate levels in cardiac tissue and serum were detected using a lactate assay kit, and serum metabolite changes mediated by lactate were analyzed via metabolomics technology. In vitro experiments confirmed that lactate could significantly induce fibroblast activation. Metabolomics results showed that elevated lactate levels after MI led to abnormal accumulation of arachidonoyl carnitine. In vivo experiments revealed that lactate levels in cardiac tissue and serum were significantly increased, while LVEF and LVFS were decreased in the SED-MI group. In contrast, lactate levels in cardiac tissue and serum were reduced, and LVEF and LVFS were elevated in the AE-SHAM group. Lactate levels in cardiac tissue and serum of the AE-MI group were lower than those of the SED-MI group but higher than those of the AE-SHAM group, and the LVEF and LVFS of the AE-MI group were significantly higher than those of the SED-MI group. This study confirmed that lactate accumulation is involved in the pathological process of cardiac dysfunction after myocardial infarction, and AE can improve cardiac function after MI by reducing lactate levels in cardiac tissue and serum. These findings provide new experimental evidence for the prevention and treatment of post-infarction cardiac dysfunction by targeting lactate metabolism, and also offer theoretical support for the application of AE in cardiac rehabilitation.
Timely detection of left atrial appendage thrombus (LAAT) is critical for stroke prevention in atrial fibrillation. Current diagnostic approaches such as traditional models and physician visual analysis face challenges in comprehensive capturing image information. The study aimed to develop and validate a multimodal model integrating coronary computed tomography angiography-based radiomics features with clinical parameters for noninvasive LAAT detection in patients with atrial fibrillation. The diagnostic study retrospectively enrolled 670 patients with nonvalvular atrial fibrillation undergoing coronary computed tomography angiography and transesophageal echocardiography from May 2015 to May 2023 and stratified into training and internal validation sets. An independent prospective cohort (n=114) from May 2023 to May 2025 served for external validation. Semiautomated LAA segmentation extracted 1231 radiomics features, with 25 features selected by random forest. A multimodal LAAT detection model was developed and evaluated using receiver operating characteristic, calibration curve, and decision curve analysis and compared against traditional models and physician visual analysis. The Radiomics-LAAT model achieved significantly superior discrimination in both internal validation (area under the curve, 0.963 [95% CI, 0.945-0.980], accuracy: 0.929) and external validation (areas under the curve, 0.920 [95% CI, 0.886-0.953], accuracy: 0.807), outperforming traditional models and physician visual analysis, with optimal calibration (Brier score: 0.067) and clinical net benefit. The Radiomics-LAAT model achieved high sensitivity (0.953), specificity (0.905), negative predictive value (0.950), and positive predictive value (0.910). The Radiomics-LAAT model significantly enhances noninvasive LAAT detection performance compared with traditional models and physician visual analysis, demonstrating its potential for stroke risk stratification in patients with atrial fibrillation.