Prostate, breast, stomach, colon-rectum, lung, and hematologic cancers represent a regional public health burden among older adults in the Caribbean. We described 15-year trends in cancer incidence and mortality among older adults in Martinique. We conducted a population-based cohort study (2008-2022) from the population-based cancer registry of Martinique. Age-Standardized Incidence and Mortality Rates (ASIR and ASMR) were estimated using the Segi/Doll world standard population and Annual Average Percent Change (AAPC) with 95% confidence interval in patients ≥ 65 years, by tumor site and sex. We recorded 15,400 cancer cases over the entire study period (2008-2022), with 64.4% diagnosed in males. Overall ASIR remained broadly stable: in men, 2,257.8 per 100,000 (95% CI: 2,172.6-2,342.9) in 2008-2012 and 2,053.1 (95% CI: 1,982.8-2,230.4) in 2018-2022; in women, 847.8 (95% CI: 801.8-893.8) in 2008-2012 and 862.1 (95% CI: 821.2-902.9) in 2018-2022. Prostate cancer ASIR decreased from 1,355.5 per 100,000 (95% CI: 1,288.6-1,422.4) in 2008-2012 to 1,303.7 (95% CI: 1,242.0-1,365.5) in 2013-2017, and then significantly to 1,102.2 (95% CI: 1,049.6-1,154.9) in 2018-2022.Prostate cancer ASMR also declined from 293.50 per 100,000 (95% CI: 263.30-323.70) in 2008-2012 to 225.80 (95% CI: 205.00-246.50) in 2018-2022. Women breast cancer ASIR increased significantly from 177.3 per 100,000 (95% CI: 155.4-199.2) in 2008-2012 to 198.0 (95% CI: 176.5-219.6) in 2013-2017 and to 230.7 (95% CI: 208.9-252.5) in 2018-2022. Male lung and bronchial cancer incidence showed a significant AAPC decrease of -12.5% (95% CI: -23.3 to -0.13) in 2018-2022. Colorectal cancer ASMR in women remained stable: 68.2 per 100,000 (95% CI: 55.1-81.3) in 2008-2012, 63.8 (95% CI: 53.0-74.6) in 2013-2017, and 71.4 (95% CI: 60.3-82.5) in 2018-2022, and male stomach cancer AAPC decreased significantly during 2018-2022: -11.5% (95% CI: -20.1 to -1.94). Between 2008 and 2022, cancer incidence among older adults in Martinique showed decreasing prostate and lung cancer but increasing breast cancer, consistent with Caribbean and global trends. These patterns reflect screening practices, treatment advances, and COVID-19 disruptions. Future research should prioritize age-tailored cancer management and uninterrupted treatment access.
To investigate risk factors for early childhood caries (ECC) among 1-2-year-old children in Beijing and to assess the effect modification of baseline caries experience on the associations between other risk factors and subsequent caries outcomes. A 12-24-month prospective study was conducted from 2021 to 2023, a total of 919 participants with valid data were included in this study, with a follow-up rate of 76.9%. Oral health information and related factors were collected through parent-completed questionnaires and clinical dental examinations. Univariate analyses and a zero-inflated negative binomial (ZINB) regression model were used to identify risk factors. Caries incidence during follow-up was 29.8%, and the mean increase in decayed, missing, and filled primary teeth (Δdmft) was 0.94 ± 1.94. In the full-sample ZINB model, significant predictors of caries risk were frequency of snack intake, frequency of candy consumption, frequency of bedtime tooth brushing, frequency of bottle use at bedtime, and baseline caries status. In the baseline caries-free group, additional significant predictors of incident caries risk were parents' caries status, frequency of saliva-sharing behavior, frequency of post-meal mouth rinsing, and the presence of additives in daily drinking water. Interaction analyses showed that the effects of parents' caries status, saliva-sharing frequency, post-meal mouth rinsing frequency, presence of additives in daily drinking water, bedtime brushing frequency, and history of dental attendance on caries risk differed significantly between children with and without baseline caries. The risk of early childhood caries (ECC) among 1-2-year-old children in Beijing was associated with multiple oral health-related factors. Baseline caries experience was a strong predictor of subsequent caries risk and significantly modified the effects of other risk factors.
Acetabular fractures are uncommon, but serious injuries. Demographic changes may have a significant impact on planning healthcare structures to improve treatment outcomes. Aim of this nationwide, registry-based retrospective controlled study was to identify incidence trends, demographic characteristics, and care structures of patients with acetabular fractures in Germany. We analyzed inpatient data from the Institute for the Hospital Remuneration System (InEK). Based on 52 095 patients with primary diagnosis of an acetabular fracture between 2019 and 2024, we calculated incidence rates for different age-groups and put a spotlight on geriatric acetabular fractures (> 65 years of age). Incidence rates in patients under 65 years remained stable, whereas patients over 65 years showed a significant age-dependent increase with an exponential rise in men aged 80 + with the highest incidence being 122.4/100 000 inhabitants annually. We recorded high levels of co-morbidity and nursing care dependency for elderly patients after acetabular fracture. Although 43% of patients were treated in hospitals > 500 beds, acetabular fractures were managed across all hospital sizes. There is a rapidly increasing incidence of geriatric acetabular fractures, predominantly driven by elderly male patients over 80 years. Patients over 65 years are associated with high rates of co-morbidities and nursing care levels.
A history of prior cardiac surgery (PCS) determines treatment decision and long-term outcomes in patients requiring aortic valve replacement. This study examined patient profiles, treatment-decisions and long-term outcomes of patients under 75 years with PCS undergoing transcatheter and surgical aortic valve implantation/replacement (TAVI, SAVR) in the Netherlands. Data from 1,284 patients (ages 50-75 years) with PCS undergoing TAVI or SAVR between 2015 and 2020 were analyzed using data from the Netherlands Heart Registration. Logistic and cox regression identified determinants of treatment selection and long-term mortality. Determinants were considered impactful if they had an odds ratio (OR) or hazard ratio (HR) of ≥ 1.5 or ≤ 0.7 and a prevalence of ≥ 5%. Of 1,284 patients, 690 underwent TAVI (54%) and 594 SAVR (46%). Prior index surgery most frequently involved coronary artery bypass grafting (CABG) (57% in the TAVI group vs 40% in the SAVR group; p < 0.001) and previous aortic valve surgery (25% vs 51%; p < 0.001). TAVI patients were significantly older (median 71 vs. 67 years, p < 0.001) and had a higher EuroSCORE II (median 5.7 vs. 4.4, p = 0.003) than SAVR patients. SAVR was the preferred strategy for intermediate-risk patients (62%), while TAVI was favored in high- and prohibitive-risk patients (62% and 94%, respectively). In descending order of odds ratio, the strongest independent determinants of TAVI selection were left ventricular ejection fraction ≤ 30% ((OR: 4.8; 95% CI: 2.6-8.8), poor mobility ((OR: 3.4; 95% CI: 1.6-7.0) and obesity/cachexia (OR 2.7; 95% CI: 1.6-4.4); the key determinants of SAVR selection were pure native aortic regurgitation (OR: 0.1; 95% CI: 0.1-0.3) and failing surgical bioprosthesis (OR: 0.7; 95% CI: 0.5-1.0. Thirty-day, 1- and 5 year survival after TAVI and SAVR was 97% and 96%, 83% and 91%, and 56% and 83%, respectively (p-value < 0.001). Left ventricular ejection fraction ≤ 30% and chronic lung disease were important mortality determinants for both procedures, with higher odds ratios for mortality in SAVR as compared to in TAVI patients. In the Netherlands, TAVI and SAVR rates were comparable among patients < 75 years with PCS. Higher-risk patients were directed toward TAVI except for those presenting with pure native aortic regurgitation and bioprosthesis failure who mainly received SAVR. Severe left ventricular dysfunction and chronic lung disease were key mortality predictors for both procedures.
Early life exposure to common pathogens and a high pathogen burden during childhood can have long-term effects on immune development and overall health. These infections can trigger molecular changes, including alterations in gene expression and DNA methylation (DNAm), which regulate immune and metabolic pathways. Our aim was to identify biological processes underlying differential patterns of DNAm and gene expression in whole blood by infection status in European children. In the Rhea (Greece) and INMA (Spain) cohorts, serum/plasma samples collected at mean ages of 4 and 8 years were analyzed by multiplex serology to measure IgG against 14 antigens from 9 pathogens, and blood collected at a mean age of 8 years was used for DNAm and gene expression profiling. Epigenome- and transcriptome-wide analyses were conducted to assess association with childhood infections. A total of 290 unique CpGs were significantly associated with pathogen outcomes: 265 with seropositivity, 111 with first exposure timing, and one with viral burden. Cytomegalovirus (CMV) exposure accounted for the largest number of both epigenetic (n = 325) and transcriptomic (n = 8) associations. A total of 89 CMV-related CpGs had been described before in adults, and among novel ones, 54 showed consistent effects in adults. CMV-related CpGs were enriched for SUZ12 targets linked to morphogenesis, oxidative stress, and cognition. A previously developed CMV episcore in adults predicted serologically assessed CMV infection at 4 and 8 years of age, with area under the curve values ranging from 0.74 to 0.78 (95% CI 0.68-0.83). We identified novel DNAm and gene expression signatures of common childhood infections, particularly CMV, implicating immune and morphogenesis pathways. A subset of CMV-related DNAm signals showed consistent associations with those reported previously in adults, suggesting similar molecular effects across ages.
Rheumatoid arthritis (RA) frequently affects the forefoot, causing pain and deformity even in remission. Subclinical inflammation may persist and vary with disease duration. This study aimed to compare clinical, ultrasound, and radiographic features of the forefoot in RA remission with metatarsalgia according to disease duration (< 10 vs. ≥ 10 years). Cross-sectional study including 84 RA patients in remission (DAS28 < 2.6) with metatarsalgia: 33 with < 10 and 51 with ≥ 10 years of disease duration. Clinical, biomechanical, ultrasound (B-mode and Power Doppler (PD), and radiographic variables were assessed by blinded evaluators. The ≥ 10-year group showed greater structural burden. Total synovitis was similar between groups, more frequent at central metatarsophalangeal joints (MTPs) (particularly MTP2 in < 10 years), and PD was rare (6%). The overall pattern was consistent with early MTP1/5 involvement and progressive structural accumulation at MTP2-4, with higher odds of erosions (OR 3.5-5.0), joint space narrowing (JSN; OR 2.8-4.4), and subluxations (MTP2 OR 5.31; MTP3 OR 2.50) in the ≥ 10-year group. In RA remission with metatarsalgia, our findings are consistent with early structural involvement at MTP1/MTP5 and more frequent B-mode synovitis at central joints, with progressive structural burden at MTP2-4 as disease duration increases. PD positivity was uncommon (6%), whereas ultrasound remained essential to detect synovitis. Our findings support systematic assessment of all MTP joints (MTP1-5), with particular attention to MTP1/MTP5 in earlier disease and careful structural evaluation of central rays in longer disease duration. As a cross-sectional study, temporal or causal sequences cannot be inferred.
Pyogenic liver abscess (PLA) is a life-threatening infection rising in East Asia, especially among patients with type 2 diabetes. Although SGLT2 inhibitors improve glycemic control and offer extraglycemic benefits, their effect on PLA risk is unknown. Using Taiwan's National Health Insurance Research Database, we conducted a nationwide retrospective cohort study of adults with T2DM. After 1:1 propensity score matching, 258,800 SGLT2i users and 258,800 non-users were included. The primary outcome was incident PLA. Incidence rates were calculated per 1,000 person-years, and adjusted hazard ratios (aHRs) with 95% confidence intervals (CIs) were estimated using multivariable Cox proportional hazards models. Additional analyses included subgroup analyses with interaction testing, a time-dependent Cox model, a competing-risks model, and a negative-control outcome analysis using fracture. During follow-up, 1,275 PLA events were identified. The incidence rate of PLA was 0.75 per 1,000 person-years in SGLT2i users and 0.83 per 1,000 person-years in non-users. In the primary multivariable Cox model, SGLT2i use was associated with a lower risk of PLA compared with nonuse (aHR, 0.88; 95% CI, 0.79-0.99). This inverse association was generally consistent across most subgroups. In the time-dependent analysis, SGLT2i use remained associated with a lower PLA risk (aHR, 0.72; 95% CI, 0.64-0.81). SGLT2i therapy was independently associated with reduced PLA risk in T2DM patients, particularly with prolonged exposure. These findings suggest an inverse association between SGLT2i use and the risk of pyogenic liver abscess in patients with T2DM.
Benign breast disease (BBD) is common and confers heterogeneous increases in breast cancer risk; however, risk prediction relies mainly on histopathology and clinical factors. Sclerosing adenosis (SA) is a proliferative BBD lesion associated with an approximately two-fold increase in risk, yet most women with SA never develop breast cancer. We hypothesize that the immune-stromal microenvironment of SA and its surrounding lobular field relates to subsequent invasive breast cancer. In a nested case-control study within a BBD cohort, we profiled 24 sclerosing adenosis (SA) biopsies (9 developing invasive breast cancer within 15 years, cases; 15 cancer-free at ≥ 15 years, controls). We integrated whole-tissue NanoString gene-expression profiling with multiplex immunofluorescence (MxIF) imaging of SA lesions and surrounding morphologically normal lobules. We measured immune and stromal biomarkers in SA lesions and adjacent lobules, with image analysis masked to case-control status, integrated these data with whole-tissue gene expression, and summarized both microenvironment patterns and the proximity of immune cells to proliferating epithelium. SA biopsies from women who later developed cancer showed a low-immune, high-stromal gene-expression program, whereas controls were enriched for immune signatures. Stromal densities of CD8⁺, CD68⁺ and RUNX3⁺ cells in both lobular stroma and SA lesions mirrored this axis and were markedly lower in cases than controls. Unsupervised clustering identified immune-cold and immune-hot lobule types and four SA lesion field archetypes; immune-hot lobules and immune-hot/epithelium-proliferative lesion fields were enriched in controls. Spatial analyses further showed that immune-hot lobules have stromal immune cells positioned closer to proliferating epithelium and enriched CD27-CD8 microclusters, whereas SA lesions from cases exhibit greater immune-to-Ki67 distances, fewer boundary-proximal CD8⁺ sentinels, and depletion of CD27-RUNX3 and RUNX3-CD8 microclusters. These findings support an association of an immune-cold SA lesion embedded within an immune-cold lobular field phenotype with subsequent invasive breast cancer risk in women with SA, and suggest that spatially organized, RUNX3-rich immune microenvironments may contribute to epithelial surveillance. Validation in larger cohorts will be needed to confirm generalizability and clarify lesion-specific versus field-wide contributions.
Wilson disease (WD) is a rare autosomal recessive disorder of copper metabolism presenting with acute liver failure, cirrhosis, or neurologic involvement. Liver transplantation (LT) is the definitive treatment; however, data remain limited, particularly from regions reliant on living donor LT (LDLT). We retrospectively analyzed a prospectively collected transplant database, identifying all patients (≥ 14 years) who underwent LT for WD between January 2001 and December 2023. Data on demographics, LT indications, disease characteristics, pre-transplant therapy, complications, and outcomes were collected. Survival was assessed using Kaplan-Meier methods, and neurologic outcomes from clinical documentation. Forty-one patients underwent LT for WD (median age: 23 years; 51.2% female). Ascites was present in 68.4%, encephalopathy in 32.4%, and hepatocellular carcinoma in 5.1%. Acute liver failure was the initial presentation in 17.9%. LDLT comprised 53.7%. Acute cellular rejection occurred in 29.7% but was manageable; no patient required re-transplantation. Neurologic involvement was present in 17.1%, with 71% improving post-LT. One-, five-, and ten-year survival rates were 94%, 94%, and 82%. LT for WD yields excellent long-term survival. Neurologic improvement occurred in most Neuro-Wilson patients, supporting LT even in neurologically affected cases. LDLT plays a crucial role in regions with limited deceased donors.
The goal of this study was to identify symptoms that occur in children post-SARS-CoV-2 infection, their trajectory over the first-year post-enrollment, and relationship to age. Longitudinal comparison of infected and uninfected cohorts. Participants (0-21 years) with laboratory-confirmed SARS-CoV-2 infection were enrolled as infected. The uninfected cohort was individuals without laboratory evidence of SARS-CoV-2 infection. Primary outcome was presence or absence of symptoms. 852 participants (705 infected, 147 uninfected) completed baseline visits. Of those, 558 infected subjects completed a 12-month post-enrollment visit. Twenty symptoms were identified as more common in infected participants compared to uninfected, at either baseline or 12-months, with symptoms varying by age. Some symptoms in the infected were more frequent at baseline (e.g. fever, weight loss), whereas many symptoms persisted through 12-months. Several symptoms were more frequent at 12-months (e.g. dysmenorrhea, persistent headache). Presence of symptoms at 12-months was not significantly associated with the wave of circulating virus at original infection. Interim analysis at one-year post-enrollment identifies 20 symptoms that infected participants were more likely to report post SARS-CoV-2 infection compared to uninfected, at either visit. Type of symptoms varies by age. Ongoing longitudinal data up to 3-years post-enrollment will increase understanding of long-term symptoms of SARS-CoV-2 infection in children and their trajectory. NCT04830852. Although most children recover fully from SARS-CoV-2 infection, some children experience a variety of prolonged symptoms following infection. Many studies attempting to characterize these symptoms and trajectory are not prospective nor longitudinal and lack comparison to uninfected controls. This longitudinal analysis identifies and characterizes post-COVID symptoms in children and adolescents and their trajectory through the first-year post enrollment compared to an uninfected cohort. 20 post-infection symptoms were identified as occurring more frequently in the infected as compared to uninfected cohort. Age played a critical role in the type and frequency of symptoms after SARS-CoV-2 infection. Gastrointestinal symptoms were prominent.
To develop a semi-automated method to segment "black hole" lesions on post-gadolinium 2D T1-weighted images (GdT1) in multiple sclerosis (MS) that follows radiological intensity rules and perform multi-center validation. Multi-center spin-echo GdT1 images and accompanying proton-density (PD)/T2-weighted images and manual T2 lesion masks of the REFLEXION study (NCT00813709) of suspected/early MS were used. Briefly, the proposed method segments cortical gray matter (GM) to derive a T1-weighted intensity threshold, which is applied inside co-registered T2 lesion masks to segment black hole lesion voxels. It was optimized on a training set (N = 40, 57.5% female, mean age 31.4 ± 8.7 (standard deviation) years), and 274 patients formed the test set (61.3% female, age 31.8 ± 8.4 years). Performance was quantified by the Dice similarity coefficient (DSC) and the intraclass correlation coefficient (ICC) for absolute agreement with manual segmentations. Lesion-wise sensitivity and specificity were calculated. Optimization resulted in: (1) GM selection as minimally 0.8 total WM plus GM partial volume, masked by MNI cortex; (2) normalized mutual information-driven linear co-registration of T2 to GdT1 images, interpolating T2 lesion masks using trilinear interpolation and 0.6 threshold; (3) mean intensity inside GM mask used as upper intensity threshold. The optimized method had acceptable spatial accuracy (DSC: 0.39 ± 0.26) and good volumetric accuracy (ICC: 0.84, 95% CI [0.72, 0.90]. Lesion-wise sensitivity was 0.91 ± 0.19, and lesion-wise specificity was 0.62 ± 0.22. The proposed method to semi-automatically segment black holes from post-gadolinium T1-weighted images shows acceptable performance. As a potential aid to radiologists, the method is not recommended to be used entirely without human intervention. Question T1-hypointense "black hole" lesions reflect disease severity in multiple sclerosis but are not routinely quantified due to a lack of reliable analysis methods. Findings A rule-based semi-automated method for GdT1 "black hole" lesion segmentation was developed and optimized, and then validated in a large unseen multi-center test set. Clinical relevance This method adds quantitative information about GdT1 "black hole" lesions to the radiological assessment of multiple sclerosis disease severity, when false positives are manually removed. This can enhance the characterization of individual patients and advance the understanding of the disease.
A well-functioning public health system relies on a robust workforce. Comprehensive data on the workforce, such as number, distribution, and key characteristics, are crucial for evidence-based workforce planning and development. However, few comprehensive public health workforce assessments exist, especially in low- and middle-income countries. Public health reforms over the years and needs identified during the COVID-19 pandemic prompted this assessment in Georgia. A survey of the core public health workforce, including employees at central and regional units of the National Center for Disease Control and Public Health (NCDC) and Municipal Public Health Centers (MPHC), was conducted online between June and September 2023. The survey collected data on workforce demographics, education, on-the-job training, and time spent across different program areas and job functions, along with questions on career progression, job satisfaction, and motivation. The response rate was 81.3%. Findings showed that the median age was 48 for NCDC and 56 for MPHC employees. Over 80% of NCDC and 90% of MPHC employees are women. More than 50% of the workforce hold a master's degrees or higher, and over half of degree-holders specialized in public health or medicine. Mean years of service are 14.9 (NCDC) and 18.0 (MPHC), but career mobility is limited, only 33.3% of NCDC and 10.5% of MPHC staff have ever been promoted. NCDC employees spend most time on administration and surveillance/response, while MPHC staff focus on communicable disease management, administration, and immunization. Training participation is limited, with employees in key positions having better access. Despite limited advancement and relatively low pay, the workforce reported high job satisfaction and strong intrinsic motivation. These findings are pivotal in identifying workforce planning and development bottlenecks and developing targeted strategies. Key interventions include addressing an aging workforce through targeted recruitment and succession planning, providing competitive salaries to attract a younger workforce, and strengthening training offerings. This effort to profile the public health workforce could guide similar assessments in the future and in other countries to prevent, detect, and respond to public health threats.
The atherogenic index of plasma (AIP) and estimated glucose disposal rate (eGDR) are two composite indices derived from routine metabolic measurements and are associated with cardiocerebrovascular disease risk. In individuals with Cardiovascular-Kidney-Metabolic (CKM) syndrome stages 0-3, however, it remains unclear whether joint stratification by these markers helps summarize gradients of cardiovascular disease, heart disease, and stroke risk beyond single-marker assessment. Using data from the China Health and Retirement Longitudinal Study (CHARLS), 5,925 participants without CVD at the start and in CKM stages 0-3 were analyzed. Participants were grouped by median AIP and/or eGDR values. Kaplan-Meier curves and Cox models assessed the link between these indicators and new CVD, heart disease, and stroke cases. Furthermore, both multiplicative and additive interactions between AIP and eGDR were assessed. The predictive value was assessed using the time-dependent Harrell's C index, integrated discrimination improvement (IDI), and net reclassification improvement (NRI). A cohort of 5,925 participants aged 45 years and older (mean age: 57.92 ± 8.52 years) was analyzed, with 54.65% of the cohort being female. During the nine-year follow-up period, 1,467 (24.76%) participants developed incident CVD, including 1,106 (18.67%) with heart disease and 525 (8.86%) with stroke. The high AIP and low eGDR group had the highest risk, with CVD hazard ratios (HRs) of 1.35 (95% CI 1.14-1.59), heart disease HRs of 1.32 (95% CI 1.08-1.62), and stroke HRs of 1.59 (95% CI 1.19-2.12), using the low AIP and high eGDR group as the reference. Neither multiplicative nor additive interaction was statistically significant. The combined application of AIP and eGDR provided a modest improvement in predictive capability for cardiovascular disease, heart disease, and stroke. In individuals with CKM stages 0-3, combined AIP and eGDR stratification captured gradients of cardiovascular risk. The combined application of these indicators may provide modest incremental value for risk stratification within CKM stages, thereby aiding in the identification of high-risk individuals during the early stages of CKM.
To understand circumstances surrounding transitions between initial drug use, non-prescribed opioid use, and injection drug use (IDU) among people who use non-prescribed opioids in the rural U.S. We interviewed adults who use non-prescribed opioids in 10 states. We coded transcript data regarding age, drug, modality, circumstances for each "first use" event and transitions to illicit opioid use/IDU. Transition-related themes were categorized using the Social Ecological Model (SEM) through iterative qualitative memoing. We calculated frequencies and measures of central tendency. Participants (n = 304, mean age [MA] = 36 years, 56% male) reported marijuana or opioid pills as first drug used (60%, reported MA = 13, and 16%, MA = 19, respectively). First opioid use (MA = 20) was typically pills (74%); 43% were initially prescribed opioids for pain. Of participants reporting IDU (92%, MA = 24), first drugs injected were methamphetamine (26%), opioid pills (22%), and heroin (22%). Reasons for first injection of these three substances were recreational (65%, 84%, 66%, respectively) followed by mental health coping (35%, 16%, 33%). Interviews revealed factors influencing transition to illicit opioid use/IDU at environmental, interpersonal, and intrapersonal levels of the SEM, including early household exposure to drugs, and family/peer encouragement to use opioids for coping. Participants described initial use of prescribed opioids following surgery or workplace injury, followed by illicit use of opioids and later IDU for recreational as well as practical goals. IDU was facilitated by intimate partners and friends in the context of diminished local access to illicit opioid pills. Among people who use opioids and inject drugs in the rural U.S., drug use typically started in pre-adolescence, first opioid use in early adulthood, and IDU a few years later, highlighting the need for early intervention. Methamphetamine comprised a substantial proportion of initial IDU raising concerns for polysubstance use. Drug use for coping highlights the need for increased mental health resources in rural areas.
Daihai Lake, a typical closed inland lake in the arid and semi-arid region of Inner Mongolia, has been subject to two consecutive years of ecological water replenishment to mitigate its severe ecological degradation. While the lake water level has risen and wetland ecosystems have been gradually restored, existing studies have predominantly focused on changes in lake water quality, leaving a critical research gap regarding the hydrochemical evolution, formation mechanisms, and drinking water safety risks of groundwater in the plain area of the Daihai Lake Basin under the dynamic conditions of ongoing water replenishment. To fill this gap, this study systematically analyzed the hydrochemical characteristics and formation mechanisms of groundwater in the study area, and conducted a comprehensive groundwater quality assessment. A suite of representative groundwater samples were collected from the study area after two years of ecological water replenishment, and analyzed using an integrated set of methods including Self-Organizing Map (SOM) clustering, hydrochemical graphical analysis, ion ratio analysis, multivariate statistical analysis, and the Entropy-weighted Water Quality Index (EWQI) method. The results show that: (1) Groundwater is divided into three clusters via SOM, with distinct ion sources from carbonate/silicate weathering and halite dissolution across clusters; some samples have excess SO₄2- and HCO₃-, requiring additional cations for charge balance. (2) Groundwater evolution is jointly controlled by water-rock interaction and evaporative concentration, with limited influence from atmospheric precipitation. (3) Three high-fluoride enrichment mechanisms are identified: mineral dissolution under weakly alkaline conditions, evaporative concentration-driven F⁻ enrichment, and accelerated dissolution of fluorine-bearing minerals induced by acidic mining wastewater. (4) 89% of Cluster-3 groundwater meets drinking standards (EWQI < 50), while 70% of groundwater samples from Cluster 1 and Cluster 2 are of poor quality, mainly distributed in the southwestern lakeshore. This study systematically elucidates the hydrochemical characteristics and formation mechanisms of groundwater in the Daihai Lake Basin under continuous ecological water replenishment, identifies key risk zones for groundwater quality, and provides a solid scientific basis for the protection and sustainable utilization of regional groundwater resources, as well as the optimization of ecological water replenishment strategies in similar arid and semi-arid inland lake basins.
Mental health conditions account for 18% of years lived with disability worldwide. 1-in-6 adults are affected in England, with most mental health conditions beginning in childhood and adolescence. Mental distress and ill health are unequally distributed in the UK, with strong associations with wider determinants of health, and higher prevalence among systemically disadvantaged groups. Currently, there is a lack of evidence to inform effective and timely policymaking for primary prevention in the UK. In recognition of these challenges, a national Population Mental Health (PMH) Consortium was established, as part of Population Health Improvement UK (PHIUK). PHIUK is a national research network which works to transform health and reduce inequalities through change at the population level. Our aim is to establish an interdisciplinary PMH Consortium, focussing on upstream determinants and the prevention of risks and onset of mental health conditions through interdisciplinary stakeholder engagement, to create new opportunities for population-based improvement of mental health in the UK.The PMH Consortium brings together leading interdisciplinary representation in population mental health, spanning from sciences to the arts, across the UK. Membership includes six academic institutions, third sector organisations, lived experience expertise, and strong links with national bodies to ensure integrated cross-national and regional policy impact. The PMH Consortium comprises four cross-cutting platforms (Partners in policy, implementation, and lived experience; Data, linkages, and causal inference; Narrowing inequalities; Training and capacity building) and three challenge areas (Children and young people's mental health; Prevention of suicide and self-harm; Multiple long-term conditions) which are highly integrated and interdependent. The work will be underpinned by a Theory of Change across an initial four-year life cycle. This paper describes the aim, objectives, and approach of the PMH Consortium, as well as anticipated challenges and strengths. The goal of the PMH Consortium is to develop a model for population mental health research and policy translation that is both scalable and sustainable. It is critical to ensure continued impact and viability beyond the initial four years, contributing to the prevention of mental health conditions in the UK, with personal, economic, social, and health benefits.
The adnexal region in females presents complex imaging due to the menstrual cycle. Accurate diagnosis is crucial for effective tumor treatment. This study aims to assess the clinical utility of abnormal adnexal uptake on 68Ga-FAPI PET/CT for early and precise lesion characterization. This study retrospectively analyzed all female patients with abnormal adnexal uptake on 68Ga-FAPI PET/CT at our institution from November 2021 to June 2024. Semiquantitative analysis of PET/CT imaging parameters was performed, combined with serum tumor markers and immunohistochemical markers. Pathological findings or imaging follow-up ≥ 6 months served as the gold standard for evaluating the diagnostic performance of 68Ga-FAPI PET/CT. The study included 121 female patients with a mean age of 53.8 ± 12.2 years (18-80 years). A total of 184 adnexal lesions with abnormal uptake were identified. Pathology/follow-up confirmed 82.6% as malignancies, comprising 84 primary and 16 metastatic cases. Additionally, 2 borderline tumors and 19 benign lesions were detected. The positive predictive value was 83.5%. SUVmax differed significantly among primary malignant, metastatic, and benign lesions (12.52 ± 5.41 vs. 9.78 ± 3.39 vs. 5.52 ± 4.17). In the subset of patients with available pathological specimens, SUVmax showed weak to moderate positive correlations with Ki-67 (r = 0.361, p < 0.001) and p53 (r = 0.419, p < 0.001). Notably, the mean SUVmax in the Ki67 > 20% group was significantly higher than in the Ki67 ≤ 20% group (p < 0.001). ROC analysis showed an AUC of 0.85 of SUVmax alone for diagnosing malignant adnexal lesions, increasing to 0.89 when combined with tumor markers. 68Ga-FAPI PET/CT demonstrates high diagnostic performance for ovarian lesions. Among pathologically confirmed cases, SUVmax correlates with proliferative activity and malignant potential, supporting its role in diagnostic optimization.
Peptic ulcer disease (PUD) remains a significant public health concern globally, particularly in regions with high prevalence of risk factors such as Helicobacter pylori infection and Non-Steroidal Anti-Inflammatory Drug (NSAID) use. This study aimed to investigate the prevalence of PUD and its associated risk factors among hospitalized patients in a tertiary hospital in southwest Iran. A mixed-methods design was employed, including a comprehensive literature review, checklist development and validation via a modified Delphi process, a cross-sectional prevalence study, and a hospital-based case-control study. Data were collected from 43,324 patient records (2019-2023) at Abadan University Teaching Hospital. Risk factors were assessed using a validated 21-item checklist, and multivariate logistic regression was used to identify independent predictors of PUD. Among all admissions, 6,874 cases of PUD were identified, indicating a point prevalence of 15.9% (95% CI: 15.6-16.2). NSAID use (75.3%), H. pylori infection (70.1%), smoking (46.9%), and corticosteroid use (30.2%) were highly prevalent among PUD patients. Significant independent risk factors included age ≥ 60 years (AOR: 1.65), NSAID use (AOR: 2.58), H. pylori positivity (AOR: 2.41), smoking (AOR: 1.45), and ulcer size ≥ 5 mm (AOR: 2.19). Despite the high rate of NSAID use, only 26.4% received gastroprotective therapy. The findings underscore the high burden of PUD in hospitalized patients in southwest Iran, with modifiable risk factors such as NSAID use, H. pylori infection, and smoking playing a critical role. Targeted interventions, including Proton Pump Inhibitor (PPI) co-prescription, H. pylori eradication, and lifestyle modifications, are essential to reduce PUD incidence and its complications.
The role of diet in multiple sclerosis (MS) development is still a matter of debate and its impact on MS course is not well understood. We tried to investigate the possible role of adolescent diet in late-onset multiple sclerosis (LOMS) odds. LOMS patients were obtained from the National MS Registry of Iran. Controls were sex and age matched with no history of neurological disorders. We assessed dietary factors using a questionnaire based on multinational studies. Food consumption levels were classified into low, medium, and high for each item. Logistic regression models were used to evaluate diet's impact on LOMS odds. We included 83 LOMS cases and 207 matched healthy controls. The mean age for LOMS patients was 61.14, compared to 61.51 years for controls. The results showed that higher consumption of dairy as in the third tertile associated with 79% decline in LOMS odds (AOR: 0.21; 95%CI: 0.09-0.47). Higher seafood consumption, in the third tertile was associated with a decrease in the odds of LOMS (AOR: 0.32; 95%CI: 0.14-0.72). Furthermore, increased nut consumption, in the third tertile (AOR: 0.37; 95%CI: 0.18-0.77), decreased the odds of LOMS. Additionally, higher consumption of fruits (AOR: 0.22; 95%CI: 0.07-0.63) and vegetables (AOR: 0.26; 95%CI: 0.12-0.55) was linked to a reduced odds of LOMS. This research highlights the advantageous impact of dairy products, seafood, nuts, fruits, and vegetables in lowering the LOMS odds. Hence, advocating for nutrition role in development of LOMS could represent a preventive measure for people susceptible to MS.
Off-pump coronary artery bypass grafting (OPCAB) minimizes systemic inflammation associated with cardiopulmonary bypass, yet postoperative atrial fibrillation (POAF) incidence remains substantial. This suggests that local pericardial factors may contribute significantly to POAF in this setting. We evaluated the impact of left posterior pericardiotomy (LPP) on POAF and drainage patterns in isolated OPCAB to determine if enhancing local drainage effectively mitigates POAF. We retrospectively analyzed 283 patients (mean age 65.1 ± 10.2 years; female, 18.7%) undergoing elective isolated multivessel OPCAB by a single surgeon (2021-2025). Patients were categorized into LPP (n = 122, routinely performed since April 2024) and no-LPP (n = 161) groups. Inverse probability of treatment weighting (IPTW) was applied to adjust for baseline differences. The primary endpoint was in-hospital POAF. Postoperative chest tube drainage distribution and incidence of postoperative pericardial effusion were analyzed to assess the mechanistic efficacy of LPP. After IPTW adjustment, the LPP group showed a significantly lower incidence of POAF (15.1% vs. 30.7%; p = 0.003) and shorter hospital length of stay (p = 0.032). Operative time was significantly shorter in the LPP group (334 ± 44 vs. 349 ± 48 min; p = 0.006), suggesting that LPP did not prolong the procedure. Operative mortality and LPP-related complications were absent. Postoperative pericardial effusion was also less frequent in the LPP group (3.2% vs. 8.4%; p = 0.084). Mechanistically, the LPP group demonstrated a significantly higher left-to-total drainage ratio in the early postoperative period, indicating effective diversion of pericardial fluid to the left pleural space during the critical early postoperative period. In elective isolated OPCAB, concomitant LPP was associated with a lower incidence of POAF and shorter hospital length of stay without increasing operative time or complications. The observed shift in drainage distribution supports the mechanism that LPP reduces POAF by effectively mitigating local pericardial fluid retention.