Salmonella is a major cause of foodborne illness worldwide, posing significant risks to public health and food safety. This study investigated the prevalence, serovar distribution, genotypic characteristics, and antimicrobial resistance (AMR) profiles of Salmonella. A total of 2515 food samples were collected from retail markets, supermarkets, and food processing facilities, and 13,670 stool samples were obtained from sentinel hospitals across 14 cities in Liaoning. The Kruskal-Wallis test was used to compare genetic features among serovars, followed by Dunn's post hoc test for pairwise comparisons. A total of 314 Salmonella strains were identified, with raw poultry showing the highest detection rate (28.88%) among food sources and children aged 0-6 years (3.47%) the highest among the clinical age groups. Among food samples, S. Enteritidis was the most prevalent serovar (42.6%), and it was also the most common in clinical samples (35.8%); in contrast, S. 4,[5],12:i:- was dominant in pediatric clinical cases. According to AMR analysis, 90.13% of strains were resistant to at least one antibiotic and 67.83% were multidrug-resistant (MDR), with the highest resistance to ampicillin (68.47%). Analysis revealed that S. 4,[5],12:i:- harbored the ASSuT resistance module (blaTEM-1B, aph(3″)-Ib/aph(6)-Id, sul2, tet(B)). Extensive MDR phenotypes were observed in S. Indiana and S. Kentucky, associated with abundant insertion sequences (IS) and resistance genes (ARGs), including clinically critical determinants (blaNDM-9, mcr-1.1, rmtB). The highest mean virulence factor (VF) count (111.17) was observed in S. Enteritidis, contributing to its epidemiological success. Conversely, S. Indiana and S. Kentucky, predominantly food-associated serovars, exhibited reduced virulence but served as critical AMR reservoirs. These findings highlight the epidemiological characteristics and AMR risks of Salmonella in food and clinical settings, providing critical data for food safety and clinical antimicrobial stewardship.
Health-related social needs (HRSN) data are used in referral and treatment decisions, in population health management strategies, and in health services research. However, evidence suggests HRSN data are at risk for bias. To identify and classify sources of bias in HRSN data and the implications for usage for patient care and population health. In this qualitative study, key informant interviews with patients and health care professionals in Indiana and Florida (recruited using multiple recruitment methods and snowball sampling) were conducted from January to May 2025. Key informants in Indiana were primarily associated with a public safety-net system including federally qualified health centers, or a multihospital system with services statewide. In Florida, key informants were associated with a large academic medical center, with some dual-affiliated with a US Department of Veterans Affairs hospital. Health care professionals had the titles such as physician, social worker, and community health worker. Data collection occurred via video or telephone call. Interviews followed a semistructured interview guide grounded in a framework describing sources of potential bias in health data. Participants were asked about HRSN data collection practices and experiences, documentation practices, responses to HRSN data collection, and how, in their own words, they defined food insecurity, housing instability, financial strain, and transportation barriers. Thematic analysis followed a consensus coding approach. A total of 20 patients and 20 health care professionals were recruited (40 informants total; 22 aged 40-64 years [42.5%]; 27 female [67.5%]). Participants described variation in HRSN data collection and differing availability of organizational resources that contributed to sampling bias. Patients and professionals reported detection bias was possible because HRSNs could be intentionally sought during visits or not collected at all. Concerns about stigma or embarrassment, power distance, and privacy could result in nonresponse bias. Health care professionals and patients could all offer slightly different, or nuanced, definitions of different HRSNs. These more expansive or restrictive definitions could lead to misclassification bias. In this qualitative study, both patients and health care professionals described opportunities for bias in HRSN data collection and documentation. These findings suggest that, while HRSN data are potentially valuable to patient care, their fitness for use in organizational decision-making, research, and health policy may need improvement.
To investigate the prevalence of infection in diabetic foot ulcers (DFUs) and its association with lower limb major and minor amputations, examining demographic and socioeconomic factors influencing DFU outcomes. This retrospective study analyzed data from the Indiana Network for Patient Care (INPC) from January 2019 to May 2024. A total of 27,078 patients with DFUs were included, aged 9 to 103 years. Data included demographics, clinical encounters, ICD-10, CPT codes, laboratory values, and microbiological assessments. Analyses included Pearson correlation, logistic regression, and one-sample proportion tests. Patients were predominately male (64.3%) and female (35.7%) with a mean age of 63.8 years. DFU prevalence showed significant inverse correlation with median income (r=0.2108, p=0.0016); lower-income areas (median income < $64,200) had DFU rates > 5.0 per 1,000 population. Of cultured specimens, 33.79% showed infections, primarily Staphylococcus spp. (22.52%) and Streptococcus spp. (11.27%). In major and minor amputation cases with microbial data available, Staphylococcus spp. (22.75%), Enterococcus spp. (11.86%) and Streptococcus spp. (10.42%) genera were most common. Osteomyelitis increased amputation odds by 7.83 times (p<0.001). Bacterial infections, particularly Staphylococcus, Enterococcus and Streptococcus genera were strongly associated with amputation risk. These findings have important clinical implications: early identification of these specific bacterial pathogens could guide targeted antibiotic therapy and inform risk stratification for amputation prevention. Improved bacterial identification could enhance DFU management and reduce complications such as amputation.
Type 1 diabetes (T1D) is an immune-mediated disease characterized by progressive autoimmune destruction of pancreatic β cells. Although traditionally viewed as primarily T-cell-driven, B cells play essential roles in disease pathogenesis. In addition to producing islet autoantibodies, B cells contribute to immune activation through antigen presentation and cytokine secretion, thereby shaping autoreactive T-cell responses. The earliest clinical predictor of T1D is the appearance of islet autoantibodies in the blood, reflecting a breach in B-cell tolerance well before symptomatic disease onset. In individuals at high genetic risk, type I interferon (IFN) signatures are detectable in peripheral blood prior to seroconversion, suggesting that type I IFNs may act as upstream regulators of B-cell tolerance. Peripheral tolerance is enforced through layered checkpoints including transitional selection, maintenance of anergy, germinal center regulation, and regulatory B-cell differentiation. Studies in systemic autoimmunity demonstrate that type I IFN signaling lowers B-cell activation thresholds, enhances BCR and TLR responsiveness, promotes survival of autoreactive transitional clones via BAFF induction, destabilizes anergy, and skews differentiation toward inflammatory phenotypes such as T-bet+ age-associated B cells. Consistent with this model, single-cell transcriptomic and BCR repertoire analyses in T1D reveal clonal expansion and proinflammatory signatures in islet-reactive B cells during the preclinical stage. Together, these findings implicate the IFN-B-cell axis as a potential target for early disease modification.
The objective of this study was to determine the time frame of clinical improvement in patient-reported outcomes (PROs) following surgical decompression for cervical spondylotic myelopathy (CSM). Based on previously published 12-month data from this group, the authors hypothesized that the average time to minimal clinically important difference (MCID) improvement would primarily occur by 3 months postoperatively regardless of preoperative myelopathy severity. They also hypothesized that there would be minimal additional improvement between 3 months and 5 years after surgery. This was a post hoc analysis of prospectively collected data from the 14-site Spine CORe™ study group of the Quality Outcomes Database (QOD). Patients were stratified according to myelopathy severity using the modified Japanese Orthopaedic Association (mJOA) myelopathy scale into mild (mJOA score 15-17), moderate (mJOA score 12-14) or severe (mJOA score < 12). PRO measures included the Neck Disability Index (NDI), numeric rating scale (NRS) for neck and arm pain, and EQ-5D for quality-adjusted life years. PROs were recorded at baseline, 3-month, 12-month, 2-year, and 5-year intervals. MCID thresholds were calculated using previously validated methods in this cohort. Time to meet the MCID cutoff and the proportion of patients achieving MCID at each time point were determined. A total of 1085 patients (with ≥ 80% follow-up at 60 months for all PRO measures [PROMs]) were enrolled. Patients with more severe myelopathy had worse baseline comorbidities (e.g., BMI, American Society of Anesthesiology class, ambulation dependence) and lower PRO scores. Average PROs met the MCID threshold in each category at 3 months postoperatively, regardless of baseline myelopathy severity. Of the patients with complete 5-year follow-up data, the majority achieved the MCID cutoff threshold for PROMs at 3 months (50%-73%, depending on the PROM). A minority of patients went on to meet the MCID for PROMs at 12 months (12%-21%), 2 years (4%-8%), and 5 years (1%-6%). Between 4% and 25% of patients never achieved MCID cutoffs at any time point. On average, patients achieved clinically meaningful improvement in PROs at 3 months postoperatively, regardless of preoperative severity. While the majority (50%-73%, depending on the PROM) reached MCID within 3 months, an additional 12%-21% improved by 12 months, 4%-8% by 2 years, and only 1%-6% by 5 years; 4%-25% never reach the MCID. This 5-year follow-up study clarifies the timeline of clinical improvement after surgery for CSM and provides a useful tool for both surgeon planning and patient counseling.
The authors aimed to evaluate the prevalence of sleep disturbance in patients with grade 2 lumbar spondylolisthesis and assess postoperative trajectories and predictors of improvement at 5 years. They hypothesized that surgical treatment of grade 2 spondylolisthesis would result in high rates of long-term improvement in sleep disturbance. Patients with grade 2 lumbar spondylolisthesis were identified from the 14-site Spine CORe™ study group within the Quality Outcomes Database (QOD). Sleep disturbance was measured using the sleep item of the Oswestry Disability Index (ODI) at baseline and 3, 12, 24, and 60 months postoperatively. The prevalence of baseline sleep disturbance was determined. Clinically meaningful improvement was defined using minimal clinically important difference thresholds. Predictors of improvement were analyzed using multivariate Firth's logistic regression, and associations with pain, disability, quality of life, and satisfaction were assessed. A total of 328 patients underwent surgery for grade 2 spondylolisthesis. At baseline, 300 of 328 patients (91.5%) reported sleep disturbance. The 60-month follow-up rate in this subgroup was 81% (21 died within 5 years of surgery of unrelated causes and 223 of the 300 patients followed up at 5 years). Improvement in sleep disturbance was observed in 165 patients (74.0%), while 58 patients (26.0%) continued to report sleep disturbance at 60 months. Those with improved sleep were more likely to achieve clinically meaningful gains in back pain (80.5% vs 50.0%, p < 0.001), leg pain (83.6% vs 53.4%, p < 0.001), EQ-5D (31.1% vs 7.3%, p < 0.001), and ODI (62.8% vs 5.3%, p < 0.001) scores, with a trend toward higher satisfaction in patients with improved sleep (88.5% vs 77.8%; risk difference 10.7%, 95% CI -0.01 to 0.23). On multivariate analysis, only private insurance (OR 2.20, 95% CI 1.03-4.78; p = 0.041) was associated with greater odds of 60-month sleep improvement. Sleep disturbance was highly prevalent in 91.5% of patients with grade 2 spondylolisthesis, and 74.0% experienced meaningful improvement by 3 months and this was sustained for 5 years following surgery. Sleep recovery was closely tied to gains in pain, disability, and quality of life. These results demonstrate that surgery for grade 2 spondylolisthesis not only improves mechanical symptoms but also substantially alleviates sleep disturbance.
Prolonged hospital length of stay (LOS) is an increasingly important quality metric among regulators and payers that has been associated with worse patient outcomes and decreased patient satisfaction. The aim of this study was to identify predictors of prolonged hospital LOS after surgery for Meyerding grade 2 spondylolisthesis using a multicenter prospectively collected registry. The prospectively collected Spine CORe™ Quality Outcomes Database (QOD) study group cohort, which consisted of 328 patients from 14 sites, was used to identify all patients who underwent single-stage lumbar fusion for Meyerding grade 2 lumbar spondylolisthesis. Prolonged LOS was defined as ≥ 4 days (75th percentile). An array of demographic, comorbidity, and perioperative factors known to impact LOS were collected for each patient. Bivariate tests, including the chi-square goodness of fit and independent t-test, were used to identify variables associated with prolonged LOS. Multivariable logistic regression analysis was conducted to determine independent predictors of prolonged LOS. The QOD cohort comprised 328 patients with a follow-up rate of > 80%. After excluding patients with an anterior or lateral surgical approach and missing LOS data, the final cohort included 268 patients, of whom 52 (19.4%) experienced a prolonged LOS. In the univariate analysis, older age, dependent ambulation, insurance status, depression, greater estimated blood loss, longer operative duration, multilevel fusion (2 or more levels), perioperative complications (e.g., incidental durotomy and urinary tract infection), and nonhome discharge were associated with prolonged LOS. In the adjusted model, multilevel arthrodesis independently increased the odds of prolonged LOS (OR 2.11, 95% CI 1.07-4.18; p = 0.03), whereas private insurance (vs Medicare/Medicaid/government) was associated with lower odds (OR 0.42, 95% CI 0.20-0.87; p = 0.02). Patient-reported outcomes at 60 months did not differ between the groups with and without prolonged LOS. In this multicenter Spine CORe™ QOD study, multilevel lumbar fusion and noncommercial insurance were the principal independent predictors of prolonged LOS after surgery for grade 2 spondylolisthesis. These findings are valuable for patient informed consent, as well as to identify higher-risk patients who could benefit from earlier inpatient resource allocation (social work and counseling) to facilitate timely discharge.
Understanding the heterogeneity of neurodegeneration in Alzheimer's disease (AD) and identifying distinct progression pathways is critical for improving diagnosis, treatment, prognosis, and prevention. Motivated by this need, this study aimed to identify disease progression subphenotypes among patients with mild cognitive impairment (MCI) and AD using electronic health records (EHRs). We developed a novel approach that combines a graph neural network (GNN)-based framework with time series clustering to characterize progression subphenotypes from MCI to AD. We applied the proposed framework to a real-world cohort of 2,525 patients (61.66% female; mean age 76 years), of whom 64.83% were Non-Hispanic White, 16.48% Non-Hispanic Black, 2.53% were of other races, and 10.85% were Hispanic. Our model identified four distinct progression subphenotypes, each exhibiting characteristic clinical patterns, with average MCI-to-AD progression times ranging from 805 to 1,236 days. These findings indicate that AD does not follow a uniform progression trajectory but instead manifests heterogeneous pathways, and the proposed framework provides an explainable, data-driven approach for delineating AD progression subphenotypes, offering actionable insights for healthcare informatics research and the clinical management of patients at risk for AD.
Surgical intervention is a standard treatment for severe cervical spondylotic myelopathy (CSM), but postoperative outcomes can vary based on socioeconomic characteristics such as insurance status. The aim of this study was to investigate the influence of insurance on patient-reported outcomes (PROs) at 60 months postoperatively. In this prospective cohort study, the Spine CORe™ study group analyzed data from the Quality Outcomes Database (QOD) database. Chi-square and Kruskal-Wallis tests were performed to identify the associations between sociodemographic and clinical variables and insurance type. The chi-square test was also used to examine the influence of insurance type on the achievement of minimal clinically important difference (MCID) for each outcome measure. Statistically significant covariates (p < 0.001) were used in a multivariate linear regression model measuring the influence of insurance type on 60-month changes in scores for neck and arm pain numeric rating scale (NRS), Neck Disability Index (NDI), EQ-5D, and modified Japanese Orthopaedic Association (mJOA) scores. From a dataset of 1085 patients who underwent CSM surgery, 106 patients died during the 5-year follow-up period and 793 had an NDI score at the 5-year follow-up. The follow-up rate was 83% ([793 with NDI + 106 died]/1085 patients). Of the 1085 patients, the authors excluded patients with Veterans Affairs insurance, no insurance, or who were missing baseline PROs, which left 1030 patients with Medicare (n = 408), Medicaid (n = 75), and private (n = 547) insurance with 60-month PROs. Insurance status varied based on demographics and medical comorbidities (each p < 0.05). Medicaid patients had significantly worse scores at baseline and 60 months for arm and neck NRS, NDI, EQ-5D, and mJOA (each p < 0.05). In multivariate analysis after adjustment for relevant covariates, compared with private insurance, only Medicare insurance was associated with lower 60-month EQ-5D scores (β -0.05, 95% CI -0.09 to -0.01; p < 0.05). Otherwise, there was no significant difference in PROs. Medicaid insurance was not significantly associated with differences in any of the outcomes after covariate adjustment compared to private insurance. Despite having worse baseline scores, patients with Medicaid insurance coverage had similar rates of achievement of MCID compared with those with private insurance. These results suggest that patients with CSM who underwent surgery had improvement in PROs for all insurance types.
Unemployment following surgery incurs significant societal costs. The authors aimed to identify predictors of return to work (RTW) following surgery for patients with grade 1 lumbar spondylolisthesis. This Spine CORe™ study is a post hoc analysis of prospectively collected data from the Quality Outcomes Database (QOD) grade 1 lumbar spondylolisthesis module. Patients were divided into 2 groups: employed preoperatively and unemployed preoperatively. Univariate and multivariate instruments were used to identify predictors of RTW/employment within 5 years postoperatively. Across the 12 highest enrolling QOD sites (Spine CORe™ group), 608 patients were enrolled with 81% having Oswestry Disability Index (ODI) follow-up data. Of these 608 patients, 604 patients had baseline employment status recorded. Of 275 patients who were employed preoperatively, 249 had RTW follow-up data. Of the 329 patients unemployed preoperatively, 218 had RTW follow-up data. The study cohort follow-up for RTW was 77%. By 5 years postoperatively, 87.1% (n = 217) of those employed preoperatively and 22.0% (n = 48) of those unemployed preoperatively returned to work. In each cohort, there were no differences in age, sex, BMI, and American Society of Anesthesiologists class between those who did and those who did not RTW. These results remained consistent in the subgroup analysis of patients younger than 65 years at baseline. However, the only difference observed in this age group was within the preoperatively unemployed cohort, where the RTW group had a lower BMI (28.4 ± 5.5 vs 32.8 ± 9.0, p = 0.001). On multivariate analysis for the preoperatively employed cohort, college degree (OR 3.6, 95% CI 1.3-12.2) and active employment (OR 6.0, 95% CI 1.9-19.8) remained independent predictors of returning to work. For those preoperatively unemployed, a college degree (OR 2.2, 95% CI 1.1-4.4) independently predicted RTW. Approximately 87% of patients employed preoperatively RTW, and 22% of patients unemployed preoperatively returned to the workforce within 60 months after surgery for grade 1 spondylolisthesis. College-level education independently predicted RTW for both preoperatively employed and preoperatively unemployed patients.
Intervertebral disc degeneration is the most recognized cause of low back pain, characterized by the decline in tissue structure and mechanics. Image-based mechanical parameters (e.g., strain, stiffness) may provide an ideal assessment of disc function that is lost with degeneration, but unfortunately, these remain underdeveloped. Moreover, it is unknown whether strain or stiffness of the disc may be predicted by MRI relaxometry (e.g., T1 or T2), an increasingly accepted quantitative measure of disc structure. In this study, we quantified T1 and T2 relaxation times and compared to in-plane strains measured with displacement-encoded MRI within human cadaveric discs under physiological levels of compression and bending. Using a novel inverse approach, we then estimated shear modulus in orthogonal image planes and regionally compared these values to relaxation times and 2D strains. Intratissue strain depended on the loading mode, and shear modulus in the nucleus pulposus was typically an order of magnitude lower than the annulus fibrosus. Relative shear moduli estimated from strain data derived under compression generally did not correspond with those from bending experiments. Only one anatomical region showed a significant correlation between relative shear modulus and relaxometry (T1 vs. µrel, coronal plane under bending). Together, these results suggest that future inverse analyses may be improved by incorporating multiple loading conditions into the same model and that image-based elastography and relaxometry should be viewed as complementary measures of disc structure and function to assess degeneration in future studies.
Cervical spondylotic myelopathy (CSM) is a common cause of spinal cord dysfunction, and anterior cervical discectomy and fusion (ACDF) is the gold standard treatment. Cervical disc arthroplasty (CDA) is a relatively novel, motion preserving alternative to ACDF. The aim of this study was to assess CDA versus ACDF in the surgical treatment of CSM at a 5-year follow-up. This study used the 14-site Spine CORe™ study group cervical module of the Quality Outcomes Database (QOD), which included 1085 patients. Baseline demographics, clinical variables, and surgical parameters were collected. Patient-reported outcome measures (PROMs) included the EQ-5D, Neck Disability Index (NDI), and numeric rating scale (NRS) for neck pain and arm pain. Of the 1085 patients, 22 patients who underwent CDA with baseline and 5-year follow-up PROMs data who met the inclusion/exclusion criteria were selected. Nearest-neighbor propensity score matching was performed using a 4:1 matching ratio. Five-year PROMs were compared between the CDA and ACDF groups using the 2-sample t-test for continuous variables. Multivariable linear regression was performed to identify predictors of 5-year myelopathy severity. There were 1085 patients in the 14-site Spine CORe™ study group's QOD cervical module; 110 matched patients were analyzed, including 22 who underwent CDA (mean age 47.73 years) and 88 who underwent ACDF (mean age 48.89 years). The subcohort had 100% of PROMs data (NDI, NRS, EQ-5D, and mJOA) at the 5-year follow-up. There were no significant differences for 1- and 2-level operations between the CDA and ACDF groups (p = 0.34). There were no significant differences in 5-year PROMs between the two groups. Patients improved in each PROM category in both treatment groups when comparing baseline with 5-year PROMs. While the rate of reoperation at 5 years was higher in the ACDF group compared with the CDA group, there was no statistically significant difference (17.0% vs 9.1%, p = 0.52). In appropriately selected patients with CSM, CDA can provide comparable outcomes to ACDF while preserving cervical motion.
Background/Objectives: Epigenetic modifications, particularly DNA methylation, contribute to tumor progression and therapy resistance. Guadecitabine, a hypomethylating agent (HMA), has shown promising clinical activity when combined with carboplatin in preclinical models. We evaluated the combination of guadecitabine with carboplatin as a second-line treatment for extensive-stage small-cell lung cancer (SCLC; NCT03913455), one of the deadliest malignancies. Here, we report methylome changes in peripheral blood mononuclear cells (PBMCs) collected at baseline and during treatment from patients on the trial. Methods: PMBC DNA was analyzed using Infinium HumanMethylationEPIC v1.0 bead chips. Data were processed, and differentially methylated positions (DMPs) were identified and analyzed for pathway enrichment using bioinformatic approaches, and immune deconvolution analyses were conducted to investigate the impact on immune cell composition. Results: Direct comparison of PBMCs between cycle 2 day 5 (C2D5; post-treatment) vs. cycle 1 day 1 (C1D1; pre-treatment) revealed a greater number of hypomethylated DMPs (380 DMPs in C2D5 vs. C1D1 PBMCs; p < 0.05, |β| > 20%). Moreover, when first compared with normal PBMCs from cancer-free controls, the number of hypomethylated DMPs was even greater in C2D5 than in C1D1 (1771 vs. 237 DMPs, respectively; p < 0.05, |β| > 20%). Long interspersed nucleotide elements-1 (LINE-1) were significantly hypomethylated in PBMCs after HMA treatment (C2D5 vs. C1D1). Pathway analysis of hypomethylated DMPs revealed significant alterations in key signaling pathways, including NF-κB, Rho GTPase, and pulmonary fibrosis in C1D1 vs. C2D5. Normal PBMCs to C1D1 PBMCs revealed changes in IL-3 signaling, Fcγ receptor-mediated phagocytosis, and molecular mechanisms of cancer. Deconvolution analysis revealed a greater percentage of monocytes in C1D1 vs. normal PBMCs; after HMA treatment, percentages of monocytes and B cells decreased, while the eosinophil percentage increased in C1D1 vs. C2D5. Conclusions: HMA treatment has a global impact on PBMC methylomes in cancer patients. DNA methylation changes were associated with biological pathways related to PBMC function, and shifts in distinct immune cell populations were observed.
Astrocytes undergo pronounced reactivity during traumatic brain injury (TBI); however, the temporal dynamics of this response and the signaling mechanisms regulating astrocyte proliferation remain incompletely defined. In this study, we characterized the spatiotemporal profile of astrocyte reactivity and proliferation in the hippocampus during TBI and investigated the involvement of mammalian target of rapamycin complex 1 (mTORC1) signaling in these processes. Using a mouse model of TBI, we found that injury triggered a rapid astrocytic response in the hippocampus, characterized by increased glial fibrillary acidic protein (GFAP) expression and morphological hypertrophy as early as 4 h post-injury. Astrocyte proliferation emerged subsequently, peaked during the acute phase (48 and 72 h), and declined to baseline levels at 7 days post-trauma, indicating a transient proliferative response during TBI. Concurrently, mTORC1 signaling was robustly activated in reactive astrocytes in the hippocampus and was specifically associated with proliferative reactive astrocytes during injury. Pharmacological inhibition of mTORC1 signaling with rapamycin significantly reduced reactive astrocyte proliferation during TBI without altering astrocytic hypertrophy. Together, these findings demonstrate that TBI induces a rapid but transient astrocyte activation and proliferation response in the hippocampus and that mTORC1 activation is required for the proliferation, but not the hypertrophic activation, of reactive astrocytes during traumatic brain injury.
Surgical intervention for grade 2 lumbar spondylolisthesis is routinely performed, but outcomes in older patients, in whom the disease is most prevalent, remain poorly understood. The aim of this study was to compare patient-reported surgical outcomes between age groups (< 65 years of age vs ≥ 65 years of age) with 5 years of follow-up. The authors hypothesized that patients would have sustained improvement in outcomes in response to surgical treatment for grade 2 spondylolisthesis regardless of age. The multicenter prospectively collected Quality Outcomes Database by the Spine CORe™ study group was retrospectively analyzed for patients who underwent arthrodesis for grade 2 lumbar spondylolisthesis. Across 14 high-enrolling sites, 328 patients with 81% follow-up were identified. Baseline and postoperative 3-month, 1-year, 2-year, and 5-year outcomes including numeric rating scale (NRS) back pain (score 0-10), NRS leg pain (score 0-10), Oswestry Disability Index (ODI), EQ-5D scores, and patient satisfaction using the North American Spine Society (NASS) index, were evaluated. These outcomes were compared between younger (< 65 years of age, n = 188) and older (≥ 65 years of age, n = 140) age groups using Wilcoxon rank-sum tests. EQ-5D and ODI scores were significantly improved postoperatively in the younger and older age groups (p < 0.001). ODI scores were not significantly different between the age groups at baseline (p = 0.37) or postoperatively at any time point (p > 0.05). EQ-5D scores were not significantly different between the younger and older patient groups at baseline (p = 0.47) or postoperatively at any time point (p > 0.05). NRS leg pain (p = 0.68) and back pain (p = 0.45) scores were not significantly different at baseline across age groups. NRS leg pain was not significantly different postoperatively (p > 0.05). Older patients had lower back pain scores (p = 0.03) at 3 months postoperatively, but not at any other time points (p > 0.05). Leg and back pain scores improved postoperatively up to 5 years of follow-up in all patients (p < 0.0001). A majority of the younger (83.5%) and older (89.5%) patients reported satisfaction with their surgical outcome up to 5 years after surgery, and postoperative NASS satisfaction scores were not significantly different between the younger and older age groups at any time point (p > 0.05). In response to surgical treatment, patients over 65 years of age have significant improvements similar to those of younger patients. Surgical treatment is a viable option for improvement regardless of age.
Belapectin reduced variceal development in a subgroup of patients with MASH cirrhosis. We report the efficacy and safety of belapectin in patients with MASH cirrhosis and portal hypertension without varices at baseline. NAVIGATE was a global Phase 2b trial. Patients were randomized to intravenous (IV) belapectin 2 or 4 mg/kg lean body weight or placebo for 18 months stratified by type 2 diabetes. The primary endpoint was the incidence of varices with esophagogastroduodenoscopy (EGD) or a composite endpoint (new varices, intercurrent events or discontinuation) at 18 months with belapectin versus placebo in the Full Analysis Set (FAS). In a pre-specified analysis, new varices were evaluated in all patients who underwent EGD at baseline and 18 months. Per Protocol population was patients treated for 18 months with EGD at 18 months. Of 357 randomized patients, 291 completed treatment. Baseline characteristics were comparable across cohorts. In the FAS, 17.8% with placebo versus 10.1% with belapectin 2 mg/kg developed varices, a 43.2% reduction ( p =0.13). In the per-protocol population (PP), varices occurred in 22.3% with placebo versus 11.3% with belapectin 2 mg/kg, a 50% reduction (unadjusted p =0.04). Belapectin 4 mg/kg had no significant effect on variceal development. For the composite endpoint at 18 months, no significant difference was observed between belapectin 2 mg/kg ( p =0.14) or 4 mg/kg ( p =0.261) and placebo in the FAS. Belapectin was well tolerated with no safety signals. Belapectin 2 mg/kg lowered development of new varices in MASH cirrhosis and portal hypertension. NCT04365868.
Studies have investigated the potential applications of Large Language Models (LLMs) in dental education. As LLMs evolve, it is necessary to determine the extent to which their accuracy improves before they can be integrated into dental curricula and clinical practice. This meta-analysis investigates the performance of LLM versions on text-based single-best-answer multiple-choice questions (MCQs) in dentistry over time and assesses the impact of question type and source on LLM accuracy. Typical guidelines for reporting systematic reviews and meta-analyses were followed. Study eligibility followed the PICO (Population, Intervention, Comparison, Outcomes) model. Included records were published between January 2022 and August 2025. Screenings and data extractions were performed by two reviewers, with disagreements resolved by a third reviewer. Proportional meta-analysis was used to determine the pooled accuracy of LLM versions across studies. Meta-regression explored sources of heterogeneity. Twenty-seven articles met the inclusion criteria. Over time, LLM accuracy in answering dentistry MCQs has significantly improved (p.
In reviewing the nomenclature of genus-group names of the Chrysomelinae of the Americas, we uncovered six issues that we resolve here: (1) We explain priority between Stilodes and Leptinotarsa. Both were described in the same work: Chevrolat, 1836. In synonymizing the two genera, Flowers (2004) became the First Reviser and gave priority to Stilodes. We restore genus Leptinotarsa Chevrolat, 1836 as distinct from Stilodes Chevrolat, 1836. (2) The work in which Phaedon Megerle von Mühlfeld, 1823 was described is suppressed for the purposes of zoological nomenclature, making the next person to use the name, Latreille, 1829, the valid authority for Phaedon. (3) Euparocha Dejean, 1836 is a nomen nudum, and while Motschulsky likely intended to keep Dejean's spelling, Euparochia Motschulsky, 1860 is the correct original spelling. (4) Lioplacis Agassiz, 1846 is an unjustified emendation of Leioplacis Chevrolat, 1843, and Lioplacis Chevrolat is a subsequent usage of the unjustified emendation. Lioplacis is not in prevailing usage, therefore Leioplacis Chevrolat, 1843 remains the correct original spelling. (5) The accent on Dr Juan Brèthes' name is often incorrect or missing. Brèthes described Henicotherus in volume 32 of the Revista Chilena de Historia Natural. Volume 32 claims it was published in 1928 but was published in 1929. The authority for Henicotherus is therefore Brèthes, 1929. (6) The authority previously known only as "Demay", 1838 refers to Dr Aloysius François De Mey (1793-1870); we provide a short biography, portrait, and updated list of the species described by De Mey, 1838. ResumenAl revisar la nomenclatura de los géneros de Chrysomelinae de las Américas, descubrimos seis problemas que resolvemos aquí: (1) Explicamos la prioridad entre Stilodes y Leptinotarsa. Ambos fueron descritos en la misma obra: Chevrolat, 1836. Al sinonimizar ambos géneros, Flowers (2004) se convirtió en el Primer Revisor y dio prioridad a Stilodes. Restauramos el género Leptinotarsa Chevrolat, 1836 como distinto de Stilodes Chevrolat, 1836. (2) La obra en la que se describió Phaedon Megerle von Mühlfeld, 1823 fue suprimida para los fines de nomenclatura zoológica, por eso la siguiente persona en utilizar el nombre, Latreille, 1829, es la autoridad válida para Phaedon. (3) Euparocha Dejean, 1836 es un nomen nudum, y aunque Motschulsky probablemente quería mantener la ortografía de Dejean, Euparochia Motschulsky, 1860 es la ortografía original correcta. (4) Lioplacis Agassiz, 1846 es una enmienda injustificada de Leioplacis Chevrolat, 1843. Lioplacis no tiene un uso predominante, por lo tanto, Leioplacis Chevrolat, 1843 sigue siendo la ortografía original correcta. (5) El acento en el nombre de Brèthes a menudo es incorrecto o falta, el describió Henicotherus en el volumen 32 de la Revista Chilena de Historia Natural. El volumen 32 afirma que se publicó en 1928, pero se publicó en 1929. Por lo tanto, la autoridad de Henicotherus es Brèthes, 1929. (6) La autoridad anteriormente conocida sólo como “Demay”, 1838 se refiere al Dr Aloysius François De Mey (1793–1870); proporcionamos una breve biografía, un retrato y una lista actualizada de las especies descritas por De Mey, 1838.
Background/Objectives: Despite the decades-old ban on lead in fuel, plumbing, consumer goods, industrial processes, and various materials, it remains a public health threat due to its persistent nature. Zebrafish (Danio rerio) are highly effective for modeling several disorders, including those affecting neurological and behavioral functions, and are well-suited for assessing the impact of environmental toxins like lead. This study aimed to investigate the neurodevelopmental effects of embryonic lead exposure using the zebrafish model system. Methods: Embryos were exposed to lead acetate (PbAc) at concentrations ranging from 0.3 to 0.7 µg/mL using an exposure window of 6 to 48 h post-fertilization (hpf). Results: PbAc exposure produced sublethal teratogenic effects in a subset of larvae across concentrations, including tail and spinal deformities, craniofacial abnormalities, and uninflated swim bladder observed at 7 dpf. At 3 days post-fertilization (dpf), spontaneous circle swimming behavior suspected to be seizure-like was observed in the lead-exposed larvae and was more pronounced under light conditions in a dose-dependent manner. Electrophysiological recordings confirmed that larvae exhibiting circle swimming behavior had heightened neural activity, indicating a potential seizure-like phenotype driven by lead exposure. Conclusions: Our findings suggest that embryonic lead exposure leads to morphological defects and seizure susceptibility, demonstrating lead's neurotoxic potential during early development. Seizure-like behaviors occurred in a non-linear concentration-dependent manner with a photosensitive component, and elevated baseline neural excitability was confirmed by local field potential (LFP) recordings.
暂无摘要(点击查看详情)