Structural and social determinants across chronosystems critically shape educational transitions and cognitive aging in India by influencing cognitive reserve through life-course. Drawing on Bronfenbrenner's Ecological Systems Theory and the Cumulative Disadvantage/Advantage frameworks, we conceptualize genderism as a structural and social determinant of cognitive health operating across multiple ecological levels - national (macrosystem), regional/states (mesosystem), and household/family or individual (microsystem) - within India's chronosystem. Using Wave 1 of the Longitudinal Aging Study in India (LASI, 2017-2018), we assessed cognitive health across five domains, i.e., memory, orientation, attention, object naming, and executive function among older adults in 36 states and Union Territories of India. Three-level regression models examined how microsystem operating at individual (e.g., age, sex, education, childhood SES, widowhood, health behaviors), and household level (monthly per capita expenditure, indoor air quality, urban/rural residence), and mesosystem operating at state-level (e.g., women's education and employment rates) factors interact to shape cognitive health outcomes. Cognitive health outcomes showed wide gender gaps with regional sex disparities with the largest gap among those with education less than primary education (men: 24; women: 21) which reversed among those with post-secondary education (women: 33; men: 33). Widowed women and those with poor health indicators had significantly lower cognitive health scores. At the microsystem household-level, urban residence (β = 2.64), and households with cleaner cooking fuel use (β = 1.61) while at the mesosystem state-level, the % of women with at least primary education (β = 1.82) and the % of women who were employed (β = 0.62) were associated with improved cognitive health outcomes. The findings reveal how genderism across chronosystems accumulate over the life course highlighting the need for national-state-level policies that enhance women's access to education, employment, health, and nutrition to mitigate gendered disparities in late life cognitive health among Indian adults.
Competency-based medical education, structured around the 6 core competencies of the Accreditation Council for Graduate Medical Education (ACGME), underpins postgraduate training in ACGME-International (ACGME-I)- accredited institutions. Despite its central role, limited evidence exists regarding trainees' awareness, confidence, and educational preferences related to these competencies, particularly in the Middle East. This study aimed to assess postgraduate trainees' awareness and confidence across the 6 ACGME competencies and to explore their orientation preferences at Hamad Medical Corporation, Qatar's largest ACGME-I-accredited academic center. A cross-sectional electronic survey was distributed to all postgraduate trainees at Hamad Medical Corporation. The questionnaire assessed self-reported awareness and confidence across the 6 ACGME Core Competencies, perceived emphasis during training, and preferred formats and frequency of competency-based orientation. A total of 255 trainees completed the survey. Overall confidence was high across most competency domains, particularly in medical knowledge, interpersonal and communication skills, and professionalism. Systems-based practice and practice-based learning and improvement were the least familiar and least emphasized domains. Trainees who previously received structured competency orientation reported significantly higher confidence across all domains (P < .01). Patient care (66.7%) and medical knowledge (63.9%) were perceived as highly emphasized competencies, whereas systems-based practice was least emphasized (39.2%). Most trainees (76.5%) expressed interest in competency-focused orientation, favoring annual sessions delivered through online or blended learning formats. Postgraduate trainees demonstrate strong overall awareness of ACGME-I competencies; however, notable gaps persist in systems-based practice and practice-based learning and improvement. Structured, recurring competency orientation is associated with higher trainee confidence.
Sleep quality declines with age and is a known contributor to multiple chronic health conditions, including Alzheimer disease. Emerging evidence suggests that certain electroencephalography (EEG) neural signatures measured during sleep may be predictive of cognitive decline in older adults. Sleep EEG signals are traditionally measured using bulky, rigid, and uncomfortable equipment in an unfamiliar laboratory setting, which can negatively impact sleep signals. Due to these limitations, sleep EEG data acquisition is typically limited to a single night. This study aimed to validate our recently developed portable, skin-like EEG monitoring patch for 7 nights in the home environment in a pilot sample of young and older adults by evaluating usability and acceptance, and replicating age-related differences in sleep architecture observed in the polysomnography literature. Eighteen young adults and 18 cognitively unimpaired older adults without sleep disorders were enrolled (data from 11 young adults and 12 older adults were included in the analyses) in a 7-night study during which they wore novel, gel-free, wireless, ultrathin, skin-conforming, sleep monitoring, fabric-based patches. These patches were self-applied to the forehead and face for optimal usability and comfort. The patches incorporate laser-cut mesh electrodes with low-profile electronics (including a rechargeable battery and amplifier) and transmit EEG signals to a participant-controlled, Bluetooth-enabled, tablet-based data acquisition app. An automated algorithm was used to stage sleep and assess microarchitecture features from the EEG commonly impacted for each participant. Averages across nights were computed for these sleep features for each participant. Young and older adults reported that the sleep patch was easy to use and comfortable to wear. There was no loss of signal power over 7 nights of wear across participants (retained-data signal-to-noise ratio over the 7-d period: young adult, mean 20.69, SD 12.78, maximum 52.13, minimum 5.19; older adult, mean 22.10, SD 9.39, maximum 49.96, minimum 13.79). Most datasets not retained were lost due to poor reference electrode adhesion on the nose (75/101, 74% of lost datasets in young adults and 57/88, 65% in older adults). Trained sleep technologists verified that the retained datasets were of sufficient quality to be scored without difficulty. Expected age-group differences in sleep features were observed, including age-related reductions in stage N3 sleep (young adult, mean 18.55, SD 6.70; older adult, mean 10.40, SD 6.43; Mann-Whitney U=42.0; P=.01) and reduced sleep spindle density (young adult, mean 2.92, SD 2.24; older adult, mean 0.94, SD 1.33; Mann-Whitney U=45.0; P=.006). This study demonstrates that our novel, comfortable, wearable patch can reliably measure physiological sleep data over multiple nights at home in adults across the lifespan, thereby making multinight sleep assessment in cognitive aging studies and clinical research more accessible than traditional polysomnography. In future studies, the small, lightweight system, which is highly scalable, can be shipped inexpensively to participants' homes, making this technology and research accessible to individuals who may have difficulty traveling or who are hesitant to travel to a laboratory or clinic.
Brodalumab is a fully human monoclonal antibody targeting interleukin-17 receptor A (IL-17RA), providing a mechanistically distinct approach to inhibiting IL-17–mediated inflammation in plaque psoriasis and psoriatic arthritis (approved in Japan). Across pivotal randomized trials and extensive real-world studies, brodalumab demonstrates a rapid onset of action, high rates of complete skin clearance, durable efficacy, and consistent effectiveness across diverse patient subgroups, while randomized and real-world data also support meaningful improvements in joint outcomes among patients with concomitant psoriatic arthritis. This review synthesizes clinical trial and real-world evidence to contextualize the efficacy, safety, and clinical positioning of IL-17RA blockade with brodalumab across the spectrum of psoriatic disease.
Trophic connectivity, or trophic subsidies, refers to the transfer of matter, energy and nutrients across both horizontal and vertical gradients in aquatic habitats. Seascape depth plays a critical role in regulating food webs and the coupling between benthic and pelagic pathways. On tropical shallow continental shelves (<30 m), benthic primary production often exceeds pelagic primary production due to the absence of light limitations. Conversely, mesophotic conditions on the outer continental shelf restrict light penetration, thereby limiting benthic production. Furthermore, oceanographic processes and local topography along the continental shelf and slope may enhance pelagic production near the shelf break and submarine canyons through upwelling. In this study, we investigated the role of the Lane snapper (Lutjanus synagris) in mediating benthic-pelagic coupling across a range of habitats, including coastal shallow seagrass meadows, mangroves, coral reefs, and continental shelf gradient. We employed stable isotope (δ13C, δ15N) of L. synagris, basal sources and consumers from both benthic and pelagic pathways to model trophic position (TP) and assess trophic support. TP were consistent regardless of whether basal sources or consumers served as baselines, although the model based on basal consumers produced narrower credible intervals. TP increased from inshore to continental shelf habitats, primarily reflecting increases in fish size; however, no significant differences were observed between shallow and mesophotic shelf areas. Mixing models based on basal consumer baselines further revealed that in seagrass meadows, coastal reefs, and mesophotic shelf habitats, the Lane snapper derives trophic support from both benthic and pelagic sources, whereas benthic support predominates in mangroves and shallow shelf habitats. Pelagic support showed a positive correlation with depth across the continental shelf, which could be related to physical and geological processes, such as upwelling near the shelf break and through submarine canyons, that augment pelagic subsidies, particularly when benthic production is limited at greater depths. In summary, our findings underscore the significance of both physical geological and biological drivers in shaping the trophic dynamics and ontogenetic habitat use of L. synagris. These insights emphasize the necessity of sustaining spatial connectivity among marine protected areas to maintain robust trophic interactions and overall ecosystem health.
Artificial intelligence (AI) has the potential to enhance patient safety, particularly in the prevention of in-hospital falls. Recent advances in sensor-based AI systems allow for the analysis of complex, multimodal data to generate real-time alerts, enabling health care professionals to intervene before a fall occurs. By shifting from reactive responses to proactive risk management, these technologies may enable reductions in fall incidence and improvements in care outcomes. As a result, hospitals across Europe are increasingly adopting such systems. Nevertheless, empirical evidence concerning their routine implementation remains limited, particularly concerning their impact on patient safety, clinical workflows, and the usage of health care resources. Addressing these gaps is essential for effective and sustainable integration into hospital care. This paper outlines the protocol for the multicenter, multimethod project SAFE (Safe AI-Assisted Fall Prevention Through Evidence), which investigates the implementation and impact of AI-assisted fall prevention in Swedish hospitals. The research project is a collaboration between Halmstad University and hospitals in the Västra Götaland Region (VGR) and will, during 2026-2028, investigate an ongoing large-scale AI system implementation in VGR hospitals, covering up to 2400 patient beds. Using surveys, interviews, observations, and a retrospective study, it will track the implementation and impact over time. Two learning laboratories involving patients, their relatives, and health care professionals will be conducted to codevelop strategies for the implementation of AI-assisted fall prevention. The project will provide evidence-based insights and practical guidance on AI-assisted fall prevention. The findings will be relevant not only to patients, health care professionals, and hospital organizations, but also to policymakers and stakeholders involved in the digital transformation of health care. Although VGR serves as the primary research setting, the project's results will inform future similar initiatives in Sweden and offer transferable lessons for other health care systems internationally. This study will contribute to the evidence base on AI-assisted fall prevention in health care, supporting the responsible and scalable integration of such systems across diverse health care environments.
The relationship between video game experience and cognitive plasticity remains a central focus of research, particularly given its potential applications in clinical rehabilitation. Although both first-person shooter (FPS) and multiplayer online battle arena (MOBA) games have been shown to enhance cognitive functions, the specific associations between the cognitive effects of different game genres and brain network structure remain unclear. This study aimed to examine whether long-term experience with FPS and MOBA games is associated with genre-specific patterns of cortical thickness covariation across brain regions. A total of 116 male participants (mean age 21.2, SD 1.9 y) were recruited via online advertisements for this cross-sectional study. On the basis of strict inclusion criteria (gaming experience >5 years, gaming frequency >5 hours per week, and ranking within the top 15%), participants were categorized into FPS players (n=39, 33.6%) and MOBA players (n=40, 34.5%). An additional group of healthy controls (n=37, 31.9%) with no gaming experience in the past 2 years was also included. High-resolution structural magnetic resonance imaging data were acquired using a 3-T scanner. Individualized differential structural covariance networks were constructed based on the cortical thickness values extracted from 68 brain regions using the Desikan-Killiany atlas. Statistical analysis included one-way ANOVA to identify significant structural covariance edges (SCEs), network-based statistic prediction analysis for weekly gaming hours, and support vector machine analysis for group classification. One-way ANOVA identified 30 significant SCEs across the 3 groups (P<.001, false discovery rate corrected). Post hoc analysis (P<.02, Bonferroni corrected) revealed that, compared to the MOBA and control groups, the FPS group exhibited 2 dominant networks: a temporo-fronto-parietal network anchored in auditory regions and a visuo-sensorimotor network. Both gaming groups showed enhanced SCEs in visual-attentional networks compared to the control group. The network-based statistic-predict analysis demonstrated that structural covariance matrices could effectively predict weekly gaming hours in FPS players (r=0.34, 95% CI 0.26-0.42). The positive edges primarily formed a temporo-fronto-parietal-occipital network, whereas the negative edges were centered on the entorhinal cortex. The support vector machine classifier successfully differentiated FPS players from controls (area under the curve=82.95%) and from MOBA players (area under the curve=72.37%). Long-term FPS and MOBA gaming experiences are associated with different brain structural network architectures. The uniqueness of FPS gaming lies in the extensive structural covariance between the primary auditory cortex and regions supporting visual attention and sensorimotor processing, which may reflect higher demands on cognitive skills. This suggests potential utility in auditory-visual rehabilitation and provides a theoretical basis for the assessment and selection of professional electronic sports players. However, the negative edges involving the entorhinal cortex in FPS players indicate that an overreliance on response learning strategies may come at the expense of the spatial memory system. Consequently, caution is warranted when applying such games to ameliorate age-related memory decline.
Patients with mucopolysaccharidosis (MPS) appear to have an increased risk of developing dental disease. This study aimed to evaluate the status of dental caries and dental anomalies among Chinese patients with different types of MPS. This retrospective study analyzed a consecutive cohort of 102 pediatric patients with MPS who visited the Department of Stomatology at the Capital Center for Children's Health between August 2010 and August 2025. Eligible patients were defined as those with a confirmed diagnosis of MPS who were aged ≤14 years at the time of their dental visit and had complete dental examination records available. Dental caries and anomalies were assessed through clinical records and radiographic data. Dental caries were observed in 55.9% (57/102) of patients, and no statistically significant difference was observed across the MPS subtypes (P=.72). Deep dentinal caries (d4-6mft) were observed in 40.2% (41/102) of the participants and contributed most to the total decayed, missing, and filled teeth index score. The overall prevalence of dental anomalies was 32.4% (33/102), with a statistically significant difference among MPS subtypes (P=.005). Patients with MPS type IV had a significantly higher risk of dental anomalies compared to those with MPS type II (odds ratio 6.32, 95% CI 1.55-28.28; P=.01), after adjusting for age and gender. The prevalence of dental anomalies differed significantly across MPS subtypes, while that of dental caries did not. These findings emphasize the need for early, targeted preventive care and tailored dental interventions to improve oral health outcomes in this population.
Pseudogenes have been regarded as nonfunctional byproducts of evolutionary processes. However, emerging evidence indicates that pseudogenes perform diverse biological roles in human physiology and pathology. We identified the BRCA1 pseudogene (BRCA1P1), a fusion pseudogene derived from the BRCA1 tumor suppressor and RPLP1 ribosomal protein genes, as an immunoregulatory RNA in breast cancer. In this study, we show that BRCA1P1 expression varies across multiple cancer cell types, with no significant association with BRCA1 or BRCA2 somatic mutations in breast and ovarian tumors. Interestingly, BRCA1P1 inhibition elicits antitumor effects in multiple cancer cell types and preclinical tumor models through an antiviral defense mechanism. Loss of BRCA1P1 induces antiviral gene expression, promotes apoptosis, and increases sensitivity to chemotherapy in various cancer cells, without inducing apoptosis in nonmalignant cells. This antiviral response also enhances macrophage-mediated phagocytosis of BRCA1P1-deficient cancer cells. Mechanistically, the majority of BRCA1P1 transcripts are circular RNAs and regulate NF-κB-driven antiviral gene expression. Furthermore, intratumoral expression of BRCA1P1 is elevated in tumor cells, compared to normal breast tissue, and its depletion significantly inhibits the growth of both primary and metastatic breast tumor organoids. Finally, in a humanized mouse model of breast cancer, BRCA1P1 loss stimulates antiviral gene expression and increases T cell infiltration into tumors. These findings support a critical role for BRCA1P1 in regulating innate immune defense and antitumor responses across cancer types, suggesting that targeting pseudogene-derived RNAs may offer innovative therapeutic strategies to enhance antitumor immunity in breast and other cancers.
Major evolutionary transitions in Homo (e.g., increased brain size, complex social behavior) are linked to reliance on high-quality foods. Increased meat consumption likely contributed to this shift, but whether hominins practiced carcass acquisition and processing strategies consistently across time and environments remains unclear. The Koobi Fora Formation spans much of the Plio-Pleistocene and is central to reconstructing the ecology of early Homo. However, zooarchaeological research has focused almost entirely on the Okote Member (~1.56 to 1.38 Ma), while the KBS Member (~1.87 to 1.56 Ma) has yielded important hominin fossils but relatively few faunal assemblages comparably well preserved for similar analysis. We present an analysis of FwJj 80 (~1.6 Ma), an assemblage from the KBS Member that preserves butchered fauna associated with early Homo fossils. Results show that behaviors documented in the Okote Member, including early access to carcasses, selective transport of limbs, and systematic marrow extraction within riparian settings, were also practiced at FwJj 80. This provides the most comprehensive and systematically analyzed evidence of such behaviors within the KBS Member, demonstrating continuity in carcass-exploitation patterns between the KBS and Okote Members. Comparisons with FLK Zinj (~1.84 Ma, Tanzania) and Kanjera South (~2.0 Ma, Kenya) demonstrate a consistent foraging niche sustained across varied environmental contexts, underscoring behavioral flexibility as central to early Homo's evolutionary success.
Kidney transplantation outcomes have improved in the short term, but long-term graft survival gains have plateaued. Aging donors and recipients with increasing comorbidities may alter contemporary allograft outcomes. We retrospectively analyzed 2076 consecutive kidney transplants performed at a single Canadian center from 1969 to 2024, representing up to 50 y of complete follow-up. All-cause graft survival (ACGS) and death-censored graft survival were compared across 5 transplant eras using era-stratified Cox regression. The median recipient age increased significantly across the eras from 35 to 54 y (P < 0.01) and the donor age from 28 to 45 y (P < 0.01), paralleled by 3-5-fold increases in pretransplant diabetes and obesity. Death-censored graft survival improved through the early 2000s but has since plateaued, whereas ACGS declined in the modern era (2018-2024) compared with the 1998-2009 peak (P = 0.007). Death with function accounted for >60% of graft losses in recent years, with infectious deaths rising from 21% to 45% (P < 0.01). Increasing donor and recipient age, comorbidities, and delayed graft function independently predicted inferior survival. These findings reveal a reversal in ACGS gains in the contemporary era, highlighting the need for precision immunosuppression strategies tailored to the aging, comorbid transplant population.
Early-stage diagnosis of paroxysmal atrial fibrillation (PAF) is challenging owing to its asymptomatic nature. However, the genetic factors underlying PAF and predictive utility of polygenic risk scores (PRSs) for PAF in Asian populations remain elusive. We aimed to explore the PAF-associated genetic variants in a Japanese cohort and evaluate the predictive performance of PAF-specific PRSs. This study included 2,604 participants. Following exclusion, quality control, and genotype imputation, a genome-wide association study (GWAS) was conducted. The predictive performance of 30 sets of PRS models constructed across various thresholds was evaluated using three machine learning methods. Model performance was assessed using area under the curve (AUC) and SHapley Additive exPlanations (SHAP). The GWAS using 1,038 PAF cases and 744 controls identified 82 genome-wide significant variants (P < 5 × 10-8), all on chromosome 4q25. Of these, 80 variants clustered upstream of PITX2, and two were located in LINC01438. Fine mapping identified two independent intergenic signals, with rs2200732 as the lead single-nucleotide polymorphism. The best PRS-only model achieved an AUC of >0.70, which was improved up to 0.737 in additive models incorporating both PRS and clinical variables. SHAP analysis consistently ranked PRS as the most influential predictor among the clinical variables included in this study. These results suggest that genetic risk, particularly at the established 4q25/PITX2 locus, contributes substantially to PAF susceptibility in this Japanese cohort and that PRS may improve early risk stratification when integrated with clinical risk factors.
Vascular access devices (VADs), including both peripheral intravenous (IV) catheters and central VADs, are used across the continuum of health care to administer IV fluids, medications, and diagnostics. Needleless connectors (NCs) are a critical component of the infusion system providing needle-free access to VADs yet many nurses are unaware of their significant impact on VAD-related complications, including infection, thrombotic occlusion, and catheter malfunction. This article provides a review of the critical role of the nurse in identifying risks and applying best practices in NC care, including disinfection, clamping, flushing, and timely replacement. Proper technique and adherence to evidence-informed protocols are vital to maintaining VAD performance and reducing adverse events.
Spasticity is a frequent neuromuscular impairment associated with cerebral palsy, stroke, and spinal cord injury, commonly assessed using subjective clinical scales. This exploratory pilot study aimed to develop and preliminarily validate a multimodal instrument for the objective quantification and stratification of spasticity in nine individuals (3 female, 6 male) with upper-limb spasticity due to cerebral palsy (n = 5) or stroke (n = 4). A wearable system integrating surface electromyography, inertial measurement units, and force sensing resistors was designed to simultaneously capture muscle activation, joint kinematics, and generalized resistance force during standardized passive mobilizations. Simple indicators six area under the curve-based indicators were derived: force, sEMG, and angular velocity under two conditions (R1, R2) and given distinct weights depending on their contribution. Principal component analysis revealed that three latent components accounted for 83.86% of the total variance observed across participants. Based on these indicators, a Composite Index was constructed using min-max normalization and weighted linear aggregation. Within the pilot study, the Composite Index could differentiate between spasticity severity levels (F = 6.38, p = 0.0327, η² = 0.68), with sEMG activity during slow stretch (AUC sEMG R2) the most influential contributor indicators. The proposed multimodal instrument demonstrates preliminary feasibility as a non-invasive and portable approach for objective spasticity quantification, warranting further validation in larger cohorts.
Highly sensitive LC-MS/MS methods were developed and validated to quantify nirmatrelvir (NMR). Although the plasma pharmacokinetics (PK) of NMR have been well characterized, its distribution into other biologically relevant compartments, such as cerebrospinal fluid (CSF), peripheral blood mononuclear cells (PBMCs), and tissues, remains poorly understood. Quantitative assessment of NMR across these matrices is essential for evaluating central nervous system and intracellular exposure. To address this gap, we developed and fully validated three independent LC-MS/MS assays for the quantification of NMR in rat plasma, CSF, and PBMC matrices following FDA bioanalytical method validation guidelines. Calibration ranges were 20-10,000 ng/mL for plasma, 1.00-250 ng/mL for CSF, and 0.100-5.00 ng/mL for PBMCs, using NMR-D9 as the internal standard. Matrix-specific extraction procedures were optimized to address physicochemical and protein-binding differences, including a methanolic ammonium hydroxide treatment to mitigate adsorption losses in CSF. Chromatographic separation was achieved on a C18 column with a 60:40:0.1 (v/v/v) acetonitrile:water:formic acid mobile phase, and detection was performed by mass spectrometry in positive multiple-reaction-monitoring mode. All assays demonstrated excellent linearity (r2 > 0.99), precision and accuracy within acceptance criteria (< 15% CV and deviation, < 20% at the LLOQ), with no significant matrix interference or ion suppression. Stability testing confirmed consistent performance under short-term, freeze-thaw, and long-term conditions. These validated assays provide sensitive, selective, and reproducible quantification of NMR across multiple matrices and will facilitate preclinical pharmacokinetic studies aimed at understanding its distribution in sanctuary compartments relevant to SARS-CoV-2 infection.
A cross-sectional study. This study aimed to investigate cognitive impairment in degenerative cervical myelopathy (DCM) and examine its relationship with radiographic spinal cord compression. Degenerative cervical myelopathy is a leading cause of chronic non-traumatic spinal cord injury. While its motor and sensory manifestations are well established, the potential impact on cognitive function remains underexplored. A total of 965 participants were enrolled: 383 DCM patients (Group A), 122 cervical spondylotic radiculopathy (CSR) patients (Group B), and 460 healthy controls (Group C). Cognitive performance was evaluated with the Montreal Cognitive Assessment (MoCA), Mini-Mental State Examination (MMSE), and the Basic Cognitive Aptitude Test (BCAT). Propensity-score matching (A:B:C=2:1:2) was used to balance age, sex, and education; additional stratified analyses by age (≤50, 51-60, 61-70, and >70 years) and education (≤6, 7-12, and ≥13 years of education) were performed. Compression ratio (CR) and maximum spinal cord compression (MSCC) were measured on cervical MRI. Correlation analyses were used to explore the association between radiographic spinal cord compression and cognitive function. After matching, DCM patients exhibited significantly lower MoCA (20.61 ± 3.76) and MMSE (26.23 ± 2.84) scores than both CSR and control group (all P < 0.001); this disadvantage persisted across every age and educational stratum. MSCC correlated negatively with MoCA (r = -0.118, P = 0.022) and MMSE (r = -0.124, P = 0.017), with stronger associations in single-level DCM (MoCA r = -0.218, P = 0.008; MMSE r = -0.237, P = 0.004). The number of compressed segments did not influence global cognition. Cognitive impairment is significantly associated with DCM, which is influenced by age, education, and the degree of spinal cord compression.
Healthy aging has emerged as a global priority. However, older adults' participation in health promotion programs remains low, and traditional health promotion models have achieved limited success in fostering sustained engagement among this population. Mobile health (mHealth)-based gamification interventions offer a promising way to address these challenges. However, no published reviews support or oppose the use of mHealth-based gamification interventions as health promotion strategies in older adults. The study aimed to identify mHealth interventions using gamification to promote health among older adults. Our scoping review was conducted following the Joanna Briggs Institute recommendations for scoping reviews and Arksey and O'Malley's framework. The process followed PRISMA-ScR (Preferred Reporting Items for Systematic Reviews and Meta-Analyses Extension for Scoping Reviews) guidelines and PRISMA-S (Preferred Reporting Items for Systematic Reviews and Meta-Analyses Literature Search Extension) checklist. A comprehensive literature search was conducted across 8 databases: PubMed, Scopus, Web of Science, Embase, Cochrane Library, CINAHL, PsycARTICLES, and IEEE Xplore Digital Library, from their inception to December 10, 2025. Two reviewers independently screened titles, abstracts, and full texts via Rayyan, with disagreements resolved by a third reviewer. This scoping review identified 11 studies. Only 1 article was published before 2022. The interventions were found to improve enjoyment and motivation (n=5), cognitive function (n=3), physical activity (n=2), and digital literacy (n=2). Individual studies also reported improvements in mental health (n=1) and adherence (n=1), a reduction in suicidal ideation (n=1), improvements in physical function (n=1), the promotion of social engagement (n=1), and the identification of mild cognitive impairment (n=1). Game elements used were ranked by frequency as progress, challenges, goals, levels, reward, sensation, storytelling or narration, leaderboard, surprise, and avatar. No research was found to use the game element of "social sharing." mHealth types included augmented and virtual reality-based training systems, wearable devices, mobile phones, tablets, and Windows platforms and devices. Notably, only 4 studies applied theoretical frameworks, and 3 omitted the concrete approach to gamification. As the first scoping review to identify and map mHealth-based gamification interventions for older adults, this study highlights their potential as an innovative approach to health promotion. By systematically synthesizing evidence regarding intervention designs, gamification strategies, and preliminary health outcomes, it establishes a foundation for future inquiry. However, this review is limited by the small number of included studies, precluding broad generalizations. Future research should assess long-term impacts, integrate theoretical frameworks, establish reporting guidelines, design personalized social-interactive interventions, and expand to broader health domains. Ultimately, these insights provide targeted guidance for developing age-appropriate digital health solutions, contributing to the realization of active aging.
Understanding the multifactorial drivers of cardiovascular disease (CVD) risk in LGBTQ+ populations is critical to advancing equitable cardiovascular care. To evaluate and synthesize empirical evidence on factors influencing CVD risk among sexual minority populations. A systematic review was conducted following PRISMA guidelines. Five databases (PubMed, Scopus, MEDLINE, ScienceDirect and ProQuest) were searched for studies published between 2019 and 2024 that examined CVD risk factors among LGBTQ+ adults. Eligible studies were appraised using the Joanna Briggs Institute critical appraisal tool, and findings were synthesized using a convergent integrated approach. Twenty studies met the inclusion criteria, most of which were cross-sectional and conducted in high-income countries. Identified CVD risk factors were classified into six domains: behavioural (e.g., nicotine use, health behaviours, substance use), blood biomarker (e.g., lipid or total cholesterol, HbA1c, C-reactive protein), physical (e.g., BMI, blood pressure, age), comorbidities (e.g., metabolic syndrome, COPD, HIV), psychological (e.g., positive and negative factors) and social (e.g., discrimination, loneliness, marital status). Bisexual, lesbian and gay individuals were most frequently represented. Across studies, minority stress and adverse social determinants were consistently associated with elevated CVD risk. Sexual minority populations face disproportionate cardiovascular risk shaped by behavioural, biological and psychosocial stressors. These findings highlight the need for inclusive, culturally competent nursing practice and identity-informed screening strategies. Nurses should integrate sexual orientation and gender identity into cardiovascular risk assessments, adopt trauma-informed and resilience-promoting approaches, and advocate for policies that reduce structural barriers to equitable care. No patients or members of the public were directly involved in the design, conduct or reporting of this systematic review.
Obesity is a major global health concern, and scalable digital solutions are urgently needed. While digital lifestyle interventions (DLSIs) have shown promise, prior meta-analyses often included hybrid formats with human support, limiting insights into the effectiveness of fully digital interventions. This study aimed to evaluate the independent effects of standalone DLSIs-defined as interventions delivered exclusively via digital platforms without in-person or adjunctive support-on anthropometric and dietary outcomes in adults with overweight or obesity. We searched MEDLINE, Embase, PsycINFO, Web of Science, and the Cochrane Library from inception through March 4, 2026. Eligible studies were randomized controlled trials (RCTs) evaluating stand-alone DLSIs in adults with overweight or obesity. Interventions were included if they targeted diet or physical activity exclusively through digital platforms. We included fully automated, asynchronous, or one-to-many synchronous systems without individualized support. Studies involving hybrid interventions, including one-to-one synchronous human interaction, nonadult populations, or non-RCT designs, were excluded. Two independent reviewers performed study selection and data extraction. Risk of bias was assessed using the Cochrane Risk of Bias 2.0 tool (Cochrane Bias Methods Group). Meta-analysis used a random-effects model with the Hartung-Knapp-Sidik-Jonkman method, and heterogeneity was assessed using I2 statistics. The certainty of evidence was evaluated using the Grading of Recommendations, Assessment, Development, and Evaluation approach. A total of 19 RCTs involving 3556 participants were included. Stand-alone DLSIs significantly improved anthropometric outcomes compared to controls (standardized mean difference 0.26, 95% CI 0.14-0.38; 95% prediction interval [PI] -0.16 to 0.68; P<.001; 19 studies; n=3556; I2=56.1%), corresponding to an additional weight loss ranging from 2.62 kg to 6.55 kg, depending on the baseline body weight. Significant improvements were also found in dietary outcomes (standardized mean difference 0.26, 95% CI 0.04-0.48; 95% PI -0.29 to 0.81; P=.008; 8 studies; n=1365; I2=57.5%). Subgroup analyses for anthropometric outcomes revealed significant differences only by control group type (P<.001), with waitlist controls showing the largest effect. For dietary outcomes, no significant subgroup differences were found (P>.05). While most studies showed a low risk of bias, substantial statistical heterogeneity was observed in some outcomes. Consequently, the certainty of evidence for both outcomes was rated as moderate. This review is innovative as it is the first to isolate the pure efficacy of stand-alone DLSIs by excluding synchronous human support. Our findings provide moderate-certainty evidence that these tools are effective for weight management and dietary improvement without human intervention. While stand-alone DLSIs offer a highly scalable, cost-effective first-step intervention, the PIs included zero, and substantial heterogeneity was observed, suggesting that benefits may vary across settings. Future research should identify user characteristics that maximize engagement with unguided digital tools.
Sacroiliac (SI) joint dysfunction is a frequently underrecognized source of low back pain, implicated in 15%-30% of cases across select populations. Unilateral SI joint fusion remains the most commonly performed technique, utilizing either iliosacral screws or, more recently, through-and-through (TNT) screws. A 65-year-old woman with a history of failed right-sided SI joint fusion underwent revision surgery with bilateral TNT screw fixation due to worsening symptoms. Follow-up imaging confirmed solid right-sided arthrodesis. However, progressive haloing and sclerosis were observed around the left iliac portion of the cephalad screw. This case represents the first reported instance of persistent motion and screw haloing using TNT screw fixation when using two spanning screws. These findings highlight a potential limitation of TNT constructs due to load transfer, stress shielding, and asymmetric osseous integration, particularly in patients with a history of prior unilateral SI joint fusion. https://thejns.org/doi/10.3171/CASE25668.