Extreme heat is a complex public health problem driven by interactions at the individual, community, health system, and policy levels. Participatory systems science engages interested parties to examine the interconnected factors driving heat-related illness and to develop locally tailored adaptation strategies. This article describes how local government agencies and community members from an area with high heat exposure in King County, Washington, engaged in the participatory systems science approach of group model building to co-develop extreme heat solutions during 2024-25. Recommended solutions included community education on heat risk; health system policies that increase access to health care; policy-informed infrastructure changes that expand access to green space; and lifesaving direct support, including distribution of cool kits to unhoused people and energy assistance programs. The insights generated can inform heat adaptation efforts across jurisdictions. The methods described offer a scalable approach to co-designing policies and interventions that can inform national and global climate resilience strategies to reduce health risks from climate-related events.
Healthy aging has emerged as a global priority. However, older adults' participation in health promotion programs remains low, and traditional health promotion models have achieved limited success in fostering sustained engagement among this population. Mobile health (mHealth)-based gamification interventions offer a promising way to address these challenges. However, no published reviews support or oppose the use of mHealth-based gamification interventions as health promotion strategies in older adults. The study aimed to identify mHealth interventions using gamification to promote health among older adults. Our scoping review was conducted following the Joanna Briggs Institute recommendations for scoping reviews and Arksey and O'Malley's framework. The process followed PRISMA-ScR (Preferred Reporting Items for Systematic Reviews and Meta-Analyses Extension for Scoping Reviews) guidelines and PRISMA-S (Preferred Reporting Items for Systematic Reviews and Meta-Analyses Literature Search Extension) checklist. A comprehensive literature search was conducted across 8 databases: PubMed, Scopus, Web of Science, Embase, Cochrane Library, CINAHL, PsycARTICLES, and IEEE Xplore Digital Library, from their inception to December 10, 2025. Two reviewers independently screened titles, abstracts, and full texts via Rayyan, with disagreements resolved by a third reviewer. This scoping review identified 11 studies. Only 1 article was published before 2022. The interventions were found to improve enjoyment and motivation (n=5), cognitive function (n=3), physical activity (n=2), and digital literacy (n=2). Individual studies also reported improvements in mental health (n=1) and adherence (n=1), a reduction in suicidal ideation (n=1), improvements in physical function (n=1), the promotion of social engagement (n=1), and the identification of mild cognitive impairment (n=1). Game elements used were ranked by frequency as progress, challenges, goals, levels, reward, sensation, storytelling or narration, leaderboard, surprise, and avatar. No research was found to use the game element of "social sharing." mHealth types included augmented and virtual reality-based training systems, wearable devices, mobile phones, tablets, and Windows platforms and devices. Notably, only 4 studies applied theoretical frameworks, and 3 omitted the concrete approach to gamification. As the first scoping review to identify and map mHealth-based gamification interventions for older adults, this study highlights their potential as an innovative approach to health promotion. By systematically synthesizing evidence regarding intervention designs, gamification strategies, and preliminary health outcomes, it establishes a foundation for future inquiry. However, this review is limited by the small number of included studies, precluding broad generalizations. Future research should assess long-term impacts, integrate theoretical frameworks, establish reporting guidelines, design personalized social-interactive interventions, and expand to broader health domains. Ultimately, these insights provide targeted guidance for developing age-appropriate digital health solutions, contributing to the realization of active aging.
Migraine headaches are considered chronic when they occur on 15 or more days per month. Newer medications are available for prevention. To explore the effectiveness and tolerability of pharmacologic prophylaxis for chronic migraine. Medline, Embase, Cochrane Central Register of Controlled Trials, PsycINFO, Web of Science, and Scopus to October 2025. Independent paired reviewers identified randomized controlled trials (RCTs) of prophylactic pharmacologic interventions for adults with chronic migraine. Paired reviewers independently extracted data and assessed risk of bias using the Cochrane Risk of Bias 2 tool. Random-effects meta-analysis and assessment of certainty of evidence were performed using the GRADE (Grading of Recommendations Assessment, Development and Evaluation) approach. The review included 43 RCTs (14 725 participants). High- and moderate-certainty evidence suggests that eptinezumab (mean difference [MD], -2.34 [95% CI, -2.76 to -1.92]), erenumab (MD, -2.08 [CI, -2.82 to -1.33]), fremanezumab (MD, -1.77 [CI, -2.45 to -1.09]), galcanezumab (MD, -2.00 [CI, -2.96 to -1.04]), and atogepant (MD, -2.10 [CI, -3.06 to -1.14]) reduce monthly migraine headache days by 2 versus placebo. Botulinum toxin may slightly reduce monthly migraine days (MD, -1.34 [CI, -2.27 to -0.41]; low certainty), whereas rimegepant probably has no effect (MD, -1.20 [CI, -2.59 to 0.19]; moderate certainty). Galcanezumab probably reduces dropout due to any cause versus placebo (relative risk [RR], 0.52 [CI, 0.33 to 0.83]; moderate certainty). Botulinum toxin probably increases discontinuation due to adverse events (RR, 3.36 [CI, 1.75 to 6.45]; moderate certainty). Studies on topiramate, valproate, and propranolol were sparse and had high risk of bias. Most trials had high risk of bias, with few available comparisons. Most calcitonin gene-related peptide-targeted therapies are probably effective for chronic migraine prophylaxis. Evidence for botulinum toxin, propranolol, topiramate, and valproate mostly had high risk of bias. None. (PROSPERO: CRD42023456915).
Pacific Island Jurisdictions are highly affected by climate change. Across the health sector, implementation of planned adaptation strategies has been uneven. It has been constrained by limited human and financial resources and organizational capacity, inadequate climate-health risk assessments, and imprecise estimates about the effectiveness of climate-related interventions. To fill these gaps, this study assessed the degree of implementation of health adaptation activities in Pacific Island Jurisdictions, using a combination of implementation science and climate change adaptation frameworks. We found that Pacific Island Jurisdictions have made advances in the implementation of health adaptation activities such as establishing coordination mechanisms, building awareness, and conducting assessments. However, less progress has been made in operationalizing targeted policies, programs, and interventions, including monitoring, evaluation, and learning. Our findings offer the potential to increase resilience if applied by practitioners (for example, public health professionals) and decision makers to inform and seek support for additional health adaptation investments.
Obesity is a major global health concern, and scalable digital solutions are urgently needed. While digital lifestyle interventions (DLSIs) have shown promise, prior meta-analyses often included hybrid formats with human support, limiting insights into the effectiveness of fully digital interventions. This study aimed to evaluate the independent effects of standalone DLSIs-defined as interventions delivered exclusively via digital platforms without in-person or adjunctive support-on anthropometric and dietary outcomes in adults with overweight or obesity. We searched MEDLINE, Embase, PsycINFO, Web of Science, and the Cochrane Library from inception through March 4, 2026. Eligible studies were randomized controlled trials (RCTs) evaluating stand-alone DLSIs in adults with overweight or obesity. Interventions were included if they targeted diet or physical activity exclusively through digital platforms. We included fully automated, asynchronous, or one-to-many synchronous systems without individualized support. Studies involving hybrid interventions, including one-to-one synchronous human interaction, nonadult populations, or non-RCT designs, were excluded. Two independent reviewers performed study selection and data extraction. Risk of bias was assessed using the Cochrane Risk of Bias 2.0 tool (Cochrane Bias Methods Group). Meta-analysis used a random-effects model with the Hartung-Knapp-Sidik-Jonkman method, and heterogeneity was assessed using I2 statistics. The certainty of evidence was evaluated using the Grading of Recommendations, Assessment, Development, and Evaluation approach. A total of 19 RCTs involving 3556 participants were included. Stand-alone DLSIs significantly improved anthropometric outcomes compared to controls (standardized mean difference 0.26, 95% CI 0.14-0.38; 95% prediction interval [PI] -0.16 to 0.68; P<.001; 19 studies; n=3556; I2=56.1%), corresponding to an additional weight loss ranging from 2.62 kg to 6.55 kg, depending on the baseline body weight. Significant improvements were also found in dietary outcomes (standardized mean difference 0.26, 95% CI 0.04-0.48; 95% PI -0.29 to 0.81; P=.008; 8 studies; n=1365; I2=57.5%). Subgroup analyses for anthropometric outcomes revealed significant differences only by control group type (P<.001), with waitlist controls showing the largest effect. For dietary outcomes, no significant subgroup differences were found (P>.05). While most studies showed a low risk of bias, substantial statistical heterogeneity was observed in some outcomes. Consequently, the certainty of evidence for both outcomes was rated as moderate. This review is innovative as it is the first to isolate the pure efficacy of stand-alone DLSIs by excluding synchronous human support. Our findings provide moderate-certainty evidence that these tools are effective for weight management and dietary improvement without human intervention. While stand-alone DLSIs offer a highly scalable, cost-effective first-step intervention, the PIs included zero, and substantial heterogeneity was observed, suggesting that benefits may vary across settings. Future research should identify user characteristics that maximize engagement with unguided digital tools.
Fuping goat milk powder is a kind of geographical indication protected food in China. To realize a rapid and accurate identification of Fuping goat milk, in this paper, an identification method was proposed based on nuclear magnetic resonance and machine learning to accurately distinguish Fuping goat milk on small scale geographical authenticity. Deuterated chloroform was selected as extraction solvent, partial least squares discriminant analysis (PLSDA), random forest (RF) and support vector machine (SVM) models were constructed using seven different geographical origin goat milk samples in Shaanxi Province. The identification based on PLS-DA model was unsatisfactory, the RF model performed better, and the SVM model showed best performance with accuracy rate reached 100% based on selected 17 features, and the 5-fold cross-validation accuracy was 94.7% ± 7.2%. It is disclosed that different geographic samples have different chemical compositions, particularly unsaturated fatty acids, such as conjugated linoleic acid.
Facies-controlled heterogeneity, including fine-grained lenses and abrupt textural transitions, produces irregular and non-Gaussian DNAPL source zone architecture (SZA) that strongly influences mass-transfer and long-term dissolution. Reconstructing both permeability and DNAPL saturation therefore requires estimating high-dimensional, coupled parameter fields that are poorly constrained by sparse hydraulic and tracer observations. Geophysical methods such as electrical resistivity tomography (ERT) and ground-penetrating radar (GPR) can supplement these data, yet ERT often lacks resolution in shallow, thinly layered systems, whereas GPR is sensitive to dielectric contrasts associated with facies and DNAPL distributions. We develop an inversion framework that integrates hydraulic-partitioning tracer tomography with ground-penetrating radar using a deep-learning parameterization and ensemble data assimilation. A convolutional variational autoencoder (CVAE) is trained to provide a non-Gaussian parameterization (prior) of permeability and DNAPL saturation. An ensemble smoother with multiple data assimilation (ESMDA) updates the latent variables z conditioned on hydrogeophysical observations while keeping the CVAE weights fixed. Synthetic experiments show that GPR resolves facies and SZA geometry more effectively than ERT, and that combining GPR with hydraulic-partitioning tracer tomography reduces permeability error by 7.9% and DNAPL saturation error by 25.7% relative to HPTT-only inversion, while lowering false positives and false negatives in SZA delineation. Enhanced SZA reconstruction further yields more reliable long-term predictions of DNAPL mass decay and dissolved-plume evolution. The results demonstrate the value of joint use of GPR and tracer data to characterize complex, non-Gaussian DNAPL source zones in shallow aquifers.
The relationship between video game experience and cognitive plasticity remains a central focus of research, particularly given its potential applications in clinical rehabilitation. Although both first-person shooter (FPS) and multiplayer online battle arena (MOBA) games have been shown to enhance cognitive functions, the specific associations between the cognitive effects of different game genres and brain network structure remain unclear. This study aimed to examine whether long-term experience with FPS and MOBA games is associated with genre-specific patterns of cortical thickness covariation across brain regions. A total of 116 male participants (mean age 21.2, SD 1.9 y) were recruited via online advertisements for this cross-sectional study. On the basis of strict inclusion criteria (gaming experience >5 years, gaming frequency >5 hours per week, and ranking within the top 15%), participants were categorized into FPS players (n=39, 33.6%) and MOBA players (n=40, 34.5%). An additional group of healthy controls (n=37, 31.9%) with no gaming experience in the past 2 years was also included. High-resolution structural magnetic resonance imaging data were acquired using a 3-T scanner. Individualized differential structural covariance networks were constructed based on the cortical thickness values extracted from 68 brain regions using the Desikan-Killiany atlas. Statistical analysis included one-way ANOVA to identify significant structural covariance edges (SCEs), network-based statistic prediction analysis for weekly gaming hours, and support vector machine analysis for group classification. One-way ANOVA identified 30 significant SCEs across the 3 groups (P<.001, false discovery rate corrected). Post hoc analysis (P<.02, Bonferroni corrected) revealed that, compared to the MOBA and control groups, the FPS group exhibited 2 dominant networks: a temporo-fronto-parietal network anchored in auditory regions and a visuo-sensorimotor network. Both gaming groups showed enhanced SCEs in visual-attentional networks compared to the control group. The network-based statistic-predict analysis demonstrated that structural covariance matrices could effectively predict weekly gaming hours in FPS players (r=0.34, 95% CI 0.26-0.42). The positive edges primarily formed a temporo-fronto-parietal-occipital network, whereas the negative edges were centered on the entorhinal cortex. The support vector machine classifier successfully differentiated FPS players from controls (area under the curve=82.95%) and from MOBA players (area under the curve=72.37%). Long-term FPS and MOBA gaming experiences are associated with different brain structural network architectures. The uniqueness of FPS gaming lies in the extensive structural covariance between the primary auditory cortex and regions supporting visual attention and sensorimotor processing, which may reflect higher demands on cognitive skills. This suggests potential utility in auditory-visual rehabilitation and provides a theoretical basis for the assessment and selection of professional electronic sports players. However, the negative edges involving the entorhinal cortex in FPS players indicate that an overreliance on response learning strategies may come at the expense of the spatial memory system. Consequently, caution is warranted when applying such games to ameliorate age-related memory decline.
Artificial intelligence (AI) has the potential to enhance patient safety, particularly in the prevention of in-hospital falls. Recent advances in sensor-based AI systems allow for the analysis of complex, multimodal data to generate real-time alerts, enabling health care professionals to intervene before a fall occurs. By shifting from reactive responses to proactive risk management, these technologies may enable reductions in fall incidence and improvements in care outcomes. As a result, hospitals across Europe are increasingly adopting such systems. Nevertheless, empirical evidence concerning their routine implementation remains limited, particularly concerning their impact on patient safety, clinical workflows, and the usage of health care resources. Addressing these gaps is essential for effective and sustainable integration into hospital care. This paper outlines the protocol for the multicenter, multimethod project SAFE (Safe AI-Assisted Fall Prevention Through Evidence), which investigates the implementation and impact of AI-assisted fall prevention in Swedish hospitals. The research project is a collaboration between Halmstad University and hospitals in the Västra Götaland Region (VGR) and will, during 2026-2028, investigate an ongoing large-scale AI system implementation in VGR hospitals, covering up to 2400 patient beds. Using surveys, interviews, observations, and a retrospective study, it will track the implementation and impact over time. Two learning laboratories involving patients, their relatives, and health care professionals will be conducted to codevelop strategies for the implementation of AI-assisted fall prevention. The project will provide evidence-based insights and practical guidance on AI-assisted fall prevention. The findings will be relevant not only to patients, health care professionals, and hospital organizations, but also to policymakers and stakeholders involved in the digital transformation of health care. Although VGR serves as the primary research setting, the project's results will inform future similar initiatives in Sweden and offer transferable lessons for other health care systems internationally. This study will contribute to the evidence base on AI-assisted fall prevention in health care, supporting the responsible and scalable integration of such systems across diverse health care environments.
Whole Slide Images (WSIs) are the gold standard for cancer diagnosis and prognosis. However, the enormous scale of WSIs presents significant challenges for effective information aggregation in survival analysis, limiting prognostic accuracy and clinical applicability. This study aims to develop a computationally efficient framework for accurate and interpretable survival prediction from WSIs. We propose HIPPA-DGCN, a novel framework that utilizes hyper-image patch clustering for feature distillation and a position-aware dense graph convolutional network for global context modeling. This architecture is designed to efficiently integrate pathological features with their spatial relationships. Extensive evaluation on seven public datasets (five from TCGA and two from CPTAC) demonstrates state-of-the-art performance. Our method achieves superior prognostic accuracy while drastically improving computational efficiency, reducing model parameters to 0.73M and computational cost to 0.08 GFLOPs per WSI. The model generates interpretable attention maps that highlight histopathological regions with significant prognostic relevance. HIPPA-DGCN provides an accurate, efficient, and interpretable solution for WSI-based survival analysis. Its lightweight architecture and robust performance make it particularly suitable for clinical deployment in resource-constrained environments, potentially enhancing cancer prognosis workflows.
Land subsidence poses major threats to human and environmental systems in river deltas worldwide, increasing risks of flooding and damage to civil infrastructure. In deltaic settings, land subsidence can be induced by multiple superimposing processes, including autocompaction, groundwater depletion and infrastructural surface loading. The quantification of each individual process is often uncertain, yet crucial for effective adaptation and mitigation. The Vietnamese Mekong Delta (VMD) is a prominent example of such a subsiding delta, with satellite-derived subsidence rates of up to 30 mm a-1 and surface elevations largely below 1 m above mean sea level. By presenting a fully coupled flow-deformation model with geomechanical parameterization at high vertical resolution, this study, supported by local geodetic leveling observations, provides an unprecedentedly detailed local-scale assessment of land subsidence dynamics for the VMD. The simulation results indicate subsidence rates of 5-6 mm a-1 due to groundwater depletion and local infrastructure loading. Additionally integrating one or multiple well-casing failures as localized subsurface disturbances in the model yields spatially heterogeneous subsidence patterns and increases local subsidence rates by an additional 1-20 mm a-1, depending on the number of implemented failures. While well-casing failures are known consequences of land subsidence, the hypothesis-driven exploratory simulations employed here indicate that such damage may in turn accelerate subsidence by facilitating subsurface drainage pathways and local head equilibration between aquitards and tapped aquifers. This suggests that well-casing failures could contribute to heterogeneous and locally extreme subsidence dynamics. The results reveal significant delays in subsidence due to past groundwater depletion at the investigated site, underscoring the need for proactive water management strategies in the VMD, supported by comprehensive land subsidence modelling. The insights derived from this localized high-resolution analysis suggest that effective management will require preventing shallow aquifer depletion to avoid triggering the Holocene's pronounced, yet largely inactivated subsidence potential.
The origins and assembly of temperate biodiversity hotspots remain poorly understood. This knowledge gap is particularly evident in low-latitude montane regions of the Americas, where northern lineages have repeatedly colonized and diversified. Here we investigate the evolutionary history of oaks (Quercus) in the Americas, with a focus on their parallel radiation into Mexican and Central American montane forests. Using a time-calibrated phylogeny, we show that white oaks (Q. section Quercus) and red oaks (Q. sect. Lobatae) independently colonized Mexico c. 25 Mya. This coincides with a variety of ecosystem changes at the Oligocene-Miocene boundary, including aridification. Diversification analyses reveal that montane habitats acted as cradles of oak diversity, with higher speciation rates associated with movement into topographically complex, higher-elevation landscapes. Despite 20 Ma of independent evolution in Mexico and Central America, red and white oaks show remarkable convergent evolution in climate niche. Our results highlight how extrinsic factors-migration into novel environments-coupled with intrinsic niche lability can facilitate rapid diversification. Our study thus provides insights into the origins of temperate biodiversity and the evolutionary processes shaping species-rich montane ecosystems.
Low physical activity (PA) is a well-established risk factor for major depressive disorder (MDD). However, the temporal dynamics of PA preceding an incident clinical diagnosis of MDD remain poorly characterized, particularly using long-term, objective measures collected in real-world settings. This study aimed to characterize trajectories of wearable-measured PA during the year preceding incident MDD diagnosis and identify the timing of within-person changes. We conducted a retrospective nested case-control study using linked electronic health record and wearable (Fitbit) data from the All of Us Research Program. Adults with at least 6 months of valid Fitbit PA data in the 12 months preceding diagnosis were included. Incident MDD cases were identified based on a first electronic health record-recorded diagnosis and matched to MDD-free controls on age, sex, BMI, and calendar time of diagnosis, with up to 4 controls per case. Daily steps and moderate to vigorous PA (MVPA) were aggregated into monthly averages. Linear mixed-effects models were used to compare prediagnostic PA trajectories between cases and controls over a retrospective time scale from -12 to 0 months. Among cases, within-person contrasts were used to identify when PA levels first showed statistically significant deviations relative to levels observed 12 months before diagnosis. Exploratory analyses assessed heterogeneity by demographic factors. The analytic cohort included 4104 participants (n=829, 20.2% incident MDD cases and n=3275, 79.8% matched controls; n=3355, 81.7% women; median age 48.4, IQR 36.3-61.3 years). Compared with controls, individuals who developed MDD exhibited consistently lower overall PA and significant downward trajectories in both daily steps and MVPA during the year preceding diagnosis (global trajectory tests; P<.001 for both outcomes). Differences widened progressively over time, indicating accelerating declines as diagnosis approached. Among cases, statistically significant changes in daily step counts emerged approximately 4 months before diagnosis (-145, 95% CI -253 to -37 steps vs month -12; P=.02) and reached -428 (95% CI -531 to -326) steps at diagnosis (P<.001). Declines in MVPA emerged approximately 5 months before diagnosis (-2.48, 95% CI -4.32 to -0.64 minutes; P=.02) and reached -5.61 (95% CI -7.35 to -3.86) minutes at diagnosis (P<.001). Furthermore, exploratory analyses suggested heterogeneity in prediagnostic trajectories across demographic subgroups, including steeper declines among men, more pronounced reductions in activity intensity at older ages, and persistently lower activity levels with flatter trajectories among individuals with obesity. Unlike prior studies lacking objective PA assessment before MDD diagnosis, this study linked wearable and clinical data to characterize long-term prediagnostic trajectories in real-world settings. We observed sustained within-person declines emerging 4 to 5 months before diagnosis, providing insights into temporal dynamics preceding clinical recognition. These findings suggest that wearable-based monitoring may offer scalable early signals for risk stratification, prevention, and intervention for MDD.
The translation of big data analytics and artificial intelligence (AI) into clinical decision support systems (CDSSs) has advanced from proof of concept to real-world clinical practice. AI-informed CDSSs show measurable improvements in diagnostic accuracy, risk stratification, resource use, and patient outcomes compared to traditional models, offering the potential to assist clinicians in managing symptom complexity and uncertainty in health care delivery. Despite this potential, access to large amounts of high-quality and granular data remains one of the most significant bottlenecks to AI-enabled CDSSs. We argue that as health care systems increasingly adopt data-driven decision support, addressing the challenges of data accessibility and protection is essential to realizing the full potential of AI in clinical medicine. We use selected case examples of AI-informed CDSSs in oncology, organ transplantation, diabetic retinopathy, epilepsy, spinal cord injury, rare disease diagnosis, and emergency medicine to illustrate opportunities and challenges related to AI's potential to improve patient outcomes. We discuss public and semipublic, medical institutional and commercial, and government and national data sources that are currently available for the development of CDSSs and highlight the practical and ethical constraints associated with these data. We consider alternative data resources and ways in which health care systems can strengthen data ecosystems to increase AI-driven CDSS efficacy and implementation to improve patient outcomes.
Immunomodulatory agents (IMiDs) are critical in treating multiple myeloma (MM). We aimed to identify patterns of adherence to IMiD treatment, patient characteristics associated with adherence, and the corresponding survival among Medicare beneficiaries with newly diagnosed MM. Using group-based trajectory modeling, we identified distinct adherence groups among patients with MM in the SEER-Medicare database. Multinomial logistic regression models were used to identify factors associated with each adherence group. Cox proportional hazards regression models were sued to evaluate the impact of different adherence groups on survival. We identified four distinct adherence groups among 4,452 patients with newly diagnosed MM between 2007 and 2019: (1) persistently-high (33.1%), (2) quick-decline-then-increase (17.5%), (3) moderate-decline (21.2%), and (4) quick-decline (28.2%). Factors related to poor-adherence groups included early treatment initiation years, patients with multiple comorbid conditions, hypercalcemia, renal failure, anemia, and bone disease and receiving a low-income subsidy. Patients age ≥85 years had higher odds of being in groups 3 and 4 than those age 66-69 years. Additionally, non-Hispanic Black patients were more likely to be in the quick-decline group than non-Hispanic White patients. Mortality was significantly associated with adherence. Patients in poor-adherence groups had a >20% increased mortality risk, compared with the persistently-high group. Old age, low-income subsidy recipients, and those with high comorbidity burden are less likely to be adherent to the IMiD treatment, and their survival outcomes are inferior compared with those in the high adherence group. Targeted intervention to assist adherence to their IMiD treatment may improve the survival for these populations with newly diagnosed MM.
Marma therapy, a traditional Ayurvedic practice, involves the precise stimulation of marmas (vital points that regulate prana [vital energy]) and alleviates musculoskeletal pain and dysfunction. While historical texts describe marma's role in pain relief, no randomized controlled trials have evaluated its efficacy and safety in lumbar disc herniation (LDH)-related radiculopathy. This study aims to explore the efficacy and safety of marma therapy in a commonly occurring painful condition, namely, radiculopathy due to LDH. Selected patients with LDH and radiculopathy are randomized into 2 groups using a computer-generated random number sequence. The participants in group 1 (the trial arm) are treated with marma therapy for 4 weeks, and those in group 2 (the control arm) receive physiotherapy for 4 weeks. All the participants in both groups are given an oral medicine, trayodashanga guggulu, an Ayurvedic formulation, for 12 weeks. As of November 2025, a total of 90 patients have been enrolled in both groups. Data analysis is ongoing. The study will be reported following standard guidelines for reporting randomized controlled trials. Clinical trial results will be disseminated through conferences and publication in a peer-reviewed scientific journal. Marma therapy, if proven effective and safe in pain management, can improve the quality of life of patients with LDH. This protocol can be useful in designing large-scale studies to establish this noninvasive and safe treatment as an alternative modality for the management of neurological pain, such as radiculopathy.
Curiosity plays a fundamental role in human learning, development, and motivation, and emerging evidence suggests that reduced curiosity is linked to poorer mental health outcomes, including depressive symptoms (DS). However, to date, the majority of curiosity research relies on self-report assessments and thus risks biased reporting. Virtual reality (VR), a novel tool increasingly used within mental health research and treatment, might represent a potent tool for offering ecologically valid insights into curiosity-driven behaviors while circumventing issues related to self-report assessments, including demand characteristics and recall bias. The study aimed to enhance the assessment of curiosity by using a novel VR environment and to examine its relevance to DS. Specifically, we tested 2 hypotheses using a novel VR environment: first, that curiosity, as assessed through spontaneous exploratory interactions and behaviors in VR, positively correlates with self-reported curiosity, and second, that VR-based curiosity is inversely associated with DS. This exploratory study used an observational design that included 100 volunteers. All participants completed self-reported assessments of DS and curiosity before engaging in a novel VR scenario. Although progression in the virtual environment required solving cognitive tasks, these were embedded as structural elements rather than framed as the primary objective. Instead, participants' free explorations and interactions with objects formed the basis for the 4 curiosity metrics used in this study. After VR exposure, participants completed a questionnaire assessing cybersickness symptoms. Hypothesis 1 was not supported, as only one curiosity metric, namely object interactions, was positively associated with one aspect of curiosity relating to motivation to seek new knowledge and experiences. Further, diminishing significance after correction for multiple testing warranted caution. Results relating to hypothesis 2 indicated partial support, in that object interaction was significantly associated with DS while controlling for age, sex, and cybersickness levels. Sensitivity analyses showed no associations between object interactions and self-reported anxiety and stress symptoms. VR may be a potent tool for assessing exploratory behaviors in a controlled, yet ecologically valid, environment that avoids issues related to self-report. However, whether such motivations translate to established curiosity constructs warrants further research. This study also provided preliminary insights into how assessing exploratory interactions in VR may be a promising avenue that could enhance the understanding of the etiology and assessment of DS-particularly its early stages.
Blood-based biomarkers (BBMs) for Alzheimer disease (AD) offer widely expanded access to biomarker-informed diagnoses in community settings that have previously lacked such resources. However, integrating them into current clinical diagnostic practice presents challenges, particularly in the diagnosis of mild cognitive impairment (MCI). Limited data are available to guide clinicians on when to test, how to counsel patients with MCI and a positive AD BBM result about their risk of dementia progression, or which evidence-based dementia prevention strategies or treatments to recommend. This Perspective article outlines the key challenges faced by clinicians and patients when addressing a diagnosis of "MCI due to AD" and offers suggestions for appropriate pretest counseling and future research directions to address current knowledge gaps.
Septic shock is associated with high mortality, yet evidence supporting adjunctive therapies remains limited. Methylene blue (MB) has been proposed as a rescue therapy targeting catecholamine-resistant vasodilation, but its impact on clinically meaningful outcomes is uncertain. Oncologic patients represent a clinically distinct subgroup with high disease severity and are underrepresented in prior studies. We evaluated the association between MB and outcomes in oncologic patients with septic shock requiring multiple vasopressors. We conducted a retrospective cohort study using the TriNetX Research Network, including adult oncologic patients with septic shock between January 1, 2015, and December 31, 2025. Septic shock requiring multiple vasopressors was defined as septic shock requiring norepinephrine plus at least one additional vasopressor (vasopressin or epinephrine). MB recipients were matched 1:1 with non-recipients using propensity score matching. The primary outcome was 30-day all-cause mortality. Secondary outcomes included acute kidney injury (AKI), renal replacement therapy (RRT), invasive mechanical ventilation, and vasopressor escalation within 7 days. Among 266 MB-treated patients and 10,163 controls, 262 balanced pairs were analyzed after matching. MB was associated with lower 30-day mortality (risk ratio [RR] 0.80, 95% CI 0.66-0.97). Favorable associations were observed for AKI and vasopressor escalation, whereas invasive mechanical ventilation did not differ between groups. In oncologic patients with septic shock requiring multiple vasopressors, MB use was associated with lower 30-day mortality; however, within the limitations of this retrospective study, these findings should be interpreted cautiously. Prospective randomized trials are warranted to clarify the role of MB in this high-risk population.
Despite making significant advancements, deep learning is still confronted with inherent challenges due to its black-box nature. Existing approaches usually employ explicit data distribution learning methods to enhance interpretability. However, such methods tend to overlook the relational constraints among tokens during the process of interpretation and fall short in natural language understanding tasks. In this paper, we propose a textual white-box transformer for natural language understanding, named TWT, which executes an optimization objective of sparse rate reduction based on token relational constraints. Specifically, the low-rank sparse embedding strategy (LSES) and the label-interacted mapping mechanism (LMM) in the preprocessing layer are used to utilize the feature of natural language. The multi-head subspace self-attention (MSSA) and the token-conditioned iterative shrinkage-thresholding algorithm (T-ISTA) in the transformer layer are employed to maximize rate reduction and sparsify feature representation. Extensive experimental results on four widely used text classification datasets demonstrate that our proposed method performs better than the state-of-the-art baselines, and shows consistent performance while maintaining simplicity and interpretability. Beyond downstream supervised classification, we further investigate a self-supervised pretraining setting for TWT, in which structured textual embeddings are learned without explicit labels, complementing standard transformer architectures for interpretable representation learning in natural language understanding.