The evidence base for telehealth treatments for mental health has grown substantially in the years following the COVID-19 pandemic. We conducted a narrative synthesis and meta-analysis of reviews investigating the efficacy of live, one-to-one telehealth interventions for common mental disorders. We conducted a search for systematic reviews of randomized controlled trials or controlled before-and-after studies. Two independent reviewers screened potentially eligible references and extracted data from all eligible reviews. A meta-regression was conducted with reviews containing data suitable for quantitative analysis. A narrative synthesis of results was undertaken for included reviews that did not contain suitable quantitative data. The search yielded 4180 references, from which 11 eligible reviews were identified. Seven systematic reviews with meta-analyses were included in the meta-regression. Only telehealth interventions for depression and anxiety disorders were identified in the eligible literature. A small significant effect was observed for interventions targeting depression and comparing to active control, where SMD = - 0.316, and p = 0.003. The results of the meta-regression and narrative synthesis indicate that live one-to-one telehealth produces similar reductions in symptom severity across diagnoses and comparator groups. We identified that most of the research on telehealth for common mental disorders includes low-intensity, self-help tools with reduced clinician involvement, highlighting that much of the research conducted in this area aims at providing care at reduced cost, despite the importance of therapeutic alliance in the provision of mental health care. Investigations of telehealth interventions targeting posttraumatic stress disorder in non-military populations are also needed.
This cross-sectional study explored how time-restricted eating (TRE) interacts with metabolic health and obesity (MH&O) in relation to biological age indices of different organs. Data from the National Health and Nutrition Examination Survey (2003-2018) were analyzed, including 4890 participants. TRE strategies were assessed based on eating frequency and meal timing. Indices of organ-specific biological age (heart, kidney, liver, overall), frailty index, life's essential 8, and cardiometabolic index were evaluated. Metabolic dysfunction and obesity were associated with elevated indices of organ-specific biological age and impaired cardiovascular health, with MU status related to more rapid advancement of cardiovascular biological age indices. Excessively long or short fasting durations were associated with a decline in indices of liver metabolic health and worsened cardiovascular risk markers. Moderate eating frequencies and fasting durations were associated with lower biological age indices and better health metrics across subgroups. The association between better cardiovascular health and healthy metabolism was more pronounced in individuals who ate breakfast on time. This study underscores the independent relationship of MU&O with advancement in indices of organ-specific biological age and impaired cardiovascular health metrics. It also highlights the potential role of personalized TRE in relation to modulated biological age indices across various MH&O statuses.
Accurate classification of physical activity from accelerometer data is essential for health research and population-scale studies, yet the wide range of computational approaches has created inconsistencies in implementation, validation, and reproducibility. This scoping review identified and categorised methods used to classify physical activities from accelerometer data, emphasising implementation, simplicity, validation, and feasibility for large datasets such as the All of Us Research Programme. We searched PubMed, Web of Science, and SPORTDiscus (2015-2025) for studies using accelerometer data to classify activities or activity levels with a reported validation strategy. Of 1851 records screened, 158 met the inclusion criteria. Machine learning was most common (n = 73), followed by deep learning (n = 38), hybrid models (n = 27), rule-based methods (n = 5), and unsupervised or other novel approaches (n = 3). Walking (n = 97), sitting (n = 79), and standing (n = 68) were most frequently studied. Most studies used lab-based protocols with k-fold or leave-one-subject-out validation. Only 16 studies provided public code, and just two examined seasonality, a limitation for generalisability. Substantial variation exists in classification methods and reporting practices. Open-source tools and real-world validation remain limited. Simpler, validated, and reproducible approaches are needed for population-scale datasets such as All of Us and the UK Biobank.
This study aimed to investigate differences in health impairment among workers at coking plants and calcium carbide plants under varying occupational exposures. Male employees (n = 325) from a coking plant and (n = 653) from a calcium carbide plant in western Inner Mongolia were selected as subjects. Demographic and occupational health examination data were collected to compare health indicators both pre- and post-employment for each individual and between plants. Cox regression, multiple linear regression, and logistic regression analyses were employed to assess the impact of occupational exposure. Results indicate that employees from both plants exhibited elevated blood pressure after employment, with coking plant workers demonstrating a 1.98-fold higher risk of hypertension than calcium carbide plant workers after adjusting for age, smoking, and alcohol consumption. Hematological analysis revealed significant post-employment elevations in white blood cell, neutrophil, and lymphocyte counts across both plant cohorts, the coking plant demonstrated abnormal risk levels exceeding those of the calcium carbide facility by over 1.5-fold. Divergent trends emerged in other parameters: coking plant workers exhibited decreased red blood cell, hemoglobin, and platelet counts post-employment, whereas their calcium carbide counterparts showed opposing increases. Monocyte counts displayed an upward trajectory at the coking plant but a downward tendency at the calcium carbide facility. Correlation analysis reveals a significant association between the elevated detection rate of hepatic haemangiomas in coking plant workers and their occupational exposure to polycyclic aromatic hydrocarbons (PAHs). Conclusions indicate that occupational exposure in both coking and calcium carbide plants is associated with adverse health effects, though coking plants pose more pronounced health hazards due to exposure to characteristic pollutants.
This study evaluated the EuroQol 5-dimensional questionnaire, 3-level version (EQ-5D-3L), using Korea Health Panel (KHP) data by examining its factor structure, measurement invariance across gender and age groups, and longitudinal measurement invariance. Panel 1 data from the second survey year (2009), when the EQ-5D-3L was first introduced in the KHP, through the 12th year (2017) were analyzed, along with panel 2 data from 2019 to 2021. Confirmatory factor analysis and measurement invariance tests by gender and age groups were conducted within each period. Longitudinal measurement invariance was also evaluated for each period. A 1-factor model demonstrated good fit for the EQ-5D-3L. In panel 1, full measurement invariance across gender and age groups was supported. In panel 2, partial invariance was achieved after relaxing constraints on item 5. Longitudinal measurement invariance was supported over 5- and 10-year intervals in panel 1 and over a 3-year interval in panel 2, indicating temporal stability of the measurement model. The EQ-5D-3L used in the KHP panel 1 and panel 2 datasets demonstrates a stable 1-factor structure and acceptable measurement invariance across key subgroups and over time. These findings support the use of the EQ-5D-3L as an appropriate instrument for assessing health-related quality of life among Korean adults and for longitudinal analyses within large-scale panel surveys.
The COVID-19 pandemic severely strained public health systems. In Japan, Public Health Centers faced administrative bottlenecks in patient monitoring and allocation. To address this, Sapporo City introduced COVIMARU, a web-based self-reporting application. This study quantitatively evaluated the impact of this digital tool on patient care site allocation efficiency. We conducted a retrospective study using a regression discontinuity design (RDD) using data from Sapporo City's comprehensive COVID-19 database from May 1, 2020, to April 25, 2022. The intervention was COVIMARU implementation for home care on November 12, 2020. The primary outcome was time in days from diagnosis to first care site allocation. We quantified the estimated effect at the cutoff to evaluate the localized impact of the intervention. The analysis included 130,500 patients. COVIMARU introduction was associated with a significant reduction in mean allocation time at the intervention threshold (estimated effect: -0.96 days (95% CI, -1.39 to -0.20; P = 0.0091). This improvement was not attributable to changes in patient volume. A sensitivity analysis excluding non-working days confirmed the finding (estimated effect: -0.87 days; 95% CI, -1.39 to -0.21; P = 0.008). The COVIMARU web application significantly reduced patient allocation times, demonstrating that digital health tools can effectively mitigate administrative bottlenecks and improve pandemic response efficiency. Investing in such digital infrastructure is a critical strategy for building resilient public health systems for future emergencies.
Groundwater quality in arid and semi-arid regions is under increasing pressure from both natural and anthropogenic sources. This study presents a comprehensive assessment of groundwater quality in the Gonbad Plain, Iran, to determine its suitability for drinking and to identify the controlling hydrogeochemical processes. A total of 45 groundwater samples were analyzed for their physicochemical parameters, and the data were evaluated using an integrated approach combining hydrochemical facies analysis, geochemical modeling (PHREEQC), multivariate statistical techniques (Principal component analysis (PCA) and hierarchal cluster analysis (HCA)), and human health risk assessment (Heavy metal pollution index (HPI), Hazard quotient (HQ), Hazard index (HI)). The results reveal a hydrochemical evolution from fresh Ca-HCO₃ type water to saline Na-Cl type water, primarily controlled by three key mechanisms: (1) dissolution of evaporite minerals (halite and gypsum), (2) cation exchange, and (3) evaporation. The Gibbs diagrams confirmed that rock-water interaction and evaporation are the dominant processes governing water chemistry. Iron concentrations varied from 0.01 mg/L to 4.12 mg/L, with a mean of 0.46 mg/L and a standard deviation of 0.66 mg/L. The median iron concentration was 0.27 mg/L. Manganese concentrations ranged from 0.015 mg/L to 0.47 mg/L, with a mean of 0.09 mg/L and a standard deviation of 0.07 mg/L. The median manganese concentration was 0.08 mg/L. The HPI indicated a critical water quality crisis, with 77.8% of the samples being unsuitable for drinking (HPI > 100), primarily due to elevated concentrations of manganese. The non-carcinogenic health risk assessment revealed that while the dermal exposure pathway poses no significant risk, oral ingestion of groundwater presents a potential health hazard, particularly for children. The HI exceeded the safe limit of 1.0 in 11.1% of the samples for children, with manganese being the primary contributor to the risk. This is the first study apply integrated framework that combines multivariate statistics, geochemical modeling, and health risk indices to evaluate groundwater vulnerability in Gonbad-e Kavus. The study's alignment with UN Sustainable Development Goal 6 (Clean Water and Sanitation).
Rates of mental health difficulties among girls and young women in the UK have risen sharply, and disproportionately so for those from marginalised groups. My Story and Me is a new digital public mental health intervention that uses storytelling to reduce stigma, increase awareness and support early help-seeking among girls and young women aged 14-18. The feasibility study aims to determine the acceptability of the intervention and future full trial, including assessing optimal settings and meaningful changes in the primary outcome measure (anxiety and depression). This is an 18-month mixed-methods, uncontrolled feasibility study conducted in secondary schools, further education colleges and community organisations across the UK. We will recruit 120-180 participants. Quantitative data will be collected at baseline and 7-month follow-up. The primary outcomes are anxiety and depression, and secondary outcomes are social support, mentalising, stigma, quality of life, loneliness, empowerment, intervention acceptability, resource use and randomisation acceptability. Platform-level engagement data will assess adherence and fidelity. Qualitative interviews with young women and staff will explore acceptability, feasibility, mechanisms of change and views on trial procedures, including randomisation in a future full trial. Analysis will be descriptive and exploratory, including comparisons across settings and priority groups (LGBTQIA+, neurodivergent and those experiencing digital poverty). A framework and reflexive thematic analysis approach will be used for qualitative data. Prespecified progression criteria will inform decisions about advancing to a full cluster randomised trial. The University College London Research Ethics Committee (0692) has approved the My Story and Me protocol. Interested participants will be required to complete an expression of interest and consent form to take part in the study, and young people under 16 years old will be required to obtain parent/carer informed consent. Results will be disseminated through peer-reviewed publications, lived experience summaries, a policy briefing and academic conference presentations. ISRCTN12191423.
Physical inactivity constitutes a pressing societal problem. To realize physical activity's (PA) potential as a key health resource, mechanisms of PA engagement need to be understood. Laboratory and interventional studies documented that exercise relates to affective well-being (AWB) and suggested that AWB may shape PA behaviour. Digitalization enabled the investigation of how PA relates to AWB in everyday life, but findings from individual studies are ambiguous. Here we compiled 67 datasets (55.2% of eligible records) including 321,345 smartphone-based AWB ratings and nearly 1,000,000 h of accelerometer-measured PA (N = 8,223 participants) until December 2023 to clarify the nature and extent of PA-AWB associations. One- and two-stage individual participant data meta-analyses reveal that momentary AWB is associated with both prior (within, r = 0.05, 99.2% confidence intervals (CI) 0.03 to 0.06; between, r = 0.08, 99.2% CI 0.04 to 0.12) and subsequent (within, r = 0.04, 99.2% CI 0.03 to 0.05; between, r = 0.08, 99.2% CI 0.04 to 0.13) short-term PA in everyday life. Within persons, PA displays a positive association with energetic arousal, positive affective states and valence, yet a negative relation to calmness. The practical effect sizes are comparable to other daily life activities, with energetic arousal evincing the strongest relation to PA. Considerable heterogeneity in associations across individuals can be partially explained by sociodemographic moderators. Between participants, PA relates to positive affective states. The results document the critical relevance of PA-AWB relations in everyday life. They can contribute to the revision and development of health behaviour models and establish a starting point to approach behavioural, physiological and neuronal mechanisms underlying PA-AWB associations.
Magnesium oxide, though poorly soluble with low bioavailability, is widely used for magnesium supplementation due to its high elemental content and affordability. This study aimed to evaluate the safety and short-term, absorption-related pharmacokinetics of oral magnesium oxide at two dosage levels in healthy male volunteers. A randomized, open-label pharmacokinetic study was conducted in 24 healthy male volunteers aged 18-45 years, who were equally assigned to receive a single oral dose of either 750 mg or 2000 mg of magnesium oxide. Serum magnesium concentrations were measured at baseline and multiple time points up to 12 h post-dose using a colorimetric assay. Pharmacokinetic parameters related to systemic exposure and absorption, including Cmax, Tmax, and AUC0-12, were calculated using non-compartmental analysis. Parameters dependent on terminal elimination (t1/2, AUC0-∞, CL/F, MRT) were not estimated because a clear terminal log-linear phase was not identifiable within the 12-hour sampling window. Safety was assessed by monitoring vital signs, laboratory parameters, ECG, and adverse events. The study showed non-dose-proportional systemic exposure, with no proportional increase in Cmax or AUC at the higher dose. Incremental exposure parameters were also comparable between doses, further suggesting non-dose-proportional absorption within the studied dose range. Tmax was significantly shorter in the 2000 mg group (p = 0.025). Both doses were well tolerated without serious adverse events or gastrointestinal side effects. In summary, this study characterizes the short-term absorption pharmacokinetics of oral magnesium oxide, demonstrating non-proportional systemic exposure across the 750-2000 mg dose range within the 12-hour observation period. As only two dose levels were evaluated, further studies with multiple doses and formal dose-proportionality analyses are required to clarify the underlying mechanisms. These findings provide exploratory pharmacokinetic insights and should not be extrapolated to clinical efficacy or perioperative dosing.
Cancer represents one of the leading global public health challenges, with its burden shaped not only by biological factors but also by social and economic inequalities. In Brazil, even municipalities with a very high Human Development Index (HDI) exhibit persistent disparities. This study assessed temporal changes in cancer incidence, mortality, and social inequalities in Campinas, São Paulo State, Brazil. A repeated cross-sectional study was conducted using incidence and mortality rates of the most frequent neoplasms among men and women residing in Campinas, SP, Brazil from 2010 to 2014 and 2015-2019 from the Population-Based Cancer Registry and the Mortality Information System. Age-standardized incidence and mortality rates were estimated for the most common cancers, stratified by levels of social vulnerability based on the São Paulo Social Vulnerability Index. Inequalities were analyzed using the Relative Index of Inequality (RII). Among men, prostate cancer (RR= 0.94; 95% CI: 0.89-0.99) and stomach cancer (RR= 0.82; 95% CI: 0.72-0.93) incidence declined, while mortality remained stable for most cancers, except for an increase in colorectal cancer mortality (RR 1.15; 95% CI: 1.00-1.32 - p = 0.032 and a reduction in stomach cancer mortality (RR= 0.83; 95% CI: 0.66-1.04). Socially vulnerable men showed persistently higher mortality from prostate, stomach, and oral cavity cancers. Among women, breast cancer incidence increased (RR= 1.14; 95% CI: 1.08-1.20), and overall mortality rose (RR= 1.06; 95% CI: 1.01-1.12), particularly from lung cancer (RR= 1.25; 95% CI: 1.07-1.47). Vulnerable women exhibited consistently higher cervix uteri cancer incidence (2010-2014: RII= 2.90; 95% CI: 1.90-4.43 vs. 2015-2019: RII= 2.36; 95% CI: 1.58-3.53) and mortality (2010-2014: RII= 2.74; 95% CI: 1.42-5.26 vs. 2015-2019: RII= 3.60; 95% CI: 1.89-6.85), while breast cancer incidence remained higher among less vulnerable women (2010-2014: RII= 0.42; 95% CI: 0.37-0.49 vs. 2015-2019: RII= 0.49; 95% CI: 0.43-0.56). Inequalities in colorectal cancer incidence narrowed over time for both sexes; however, mortality inequality among men reversed, becoming higher among the most vulnerable in 2015-2019 (RII= 0.92; 95% CI: 0.66-1.29). Despite high socioeconomic development, substantial social inequalities in cancer incidence and mortality persist in Campinas, with some disparities widening over time. These findings highlight the need for targeted and equity-oriented cancer control strategies to improve access to early diagnosis, treatment, and care among socially vulnerable populations.
Sleep and physical activity play a crucial role in brain and mental health. While traditional self-reported methods and research-grade accelerometers have several limitations, consumer-grade wearable devices, such as Fitbit, allow continuous, objective, and real-life data collection. The purpose of this article is to report the study protocol and participant characteristics of a wearable device survey that primarily aims to examine the relationship between brain health and long-term patterns of daily physical activity and sleep.Nearly 2,000 participants were recruited between September 2022 and December 2023, as part of the Tohoku Medical Megabank Brain Magnetic Resonance Imaging Study, to wear Fitbit Charge 5 devices for 1 year, continuously tracking physical activity and sleep. Additionally, home blood pressure measurements and questionnaire data on housing conditions, psychological distress, and medication were collected every 4 months for a total of four assessments.The mean age of the participants was 58.5 years and 37.4% were men. For the first 30 days after the recruitment, mean step count was 8,910 steps/day, and a mean total sleep time was 370.4 min/night (approximately 6.2 h). The mean morning systolic blood pressure and diastolic blood pressure were 126.2 and 75.5 mmHg, respectively.This study provides a unique opportunity to integrate longitudinal consumer-grade wearable data, brain MRI, multiomics data, and comprehensive health records from existing cohorts to advance precision medicine and help prevent mental and neurodegenerative diseases, such as dementia. Limitations include potential selection bias, and a relatively smaller sample size than larger global studies.
In epidemiological research, time-to-event outcomes, commonly referred to as survival outcomes, are a common subject of investigation. Here, we first describe the circumstances under which methods of survival analysis are necessary, review the roles of hazard functions, and discuss the limitations of hazard ratios for causal inference. Second, we explain how confounding and dependent censoring, which are common in observational studies, can be addressed by inverse probability weighting and parametric g-formula estimators. We emphasize that hazards serve as building blocks for estimating counterfactual risks. Third, we summarize recent developments in defining causal estimands in the presence of competing risks, including risk without eliminating competing events, net risk, and cumulative incidence under modified treatment. To illustrate how these challenges are addressed in practice, we revisit a recent clinical study on pharmacological interventions for the onset of dementia. We further empirically compare various estimands in the presence of competing events, including separable effects, through simulations. Our overall aim is to elucidate the need to move beyond routinely used methods of survival analysis, particularly the mere estimation of hazard ratios, if the goal is to draw causal inferences. This paper provides an overview of causal survival analysis, focusing on how confounding, dependent censoring, and competing risks can be addressed to estimate causal parameters of interest (counterfactual survival functions or counterfactual risks), which are interpretable and often meaningful for investigators.
Antimicrobial resistance (AMR) is a critical global health threat, intensified in low- and middle-income countries by rampant antibiotic misuse in livestock and poultry. This study assessed the knowledge, attitudes, and practices (KAP) of 537 farmers in Rangpur, Bangladesh, to identify drivers and pathways of high-risk antimicrobial use (AMU). Data on demographics, farm characteristics, and AMR-related KAP scores were collected, with disease treatments mapped to WHO AWaRe classifications. Logistic regression identified predictors of responsible practices, while additional analyses and visualizations explored patterns of antimicrobial use and misuse pathways. Results showed that 42.9% of antibiotic use was high-risk, with nearly half of diseases treated using Critically Important Antimicrobials (CIAs) from the Watch or Reserve groups, intended primarily for human medicine. Colistin and ciprofloxacin, last-resort drugs for human health, were commonly used for routine poultry diseases, raising serious public health concerns. Three misuse pathways emerged: (i) antibiotics applied to viral diseases, (ii) reliance on Watch/Reserve antibiotics for bacterial infections, and (iii) antibiotic use for parasitic diseases. Paravets and veterinarians influenced 76.2% of prescribing decisions, underscoring their pivotal role. AMR training was associated with more responsible practices, yet high practice scores did not consistently align with knowledge or attitudes, revealing a gap between behavior and awareness. Immediate One Health stewardship interventions combining regulatory enforcement, improved diagnostics, and sweeping educational reform are essential to reduce AMR risks and safeguard public health in Bangladesh.
Namaste Care, a non-pharmaceutical daily multicomponent palliative care intervention, offers care for people with dementia, aiming to improve quality of life of those living with dementia as well as their family and caregivers. This systematic review explores the Namaste Care intervention and its clinical and economic effects in multiple care settings. The aim of this review is to consolidate existing evidence on Namaste Care's clinical and economic outcomes and examine the tools used for data collection. A systematic literature search was conducted (PubMed, Scopus and Web of Science) to identify peer-reviewed studies on Namaste Care's impact on quality of life, costs, health, economic outcomes and benefits up to 22 February 2026. Methodological quality was assessed using the Mixed Methods Appraisal Tool, while the completeness of reporting of economic evaluation studies was evaluated according to the Consolidated Health Economic Evaluation Reporting Standards 2022 (CHEERS). 31 studies reported the clinical and/or economic outcomes of Namaste Care. The results for quality of life and quality of dying were mixed, while 5 of 11 studies evaluating quality of life reported significant improvements. The various quality-of-life instruments used include the Quality of Life in Late-Stage Dementia (QUALID), EQ-5D-3L and EQ-5D-5L instruments, ICEpop CAPability Measure for Older People (ICECAP-O), ICECAP Supportive Care Measure (ICECAP-SCM), Quality of Life for People with Dementia (QUALIDEM) and Carers-DEMentia Quality of Life (C-DEMQOL). The clinical outcomes considered included pain, behavioural symptoms and quality of end-of-life care. The Medication Quantification Scale and Minimum Data Set indicated reductions in antidepressant and antianxiety medication use. Seven studies reported significant improvements in well-being, and two studies reported reduced stress among family members following Namaste Care sessions. A subset of five studies reported a range of economic outcomes. The findings suggest that Namaste Care improves well-being, reduces caregiver stress and lowers the use of antidepressant and antianxiety medications at a moderate cost. The current literature is characterised by small, non-random, heterogeneous studies. Randomised controlled trials, which include economic evaluations, help to improve evidence-based research to support funding and implementation decisions on Namaste Care. CRD42024560056.
Recovery after extremity fracture is influenced by injury-related impairments, functional capacity, symptom severity and psychosocial well-being, emphasising the need for patient-reported outcome measures (PROMs). Despite increasing adoption of Patient-Reported Outcomes Measurement Information System (PROMIS) and legacy PROMs in orthopaedic research, substantial knowledge gaps remain regarding their measurement properties, comparability and clinical utility, particularly in fracture populations. Our aim is to develop meaningful guidance to clinicians regarding the use of PROMs in treating orthopaedic fracture patients. A prospective cohort of 1500 patients across 14 centres presenting with isolated extremity fractures being treated operatively or non-operatively will be recruited prospectively. We will aim to recruit 300 patients for each of five fracture types: isolated hip, tibial shaft, ankle/pilon, proximal humerus and distal radius fracture. All procedures and management will be performed according to the site's standard of care and treatment protocol. For patients treated non-operatively, the index visit (first study assessment) will be performed at the first orthopaedic treatment encounter (eg, emergency department visit or outpatient fracture clinic). Surgically treated patients will have PROMs collected anytime from the day of the surgery up to discharge from hospital for the index visit. Follow-up visits will be performed at 2-3 weeks, 6-8 weeks, 10-13 weeks, 6 months and 1 year after the index visit. At baseline, we will document injury data, demographic and sociodemographic data, and radiographic classification using the AO classification system. Patient-perceived functioning will be assessed with PROMIS Physical Function (PF), PROMIS Upper Extremity, PROMIS Global Health, Knee and Injury Osteoarthritis Outcome Score-12, Hip Disability and Osteoarthritis Outcome Score-12, Foot and Ankle Ability Measure and Quick Disabilities of the Arm, Shoulder and Hand. The following domains and PROMs will also be captured: symptoms of anxiety and depression (PROMIS Anxiety; PROMIS Depression), patient activation (Patient Activation Measures (PAM)-10) and a patient's ability to fulfil social roles (PROMIS Social Roles).The range and normative limits of the PROMs collected will be defined using standard descriptive statistics. We will crosswalk or validate PROMIS measures with legacy instruments for PF using an Item Response Theory (IRT)-based linking model and compare it to non-IRT models (such as equipercentile linking). Lastly, we will assess the PROM-based recovery trajectory after fracture, overall and after adjusting for relevant demographic, clinical or biopsychosocial factors. Ethics approval for this study was granted from the local Ethics Committees or Institutional Review Board at each of the participating sites prior to patient enrolment. Austin: Institutional Review Board University of Texas at Austin, STUDY00000262; Boston: Mass General Brigham, 2019P000397; Los Angeles: Cedars-Sinai, Office of Research Compliance and Quality Improvement, STUDY00000081; Miami: University of Miami Human Subject Research Office, 20221353; Bogotá: Comité Corporativo de Ética en Investigación, CCEI-15607-2023; Berlin: GoFitFast: under Homburg approval; Recovery/Linking: Ethikkommission Charité Universitätsmedizin Berlin, EA2/026/21; Homburg: GoFitFast: Ethikkommission der Ärztekammer des Saarlandes, 232/19; Recovery/Linking: under Charité approval; Murnau: under umbrella from Homburg and Charité, GoFitFast: under Homburg approval, Recovery/Linking: under Charité approval; Tübingen: GoFitFast: Ethik-Kommission, Universitätsklinikum Tübingen, 393/2022BO2; Freiburg: Ethik-Kommission Albert-Ludwigs-Universität Freiburg, 21-1401; Rostock: Ethikkommission an der Universitätsmedizin Rostock, A 2024-0113; Innsbruck: Ethikkommission der Medizinischen Universität Innsbruck, 1258/2021; Oxford: HRA and Health and Care Research Wales, 20/EE/0051; London: HRA and Health and Care Research Wales, 20/EE/0051; Groningen: Medical Ethics Review Board University Medical Center Groningen, METc 2023/187 16882; Non-WMO waiver; Zwolle: Medical Ethics Review Board University Medical Center Groningen, METc 2023/187 16882; Non-WMO waiver. The results of this study will be published in peer-reviewed journals and presented at different conferences. NCT04113044.
Legumes, a staple food from the Leguminosae family, are widely consumed and deliver key food components and nutrients, including dietary fibre and plant protein. Regular consumption is associated with improved health outcomes such as a reduced risk of cardiovascular disease, better weight management, and enhanced gastrointestinal health. However, legume consumption varies globally, and the available Australian data is outdated. The aim of this study was to explore perceptions, knowledge and culinary use of legumes in a convenience sample of Australians. An anonymous and voluntary survey was conducted online. Eligible respondents were adults living in Australia who had not received formal nutrition training. Convenience sampling via social media and snowballing were used to recruit respondents. Three hundred and forty-nine respondents (84.8% female; 55.1% aged 56 years or older; 54.1% Bachelor's degree or higher) were included in analyses. Most (97.9%) respondents reported consuming legumes, with 24.5% reporting consumption once or twice per week and 27.2% reporting consumption three to four times per week. Perceptions of legumes included their high content of protein and fibre, though confusion was evident regarding the impact of legume intake on cardiometabolic health. The most influential cultures or cuisines for legume consumption were Mexican (31.4%) and Indian (25.8%). Promotion of legume consumption should focus on their health benefits, affordability, and environmental sustainability attributes. Collaboration between key stakeholders (legume advocacy groups, consumers, food industry) could help drive legume product innovations to extend the range available to consumers.
Cervical cancer disproportionately affects women in LMICs, with treatment delays worsening outcomes. Despite WHO's 90-70-90 elimination goal, global disparities persist. This study systematically reviews worldwide delays from diagnosis to treatment, identifying contributing factors across income levels, healthcare systems, and treatment approaches. We conducted a systematic review and meta-analysis following PRISMA 2020 guidelines. Observational studies reporting the interval between histopathological diagnosis of cervical cancer and initiation of first-line definitive treatment (surgery, chemotherapy, radiotherapy, or combination therapy) were eligible. Search was performed in PubMed, Embase, Web of Science, and Scopus up to October 30, 2025. Two reviewers independently screened records, extracted data, and assessed study quality using the JBI Critical Appraisal Checklist. Pooled mean delays were estimated using random-effects models in STATA 19.5. Subgroup analyses and meta-regression explored heterogeneity by income level, data source, and treatment modality. Sensitivity analyses and publication bias assessments were performed using Galbraith plots, leave-one-out methods, Egger's test, and trim-and-fill procedures. The pooled mean delay from diagnosis to treatment initiation was 71.42 days (95% CI: 47.96-94.88), exceeding recommended benchmarks of 30-60 days. Across studies, 59% of patients experienced treatment initiation delays exceeding 30 days, 54% exceeded 45 days, and 33% exceeded 90 days. Subgroup analysis revealed that high-income countries experienced shorter delays (40.50 days), while upper-middle-income countries faced significantly longer waits (94.63 days). Among treatment types, radiotherapy had the longest delay (79.90 days). The causes of delay were multifaceted, involving patient-level challenges such as stigma and financial constraints, systemic issues like inefficient referral processes and limited radiotherapy access, and disease-related factors. Despite substantial heterogeneity across studies (I² = 99.98%), sensitivity analyses validated the consistency and reliability of the pooled estimates. This study provides the most comprehensive global estimate of cervical cancer treatment delays to date. The findings highlight critical disparities across income settings and healthcare systems, with actionable insights for policy and practice. Addressing these delays is essential to improving survival outcomes and achieving WHO's cervical cancer elimination targets. Strengthening referral systems, expanding radiotherapy infrastructure, and tailoring interventions to local barriers are key priorities.
Visual perceptual functions are frequently disrupted in patients with schizophrenia. The nature and degree of the visual abnormalities experienced by patients provide important information about altered cognitive mechanisms and may serve as potential endophenotypes for classification and diagnosis of schizophrenia. To characterize dynamic and integrative visual functions and distinguish patients with schizophrenia from healthy controls, we compared performance on six visual tasks between patients and healthy controls. The tasks included binocular rivalry (interocular dynamics), structure from motion (3D-surface dynamics), surround suppression of contrast (contextual modulation), contour integration (form integration), coherent motion (motion integration), and motion speed discrimination (motion sensitivity). The same set of tests was conducted in 61 early-stage outpatients and 69 chronic inpatients and controls, allowing us to see the effects of long-term treatment on patients' visual functions. Compared with healthy controls, patients experienced slower switching in binocular rivalry but faster switching in structure from motion, along with impaired spatial form integration. Inpatients had reduced motion sensitivity, while outpatients had deficits in motion integration, two groups differed in susceptibility to surround contrast suppression. Multidimensional test data supported more accurate classification between patients and controls. Patients with schizophrenia have specific patterns of visual perceptual abnormalities in dynamic and integrative information processing, with deficit in coherent motion (considered more state-linked) more apparent in early-stage outpatients and poor motion speed discrimination (linked to stable traits) seen in chronic inpatients. There is a significant advantage of using multiple independent tests to assist in the classification and diagnosis of schizophrenia.
The Internet holds potential for alleviating cognitive decline through cognitive stimulation and social engagement. However, existing studies often overlook the crucial role of Internet access as the foundational layer of the digital divide. This study investigated the association between Internet access and cognitive function among middle-aged and older adults and examined age-specific differences in this relationship. Data were drawn from the 2015 and 2018 waves of the China Health and Retirement Longitudinal Study (CHARLS). Participants aged ≥ 50 years with complete follow-up cognitive assessments were included (n = 7721; mean age = 60.48, SD = 9.37; male, 56.1%). Lagged dependent variable models were used to evaluate associations between Internet access and cognitive function. Chain mediation analyses were used to test whether family connection mediated this association. Internet access was significantly associated with better cognitive outcomes over time. Participants with Internet access demonstrated greater significant improvements in episodic memory (β = 0.29,95%CI = 0.14-0.44) and mental status (β = 0.28,95%CI = 0.12-0.44) compared to those without access. Age-stratified models indicated stronger effects in adults aged 50-59 for episodic memory (β = 0.36,95%CI = 0.17-0.55) and aged 60-69 for mental status (β = 0.42,95%CI = 0.15-0.69). Chain mediation analyses revealed that Internet access enhanced family connection, thereby contributing to better cognitive function. Internet access is positively associated with cognitive function in mid- to later life, partly through strengthened family ties. These findings underscore the importance of digital inclusion policies to support cognitive health and promote healthy aging.