Malnutrition affects 35% to 64% of hospitalised older people, and is associated with adverse health outcomes such as disease complications and hospital readmission. Identifying effective nutritional interventions is essential to improve clinical outcomes and reduce healthcare costs in this population. To evaluate the effects of various nutritional interventions, compared with either a control group (standard care or placebo) or each other, on patient-relevant outcomes in hospitalised older people at risk of or with established malnutrition, and to rank the effects of these different interventions using network meta-analysis (NMA) based on individual participant data (IPD). We searched CENTRAL, MEDLINE, five other databases, and two trial registries to 2 July 2024, and checked the reference lists of included studies and relevant systematic reviews. We included older people (≥ 65 years) hospitalised for different acute conditions at risk of or with malnutrition enrolled in randomised controlled trials (RCTs) comparing oral nutritional interventions with control or each other. For RCTs that met our inclusion criteria, either fully or partially, we requested IPD from the study authors. If we did not receive a response or IPD were unavailable, we used published aggregated data. We excluded RCTs that only partially met the eligibility criteria if neither IPD nor sufficient aggregated data were obtainable. Critical outcomes were all-cause mortality, serious adverse events (SAEs), and functional status (e.g. activities of daily living). Important outcomes were health-related quality of life (HRQoL), length of hospital stay (LOS), body weight, and fat-free mass. The main outcome assessment time point was at hospital discharge or 30 days after randomisation. We used the Cochrane risk of bias 2 (RoB 2) tool. For each outcome, we first analysed IPD within each study. Second, we pooled results in an NMA which also included the aggregated data from RCTs without available IPD. We performed random-effects NMAs based on the frequentist approach and ranked treatments by P-scores. We rated the certainty of evidence using the GRADE approach. We included 21 RCTs (72 reports; 12 RCTs with IPD) with 3309 older participants (mean age ranged from 75 to 85 years; 1863 participants with IPD) with different acute conditions. Interventions included the provision of additional protein (three studies), energy supplements (two studies), oral nutritional supplements (ONS; eight studies), individualised feeding support (two studies), and comprehensive individualised nutritional care (eight studies). In all but two RCTs, interventions were compared to control (standard care with or without a placebo). We judged 16.1% of outcome assessments to be at low risk of bias and 16.8% at high risk. ONS may reduce all-cause mortality (risk ratio (RR) 0.46, 95% confidence interval (CI) 0.25 to 0.84; absolute risk difference 57 fewer deaths per 1000 people, 95% CI 79 fewer to 17 fewer; low-certainty evidence) compared to control, while comprehensive individualised nutritional care may show little to no effect (RR 0.98, 95% CI 0.55 to 1.73; 1 fewer per 1000 people, 95% CI 26 fewer to 46 more; low-certainty evidence). For all other treatment comparisons, the evidence is very uncertain (NMA with 13 RCTs, 2728 participants; Q between designs: not applicable (NA)). ONS may reduce SAEs compared to control (RR 0.56, 95% CI 0.32 to 0.95; 84 fewer SAEs per 1000 people, 95% CI 131 fewer to 10 fewer; Q between designs: Q 1.95, df 2, P = 0.3772; low-certainty evidence). For all other treatment comparisons, the evidence is very uncertain (NMA with 14 RCTs, 2184 participants). Comprehensive individualised nutritional care may make little to no difference in activities of daily living compared to control (standardised mean difference (SMD) 0.06, 95% CI -0.08 to 0.20; low-certainty evidence) and ONS compared to energy supplements (SMD -0.15, 95% CI -0.53 to 0.23; low-certainty evidence). For all other treatment comparisons, the evidence is very uncertain (NMA with 5 RCTs, 1128 participants; Q between designs: NA). Energy supplements probably make little to no difference in HRQoL compared with ONS (mean difference (MD) 0.01, 95% CI -0.06 to 0.08; Q between designs: NA; moderate-certainty evidence). All other comparisons of different nutritional interventions may make little to no difference to HRQoL (NMA with 3 RCTs, 1513 participants). The provision of additional protein, energy supplements, ONS, and comprehensive individualised nutritional care may make little to no difference in LOS compared to control (18 RCTs, 3013 participants; Q between designs: Q 2.86, df 3, P = 0.4145). Body weight (16 RCTs, 2114 participants; Q between designs: Q 2.03, df 3, P = 0.5655) may increase with ONS when compared to control (MD 0.9 kg, 95% CI 0.37 to 1.42) or comprehensive individualised nutritional care (MD 1.00 kg, 95% CI 0.12 to 1.87), but the evidence is very uncertain. Energy supplements and ONS probably have similar effects on body weight (MD 0.11 kg, 95% CI -0.85 to 0.63; moderate-certainty evidence). For fat-free mass, no meta-analysis was possible. One RCT (102 participants) compared ONS with energy supplements and found little or no difference between groups (MD 0.13 kg, 95% CI -0.63 to 0.90; low-certainty evidence), while evidence regarding the effects of additional protein compared with control was very uncertain (1 RCT, 19 participants). Rankings of treatments by P-scores were not consistent across outcomes. In older hospitalised people at risk of or with malnutrition, oral nutritional supplements may reduce mortality and SAEs compared to control 30 days after randomisation. For other outcomes, there may be little or no differences in results. Overall, the evidence was of low to very low certainty, primarily due to a limited number of studies and participants per comparison. The comparison of treatment effects across outcomes was constrained by variations in network structure. When interpreting the results, the heterogeneity of the population in terms of acute and chronic conditions needs to be considered. To improve certainty, adequately powered studies with robust methodologies should compare interventions with controls as well as against each other. The German Federal Ministry of Education and Research funded this work (grant number: 01KG2102). Protocol (2022) doi.org/10.1002/14651858.CD015468.
BackgroundParenteral glutamine (Gln) supplementation can enhance the immune function of patients with colorectal cancer(CRC), regulate inflammatory response, nitrogen balance and protein synthesis, reduce the morbidity of postoperative complications and shorten the length of hospitalization. However, some guidelines and clinical studies have questioned the rationality of using glutamine-containing immunonutrition support in the perioperative period of CRC. Therefore, we conducted a meta-analysis of the effects of perioperative glutamine-enriched parenteral nutrition on short-term postoperative clinical outcomes in patients with CRC.MethodsA comprehensive search of all relevant literature from the default date to June 2025 was performed using the following databases: PubMed, Embase, Web of Science, Cochrane Library, China Biology Medicine Database (CBM), China National Knowledge Infrastructure (CNKI), VIP Medical Information System (VIP), and Wanfang electronic database. RevMan 5.3 software was used for the meta-analysis. We calculated the outcomes using random- and fixed-effects models.ResultsSeventeen single-center randomized controlled trials (RCTs) involving 950 patients with CRC were included. The control group consisted of 470 patients who received traditional parenteral nutrition therapy and the experimental group consisted of 480 patients who received parenteral nutrition with Gln. The analyses showed that perioperative Gln-enhanced parenteral nutrition reduced the morbidity of infectious complications (Relative Risk [RR]=0.36,95% Confidence Interval [CI]:0.23-0.58) and non-infectious complications (RR=0.27, 95% CI: 0.13-0.55). The length of hospitalization was reduced by 2.18 days (mean difference [MD] = -2.18, 95% CI: -2.59--1.78).ConclusionParenteral Gln supplementation may potentially reduced the morbidity of postoperative complications, shortened postoperative hospitalization, improved some aspects of nutritional status, immune and inflammation function in patients with CRC, based on current evidence of low-to-moderate certainty.
Stroke-associated pneumonia (SAP) is a common complication after acute ischemic stroke and contributes to worse recovery and greater resource use. Nutritional and inflammatory dysregulation have been implicated in both SAP susceptibility and adverse prognosis. To examine whether admission inflammatory and nutritional markers are associated with the development of SAP and with short-term functional prognosis. We performed a retrospective single-centre cohort study of consecutive patients with acute ischemic stroke admitted between 1 January 2015 and 31 December 2024 (N=303;SAPn=108,non-SAPn=195). Admission laboratory indices (albumin, CRP, fibrinogen, WBC, PCT, and prealbumin) in the first 24 h and clinical variables were analysed. Multivariable logistic regression identified factors independently associated with SAP; the relationship between SAP and early functional recovery was assessed in adjusted outcome models. A nomogram integrating key predictors was developed and its apparent discrimination is reported. SAP occurred in 35.6% of patients. Factors independently associated with SAP included nasogastric tube placement (OR: 7.02, 95% CI: 3.50-14.62), venous thromboembolism (OR: 3.20, 95% CI: 1.62-6.31), cognitive impairment (OR: 2.90, 95% CI: 1.32-6.36), and elevated inflammatory markers (WBC OR: 1.52, 95% CI: 1.28-1.80; fibrinogen OR: 1.37, 95% CI: 1.02-1.84; CRP OR: 1.01, 95% CI: 1.00-1.03). Higher admission serum albumin was associated with lower odds of SAP (OR: 0.92, 95% CI: 0.86-0.98). The nomogram showed strong apparent discrimination (AUC: 0.90, 95% CI: 0.86-0.94). After multivariable adjustment, SAP remained associated with poorer short-term functional improvement (adjusted OR: 6.99, 95% CI: 3.05-17.54) and greater healthcare utilization (median length of stay: 39.6 vs. 30.6 days; median cost: USD 12,836 vs. 6585). In this retrospective cohort, admission markers of nutritional depletion and inflammatory activation were associated not only with increased likelihood of SAP, but also with adverse early functional outcomes. These association-based findings support early risk stratification using routine admission markers; prospective studies and external validation are required before clinical implementation.
The last decade has brought a worldwide surge of interest in rewilding-the repopulation of large herbivores and carnivores-as a strategy for conserving species and reviving ecosystem functions. Rewilding initiatives, if closely monitored, can provide unique insights into the ecology of the world's largest animals at otherwise impossible spatial and temporal scales. Capitalizing on these opportunities, and developing a knowledge base to guide future restoration efforts, requires the collection and dissemination of long-term data that document community reassembly. To date, such data are virtually nonexistent: most megafaunal restoration projects are nascent and/or have not been rigorously monitored. Since 2008, the Gorongosa Restoration Project, in Mozambique's Gorongosa National Park, has facilitated the recovery of megafauna populations that were severely depleted or extirpated during the country's civil war (1977-1992). For over a decade, we have monitored Gorongosa's large-herbivore populations to understand how animal behavior and trophic interactions change as communities reassemble. Here, we present spatiotemporally explicit data sets on the movements and diets of large herbivores in Gorongosa between 2013 and 2025, along with annual rainfall data. This period encompassed extremes of climate (including some of the driest and wettest years on record) and the reintroduction, starting in 2018, of locally extinct apex predators and scavengers: African wild dog (Lycaon pictus), leopard (Panthera pardus), spotted hyena (Crocuta crocuta), and side-striped jackal (Lupulella adusta). We used GPS telemetry to monitor 277 herbivores of seven species (listed below with number of individuals collared, median duration of tracking, and median number of locations per individual): Cape bushbuck (Tragelaphus sylvaticus: 103 individuals; 280 days; 6646 fixes), nyala (T. angasii: 37 individuals; 306 days; 6789 fixes), greater kudu (T. strepsiceros: 80 individuals; 300 days; 17,365 fixes), common eland (T. oryx: 10 individuals; 334 days; 15,783 fixes), waterbuck (Kobus ellipsiprymnus: 22 individuals; 13 days; 2877 fixes), plains zebra (Equus quagga: 7 individuals; 212 days; 1171 fixes), and African savanna elephant (Loxodonta africana: 18 individuals; 706 days; 33,122 fixes). For 295 individuals that were immobilized during this work, we present morphological measurements (chest girth, body length, hind-foot length, weight), reproductive status and nutritional condition (ultrasound measurements, palpation scores), and fate (mortality date and cause, if known). For diet analysis, we used DNA metabarcoding to identify and quantify the relative abundances of plant taxa in 3785 fecal samples from 27 mammal species belonging to 11 families and 7 orders. In all, we recorded 516 food-plant taxa from at least 87 plant families and 39 orders. For Gorongosa's 15 most common large herbivores, the median sampling depth was 216 fecal samples per species (interquartile range 156-279); the overall median sampling depth was 92 samples per species (range 1-499). We include basic metadata collected in the field (e.g., date, time, GPS location, animal sex, and age) along with laboratory notes and information on plant taxonomic identification. These data are valuable not just as a window on one ecosystem's recovery from armed conflict, but also as a resource for macroecology, meta-analysis, and synthetic studies of animal movement, diet, and the dynamics of community reassembly. The data are freely available for use and this paper should be cited whenever data are reused; see Data S1: Metadata S1: Class III.B.4 for additional details.
Understanding interindividual variability in treatment response and toxicity is essential for optimizing outcomes in pediatric acute lymphoblastic leukemia (ALL). Molecular and pharmacogenetic markers hold promise in predicting treatment efficacy and adverse effects, particularly in genetically diverse populations. This protocol outlines the methodology for a prospective, nonrandomized observational cohort designed to evaluate molecular and pharmacogenetic factors associated with treatment response and toxicity in Indian children diagnosed with ALL. The primary objective is to identify genetic markers associated with treatment-related toxicity and therapeutic response. Secondary objectives include evaluating associations between the occurrence of early toxicities and quality of life during active ALL treatment, specific pharmacogenetic variants, and survival outcomes along with generating data to support the future implementation of personalized treatment strategies in Indian children with ALL. In this prospective, observational cohort, 556 children (≤18 years of age) with newly diagnosed ALL treated under the Indian Childhood Collaborative Leukemia-Acute Lymphoblastic Leukemia 2014 (ICiCLe-ALL-14) protocol at two Indian centers will be enrolled, aiming for a minimum of 500 evaluable children. Eligible participants will be enrolled prior to the initiation of chemotherapy and followed longitudinally throughout treatment. Clinical and laboratory data (demographics, nutritional assessment, quality of life, comorbidities, treatment regimen, toxicity graded by Common Terminology Criteria for Adverse Events v5.0, remission status, and survival) will be collected at predefined intervals up to day 100 of the maintenance phase. Germline and somatic DNA will be sampled at diagnosis and remission. The first phase will use whole-exome sequencing to discover candidate variants by implementing a candidate gene prioritization strategy. The second phase will genotype the top candidates in the full cohort using array technology. Associations with early treatment-related toxicities, steroid response, and survival will be tested by multivariable regression and Cox models. A machine learning approach with pharmacogenetic predictors as classifiers will be implemented further with cross-validation and sensitivity analyses. Ethical committees approved the protocol version 1.0 in 2020: IEC-1167/06.11.2020 (All India Institute of Medical Sciences, New Delhi), JIP/IEC/2020/201 (Jawaharlal Institute of Postgraduate Medical Education and Research, Puducherry), and AO_2021-00048 (UNIGE, Geneva). Funding was received from Swiss National Science Foundation, Switzerland; Department of Biotechnology, India; and CANSEARCH Foundation, Switzerland. Recruitment began in December 2022 and is likely to conclude by 2027. A comprehensive analysis of the complete study cohort is anticipated to be completed by 2027. The MPGx-INDALL (Molecular and Pharmacogenetic Marker Evaluation in Relation to the Toxicity and Clinical Response of Acute Lymphoblastic Leukemia Treatment in Indian Children) study will generate actionable insights for individualized ALL therapy in India via systematically evaluating germline and somatic markers in a large ethnically distinct cohort.
To address the high false-positive rate of fibrosis-4 (FIB-4), we hypothesize that combining it with readily available nutritional-inflammatory indices can improve the diagnostic accuracy for advanced liver fibrosis/cirrhosis. Diagnostic performance was sequentially evaluated: first, individual metrics and FIB-4 were compared via receiver operating characteristics (ROC) curve analysis; then, their incremental value was assessed using DeLong's test, net reclassification improvement (NRI) and integrated discrimination improvement (IDI); finally, the clinical utility of four combined strategies was examined. In discriminating advanced fibrosis/cirrhosis, the platelet-to-albumin ratio (PAR) showed a significantly higher area under the curve (AUC) of 0.698 than the haemoglobin, albumin, lymphocyte and platelet (HALP, AUC= 0.494; p < .001). Similarly, the prognostic nutritional index (PNI) also outperformed HALP in diagnostic performance (AUC = 0.658; p < .0001). Adding PAR or PNI to FIB-4 significantly improved diagnostic performance, as evidenced by Delong's test, NRI and IDI. Specifically, the combination of FIB-4 and PAR improved specificity (88.5% vs 38.5%) and overall accuracy (63.7% vs 54.8%) compared to FIB-4 alone, while the FIB-4 and PNI combination achieved a specificity of 80.3%. The parallel strategy (FIB-4 + 'PAR or PNI') maintained a sensitivity of 61.9% and achieved the highest negative predictive value (65.2%). The serial strategy (FIB-4 + 'PAR and PNI') provided the highest specificity (95.1%) and positive predictive value (81.8%). In scenarios prioritizing high sensitivity, FIB-4 alone is suitable. For high specificity, a sequential strategy (FIB-4 + 'PAR and PNI') is recommended. For balanced performance, a parallel strategy (FIB-4 + 'PAR or PNI') provides optimal feasibility. In settings where high sensitivity is prioritized for screening, the FIB-4 index may be used alone to minimize the risk of missed diagnoses.In settings where high specificity is the primary goal, a sequential strategy (FIB-4 + ‘PAR and PNI’) is recommended to reduce the false positive rate as much as possible.For routine screening that requires balanced overall diagnostic performance, a parallel strategy (FIB-4 + ‘PAR or PNI’) demonstrates relatively even performance across metrics and offers good feasibility for clinical application.
Understanding the interplay between nutrition, lifestyle, and environmental awareness is increasingly critical in shaping effective public health strategies. This study aimed to culturally adapt, extend and validate the nutritional and social health habits (NutSo-HH) scale for use in the Italian context. Cultural adaptation included forward and backward translation, expert review, and pilot testing. In line with current evidence and guidelines promoting health-enhancing and environmentally responsible diets, the Italian version was revised and expanded into the NutSo-HH-Ita inventory, comprising 4 domains: Nutrition and food choices, Nutritional risk behaviors, sustainability and media influence, and lifestyle and wellbeing. Structural validity was examined through confirmatory factor analysis in a sample of four hundred three Italian adults. Construct validity was evaluated through known-group comparisons. Reliability was assessed using McDonald's omega for internal consistency, factor score determinacy coefficients for factor quality, and intraclass correlation Coefficients for test-retest stability. Confirmatory factor analysis (CFA) confirmed the theoretical structure of the inventory, yielding satisfactory to excellent fit indices across the 9 factors (comparative fit index [CFI] = 0.943-0.997; Tucker-Lewis index [TLI] = 0.922-0.994; root mean square error of approximation [RMSEA] = 0.015-0.048). Known-group analyses further supported construct validity. Omega values ranged from 0.51 to 0.80, with highest reliability observed for Environmental awareness (ω = 0.80; FSD = 0.92) and Digital influence (ω = 0.78; FSD = 0.92). Protein sources and Physical activity showed the lowest internal consistency (ω = 0.51 and 0.54, respectively). Test-retest stability was moderate for most factors, except Media influence, Social Habits, and Sleep quality, which demonstrated poor stability. The NutSo-HH-Ita is a valid and contextually relevant inventory for assessing dietary habits, lifestyle, and the sustainability of nutritional choices in the Italian context. It offers valuable support for epidemiological research, public health intervention planning, and the monitoring of nutritional and environmental behaviors.
Korea's population is aging rapidly, yet the oral health and nutritional status of long-term care facility residents remain limited in terms of systematic management and research, despite its importance to quality of life. This study aimed to compare the oral health and nutritional status of older adults in Long-Term Care Facilities (LTCFs) according to their Long-Term Care (LTC) grades and to examine the interrelationship between these two factors. A cross-sectional study was conducted among 180 older adults aged 65 years or older residing in two public long-term care facilities in Korea. Participants were classified into an institution-based benefits group (Grades 1-2) and a home-based benefits group (Grades 3-5) according to their Long-Term Care (LTC) grades. Oral health status, including oral hygiene management, chewing ability, and oral dryness, and nutritional status, assessed using the Mini Nutritional Assessment (MNA), were evaluated. Multiple linear regression analysis was conducted to identify oral health factors affecting MNA scores. The institution-based group showed significantly poorer physical function, oral hygiene behavior, chewing ability, and nutritional status compared to the home-based group. Malnutrition was observed in 73.9% of the institution-based group versus 26.9% in the home-based group (p<0.001). Regression analysis identified significant predictors of nutritional status: oral dryness (p<0.001), chewing ability (p=0.001), oral care dependency (p=0.024), plaque or calculus (p=0.025), and food debris (p=0.016). Oral health significantly impacts nutritional status among older adults in LTCFs. Those with higher LTC grades are at greater risk of both oral and nutritional deterioration. These findings highlight the urgent need for integrated, multidisciplinary care approaches addressing oral health and nutrition concurrently in LTC settings.
To evaluate the efficacy of a combined intervention of dietary fiber and probiotics on gut microbiota composition, systemic immune response, nutritional status, and long-term prognosis in patients with advanced colorectal cancer (CRC) undergoing conventional therapy. This retrospective cohort study included 80 patients with advanced CRC and diagnosed malnutrition, treated between May and December 2022. Patients were divided into a control group (n = 40), which received standard nutritional support, and an observation group (n = 40), which received standard support supplemented with a daily formulation of dietary fiber and multi-strain probiotics for 12 weeks. Primary endpoints included changes in immune markers (IgA, IgG, CD4+/CD8+ ratio) and gut microbiota composition (alpha and beta diversity, and specific genera abundance via 16S rRNA sequencing). Secondary endpoints were nutritional status (body mass index [BMI], Patient-Generated Subjective Global Assessment [PG-SGA]), quality of life (WHOQOL-100), and 3-year survival outcomes. Baseline characteristics were well-matched between groups (p > 0.05). After 12 weeks, the observation group exhibited significantly higher levels of IgA (0.95 ± 0.24 vs. 0.78 ± 0.15 g/L), IgG (9.34 ± 1.35 vs. 8.05 ± 1.16 g/L), and an improved CD4+/CD8+ ratio (2.18 ± 0.20 vs. 1.56 ± 0.15) compared to the control group (all p < 0.01). The intervention led to superior improvements in BMI (+ 2.83 kg/m2 vs. + 0.77 kg/m2), PG-SGA scores (4.67 ± 0.44 vs. 6.14 ± 0.50), and WHOQOL-100 scores (98.27 ± 7.38 vs. 72.35 ± 6.79) (all p < 0.001). Microbiota analysis revealed a significant increase in the Shannon diversity index and distinct clustering in beta diversity, alongside enriched relative abundance of Lactobacillus and Bifidobacterium in the observation group (p < 0.001). Importantly, the observation group demonstrated a higher 3-year survival rate (83.5% vs. 67.4%, p = 0.001) and a numerically lower incidence of overall complications (7.5% vs. 17.5%, p = 0.176), including a reduction in grade ≥ 3 diarrhea and mucositis, although these differences did not reach statistical significance due to sample size limitations. In patients with advanced CRC, supplementing conventional therapy with dietary fiber and probiotics effectively modulates the gut microbiota, enhances systemic immune function, improves nutritional status and quality of life, and is associated with better long-term survival. These findings advocate for the integration of gut microbiota-targeted nutritional strategies into standard oncological care.
Pregnancy-related anaemia significantly affects human development across life stages. In sub-Saharan Africa (SSA), country-specific epidemiological variations primarily driven by nutritional practices, socioeconomic factors and health-system disparities contribute to heterogeneity in prevalence, severity and adverse birth outcomes (ABOs). While anaemia and micronutrient deficiencies in pregnancy are well studied globally, comprehensive trimester-specific evidence and their associations with ABOs in SSA remain scarce. This review, therefore, examines the breadth and nature of existing evidence on these associations within SSA, thereby updating current knowledge and informing regionally tailored interventions and future research. A scoping review methodology will be employed due to the limited volume of literature addressing the specific research questions and population. The review will follow the Joanna Briggs Institute (JBI) framework, applying the population-concept-context approach. Comprehensive searches will be conducted across CINAHL, MEDLINE, Cochrane Library, Scopus, Google Scholar, EBSCO Open Dissertations and relevant organisational websites. The planned search period will span from 1 January 2016 to 31 December 2025. Two reviewers will independently screen and extract data using JBI-adapted protocols within the Rayyan review platform. Any discrepancies will be resolved via discussion with the research team. Findings will be synthesised narratively through descriptive content analysis and visual mapping.This review will include peer-reviewed studies and grey literature that investigate the associations between anaemia or deficiencies in iron, folate and vitamin B12 during pregnancy trimesters and ABOs in SSA. All relevant sources of evidence will be considered, regardless of study design or methodology, provided they report on women of reproductive age who experienced anaemia in any trimester and were subsequently identified with ABOs. Birth outcomes of interest include low birth weight, macrosomia, small or large for gestational age, preterm birth, post-term birth and stillbirth. Only sources published in English from 2016 onward will be included. The studies' quality will be evaluated using Cochrane's risk of bias assessment and mixed methods appraisal tools and reported according to Preferred Reporting Items for Systematic Reviews and Meta-Analyses extension for Scoping Review. This scoping review will not require ethical approval as it will synthesise published data and reports. It has been registered with the Open Science Framework. This review does not involve human participants. The final report will be submitted for publication in a peer-reviewed journal. The findings will be used to shape subsequent research, serving as a fundamental element of the evidence and knowledge mapping framework. As this study protocol was not reviewed by an ethics committee, the appropriate contact for research integrity matters is the Faculty of Health and Environmental Sciences, Auckland University of Technology.
Comorbidities, common in patients with advanced breast cancer (ABC), may impact survival outcomes and health-related quality of life (HRQoL). Here, we report subgroup analyses on the basis of comorbidities of patients from POLARIS (NCT03280303), a prospective, observational study of patients with hormone receptor-positive/human epidermal growth factor receptor 2-negative (HR+/HER2-) ABC who were treated with palbociclib plus endocrine therapy in routine clinical practice in North America. Real-world progression-free survival (rwPFS) and overall survival (OS) were evaluated by line of therapy (1 LOT, ≥ 2 LOT), Charlson Comorbidity Index (CCI) score (0, 1-2, and 3+), and common comorbid disorder categories (cardiovascular, psychiatric, metabolic and nutritional, and blood and lymphatic disorders). HRQoL was assessed using the global health status (GHS)/QoL subscale of the European Organization for Research and Treatment of Cancer Quality-of-Life Questionnaire Core 30. From January 2017 to October 2019, 1250 patients (median age, 64 years) initiated palbociclib-based therapy. Median rwPFS (95% CI) for patients with CCI scores of 0, 1-2, and 3+ were 20.3 (17.1-24.8), 24.2 (19.4-29.5), and 16.8 (11.2-20.8) months in 1 LOT and 13.7 (6.2-19.7), 13.2 (9.4-17.5), and 14.9 (8.0-21.9) months in ≥ 2 LOT, respectively. Median OS durations (95% CI) were 48.8 (37.3-not estimable [NE]), not reached (43.0-NE), and 34.8 (29.1-44.1) months in 1 LOT and 39.0 (30.6-50.5), 37.9 (26.5-42.6), and 31.6 (20.9-45.2) months in ≥ 2 LOT, respectively. Patients with blood and lymphatic disorders had shorter rwPFS and OS than those with other common comorbidities. GHS/QoL was maintained irrespective of CCI score. Patients with a CCI score of 0 had clinically meaningful and statistically significantly higher mean GHS/QoL scores than patients with a CCI score of 3+ at each assessment. Patients with HR+/HER2- ABC receiving palbociclib with higher comorbidity burden, especially blood and lymphatic system disorders, had poorer clinical outcomes. GHS/QoL was preserved regardless of comorbidity burden. Clinical trial number: NCT03280303.
Dairy products are known to improve blood lipid profiles and insulin sensitivity and to reduce risk factors for metabolic syndrome (MetS) and nonalcoholic fatty liver disease (NAFLD). However, the mechanisms through which dairy product consumption influences NAFLD via MetS components remain unclear. This study examined the mediating effects of MetS components on the association between dairy product consumption and NAFLD. This study included 12,775 Korean adults from the Korea National Health and Nutrition Examination Survey (KNHANES) 2019-2021. Dairy product intake was assessed using a 24-hour dietary recall. NAFLD was defined using a hepatic steatosis index score >36, and MetS was classified according to the National Cholesterol Education Program Adult Treatment Panel III criteria. Multivariable logistic regression analyses were conducted to examine the associations among dairy intake, NAFLD, and MetS components. Mediation analyses with bootstrapping (n = 1,000) were performed to investigate the mediating effects of individual MetS components on the association between dairy consumption and NAFLD. Consumption of more than one serving of dairy products was associated with a lower prevalence of NAFLD among women (adjusted odds ratio [AOR], 0.75; 95% confidence interval [CI], 0.59-0.96). Regarding MetS components, intake of one serving of dairy products was associated with lower odds of elevated triglycerides in men (AOR, 0.75; 95% CI, 0.63-0.89). In women, consumption of at least one serving was associated with decreased hyperglycemia (AOR, 0.84; 95% CI, 0.73-0.97), abdominal obesity (AOR, 0.69; 95% CI: 0.55-0.87), low high-density lipoprotein cholesterol (AOR, 0.83; 95% CI: 0.72-0.95), and elevated triglyceride levels (AOR, 0.71; 95% CI, 0.60-0.85). Mediation analyses indicated that, among women, significant proportions of the associations between dairy product consumption and NAFLD were mediated by waist circumference (58.0%), systolic blood pressure (18.2%), and high-density lipoprotein cholesterol (51.7%), whereas no significant mediation effects were observed among men. Dairy product consumption was associated with a lower prevalence of MetS and NAFLD among women. Mediation analysis further suggested that dairy product consumption may reduce the risk of NAFLD by improving metabolic dysfunction among women.
With the global aging population and the expanded application of hematopoietic stem cell transplantation (HSCT) in older adults, understanding quality of life (QOL) outcomes in this vulnerable group is of paramount importance. This narrative review synthesizes current evidence to (1) describe the trajectory of QOL in older HSCT recipients; (2) identify the key transplant-related, physiological, and psychosocial determinants of post-transplant well-being; and (3) evaluate existing and emerging strategies to improve patient outcomes. A literature search was conducted in PubMed/MEDLINE, Web of Science, and Embase (inception to December 2024) using terms related to HSCT, older adults, and QOL. Articles focusing on patients aged ≥60 years were selected to synthesize evidence on QOL outcomes, influencing factors, and interventions. Older adults experience significant QOL decline, typically nadiring within three months post-transplant, with infrequent full recovery to baseline. Key determinants include transplant modality (allogeneic vs. autologous), regimen intensity, and complications like graft-versus-host disease. These interact with patient-specific factors such as frailty, nutritional status, psychological distress, and social support. Emerging multidisciplinary strategies, including optimized conditioning regimens, structured psychosocial support, and geriatric assessment-informed enhanced recovery programs, show promise in mitigating QOL impairment. The evidence highlights the multifactorial nature of QOL in this population. A critical gap remains the lack of standardized, HSCT-specific geriatric assessment tools to guide personalized interventions. While multidisciplinary care is advocated, its components require further refinement and validation for older adults. A holistic, patient-centered approach integrating geriatric principles is essential to improve long-term survivorship and QOL in the growing population of older adults undergoing HSCT.
To study the therapeutic efficacy and its mechanism of Sijunzi decoction (SJZD) in geriatric sarcopenia. Geriatric sarcopenia patients were divided into control group (n = 31) and research group (n = 31). Control group received basic intervention therapy, research group received basic intervention therapy and SJZD. Clinical efficacy (AMIS, 6 meters walking speed, and grip strength), comprehensive geriatric assessment (activities of daily living, Morse, mini-nutritional assessment, Frail, mini-mental state examination, self-rating anxiety scale, and geriatric depression scale), the functions of liver and kidney, blood cells inflammatory indexes (platelet-to-lymphocyte ratio, neutrophil-to-lymphocyte ratio, systemic immune-inflammatory index, systemic inflammation response index, and aggregate inflammation systemic index), and nutritional indicators were detected. Network pharmacology was used to study the mechanism of SJZD in treating sarcopenia. After SJZD treatment, the levels of AMIS, 6 meters walking speed, and grip strength, the scores of activities of daily living, mini-nutritional assessment, and mini-mental state examination scale were significantly increased, compared with the control group, while the scores of Morse, Frail, self-rating anxiety scale, and geriatric depression scale were strongly decreased. The levels of aspartate aminotransferase, alanine aminotransferase, hemoglobin, albumin, prealbumin, and body mass index in research group were obviously higher than that in control group, but the levels of blood urea nitrogen, creatinine, platelet-to-lymphocyte ratio, neutrophil-to-lymphocyte ratio, systemic immune-inflammatory index, systemic inflammation response index, and aggregate inflammation systemic index were obviously decreased after treatment. Network pharmacology results showed that the active ingredients of SJZD acted on 36 targets of sarcopenia, which were mainly enriched in the interleukin-17 pathway, TNF pathway. We also found that the serum levels of interleukin-1β, matrix metalloprotein-9, tumor necrosis factor-α, interleukin-6, myostatin, and C-reactive protein were significantly reduced by SJZD, which suggested the roles of SJZD in inflammation. SJZD improved muscle strength, quality, and functions in sarcopenia patients by reducing peripheral blood inflammation.
The Heal-Me (Healthy Eating, Active Living-Mindful Energy) version 1 is a web-based app designed to provide exercise and nutrition support to vulnerable chronic disease populations. As version 1 of the Heal-Me technology has not been evaluated for acceptance and usability within a clinical environment, this study sought to apply the Unified Theory of Acceptance and Use of Technology (UTAUT) framework to investigate its acceptance and usability as part of a randomized controlled trial (RCT) involving adults diagnosed with cancer, chronic lung disease, or those who had undergone liver or lung transplantation. This single-group study used a multimethod design. Participants completed self-administered questionnaires before and after the intervention, without a control group. Near the end of the study, individual interviews were conducted with select participants. Technology acceptance and usability were assessed using five-point Likert scales from "disagree (0)" to "agree (4)." Data analysis included partial least squares structural equation modeling and thematic analysis of interview responses. Participants' expectations for accepting the Heal-Me web-based app were met. Performance expectancy, effort expectancy, and social influence had direct effects on their intention to use the app, explaining 50.9% of the variance in behavioral intention. Both facilitating conditions and behavioral intention significantly influenced actual usage, accounting for 71.4% of its variance. Performance expectancy emerged as the most influential factor predicting participants' intention to use the Heal-Me web-based app, with findings clearly indicating a high level of acceptance of the first version of this technology.
Excessive dietary salt intake is a key controllable risk factor for hypertension. Despite the clear clinical benefits of reducing salt intake, overcoming the "knowledge-action" gap among patients with hypertension poses a significant public health challenge. The Theory of Planned Behavior (TPB) is a powerful framework for understanding the psychological factors that influence behavioral intentions. However, studies based on TPB that explore dietary intentions to reduce salt intake in middle-aged and older hypertension patients have not been well described nor well studied in previous literature. Using the Theory of Planned Behavior (TPB) as the theoretical framework and Structural Equation Modeling (SEM) as the methodological basis, we aimed to examine the status of salt reduction dietary intention and its influencing factors in middle-aged and older hypertensive patients. From March to November 2023, a total of 558 middle-aged and older hypertensive patients across 38 districts and counties in Chongqing Municipality, China, were interviewed using a face-to-face questionnaire with a Likert scale. SEM was used to explore the relationship between the salt reduction intention (INT) of middle-aged and older hypertensive patients aged 45 years and above and their attitude toward salt reduction (ATT), subjective norms of salt reduction (SN), and perceived behavioral control of salt reduction (PBC), as well as the bidirectional correlation between ATT, SN, and PBC. Among the 558 participants, 55.2% were female, 61.3% were older, and 38.7% were middle-aged; 44.1% were the primary home cook, and 48% had a junior high school education or above. Overall 70.8% reported an intention to reduce salt in their diets. Attitudes toward a salt-reducing diet (β = 0.22, p < 0.05) and perceived behavioral control (β = 0.70, p < 0.05) positively affected salt-reducing diet intentions, with perceived behavioral control showing the strongest effect. Subjective norm (β = 0.14, p > 0.05) did not significantly affect intentions to reduce dietary salt. This study highlights the central influence of salt-reducing dietary attitudes and perceived behavioral control in salt-reducing dietary intentions among middle-aged and older hypertensive patients. Future nutritional health education should prioritize strategies to strengthen intention and perceived behavioral control.
A randomized phase II screening trial of gemcitabine, nab-paclitaxel, and cisplatin with a medically supervised ketogenic diet (MSKD) versus usual diet (non-MSKD) was conducted in patients with treatment-naive metastatic pancreatic ductal adenocarcinoma (PDAC). Patients with untreated metastatic PDAC were randomized 1:1 to MSKD or non-MSKD while receiving gemcitabine, nab-paclitaxel, and cisplatin on days 1 and 8 of a 21-day cycle. The MSKD was guided by a remote health care team and daily ketone (beta-hydroxybutyrate) levels, with goal beta-hydroxybutyrate of 0.5 to 3.0 mM. The primary endpoint was progression-free survival (PFS) using a one-sided alpha level of 0.20. Secondary endpoints included overall survival (OS), safety, and quality of life (QOL). Changes in microbiome were an exploratory endpoint. Overall, there were 32 evaluable patients. In the MSKD arm, 15 of 16 patients achieved nutritional ketosis; the median proportion of days in ketosis was 39.4%. The median PFS was 8.5 months in MSKD patients and 6.2 months in non-MSKD patients: hazard ratio, 0.53 (95% CI, 0.21-1.37); one-sided p = .096. The median OS was 13.7 months with MSKD and 10.2 months in the non-MSKD arm: hazard ratio, 0.58 (95% CI, 0.25-1.37); one-sided p = .107). All MSKD-related adverse events were grade 1-2. There were no significant differences in grade ≥3 chemotherapy-related adverse events between the arms. MSKD patients had no decline in QOL and had significant enrichment of beneficial taxa in the microbiome (p < .05, log-fold change ≥2). The MSKD is feasible in patients with PDAC and, although not powered for definitive outcomes, shows trends in improved PFS and OS when combined with gemcitabine, nab-paclitaxel, and cisplatin, without added toxicity or detriment to QOL. Larger studies are required to confirm these findings and establish the value of the MSKD in pancreatic cancer treatment.
Given the central role of phosphorus in key metabolic processes, including glucose phosphorylation, ATP synthesis, insulin signalling, and energy metabolism, dietary phosphorus availability may influence postprandial metabolic responses. This systematic review evaluates the effects of inorganic phosphorus supplementation on diet-induced thermogenesis, postprandial glycaemia, and postprandial lipidemia in healthy adults. A systematic search of PubMed, Google Scholar, Scopus, and the Cochrane Central Register of Controlled Trials (CENTRAL) was conducted. Only experimental intervention studies assessing phosphorus supplementation as the primary exposure and postprandial metabolic outcomes as primary endpoints were included. Eligible participants were healthy adults aged 18-64 years. Secondary outcomes included changes in body weight, energy intake, and satiety. Ten randomised crossover trials met inclusion criteria, comprising a total of 225 participants. Three out of four studies reported a significant positive association between phosphorus supplementation and diet-induced thermogenesis (P < 0.05). Evidence regarding the effects of phosphorus on postprandial glycaemia and lipidemia was inconsistent. An inverse association was observed between phosphorus intake and weight gain (P < 0.001) and energy intake (P < 0.01), alongside a positive association with satiety (P < 0.05). While these findings indicate potential metabolic benefits of dietary phosphorus, particularly in relation to thermogenesis and energy regulation, interpretation is tempered by the small number of available studies, modest sample sizes, and methodological heterogeneity. These limitations restrict causal inference and generalizability. Further rigorously designed, adequately powered clinical trials are therefore warranted to substantiate these associations and to clarify the effects of phosphorus on postprandial glycaemic and lipid outcomes.
Malnutrition in children remains a major global public health concern, especially in sub-Saharan Africa. A cross-sectional study was conducted among 120 children, with a sub-sample of 23 children selected for a 3-day weighed food intake assessment. Data were collected using a validated questionnaire, anthropometric measurements, and dietary intake records. Analysis was performed using SPSS version 21 and results were presented as means, frequencies, and percentages. The daily energy intake of children aged 4 and 5 years was below the recommended levels (74.1% and 64.3%, respectively). However, children aged 2 and 3 years had adequate energy intakes, exceeding the recommendations (102.4% and 111.5%). Iron intake across all age groups was below the recommended dietary intake. Intake of B-complex vitamins (B1, B2, B3) among 2-, 3-, and 5-year-olds exceeded recommended levels. Calcium intake was consistently low across all age groups (2 years: 37.5%, 3 years: 44.6%, 4 years: 23.5%, 5 years: 24.7%), this is due to low consumption of protein food sources and vegetables rich in calcium. Key factors influencing low nutritional status included inadequate consumption of high protein food sources, overreliance on carbohydrate food (cassava flour), poor consumption of fruits and vegetables, and inability to access food due to sickness. The study highlights suboptimal intake of energy and essential micronutrients among orphanage children, particularly older age groups. Nutrition education, improved feeding practices, and increased dietary diversity are essential to improve the nutritional status of children in orphanages.
This study analyzes the causal effect of sleep duration on mental health among young adults using a meta-learner-based causal inference framework. Specifically, we applied a T-Learner model with Random Forest and XGBoost as the base learner to data from 1,405 individuals aged 19-34, drawn from the 2022-2023 Korea National Health and Nutrition Examination Survey. The result indicates that adequate sleep increases the probability of maintaining normal mental health. Subgroup analysis comparing individuals with adequate sleep and normal mental health to those with insufficient sleep and poor mental health also reveals a statistically significant causal effect of sleep on mental health improvement. In addition, AST (SGOT) levels and blood creatinine concentration are identified as key confounding factors. Findings suggest that sufficient sleep could enhance mental health among young adults, and policy implications for youth mental health are derived from the perspective of sleep duration. By providing empirically identified causal evidence based on nationally representative data, this study contributes to the growing literature on sleep and mental health and highlights sleep duration as a modifiable target for evidence-based mental health interventions in young adults.