Postoperative organ dysfunction is a leading cause of death and disability following hip fracture surgery in older patients. Intraoperative hypotension is a major modifiable risk factor for this complication, yet the optimal management strategy to prevent it remains controversial. We hypothesize that an individualized blood pressure management strategy is superior to standard management in reducing postoperative organ dysfunction. This single-center, randomized, controlled trial will enroll 180 patients aged 65-85 years with hip fractures under general anesthesia. Eligible patients will be randomly allocated in a 1:1 ratio to the individualized management group (targeting systolic blood pressure within ± 10% of baseline) or the standard management group (reactive management, where intervention is initiated only if systolic blood pressure < 90 mmHg or decrease of > 30% from baseline). A universal mean arterial pressure target of ≥ 65 mmHg will be maintained for all patients. The allocated hemodynamic management strategy will be maintained throughout surgery and during the post-anesthesia care unit stay. The primary outcome is a composite of dysfunction in at least one organ system (respiratory, cardiovascular, renal, and neurological) within 7 days after surgery. Secondary outcomes include the components of the primary outcome, intraoperative variables (including hemodynamic management data, fluid balance, blood loss, and serum lactate levels), intensive care unit and hospital stay, and all-cause mortality within 30 days after surgery. This randomized controlled trial aims to determine whether individualized blood pressure management reduces postoperative organ dysfunction more effectively than standard management in older hip fracture surgery patients. If proven effective, this proactive approach may represent a significant advance in clinical practice, moving from reactive hypotension correction to preventive stabilization, potentially reducing major complications, shortening hospital stays, and improving functional recovery. The results will provide important evidence to guide hemodynamic management during general anesthesia in this vulnerable population, contributing to standardized, evidence-based protocols for enhancing perioperative outcomes. Trial registration: Chinese Clinical Trial Registry, ChiCTR2400093838. Registered on 12 December 2024. This manuscript presents the study protocol; participant recruitment is ongoing and results will be reported upon trial completion.
Management of rectourinary fistula (RUF) is challenging due to limited data and variability in presentation and treatment. We conducted a systematic review of radical prostatectomy (RP)-related RUF to assess clinical features, diagnostics, treatments, and outcomes, and to propose a structured algorithm to guide management. We conducted a systematic review in accordance with PRISMA guidelines to evaluate the clinical presentation, diagnostic strategies, management approaches, and outcomes of RUF following RP. A comprehensive search of PubMed, Embase, and Web of Science was performed from database inception to January 2025. Data were extracted on patient demographics, symptoms, diagnostic modalities, management strategies, surgical repair techniques and treatment outcomes. A total of 455 cases of RUF following RP were identified across 34 studies. The reported incidence of RUF ranged from <0.01% to 1.5%. The most frequent presenting symptoms were urine leakage per rectum (60.7%), fecaluria (44.1%), and pneumaturia (50.0%). Fecal and/or urinary diversion was utilized in over 60% of cases, with a median stoma duration of 3 months and an indwelling urinary catheter duration of 1 month. Conservative management (observation, fluid replacement and antibiotic therapy) was attempted in a minority of patients and was generally associated with success rates below 50%. Surgical repair was performed in nearly all cases, with the transperineal and transsphincteric approaches being the most commonly employed techniques. The median time to fistula closure ranged from 0.5 to 30 months, with reported surgical success rates varying between 41% and 100%. We propose a management flowchart based on our clinical experience, outlining the diagnostic and therapeutic approach to rectal injury and RP-related RUF. RUF after RP remains a challenging complication, often requiring stepwise management. Conservative treatments rarely succeed, and surgery is often necessary. Our proposed algorithm aims to standardize the approach to RI and RUF, guiding treatment decisions and improving outcomes.
Work in eating disorder (ED) services presents unique challenges and rewards that may affect clinicians' work-related and personal wellbeing. However, research on ED clinician needs, views, and experiences is still sparse, despite major service changes since the COVID pandemic. This study aims to explore and conceptualise NHS ED clinicians' work-related experiences, challenges, and needs, in order to inform future clinicians wellbeing and service improvement strategies. Clinicians working in ED services (N = 19) were interviewed using a semi-structured interview guide that probed their professional experiences, work-related needs, and views. Interviews were analysed using NVivo, following guidance from Braun and Clarke (2006) for reflexive thematic analysis. A holistic ecological systems framework for ED services was created, comprised of five levels of influence: intrinsic, intra-personal, departmental, systemic, and societal. These levels contain nine themes: [1] clinician motivation for working in ED services [2], complexities of ED management [3], clinician personality and emotional disposition [4], team dynamics [5], supervision, management, and organizational support [6], service-level concerns [7], macro-level systemic concerns [8], broader societal challenges in ED care, and [9] COVID-related challenges. Key concerns included the chronic nature and risk of EDs, growing service demands amid limited resources, and regulation through guidelines and commissioning targets. This presented framework illustrates the multifaceted array of complexities faced by ED clinicians. The interplay of personal, inter-personal, and systemic factors is explored, with clinicians' interest in and commitment to ED care at the core of the framework. These areas can be targeted to improve clinician job satisfaction and reduce burnout risk, with the goal to provide optimal patient care. This study explores the experiences and wellbeing of clinicians working in NHS eating disorder (ED) services. Through interviews with clinicians, the research explored both the positive and difficult parts of their job. While staff felt strongly committed to helping people with EDs, many also described feeling emotionally drained and frustrated. This was often due to high workloads, not enough resources, and long waiting lists. Clinicians found it especially hard when they had to follow strict service rules that didn’t work well for individual patients, and when they had to manage complex medical risks. Supportive teams and good supervision helped some staff cope. Wider problems like staff shortages, poor communication between services, and lack of funding compounded emotional strain. The findings show that ED clinicians urgently need more support, including better resources, more flexible ways of working, and proper training, to give safe, effective care without burning out.
The unique environments created within healthcare simulations may create distinct health and safety risks for patients, learners, simulation faculty and staff, and other participants involved in these activities. Guidance to aid simulationists to manage the physical risks arising from simulation activities is limited and there is no integrated synthesis of known risks or risk mitigation strategies. To identify and examine literature addressing the physical health and safety risks associated with healthcare simulation, in order to inform the development of effective safety management strategies. This review included published empirical research and non-empirical literature (e.g., commentaries, editorials) examining physical health and safety risks associated with any form of healthcare simulation programme, event, or facility. A multi-pronged search strategy was used including electronic databases, Google Scholar searches, and reference list searches of literature published between 2010 and 2025. Risks, contributory factors and mitigation strategies were identified and mitigation strategies were coded according to the Hierarchy of Controls framework. Sixteen articles were included. The literature most frequently identified physical health and safety risks to patients during in-situ simulation. Risks to participants and simulation staff or faculty were reported less often and primarily related to exposure to clinical equipment and musculoskeletal injury associated with the physical demands of simulation. Risks to simulated participants included clinical interventions being performed on them. Contributory factors included learner inexperience and failure to recognise risks, rule violations, pursuit of realism in simulation, poor simulation design, inadequate preparation, and lack of formal safety systems. Mitigation strategies were predominantly administrative, with rules, procedures, and checklists reported in 94% of sources, while elimination and engineering controls appeared in 37% and 19% respectively. There was call for clinical governance tools and processes to support a robust simulation health and safety approach. These findings highlight the need for co-designed, simulation-specific governance tools, including standardised risk assessments, adverse event reporting systems, and safety policies tailored to simulation environments. These tools should be embedded within, and aligned to the parent organisation's health and safety governance process, rather than constituting a separate or parallel governance process.
A well-functioning public health system relies on a robust workforce. Comprehensive data on the workforce, such as number, distribution, and key characteristics, are crucial for evidence-based workforce planning and development. However, few comprehensive public health workforce assessments exist, especially in low- and middle-income countries. Public health reforms over the years and needs identified during the COVID-19 pandemic prompted this assessment in Georgia. A survey of the core public health workforce, including employees at central and regional units of the National Center for Disease Control and Public Health (NCDC) and Municipal Public Health Centers (MPHC), was conducted online between June and September 2023. The survey collected data on workforce demographics, education, on-the-job training, and time spent across different program areas and job functions, along with questions on career progression, job satisfaction, and motivation. The response rate was 81.3%. Findings showed that the median age was 48 for NCDC and 56 for MPHC employees. Over 80% of NCDC and 90% of MPHC employees are women. More than 50% of the workforce hold a master's degrees or higher, and over half of degree-holders specialized in public health or medicine. Mean years of service are 14.9 (NCDC) and 18.0 (MPHC), but career mobility is limited, only 33.3% of NCDC and 10.5% of MPHC staff have ever been promoted. NCDC employees spend most time on administration and surveillance/response, while MPHC staff focus on communicable disease management, administration, and immunization. Training participation is limited, with employees in key positions having better access. Despite limited advancement and relatively low pay, the workforce reported high job satisfaction and strong intrinsic motivation. These findings are pivotal in identifying workforce planning and development bottlenecks and developing targeted strategies. Key interventions include addressing an aging workforce through targeted recruitment and succession planning, providing competitive salaries to attract a younger workforce, and strengthening training offerings. This effort to profile the public health workforce could guide similar assessments in the future and in other countries to prevent, detect, and respond to public health threats.
Falls are the leading cause of accidental injury among older adults, 30% of community-dwelling adults aged 65 and over fall each year, with nearly half occurring outdoors. These falls are complex, understudied, and insufficiently addressed in current age-friendly cities or walkability frameworks. This study aimed to build interdisciplinary consensus on risks, preventive actions, and barriers to fall prevention in outdoor public spaces through a Delphi process. A three-phase Delphi study was conducted with 64 participants in round 1, 60 in round 2, and 49 in round 3, including four expert groups: older adults who had fallen outdoors, health and research professionals, urban planners, and decision-makers (local and regional policy-makers, elected officials, and public-space managers involved in urban planning). Phase one collected open responses on risks, preventive actions (modification of physical layout, public-space management, and behavior-related factors), and barriers to these actions. Responses were synthesized using AI-assisted analysis with systematic human validation. In phases two and three, the relevance of 124 propositions were rated on a 10-point Likert scale. Consensus was defined as ≥ 70% of ratings ≥ 7/10 and interquartile range ≤ 2.5. Consensus was reached for key intrinsic factors such as gait and balance impairments, visual and vestibular deficits, cognitive decline, and polypharmacy, as well as for environmental factors including irregular or inappropriate surfaces, obstacles, or signage, and crowding. Highly relevant preventive actions included integrating fall prevention into street and sidewalk design, training urban planning professionals, awareness campaigns, systematic maintenance, safer crossings, participatory co-design public-space adaptations and urban design features involving older adults and local stakeholders, and improved data monitoring through surveillance, mapping, and sharing of fall-related and environmental risk information. Main barriers were insufficient budgets, high costs, limited integration of fall prevention into planning priorities, and lack of evaluation of the impact of implemented actions. Outdoor fall prevention is a transversal challenge requiring integration of public health and urban planning. This Delphi highlights actionable priorities to embed fall prevention in local and national strategies, in particular in rapidly aging regions.
Dyslipidemia remains a major modifiable contributor to China's cardiovascular disease (CVD) burden, yet large-scale evidence on risk-stratified management gaps is lacking. In this nationwide study across 1,785 hospitals in 28 Chinese provinces, 604,250 outpatients with dyslipidemia were enrolled. We analyzed lipid levels, quantified control rate among treated population and rates of requiring lipid-lowering therapy (LLT) among untreated population across regions, socioeconomic status, and demographic groups. Total cholesterol (TC) and low-density lipoprotein cholesterol (LDL-C) levels were higher among females, middle-aged individuals, and individuals with obesity. LDL-C levels were also higher among urban residents, while TC showed no significant urban-rural difference. Triglyceride (TG) levels were higher in males, middle-aged individuals, individuals with obesity, and rural residents. Among treated population, LDL-C/non-HDL-C control rates reached 95% in low-risk, 70-90% in moderate-risk, 40-50% in high-risk, and 5-15% in very-high-risk groups. Among untreated population, rates of requiring LLT reached about 20% in the low-risk group and over 70% in moderate- and high-risk groups. After adjusting for covariates, males, older individuals, smokers, patients with hypertension or type 2 diabetes mellitus, as well as those in rural and low-gross-domestic-product areas were associated with lower lipid control rates and higher treatment needs. Our findings highlight the urgent need for risk-stratified lipid management in primary care, improved access to LLT, and policies addressing regional and socioeconomic disparities to enhance lipid control and reduce CVD burden in China. Lessons from China can inform global strategies to improve lipid management and reduce the CVD burden.
Home environments shape children's dietary habits, but which factors are most influential is unclear. The study purpose was to identify factors in the home environment associated with child intake of fruit and vegetables (FV) and sugar-sweetened beverages (SSBs) using a national dataset collected in 2013-2015 in the U.S. Data from 5,138 school-aged children (4-15 years old) from 130 U.S. communities were collected in 2013-2015. Parents and/or children completed a dietary screener and additional survey questions to assess household socioeconomic status (SES), grocery shopping sources, home food availability, social support for healthy eating, eating out frequency, and other home eating and related behaviors. Other child characteristics included breastfeeding history, intake of school foods, and participation in other nutrition programs. Community variables included predominant race/ethnicity and SES. Classification and regression trees (CART) identified key predictors of intake. The FV and SSB CARTS had 14 and 12 terminal groups, respectively. Children with the highest FV intake (0.54 SD from mean cups/day; 13% of sample) had fruit more often available at home, dark green vegetables more often available at home, ate dinner with family more often, had SSBs less often available at home, and were breastfed longer. Conversely, children in the two groups with the lowest FV intake either had fruit less often available at home, and family never complimented their eating (-0.86; 2%), or they had family that rarely or sometimes complimented their eating, and perceived school lunches as unhealthy (-0.87; 1%). For SSB intake, the lowest consumers (-0.63 SD from mean tsp/day sugar; 17%) never or rarely had SSBs available at home, and lived in higher SES communities. Children in the two groups with the highest SSB intakes had SSBs available at home more often, and lived in a SNAP-participating household and either ate out less often, used a phone/computer for social networking, and had SSBs available at home very often (1.3; 1%), or they ate out more often, and were breastfed for a shorter duration (1.1; 5%). Home availability of FV and SSBs were the most salient predictors of intake of both FV and SSBs, while other predictors differed between FV and SSB intake. Study findings highlight several actionable home-environment strategies to test in future studies to improve school-aged children's diets.
The China-ASEAN regional medical procurement platform was launched in early 2025. However, the scope and operational mechanisms of the platform remain unclear. This study aims to assess medicine prices and affordability in China and ASEAN countries, explore potential implementation challenges of the platform, provide policy suggestions. We selected commonly used medicines from four ATC categories (alimentary tract and metabolism, cardiovascular system, anti-infective for systemic use, nervous system). Prices were standardized to WHO defined-daily-dose (DDD) prices and converted into median price ratio (MPR) using Management Sciences for Health (MSH) international reference prices (IRP). All prices data were collected from official public sources and converted to US dollars using the official 2024 annual average exchange rate. Affordability was estimated the number of days' statutory gross daily minimum wages required to purchase one DDD, with wage data obtained from the International Labor Organization (ILO). Descriptive statistics were performed. A total of 68 medicines were included, with 68, 68, 60, and 59 available in China, Thailand, Indonesia, and the Philippines, respectively. Median MPRs were 0.88 (IQR:0.46-3.49), 0.97 (IQR:0.50-2.20), 1.69 (IQR:0.77-3.16), 1.86(IQR:0.72-5.03), respectively, and 45.6%, 45.6%, 61.7%, and 67.8% of medicines were priced above the IRPs. Prices varied widely across and within countries. For cardiovascular medicines, median MPRs exceeded the IRPs in China 1.72(IQR:0.53-6.30), Indonesia 1.79(IQR:0.78-2.80), the Philippines 2.85(IQR:1.31-6.43), while Thailand achieved a lower price of 0.78(IQR:0.31-1.37). The overall affordability was higher in China, Indonesia and Thailand, where one DDD of medicine required less than 6% of a day's wage, with median values of 4.8% (IQR:2.5%-19.3%), 5.1% (IQR:2.3%-9.6%), and 3.6% (IQR:1.8%-8.1%), respectively, compared with 14.0% (IQR:5.4%-37.7%) in the Philippines. Sensitivity analysis excluding extreme affordability values yielded similar results. Our findings suggest that understanding cross-country disparities in medicine prices and affordability may help inform the design of future regional purchasing strategies. Realizing the benefits of such joint procurement will require strong political commitment to establish a legal framework, enhance price transparency, harmonize regulations, and strengthen supply chains to ensure the platform's effectiveness and sustainability.
Parents of undiagnosed children (POUC) experience significant psychosocial challenges, including anxiety, uncertainty, and isolation, that stem from parenting medically complex children while facing obstacles throughout the diagnostic journey. Despite these well-described challenges, a mental health intervention designed to meet the unique needs of POUC, which is necessary to promote the psychological and overall wellbeing of this population, does not exist. Acceptance and Commitment Therapy (ACT) has proven effective in a wide range of populations and shows promise for POUC. With the goal of designing and implementing an ACT-based intervention tailored to POUC, this pre-implementation study aimed to understand their psychosocial needs and prior mental health support experiences, explore their reactions towards ACT, and determine their anticipated barriers, facilitators, and preferences for participating in an ACT skills group, guided by the Consolidated Framework for Implementation Research (CFIR). Semi-structured, individual interviews were conducted with 18 POUC, including an experiential portion that exposed participants to key ACT concepts and exercises. Inductive coding based on participant responses and deductive coding based on the CFIR were employed to code interview transcripts. Reflexive thematic analysis was performed to identify key findings. Isolation was a psychosocial challenge for which all participants desired support. Many participants reported inadequacies in their prior mental health support, primarily due to lack of understanding from therapy providers regarding their unique circumstances. Although most participants indicated that ACT could help them manage difficult thoughts and emotions and act in alignment with their values, they also described achievability, collaboration, and accountability as key elements that could support their uptake. The main barriers, facilitators, and preferences that participants highlighted were related to group design (accessibility, flexibility) as well as their own characteristics as recipients (capability, need, and motivation). This pre-implementation study affirmed the potential value of ACT for POUC and identified key opportunities for tailoring an ACT skills group to meet their needs. Future research, including pilot implementation studies, are needed to evaluate the effectiveness of a tailored ACT skills group and further refine both the intervention and its implementation strategy.
Potato (Solanum tuberosum L.) productivity in northwestern Ethiopia is limited by poor soil fertility and nutrient imbalances, particularly deficiencies of phosphorus and potassium in Nitisols, resulting in suboptimal yield and tuber quality. This study aimed to evaluate the combined effects of different phosphorus and potassium fertilizer rates on potato yield, yield components, and nutrient accumulation under Nitisol conditions. A field experiment was conducted using three phosphorus rates (0, 34.5, and 69 kg P₂O₅ ha⁻¹) and four potassium rates (0, 100, 200, and 300 kg K₂O ha⁻¹) arranged in a factorial randomized complete block design with three replications. Yield components, total and marketable yields, and plant tissue phosphorus and potassium contents were measured and analyzed using analysis of variance. The interaction between phosphorus and potassium significantly affected major yield components, including marketable tuber number per hill, marketable tuber weight, marketable yield, and total tuber yield. The highest marketable tuber number (11.80) was obtained from 69 kg P₂O₅ ha⁻¹ combined with 300 kg K₂O ha⁻¹, whereas the maximum marketable yield (48.32 t ha⁻¹) and total tuber yield (49.14 t ha⁻¹) were achieved with 34.5 kg P₂O₅ ha⁻¹ combined with 200 kg K₂O ha⁻¹. Fertilizer application also enhanced phosphorus and potassium concentrations in both shoot and tuber tissues. The combined application of 34.5 kg P₂O₅ ha⁻¹ and 200 kg K₂O ha⁻¹ is recommended to improve potato productivity in Nitisols. Future studies should focus on multi-season validation, and integrated nutrient management strategies.
Off-site fattening of Tibetan sheep is a key strategy to mitigate the effects of high-altitude grassland degradation and winter forage scarcity, promoting sustainable development in plateau animal husbandry. However, transport stress (TS) presents a significant challenge to realizing its benefits. The mechanism by which TS affects the health of Tibetan sheep by regulating rumen microbial and serum metabolite rhythmic changes remains unclear. This study selected six healthy male Tibetan sheep, aged seven months and of comparable body weight, for the transport experiment. Blood and rumen fluid samples were collected at four-hour intervals during 24-hour periods pre-transport (CON) and post-transport (TS) for serum indicators, serum metabolome, and rumen microbiome analyses. The results showed that TS significantly increased serum concentrations of cortisol (COR), melatonin (MT), lipopolysaccharide-binding protein (LBP), serum amyloid A (SAA), and non-esterified fatty acid (NEFA), while significantly decreasing glucose (GLU), total antioxidant capacity (T-AOC), and glutathione peroxidase (GPx) (P < 0.05). Furthermore, the circadian rhythms of COR, MT, LBP, SAA, NEFA, and GPx were significantly disrupted (ADJ.P < 0.05). TS reduced the proportion of rumen microbial circadian rhythms from 3.46% to 1.99%, with Prevotella, Butyrivibrio, and Ruminococcus losing their circadian rhythmicity in the TS phase (ADJ.P < 0.05). Additionally, TS decreased the proportion of circadian rhythm-regulated serum metabolites from 51.74% to 29.51%. In the TS phase, rhythmically regulated metabolites, including 3',5'-cyclic AMP, fumarate, dopamine, glutathione, and angiotensin (1-7), were enriched in pathways such as oxidative phosphorylation, retinol metabolism, and tryptophan metabolism. Multi-omics analyses demonstrated significant correlations between Ruminococcus and energy metabolites (malic acid, 3',5'-cyclic AMP, fumarate, NEFA), and between Butyrivibrio, Anaeroplasma, and inflammatory/antioxidant markers (glutathione, SAA, LBP). In conclusion, this study reveals that TS induces a homeostatic imbalance in Tibetan sheep by disrupting the circadian rhythms of both the rumen microbiota and host metabolism. These findings provide a theoretical basis and molecular targets for developing interventions to alleviate TS in livestock.
Ureteral stents are routinely used following endourological procedures to ensure adequate drainage and prevent obstruction. However, stent-related morbidity remains common, and optimal stent dwell time and removal methods are not well defined. This systematic review aimed to evaluate clinical and procedural factors influencing ureteral stent dwell time and the methods used for stent removal after endourological interventions. A systematic review was conducted in accordance with PRISMA guidelines and registered on PROSPERO. MEDLINE and Embase were searched from inception to October 2025. Randomized controlled trials and comparative observational studies evaluating ureteral stent dwell time and/or removal methods in adults undergoing endourological procedures were included. Risk of bias was assessed using RoB 2 and ROBINS-I tools. Thirty-two studies encompassing 4,373 patients were included. Reported stent dwell times varied widely, most commonly ranging between 10 and 14 days in uncomplicated cases, with longer durations associated with increased rates of encrustation and removal difficulty. Removal techniques included rigid cystoscopy (48.7%), flexible cystoscopy (19.9%), extraction strings (23.5%), and device-assisted methods (7.9%). Less invasive approaches, particularly flexible cystoscopy and extraction-string removal, were consistently associated with reduced pain scores and improved patient comfort, although extraction strings carried a small risk of premature dislodgement. While practice patterns vary, the evidence suggests that a 10-14 day dwell time might be the optimal window to balance healing with the prevention of encrustation. Less invasive removal approaches, particularly flexible cystoscopy and extraction-string techniques, were generally associated with lower pain scores and high procedural success rates in selected patients. While these methods are safe and better tolerated, extraction strings carried a small, reproducible risk of premature dislodgement. High-quality prospective studies are needed to define determinant-based, individualized stent management strategies.
The evolving global disease landscape, in conjunction with the significant impact of an aging population, has led to mental‒physical multimorbidity, imposing unprecedented pressures on healthcare systems and economies. This study aimed to investigate the interrelationships among multimorbidity, depression, and catastrophic health expenditure (CHE) and to test whether the intensity of CHE mediates these links. The analysis employed data from the China Health and Retirement Longitudinal Study (CHARLS), which conducted a longitudinal survey from 2011 to 2018, tracking 5,274 participants aged 45 years and older over a seven-year timeframe. Multimorbidity was ascertained through self-reported data from participants, whereas depression was evaluated via the 10-item Center for Epidemiologic Studies Depression Scale (CES-D-10). The intensity of CHE was calculated as the ratio of out-of-pocket (OOP) payments to the capacity to pay (CTP), adjusted for a catastrophic threshold of 40%. The relationships among the three variables were analysed via an extension of the random intercept cross-lagged panel model (RI-CLPM), which includes covariates to predict the observed variables. Mediation via the intensity of CHE was tested using 5,000 bootstrap resamples. At the between-person level, multimorbidity and depression were positively correlated (Model 1 r = 0.349; Model 2 r = 0.246; both p < 0.001), whereas the intensity of CHE showed negligible between-person associations with either variable. At the within-person level, all variables showed significant autoregressive stability, with multimorbidity demonstrating the strongest persistence (β = 0.808 in Model 1 and 0.936 in Model 2). Cross-lagged associations were clearly asymmetric, with prior multimorbidity exerting the largest prospective effects on the intensity of CHE (β = 3.028) and subsequent depression (β = 0.646 in Model 1 and β = 0.789 in Model 2), whereas prior depression and prior intensity of CHE had much smaller effects on later multimorbidity. Mediation analyses indicated that the intensity of CHE (T) partially mediated the association from multimorbidity (T‑1) to depression (T + 1) (indirect effect = 0.063, 95% CI [0.042, 0.084]), but showed negligible mediation for the reverse pathway from depression (T‑1) to multimorbidity (T + 1) (indirect effect = 0.001, 95% CI [0.000, 0.001]). The study identified asymmetric bidirectional relationships among multimorbidity, depression, and the intensity of CHE in Chinese middle-aged and older adults, with effects predominantly running from multimorbidity to increased intensity of CHE and later depression; the intensity of CHE explained only a small portion of the multimorbidity→depression effect and virtually none of the depression→multimorbidity pathway. Policies that integrate multimorbidity management with routine depression screening could help reduce the combined physical, psychological, and financial burdens among middle-aged and older adults.
Despite modest improvement, the lifespan of a child on dialysis continues to be 40 years shorter compared to healthy children. Cardiovascular disease (CVD) is the leading cause of morbidity and mortality in these patients. Risk factors for CVD are present even in early stages of chronic kidney disease (CKD) and accelerate as the child's renal function deteriorates. As a result, the highest burden of CVD exists in patients on chronic dialysis. Although dialysis is life-sustaining, the dialysis procedure promotes cardiovascular damage. Both traditional and non-traditional CVD risk factors drive this acceleration. More concerning, the dialysis procedure itself is cardiotoxic. Because of coexisting poor cardiac reserve and altered sympathetic tone in this patient population, dialysis induces repetitive contractile, ischemic injury termed myocardial stunning. This ischemia-reperfusion injury is reversible at first. However, with repetitive episodes, this injury will trigger alterations in cardiac function that decrease contractile function in order to preserve viability. Ultimately, these adaptations lead to remodeling and fibrosis. There is no targeted therapy available to reverse cardiovascular damage in these patients. Intensive monitoring and management of modifiable risks such as hypertension and anemia in the early stages of CKD to optimize cardiovascular health is imperative. However, in late CKD, especially in those patients who are not candidates for preemptive renal transplantation, optimization of the dialysis procedure is critical to prevent acceleration of their CVD burden. Improved assessment of dry weight as well as data-driven fluid management programs may decrease some risk. More importantly, standard implementation of intensified dialysis prescriptions by increasing time or through the addition of convective clearance may mitigate progressive cardiovascular damage and enhance survival. In this review, the pathophysiology of dialysis-induced cardiovascular damage will be reviewed. The management strategies to limit the cardiovascular burden in our patients are also discussed.
Chronic kidney disease (CKD) attributable to type 2 diabetes mellitus (T2DM) represents a growing global health concern. However, comprehensive long-term epidemiological trends and projections, stratified by sociodemographic and geographic variables, remain inadequately delineated. To evaluate the global, regional, and national burden of CKD due to T2DM from 1990 to 2021, and to forecast its trends through 2035 using Bayesian age-period-cohort (BAPC) modeling. This population-based observational study used data from the Global Burden of Disease Study 2021 (GBD 2021), which includes 204 countries and territories across five sociodemographic index (SDI) quintiles and 21 GBD regions. The study covers the period 1990-2021 with projections to 2035. Diagnosis of T2DM mellitus as an underlying cause for CKD. Incident and prevalent cases, mortality, and disability-adjusted life-years (DALYs) attributable to T2DM-related CKD. Age-standardized incidence (ASIR), prevalence (ASPR), mortality (ASDR), and DALY (ASR) rates were computed, alongside estimated annual percentage changes (EAPC). From 1990 to 2021, the global number of incident CKD cases due to T2DM increased by 167.2%, while the ASIR rose by 21.0% (EAPC: 0.61). Prevalent cases nearly doubled (+85.1%), although ASPR declined slightly (-5.1%, EAPC: -0.17). Deaths surged by 222.6%, and ASDR increased by 37.8% (EAPC: 1.17). DALYs rose by 173.6%, with a 24.0% increase in ASR (EAPC: 0.81). Males and older adults consistently exhibited higher burden across all indicators. Low- and middle-SDI nations experienced the most pronounced burden growth, yet high-SDI regions also registered substantial increases in mortality and DALYs. Projections to 2035 suggest a continued escalation, with incident cases exceeding 2.6 million and deaths surpassing 700,000 annually by mid-century. These findings highlight the importance of targeted prevention, early detection, and improved management strategies, particularly in high-growth regions and vulnerable populations.
Severe acute malnutrition (SAM) affects millions of children globally, and treatment coverage remains below 30% in many settings, including Ethiopia. Although the Community-Based Management of Acute Malnutrition (CMAM) program has expanded nationwide, persistent service gaps remains, partly due to insufficient evidence for accurate cost estimation and budgeting. To address this gap, this study estimated the total economic cost, including provider-side financial and caregiver costs, of treating SAM through the CMAM program in two operational areas of Action Against Hunger, Ethiopia, and identified major cost drivers. A cross-sectional cost analysis was conducted in Girawa district (Oromia Region) and Adadle district (Somali Region) in 2024 from a societal perspective, including both provider-side financial and caregiver costs. Provider-side financial costs include personnel, medical supplies, therapeutic foods, equipment, transport and storage, and training and supervision. Caregiver costs include both direct costs (transport, food, and hospitalization-related expenses) and indirect costs (lost income and coping strategies). Provider-side financial costs were extracted and estimated using the FANTA CMAM costing tool. The tool automatically generated total cost per district, and the provider-side financial cost per SAM child was calculated by dividing total annual provider-side financial expenditure by the number of SAM cases treated. Caregiver costs were collected through structured exit interviews, analyzed using Excel, and summarized as mean cost per treatment episode. The mean caregiver cost was added to the provider-side financial cost to estimate the total economic cost per SAM child. The total annual provider-side financial cost for SAM treatment was USD 386,598 in Girawa district and USD 289,433 in Adadle district. Supplies, particularly RUTF and therapeutic milk, constituted the largest cost category in Girawa (57.7%), whereas repeated SAM-specific training and supervision represented the major share in Adadle (40.9%). The average provider-side financial cost per SAM child was USD 171.1 in Girawa and USD 325.2 in Adadle. The average caregivers incurred cost per SAM episode was USD 53.55. The total economic cost per SAM child, including caregiver expenses, was USD 224.65 in Girawa and USD 378.75 in Adadle. There is substantial variation in the cost of delivering SAM treatment across districts, highlighting the importance of context-specific district-level cost analyses. SAM-specific supplies and training intensity were the primary cost drivers. The incorporation of the household economic burden highlights an important but often overlooked dimension of treatment costs. These findings provide realistic district-level unit costs that can directly guide partners and governments in estimating resource needs for annual response plans while strengthening CMAM budgeting, planning, scaling up in Ethiopia.
Glioblastoma (GBM) is one of the most aggressive and treatment-resistant brain tumors, largely due to the restrictive nature of the blood-brain barrier (BBB). This barrier significantly limits the efficient delivery of therapeutic agents to the tumor site, thereby reducing treatment efficacy. This review evaluates the potential of dextran (Dex)-based nanoparticles (NPs) as an advanced platform for enhancing BBB penetration and enabling targeted GBM therapy. Dex, a biocompatible and biodegradable polysaccharide, offers key advantages including ease of functionalization, high drug-loading capacity, and improved systemic stability. Recent studies demonstrate that Dex-based nanocarriers enhance drug transport across the BBB via receptor-mediated and adsorptive transcytosis mechanisms, resulting in improved accumulation at tumor sites. Furthermore, surface engineering strategies facilitate active targeting of GBM cells, thereby increasing therapeutic efficacy while reducing systemic toxicity. Comparative evidence indicates that Dex-based nanocarriers outperform conventional delivery systems in terms of targeting efficiency, biocompatibility, and tailored drug release. These systems also show potential for co-delivery of multiple therapeutic agents, supporting combination treatment approaches for improved clinical outcomes. Emerging preclinical studies highlight improved survival outcomes and enhanced pharmacokinetic profiles associated with Dex-based nanocarriers, reinforcing their therapeutic relevance. Despite these promising findings, challenges related to large-scale manufacturing, reproducibility, and regulatory approval remain significant barriers to clinical translation. Future research should focus on clinical validation, scalable synthesis approaches, and long-term safety assessment to facilitate successful translation into clinical practice. Overall, Dex-based NPs represent a versatile and highly promising strategy to overcome existing limitations in GBM treatment and advance targeted nanomedicine approaches for brain cancer therapy.
Corporate bankruptcy prediction is essential for assessing companies' capacity to maintain sustainable customer relationships and service quality. This study proposes a novel CNN-based hybrid approach that transforms correlation-filtered financial features into 64 × 64 grayscale images, enabling reliable identification of firms at financial risk whose deteriorating conditions could compromise their ability to maintain quality customer service and sustain long-term business relationships. The research explicitly examines the linkage between financial health indicators and customer relationship sustainability by categorizing financial features based on their operational impact on service delivery, relationship management capabilities, and long-term customer commitment fulfillment. The methodology was evaluated on a comprehensive dataset of 43,405 Polish companies (2,091 bankrupt, 41,314 healthy) using two resampling strategies: random downsampling and Synthetic Minority Oversampling Technique (SMOTE). Following correlation-based feature selection that reduced multicollinearity by eliminating features with absolute correlation coefficients exceeding 0.8, retained financial features were normalized and transformed into spatial image representations. Six classification models were implemented: Deep Neural Network (DNN), Support Vector Machine (SVM), Random Forest (RF), Decision Tree (DT), Gradient Boosting (GB), and Logistic Regression (LR), alongside five CNN-hybrid variants, evaluated using 5-fold cross-validation. SMOTE-balanced datasets demonstrated superior performance across all models. Ensemble methods achieved exceptional accuracy, with Random Forest reaching 99.99% and Gradient Boosting 99.97%. The innovative CNN-SVM hybrid model attained 99.77% accuracy with perfect ROC-AUC (1.000), providing reliable indicators for assessing firms' financial stability and their ability to invest in customer experience initiatives. Statistical analysis identified company size and working capital as the most discriminative financial indicators directly impacting customer service delivery capabilities. Customer-related metrics such as receivables turnover and collection period indicators emerged as critical predictors of relationship management effectiveness. The study contributes a novel spatial feature representation methodology enabling precise identification of companies whose financial deterioration could compromise customer service quality and relationship sustainability. These findings provide significant implications for stakeholders seeking enhanced risk assessment capabilities that consider both internal financial health and external customer relationship dynamics in bankruptcy prediction.
Serious fungal diseases are significant yet long neglected public health problem in India, setting an estimated 4.1% of the population and with the mortality from invasive fungal diseases approaching 50%, surpassing several nationally prioritised conditions. The emergence and spread of drug resistant pathogens such as Candida auris, azole resistant Aspergillus fumigatus, and Trichophyton indotineae together with COVID-19 associated mucormycosis epidemic, expose critical gaps in preparedness prevention and care despite the presence of a WHO collaborating centre and an Indian Council of Medical Research (ICMR) mycology Advanced Mycology Diagnostic and Research Centre network, major deficiencies persist in surveillance, diagnostics, antifungal access, stewardship, workforce, capacity and coordinated research OBJECTIVES: The Indian Fungal Infection National Declaration (I- FIND) seeks to catalyse a structured, time bound national response that elevates fungal diseases to a recognised public health priority and measurably reduces morbidity, mortality, and fungal resistance. Its objectives are to identify and target key at risk populations (including those with immunosuppression, critical illnesses, chronic lung disease and poorly controlled diabetes, to acknowledge major community mycoses such as chronic dermatophytosis, vulvovaginal candidiasis, and fungal keratitis, and to define clear domains for action across governance, surveillance, diagnostics, clinical management, research, workforce development, public awareness, One Health, and accountability. I- FIND identifies 9 core domains, including governance, surveillance, diagnostics, clinical management, research, workforce development, public awareness, One Health measures, and accountability. Proposals cover creating a National Task Force, issuing a five-year fungal disease control strategy, integrating mycology network, setting diagnostic standards, incorporating stewardship and prioritizing translational research funding. The declaration is supported by major Indian and international professional societies guides formal adoption and implementation across India.