Rare diseases affect small, dispersed populations and are often studied through multisite designs where equity-relevant demographic data are essential for inclusive recruitment and accurate interpretation. This study examined how sociodemographic variables are collected and reported in rare disease research and evaluated their alignment with the PROGRESS-Plus framework, which outlines Place of residence, Race/ethnicity/culture/language, Occupation, Gender/sex, Religion, Education, Socioeconomic status, social capital, and additional "Plus" factors such as age and disability status. A systematic review of peer-reviewed articles was conducted alongside an environmental scan of demographic instruments from governmental, health-system, academic, and rare disease organizations. Screening and extraction coded variables as reported, indirectly derivable, or not reported and compared them with established standards. Of 647 records identified, 37 met inclusion criteria. Reporting was dominated by age and sex, while most other equity-relevant variables including gender identity, sexual orientation, race/ethnicity, distinctions-based Indigenous identity, socioeconomic position, language, migration, disability/function, religion, occupation, and social capital, were inconsistently captured. Environmental scan instruments were more comprehensive, revealing a capture-to-reporting gap. Demographic reporting in rare disease research is heterogeneous and insufficient for equity-focused analyses. A concise, standards-aligned sociodemographic dataset is needed to improve transparency, comparability, and detection of inequities across rare disease populations.
Mental health conditions account for 18% of years lived with disability worldwide. 1-in-6 adults are affected in England, with most mental health conditions beginning in childhood and adolescence. Mental distress and ill health are unequally distributed in the UK, with strong associations with wider determinants of health, and higher prevalence among systemically disadvantaged groups. Currently, there is a lack of evidence to inform effective and timely policymaking for primary prevention in the UK. In recognition of these challenges, a national Population Mental Health (PMH) Consortium was established, as part of Population Health Improvement UK (PHIUK). PHIUK is a national research network which works to transform health and reduce inequalities through change at the population level. Our aim is to establish an interdisciplinary PMH Consortium, focussing on upstream determinants and the prevention of risks and onset of mental health conditions through interdisciplinary stakeholder engagement, to create new opportunities for population-based improvement of mental health in the UK.The PMH Consortium brings together leading interdisciplinary representation in population mental health, spanning from sciences to the arts, across the UK. Membership includes six academic institutions, third sector organisations, lived experience expertise, and strong links with national bodies to ensure integrated cross-national and regional policy impact. The PMH Consortium comprises four cross-cutting platforms (Partners in policy, implementation, and lived experience; Data, linkages, and causal inference; Narrowing inequalities; Training and capacity building) and three challenge areas (Children and young people's mental health; Prevention of suicide and self-harm; Multiple long-term conditions) which are highly integrated and interdependent. The work will be underpinned by a Theory of Change across an initial four-year life cycle. This paper describes the aim, objectives, and approach of the PMH Consortium, as well as anticipated challenges and strengths. The goal of the PMH Consortium is to develop a model for population mental health research and policy translation that is both scalable and sustainable. It is critical to ensure continued impact and viability beyond the initial four years, contributing to the prevention of mental health conditions in the UK, with personal, economic, social, and health benefits.
The growing proportion of women in veteran communities internationally highlights a rising need for veteran support services tailored to their unique experiences. Despite this, support services remain predominantly designed for men, leading to underutilization and dissatisfaction among women veterans. This scoping review aimed to provide a comprehensive international review of the current state of knowledge regarding the experiences of women veterans in accessing and engaging with veteran-specific support services. This study followed the Joanna Briggs Institute scoping review methodology. Five databases were searched for papers published from 2000 onwards. Studies reporting on barriers and/or facilitators to access and experiences of engaging with veteran-specific support services reported by women veterans were included. There were no limitations on study methodology or country of origin, and all publications reporting primary research were included. A total of 117 studies were included in the review. This research originated predominantly from the US (n = 109), with seven UK papers, and one Canadian. Eleven themes were identified across the literature, highlighting gendered barriers and facilitators of accessing veteran-specific support for women. Women veterans report feelings of discomfort, exclusion, and discrimination within veteran services, which are perceived as being set up and designed for men. Women report experiencing stigma in help-seeking compounded by a perception of feminine weakness experienced during military service. Some women didn't want to access services they saw as military-adjacent, due to gendered adverse experiences during military service, including discrimination, harassment, and sexual violence. A lack of identification with the term 'veteran' further hinders women's engagement with veteran-specific services. Enablers of access include care that is sensitive to women's needs, trauma-informed service user-provider relationships, and peer support. The reviewed evidence suggests women experience unique challenges and needs in accessing veteran-specific services. Support services should focus on developing care that is, culturally competent, trauma-informed and sensitive to the needs of women, to address gendered barriers to engagement. More research is needed to confirm these research findings outside of the US context, and incorporating an intersectional lens in future research will be essential for improving the support systems for women veterans internationally.
Work in eating disorder (ED) services presents unique challenges and rewards that may affect clinicians' work-related and personal wellbeing. However, research on ED clinician needs, views, and experiences is still sparse, despite major service changes since the COVID pandemic. This study aims to explore and conceptualise NHS ED clinicians' work-related experiences, challenges, and needs, in order to inform future clinicians wellbeing and service improvement strategies. Clinicians working in ED services (N = 19) were interviewed using a semi-structured interview guide that probed their professional experiences, work-related needs, and views. Interviews were analysed using NVivo, following guidance from Braun and Clarke (2006) for reflexive thematic analysis. A holistic ecological systems framework for ED services was created, comprised of five levels of influence: intrinsic, intra-personal, departmental, systemic, and societal. These levels contain nine themes: [1] clinician motivation for working in ED services [2], complexities of ED management [3], clinician personality and emotional disposition [4], team dynamics [5], supervision, management, and organizational support [6], service-level concerns [7], macro-level systemic concerns [8], broader societal challenges in ED care, and [9] COVID-related challenges. Key concerns included the chronic nature and risk of EDs, growing service demands amid limited resources, and regulation through guidelines and commissioning targets. This presented framework illustrates the multifaceted array of complexities faced by ED clinicians. The interplay of personal, inter-personal, and systemic factors is explored, with clinicians' interest in and commitment to ED care at the core of the framework. These areas can be targeted to improve clinician job satisfaction and reduce burnout risk, with the goal to provide optimal patient care. This study explores the experiences and wellbeing of clinicians working in NHS eating disorder (ED) services. Through interviews with clinicians, the research explored both the positive and difficult parts of their job. While staff felt strongly committed to helping people with EDs, many also described feeling emotionally drained and frustrated. This was often due to high workloads, not enough resources, and long waiting lists. Clinicians found it especially hard when they had to follow strict service rules that didn’t work well for individual patients, and when they had to manage complex medical risks. Supportive teams and good supervision helped some staff cope. Wider problems like staff shortages, poor communication between services, and lack of funding compounded emotional strain. The findings show that ED clinicians urgently need more support, including better resources, more flexible ways of working, and proper training, to give safe, effective care without burning out.
The recent introduction of the right to oncological oblivion in some European states raises critical issues. While designed to protect cancer survivors from discrimination, this right may compromise occupational health surveillance for workers exposed to carcinogenic hazards. This commentary raises questions for future policy and research.
Visual impairment affects over 2.2 billion people worldwide and the major causes include age-related macular degeneration (AMD), glaucoma, and diabetic retinopathy. For research in these areas, although animal models offer a more physiologically complex system than in vitro approaches, their use raises ethical considerations, and species-specific differences such as variations in protein sequences and signaling pathways. This can limit the direct translatability of the outcomes. Traditional 2-D cell cultures, in contrast, lack the multicellular organization and dynamic microenvironment necessary to replicate human retinal complexity. Retinal organoids (ROs), three-dimensional tissue constructs derived from pluripotent stem cells, have emerged as a promising model due to their human origin and complex cellular interactions that cannot be achieved in conventional 2-D/3-D co-culture models. In this review, we provide a brief overview of the evolution from 2-D to 3-D retinal models, highlight the structural and functional features of ROs including the presence of layered retinal architecture, photoreceptor outer segment formation, and light-responsive electrophysiological activity and summarize their applications in disease modeling, drug discovery, and gene and cell therapy. ROs represent a significant advancement over traditional models by enabling the recapitulation of human-specific retinal development, facilitating the study of patient-derived disease phenotypes, and providing a platform for personalized therapeutic screening. Their development has deepened understanding of pathological mechanisms in conditions such as retinitis pigmentosa and AMD, while enabling preclinical testing of targeted interventions like CRISPR-based gene editing and photoreceptor cell replacement. Nonetheless, challenges remain in fully replicating retinal vascularization, long-term functional maturation, and synaptic connectivity, underscoring the need for continued refinement and integration with complementary model systems.
Hidradenitis suppurativa (HS), an inflammatory skin disorder characterized by painful nodules and abscesses, has varying prevalence among different races/ethnicities. This study explored the social drivers of health, burden, and impact of HS among different racial and ethnic groups. An online, cross-sectional survey was conducted among adult patients with HS (September 2023-December 2023) in the USA. Patients were recruited through HS Connect (patient advocacy group) and AmeriSpeak (US national sample panel). Descriptive data were collected using patient-reported outcome measures and de novo questions about patients' disease knowledge and perception, healthcare access and utilization, impact on quality of life (QoL), and social impact. All analyses were descriptive and stratified by racial/ethnic groups. The study included 583 patients (mean age, 34.8 years; 95.5% female) representing a range of racial backgrounds: Black or African American (n = 273; 46.8%), white (n = 236; 40.5%), Two or More Races (n = 47; 8.1%), American Indian or Alaska Native (n = 18; 3.1%), Asian (n = 7; 1.2%), and Native Hawaiian and Other Pacific Islander (n = 2; 0.3%). Ethnic representation also varied (Hispanic/Latino = n = 76; 13.0%). Patients of all races and ethnicities reported considerable QoL impact (Dermatology Life Quality Index, EQ-5D-5L), with results for smaller subgroups (n < 10) included for descriptive completeness only and not intended for comparison with other groups. During flaring, most patients used over-the-counter products/medications (54.2%) or nonmedical intervention/home remedy (56.9%) Up to 36.5% of patients reported challenges in procuring food, utilities, medicine/healthcare, phone, clothing, or childcare when needed in the past year. Among those who paid out-of-pocket for their HS treatment, 55.6% reported that it stopped them from visiting a healthcare provider for treatment. The findings indicate a high burden and impact of HS across all races and ethnicities. Patients reported social drivers of health and challenges with healthcare utilization, indicating the need for integrating social workers and care management teams in dermatology practice, which could facilitate improved care of patients with HS. Hidradenitis suppurativa is a painful skin condition that causes lumps and abscesses. It affects people of all races and ethnicities but is more common in Black or African American individuals. This study surveyed 583 adults in the USA to understand how hidradenitis suppurativa affects people from different racial and ethnic backgrounds. Our focus was on how the disease impacts their daily lives, their ability to access healthcare, how often they visit doctors, their quality of life, and their mental and emotional well-being. Most people said that hidradenitis suppurativa lowers their quality of life and makes daily activities harder. During flaring, many used home remedies instead of seeing a doctor. People suffering from hidradenitis suppurativa also reported trouble getting basic needs such as food, medicine, and transportation. These challenges occurred among patients from different racial and ethnic groups; results for very small subgroups (Asian, Native Hawaiian/Other Pacific Islander) are reported descriptively only and should not be interpreted as representative of these groups or compared with other groups. The research underscores the importance of improving awareness and tailoring care for people with hidradenitis suppurativa, particularly those facing barriers to healthcare.
Periodontitis, a chronic inflammatory disease, is increasingly prevalent among young people and impairs their quality of life. Adverse childhood experiences (ACE), depressive symptoms, and suboptimal health status (SHS) are linked to health risks and chronic diseases, but their interrelationships with periodontitis in Chinese young adults remain unclear. This study aimed to explore associations among these factors. From December 2024 to May 2025, 2,888 participants (aged 18-35) from Tongji Hospital completed surveys on demographics, ACE, depressive symptoms, and SHS. Periodontitis was diagnosed according to the 2018 criteria. Simple, parallel, and chain mediation models were used, controlling for age, sex, marital status, and smoking. Periodontitis prevalence was 25.00% and higher in married individuals (P < 0.001) and smokers (P = 0.004). ACE correlated positively with depressive symptoms (r = 0.28, P < 0.001), SHS (r = 0.19, P < 0.001), and periodontitis (r = 0.16, P < 0.001). Mediation analyses showed: Simple model: Depressive symptoms and SHS partially mediated the effect of ACE on periodontitis (indirect effect = 0.011 for both). Parallel model: Only SHS significantly mediated the effect (indirect effect = 0.011). Chain model: ACE was related to periodontitis via "depressive symptoms → SHS" (indirect effect = 0.010), with significant direct and indirect effects. ACE associated with higher periodontitis risk in young people. This association included both a direct link between ACE and periodontitis, and an indirect link through the chain pathway of "depressive symptoms → SHS"; among these pathways, SHS was a key mediator. The study was registered in the Chinese Clinical Trial Registry (ChiCTR) with the registration number ChiCTR2500103464. Childhood trauma can exert long‐term impacts on health, including oral health. This study involving 2,888 Chinese young adults aged 18‐35 found that 25% of the participants had periodontitis. Those who experienced childhood abuse, neglect, or family issues showed a higher association with the disease. The research revealed two pathways linking early trauma to periodontitis: a direct association and an indirect chain of “depressive symptoms → suboptimal health status (e.g., persistent fatigue).” While depressive symptoms played a role, suboptimal health status was the critical mediator. Higher periodontitis rates in married individuals and smokers may relate to stress or lifestyle factors. The findings suggested that early identification of childhood trauma, combined with interventions targeting mental health or overall well‐being (e.g., counseling, health management), could be more effective than oral care alone in prevention. This underscored the association between early‐life experiences and long‐term health and the need for integrated interventions.
Depressive symptoms have been on the rise among young adults, with the transition to college, particularly the first year, being a critical period of vulnerability. Despite prior research on depression trajectories in college students, limited longitudinal studies have explored unique depressive symptom trajectory groups among first-year students and their associations with academic achievement (GPA), sleep patterns, and whether sociodemographic factors are associated with certain trajectories. This study analyzed a pre-existing dataset that was collected over two waves from a private university (spring semester 2017 and 2018). The final pooled sample resulted in first-year undergraduate students (N = 271) who reported on their depressive symptoms (CES-D scale) at the start and end of the semester, signed a release record for their fall and spring term GPA, and provided continuous sleep data across the academic spring term with Fitbits. K-means + + clustering was conducted to form depressive symptom trajectory groups. ANOVAs, Watson-Williams, and Dunnett's post hoc comparison tests were employed to examine how the resulting trajectory groups were associated with GPA and sleep outcomes (bedtime, waketime, total sleep time, time in bed). Associations between sociodemographic variables and trajectory groups were investigated using chi-square tests. K-means + + clustering identified four trajectory groups: low-stable (n = 109), increasing (n = 72), decreasing (n = 51), and high-stable depressive symptoms (n = 39). The low-stable and decreasing group had a higher spring term GPA (M = 3.44 and M = 3.39, respectively) compared to the increasing and high-stable groups (M = 3.22 and M = 3.18, respectively). The low-stable group generally had an earlier wake time and bedtime, greater total sleep time and time in bed, relative to the decreasing and increasing trajectory groups. Gender, ethnicity, international student status, and first-generation student status were not associated with trajectory groups. Consistent with prior work, there are unique depression trajectory groups among first-year college students that represent stability and change of depressive symptoms over the course of a spring semester. Favorable trajectories (low-stable and decreasing symptoms) are associated with better academic performance and sleep habits.
There have been discussions as to the time of elective induction of labour to curb the continuation of pregnancy that might endanger the lives of both the mother and child. This research was conducted to assess foetal and maternal consequences of planned delivery at 40 and 41weeks in women with low-risk singleton pregnancy. A randomised controlled trial with equal allocation of participants (96 pregnant women in each arm) into 40weeks and 41weeks. Participants were randomised at the antenatal clinic at 39 weeks for induction of labour. The main outcome was the caesarean section rate. Secondary outcomes were maternal (genital tract laceration rate) and foetal (rates of meconium staining of amniotic fluid, SCBU admission, perinatal mortality, birth trauma, birth weight, and neonatal APGAR score at 1 and 5 minutes). Student t-test and chi-square test were used for inter-group comparison. Incidence of caesarean delivery (26.6% vs. 21.3%; p=0.406), and genital laceration (2.1% vs. 5.6%; p=0.268) did not differ between groups. Significantly higher birth weight was noted among women induced at 41weeks (3.41 ± 0.37kg) than 40weeks (3.28 ± 0.46kg) (p=0.043). Also, there was significant variation in meconium staining of amniotic fluid between 40weeks (11.7%) and 41weeks (25.8%) (p=0.014). Other foetal outcomes showed no significant difference. Inducing labour at 40weeks is safe for low-risk women as it does not significantly increase the cesarean delivery rate and adverse perinatal outcomes. Therefore, elective induction of labour at 40weeks should be recommended and introduced into obstetric practice without the fear of adverse outcomes.
Postpartum hemorrhage (PPH) remains a leading cause of maternal morbidity and mortality globally, particularly in women with antepartum hemorrhage (APH). Current risk assessment methods lack standardized predictive tools that are both simple and reliable for clinical application. We conducted a secondary analysis of a prospectively collected cohort of 100 pregnant women presenting with APH at ≥28 weeks' gestation at a tertiary care centre in northern India. Multivariable logistic regression was used to identify significant predictors of PPH. A point-based clinical risk score was then developed based on the multivariable model and internally validated using bootstrap techniques with 1000 replicates. PPH occurred in 30% of patients (n=30). Multivariable analysis identified four independent predictors of PPH: maternal age (adjusted odds ratio [OR] 1.29 per year; 95% confidence interval [CI] 1.10-1.51; p=0.002), gravidity (OR 2.11 per unit; 95% CI 1.00-4.43; p=0.049), gestational age at delivery (OR 0.64 per week; 95% CI 0.44-0.94; p=0.021), and antepartum blood transfusion (OR 2.44; 95% CI 1.02-5.84; p=0.045). The prediction model demonstrated excellent discrimination with an area under the receiver operating characteristic (ROC) curve of 0.86 (95% CI 0.80-0.92) and good calibration (slope 0.95). Bootstrap internal validation yielded an optimism-corrected AUC of 0.84. The resulting four-factor risk score stratified patients into four risk categories with PPH rates ranging from 4% (low risk) to 100% (very high risk). The four-variable score provides an accurate, easily applicable tool with excellent predictive performance. The score is a promising tool that, pending external validation, may facilitate early identification of high-risk patients and improve maternal outcomes. Further research should focus on external validation of this tool in diverse populations and its integration into clinical practice.
As the primary living environment for disabled older adults, families play a crucial role in disease prevention and maintaining their health. However, research has found that both disabled older adults and their family members experience numerous physiological, psychological, and social adaptation problems when adjusting to the changes brought by disability, severely impacting the overall health status of the family. Therefore, guided by the ERG (Existence-Relatedness-Growth) theory, this study aims to understand the family health needs of families with disabled older adults in the community, providing a basis for improving the health level of these families and developing targeted intervention programs. From December 2024 to February 2025, this study employed purposive and snowball sampling to select 12 pairs of disabled older adults and their primary caregivers from communities under the jurisdiction of Zhengzhou City, Henan Province for semi-structured interviews. Thematic analysis was applied to organize and analyze the interview data. Deductive analysis indicated that the famliy health needs of families with disabled older adults in the community can be summarized into the following three themes: existence needs (daily living needs, economic support needs, environmental modification needs), relatedness needs (family communication needs, social resource connection needs, social participation needs), and growth needs (autonomy and dignity maintenance needs, family development needs, demand for technology-enabled solutions). The results show that the family health needs of families with disabled older adults in the community are unique and diverse. Community health workers and social workers can develop and implement effective strategies based on the different levels of family needs to promote the health level of families with disabled older adults and improve the overall quality of life of these families.
Soft skills correspond to intrapersonal and interpersonal abilities related to how individuals interact, make decisions, and manage their activities. In the context of undergraduate nursing education, their development is fundamental to the preparation of professionals capable of acting in an ethical, critical, and relational manner, making it relevant to understand how these competencies are incorporated into the teaching and learning process. In this context, the objective of this study is to understand how faculty members in undergraduate nursing programs incorporate soft skills into their pedagogical approaches and practices, identifying the competencies considered essential and the challenges to their implementation. A qualitative study was conducted with 26 nursing faculty members from four federal public universities in southern Brazil. Data were collected between June and September 2025 through semi-structured interviews, following the criteria of the Consolidated Criteria for Reporting Qualitative Research checklist. The interviews were processed using IRaMuTeQ software and analyzed in light of Discursive Textual Analysis. Three analytical categories emerged: faculty understanding of soft skills in nursing education; pedagogical approaches and strategies for the development of these competencies; and perceived difficulties in their promotion within teaching. The faculty members recognize the relevance of soft skills and report the use of active methodologies and reflective strategies for their development. However, most had not received specific training, and the teaching of these competencies occurs predominantly in an implicit manner. The findings demonstrate that, although soft skills are widely valued in nursing education, their promotion still lacks pedagogical systematization and institutional support. Challenges such as the subjectivity of these competencies, the prioritization of technical skills by students, and distractions associated with the use of technologies limit their intentional development. These results contribute to the international literature in nursing education by highlighting the need for structured institutional strategies for faculty development and for the explicit integration of soft skills into nursing curricula.
Pediatric sepsis is a leading cause of global morbidity and mortality, yet high-resolution, granular subnational assessments remain scarce. Chile and Mexico are the only countries in Latin America that possess robust vital registration systems and open access databases with marginal levels of missing cases. This offers a unique opportunity to quantify the subnational burden of pediatric sepsis, identify healthcare system constrictions, and guide targeted public health interventions. This retrospective longitudinal study analyzed official hospital discharge and non-fetal death records of pediatrics (< 10 years old) from Chile and Mexico between 2014 and 2024. Age-standardized incidence (ASIR) and mortality (ASMR) rates, standardized ratios, and the mortality-to-incidence ratio (MIR), were calculated to assess mortality relative to subnational hospital output. A novel dynamic risk stratification matrix was developed to classify ICD-10 sepsis-related causes into four risk/severity quadrants based on year-specific ASIR and MIR indicators. A total of 656,234 discharges and 2,035 deaths in Chile, and 964,452 discharges and 77,252 deaths in Mexico were analyzed. Subnational trends were highly heterogeneous. Chile exhibited a predominantly low pediatric MIR (median < 1%) with isolated hotspots with significant structural deviations to the North. High-severity sepsis causes in Chile were relatively rare. Conversely, Mexico displayed an alarmingly high MIR (median 7.2%), with systemic persistency in States such as Chiapas and Nuevo León. Strikingly, high-severity causes in Mexico (e.g., unspecified septicaemia, bacterial meningitis) were highly frequent, accounting for 88-97% of pediatric sepsis deaths. Furthermore, systemic instances of code-specific MIR > 1.0 in Mexico suggest significant health system fragmentation and decoupling of hospital discharge from vital statistic registries. Pediatric sepsis in Latin America encompasses distinct realities, ranging from localized critical care gaps to high-lethality persistency. One-size-fits-all national policies may be inadequate. These findings advocate for precision public health, urging the deployment of decentralized, data-driven interventions and specialized resource allocation based on high-risk subnational hotspot identification.
To compare the elastosonographic changes of the tibial nerve (TN) and Achilles tendon (AT) in patients with type 2 diabetes mellitus (T2DM) and explore their relationship and respective relevant factors. This case-control study enrolled 165 subjects, comprising 126 patients with T2DM and 39 healthy controls matched for age and gender. The patients were further divided into those with and without diabetic peripheral neuropathy (PN-DM and NPN-DM groups). Clinical and laboratory data were collected. Conventional ultrasound and elastography were performed to assess the changes in the morphology and elasticity of the bilateral TN and AT. Sonographic features were compared across the three groups, relevant factors affecting the stiffness of TN and AT were analyzed, respectively. Diabetic patients exhibited significantly higher levels of HbA1C and a higher rate of smoking than healthy controls (P < 0.01 and P = 0.02, respectively). Their levels of body mass index (BMI) and total cholesterol have a significant difference between the NPN-DM group and healthy controls (both P = 0.02). The incidence of other microvascular complications in the NPN-DM group was significantly lower among diabetic patients (P = 0.04). Compared with healthy controls, the cross-sectional area (CSA) and transverse diameter of TN in diabetic patients were significantly larger (both P < 0.01), and CSA and anteroposterior diameter of AT were notably greater (P = 0.02 and P < 0.01). Besides, the stiffness of TN in the longitudinal section was significantly higher (P < 0.01), and the stiffness of AT in the cross-section was remarkably lower (P < 0.01). There was no significant difference in the morphology or elastography of TN or AT between NPN-DM and PN-DM groups. Furthermore, the stiffness of TN was not linearly related to that of AT, but independently correlated with age, HbA1C, and other microvascular complications (P < 0.05). The stiffness of AT was only independent of age (P < 0.01). The size of both TN and AT in diabetic patients was significantly larger. The stiffness of TN increased, and that of AT decreased; however, these changes were independent of each other. Not applicable.
The gut microbiome supports digestion, immunity, and metabolism; its imbalance (dysbiosis) drives inflammation and metabolic dysfunction, contributing to chronic diseases such as diabetes, cardiovascular disease, inflammatory bowel disease, and autoimmune disorders. Medicinal plants provide a wide range of phytochemicals (such as polyphenols, flavonoids, alkaloids, saponins), which reach the colon and undergo two-sided interactions with microbes in the gut, acting as potential microbiome modulators and substrates of biotransformation into bioactive metabolites. This structured narrative review synthesises evidence from peer-reviewed studies indexed in PubMed, Scopus, and Web of Science over the last 10 years on the role of medicinal plants in microbiome-mediated chronic disease modulation. This literature is organised into three mechanistic axes: (i) perturbations, defined here as measurable shifts in microbial diversity or taxonomic composition relative to a baseline or healthy reference state, together with beneficial taxa enrichment; (ii) alterations in microbial metabolite output, especially short-chain fatty acids (SCFAs) and other immunometabolic mediators; and (iii) downstream host metabolic and immune signalling. Rather than broad descriptive summaries, the literature is organised using an axis-based mechanistic framework, highlighting key translational constraints such as botanical heterogeneity, dose/formulation variability, and inconsistent microbiome endpoint standardisation, that must be addressed to strengthen human evidence and clinical relevance. Illustrative microbiome-mediated processes involve botanicals such as turmeric (curcumin), ginseng (ginsenosides), and green tea (catechins), though evidence strength varies by study design. Future progress requires standardised phytochemical characterisation, microbiome-stratified trials, and integration of multi-omics with artificial intelligence analytics to enhance mechanistic insight, identify responders, and enable personalised plant-based microbiome therapies.
Tanzania has adopted artificial intelligence (AI)-assisted chest X-ray screening for tuberculosis (TB), including the use of CAD4TB version 6, which is registered by the Tanzania Medicines and Medical Devices Authority (TMDA). While GeneXpert, practical reference standard used in routine practice, remains the primary bacteriological confirmatory test in routine practice, there is currently no established national threshold for CAD4TB use in either active case finding (ACF) or passive case finding (PCF) settings. This study evaluates the implementation and operational use of CAD4TB version 6 within mobile TB screening units in Tanzania and highlights challenges affecting its effective use. We conducted a retrospective analysis of screening data from 11,923 individuals collected from mobile clinics equipped with digital X-ray, CAD4TB version 6, and GeneXpert systems. Comparisons were made between manual chest X-ray interpretation, CAD4TB scores, and GeneXpert results within the subset of individuals who underwent confirmatory testing. The findings reveal substantial inconsistencies in screening workflows, including non-uniform use of CAD4TB prior to GeneXpert testing, missing radiological records, and deviations from intended protocols across sites. Descriptive analysis showed that CAD4TB scores generally aligned with GeneXpert-positive cases within the tested subset; however, due to selective application of GeneXpert and incomplete data, these observations cannot be interpreted as measures of diagnostic accuracy. This study should be interpreted as an implementation and operational assessment of AI-assisted TB screening rather than a diagnostic accuracy or threshold-setting study. The findings highlight important gaps in protocol adherence, data completeness, and workflow standardization, underscoring the need for prospective, protocol-driven studies to establish validated national thresholds for CAD4TB use in Tanzania.
Ongoing neurodevelopmental care is essential for children with congenital heart disease (CHD). Understanding delivery and uptake of neurodevelopmental care pathways can inform implementation and resource planning. This study applied simulation modelling to explore outcomes from a neurodevelopmental care pathway for children with CHD. The model was developed using data from a Queensland program to explore health service interactions for neurodevelopmental screening, formal assessment, and early intervention, up to five years. Modelling was intended to provide a baseline understanding of the pathway, rather than evaluating against a reference standard. Hypothetical scenarios explored how changes in screening and referrals influenced the identification of developmental concerns, and how developmental concern severity affected intervention referrals. Based on available data, 58% of the cohort remained under routine surveillance and 25% had accessed early intervention for one or more developmental delays. Scenarios defined by increased screening projected up to 55% of the cohort having a developmental concern identified during screening and 45% having a developmental delay identified following assessment. Simulation modelling was useful for understanding outcomes from a neurodevelopmental pathway and how differences in screening and assessment affected health service interactions. Findings may inform policy and resource planning for future neurodevelopmental pathways. This study shows that simulation modelling is a useful approach for evaluating a neurodevelopmental care pathway for children with CHD, to understand movement through neurodevelopmental screening, assessment, and interventions. Scenario-based modelling provides insights into factors influencing pathway engagement, contributing evidence to strengthen understanding of service gaps and areas where improvements can most effectively impact engagement and resourcing. This study identifies neurodevelopmental screening as the most influential stage impacting downstream outcomes, underscoring its importance as a strategic intervention point. This study's approach provides a general framework for evaluating similar pathways and a potential baseline for assessing future policy or service changes.
In nature, photosynthesis is driven by solar light and a large proportion of the visible spectrum is absorbed by the light harvesting complexes (LHCs), which then transfer the energy to the reaction center. Inspired by nature, we implemented a light harvesting energy transfer cascade within biomimetic lipid bilayers of liposomes built with DPPC (1,2-dipalmitoyl-sn-glycero-3-phosphocholine), using membrane-anchored fluorescein, 2-(3,6-dihydroxy-9H-xanthen-9-yl)-5-dodecanamidobenzoic acid (FlC12) as primary absorber and membrane anchored eosin Y, hexadecyl 2-(2,4,5,7-tetrabromo-3,6-dihydroxy-9H-xanthen-9-yl)benzoate (EYC16), as energy acceptor to sensitize oxygen and generate the reactive oxygen species 1O2. Finally, the model substrate nicotinamide adenine dinucleotide (NADH) is oxidized by 1O2 within the compartmentalizing liposome nanoreactors. It was observed that our metal-free LHC system has only a minor effect on the photooxidation rate of NADH when the nanoreactor membrane is functionalized symmetrically. By contrast, asymmetric membrane functionalization of the liposome nanoreactor membranes leads to acceleration by 16% to 27% when using multi-colored light emitting diodes (LED) or simulated solar light, respectively.
The deltoid ligament (DL) is the primary stabilizer of the medial ankle, but its injury mechanisms remain poorly understood. This study aimed to investigate the injury risk and mechanisms of individual DL bundles under both acute and chronic conditions to inform prevention and treatment strategies. A validated finite element model of the human foot was used to examine peak stresses in DL bundles under four acute loading scenarios. Chronic loading was simulated by applying gait loads after transecting the lateral ligaments, and the resulting DL stresses were compared with those of the intact model. Additionally, thirty-nine rats were assigned to three groups: a lateral ligament rupture group (LR, n = 13), a tibialis posterior tendon rupture group (TPR, n = 13), and a sham group (n = 13). After 6 weeks of treadmill running, the mechanical properties and histological characteristics of the DL, along with ankle joint morphology and articular stresses, were evaluated to further verify the hypothesized mechanisms of chronic injury. Under acute loadings, the tibiocalcaneal ligament (TCL), anterior tibiotalar ligament (ATTL), and deep posterior tibiotalar ligament (dPTTL) showed the highest stress under pronation-external rotation loading. Lateral ligament rupture increased DL stress during gait. After 6 weeks of treadmill running, the LR and TPR groups showed roughened articular surfaces with osteophyte formation, increased articular stress, decreased talar bone volume fraction, lower failure load and stiffness ratios of the DL (p < 0.01), reduced fluorescence intensity of COL1, and elevated levels of COL3, MMP-2 and IL-1β compared with the sham group (p < 0.01). The TCL, ATTL, and dPTTL bundles are particularly susceptible to acute injury, with pronation-external rotation posing the greatest risk. Chronic degeneration of the DL occurs following rupture of the lateral ligament or tibialis posterior tendon, with a more pronounced effect after lateral ligament rupture.