Appropriate nutrition during the first 1,000 days from pregnancy to a child's second birthday is a critical window for optimal growth, cognitive development, and long-term health, with proper complementary feeding practices playing a pivotal role. Despite this, evidence on the magnitude of appropriate complementary feeding practices and their associated factors remains limited at the local level in Ethiopia. Therefore, this study assessed the Complementary Feeding Practices and Associated Factors among Mothers of Children Aged 6-23 Months in Debre Berhan Town, Ethiopia, in 2021. A community-based cross-sectional study was conducted from July 1 to August 30, 2021, using a cluster sampling technique in four randomly selected kebeles, including all eligible mothers. Data were collected using a structured interviewer-administered questionnaire, and appropriate complementary feeding practice was assessed based on WHO-recommended indicators. Data were entered into EpiData version 3.1 and analyzed using SPSS version 25. Multivariable logistic regression was used to identify factors associated with appropriate complementary feeding practice at p < 0.05. The prevalence of appropriate complementary feeding practice was 43.7%, and factors significantly associated included being a housewife (AOR = 2.53; 95% CI: 1.05-6.08), attending postnatal care (AOR = 3.18; 95% CI: 2.03-4.97), institutional delivery (AOR = 2.14; 95% CI: 1.11-4.14), and child vaccination (AOR = 1.89; 95% CI: 1.15-3.12). The level of appropriate complementary feeding practice in Debre Berhan Town (43.7%) remains suboptimal compared with WHO recommendations for children aged 6-23 months. Postnatal care attendance, being a housewife, institutional delivery, and child vaccination were significantly associated with appropriate complementary feeding practice. Strengthening postnatal counseling and integrating appropriate infant and young child feeding messages into routine vaccination services may help improve complementary feeding practices.
School belonging is a key protective factor for socioemotional functioning during the preschool period. Counsellors working in preschool institutions play an important role in developing sense of school belonging by meeting their emotional and social needs and fostering a positive school environment. This study examined the views of school counsellors working in preschool education institutions in Türkiye regarding sense of school belonging and, to support these views and to reveal their reflection at the level of institutional practice, analysed routine guidance and counselling documents to identify how belonging-supportive practices are formally planned, implemented, and recorded over the school year. A phenomenological qualitative design was employed. Data were collected through structured interviews with school counsellors and complemented by document analysis of counselling artefacts to examine how belonging practices are formally planned, implemented, and documented across the school year. The document corpus comprised 186 written artefacts from 10 kindergartens. Data were analysed using computer-aided content analysis, and a hierarchical code-subcode model was utilised in MAXQDA 2020. Findings indicated that counsellors conceptualised school belonging as children's feelings of acceptance and security within the school community and described belonging-related differences in children's emotions, participation, and peer interactions. Counsellors also highlighted multi-level strategies targeting emotional safety, peer relationships, teacher practices, and family involvement. Complementing these findings, the document analysis yielded five practice-oriented domains through which counselling services may support school belonging: (1) emotional safety and adjustment to school routines, (2) support for peer relationships and social problem-solving, (3) cultivation of a positive classroom/school climate through teacher collaboration, (4) systematic family involvement and parent education, and (5) developmental monitoring and evaluation. Document analysis corroborated counsellors' reported perspectives by providing practice-level evidence of how belonging-supportive strategies were formally embedded in institutional plans and documented in artefacts. Preschool children's school belonging is supported through a year-long programme of practice spanning children, teachers, and families. Strengthening preventive counselling services that promote emotional safety, peer-inclusion, and family-school collaboration may improve school belonging in early childhood, as this phenomenon has far-reaching implications for children's educational trajectories, working lives, and broader societal and democratic engagement.
Zoonotic diseases are common threats to global health. A large number of infectious diseases are transmitted from animals to humans. The current study aimed to assess the community's knowledge, attitudes, and practices (KAP) regarding common zoonotic diseases in the Arbaminch district. A cross-sectional survey was carried out between November 2024 and June 2025. A total of 384 participants were interviewed in the study. Participants residing in these areas were randomly chosen. Data were collected using a structured questionnaire. The collected data were analyzed using Stata 17, and the results were reported using descriptive statistics and the chi-square test. The findings of this study revealed that a majority (55%) of participants had good knowledge about zoonotic diseases. Respondents know several modes of transmission for zoonotic diseases, with animal bites (32.5%) being the most recognized, followed by direct contact (15.5%), ingestion of raw products (10%), and inhalation (10%). Regarding attitudes, 63.2% of respondents exhibited a positive attitude towards the importance of zoonotic disease prevention and control, and 67.4% of respondents followed relatively good hygiene and preventive behaviors. However, risky practices were still common. Knowledge score showed a significant association with age. Attitudes of participants were significantly associated with education, age, occupation, and income. Similarly, practices were significantly associated with gender, education level, occupation, and income, with all associations being statistically significant (p < 0.05). The overall community knowledge, attitudes, and practices regarding zoonotic diseases were relatively good.
This study aimed to characterize practices and decision-making for extracorporeal membrane oxygenation (ECMO) for congenital anomalies of the kidney and urinary tract (CAKUT). General practices (GP) section inquired about institutional practices and barriers, ECMO criteria, and dialysis. The hypothetical cases (HC) illustrated four clinical scenarios with varying degrees of renal severity for ECMO candidacy. Then, 99 (42 centers) and 91 (38 centers) physicians completed the GP and HC components, respectively. The majority considered ECMO on a case-by-case basis (66%). Bilateral renal agenesis was the most common diagnosis for exclusion (52%). Prenatal markers used for ECMO exclusion included anhydramnios (43%) and lung volumes (43%). The majority of centers had nephrology involved in ECMO decision-making. Challenges for implementing ECMO included disease heterogeneity (79%) and poor evidence on outcomes (66%). HC responses demonstrated variability in considering ECMO for CAKUT. Variability among providers and institutes underscores the need for consensus-based guidelines to optimize decision-making and outcomes.
Dizziness and vertigo affect millions annually, creating a $13.3 billion US economic burden. Physical therapists are key in treating vestibular disorders, but entry-level training varies significantly, necessitating specialized post-professional education. This study evaluates whether the Advanced Vestibular Physical Therapist (AVPT) Certificate Program effectively prepares PTs to meet patient needs. A cross-sectional survey was distributed to 230 graduates from four AVPT cohorts. The 18-question survey assessed program impact on clinical practice, professional development, and patient outcomes using 5-point Likert scales, plus demographic information and open-ended feedback. Sixty-seven graduates responded (29% response rate). Nearly all (97%) agreed the program prepared them as frontline vestibular providers, while 92% reported improved patient outcomes. All participants (100%) agreed the program met continuing education needs, and 98% reported changed clinical practice. Average likelihood to recommend was 9.69/10. However, 30% reported employers did not value the certification, and 91% received no additional compensation. The AVPT program successfully creates competent expert vestibular providers, improving clinical confidence, differential diagnosis skills, and patient outcomes while addressing critical gaps in entry-level education.
Population-based organized prostate-specific antigen (PSA) screening is implemented in 80% of Japanese municipalities; however, Shiga Prefecture remains a unique exception without such a systematic program. This study characterized the longitudinal clinical features and treatment patterns in this opportunistic testing environment using data from 1716 patients diagnosed via prostate biopsy in 2012, 2017, and 2022. While median PSA levels remained stable (10.40-11.43 ng/mL), median age at diagnosis increased from 72 to 74 years. Over the decade, the incidence of International Society of Urological Pathology Grade Group 1 and cT1c stages decreased significantly (p < 0.001), with nearly 90% of cases being cT2 or higher in 2022. Risk classification showed a decrease in low-risk cases and a rise in high-risk cases. Regarding treatment, radical prostatectomy rates remained stable at approximately 25%, whereas the overall use of active surveillance (AS) increased from 1 to 9%. Notably, among low-risk patients, AS adoption rose markedly from 2.3% in 2012 to 68% in 2022. While clinical practices have evolved to successfully minimize unnecessary invasive intervention, these findings suggest that clinical progress alone cannot fully compensate for the lack of organized efforts to improve early detection.
Rare diseases affect small, dispersed populations and are often studied through multisite designs where equity-relevant demographic data are essential for inclusive recruitment and accurate interpretation. This study examined how sociodemographic variables are collected and reported in rare disease research and evaluated their alignment with the PROGRESS-Plus framework, which outlines Place of residence, Race/ethnicity/culture/language, Occupation, Gender/sex, Religion, Education, Socioeconomic status, social capital, and additional "Plus" factors such as age and disability status. A systematic review of peer-reviewed articles was conducted alongside an environmental scan of demographic instruments from governmental, health-system, academic, and rare disease organizations. Screening and extraction coded variables as reported, indirectly derivable, or not reported and compared them with established standards. Of 647 records identified, 37 met inclusion criteria. Reporting was dominated by age and sex, while most other equity-relevant variables including gender identity, sexual orientation, race/ethnicity, distinctions-based Indigenous identity, socioeconomic position, language, migration, disability/function, religion, occupation, and social capital, were inconsistently captured. Environmental scan instruments were more comprehensive, revealing a capture-to-reporting gap. Demographic reporting in rare disease research is heterogeneous and insufficient for equity-focused analyses. A concise, standards-aligned sociodemographic dataset is needed to improve transparency, comparability, and detection of inequities across rare disease populations.
Congenital anomalies and genetic disorders contribute substantially to perinatal morbidity and mortality, particularly in low- and middle-income countries. Prenatal healthcare providers play a key role in identifying affected pregnancies and referring to patients for genetic counselling; however, referral practices remain suboptimal. To assess the utilisation of genetic counselling services and perceptions of genetic counselling among prenatal healthcare providers in Gauteng Province, South Africa. An electronic survey was distributed to prenatal healthcare providers working in public and private healthcare sectors in Gauteng. The survey assessed access to genetic counselling services, referral practices, knowledge of referral indications, understanding of the genetic counsellor's role, and perceived barriers to referral. Fifty-four respondents were included. Seventy-four percent of participants reported being able to refer to patients for genetic counselling, but only 57% had utilised the service. No participant correctly identified all appropriate referral indications, and only 24% understood the scope of practice of genetic counsellors. Only 6% felt confident in their knowledge of genetic counselling. Although genetic counselling services are available and utilised in Gauteng, they are not accessed to their full potential. Improved education and clearer referral guidance are required to optimise prenatal genetic care in this setting.
A position paper released by the European Association of Nuclear Medicine emphasised the need for multidisciplinary engagement to establish dosimetry-based personalised treatment in Radionuclide therapy (RNT). The uncertainty analysis results often ignored in routine clinical practice should be incorporated into the dose calculations to improve the efficacy and accuracy of treatment. In this study, patients with haematological malignancies undergoing radioimmunotherapy were evaluated. Our study aimed to calculate the uncertainties associated with each parameter of the single time point (STP) dosimetry chain and compare the with multiple time points (MTP) in the bone marrow and liver results. 28 patients received an intravenous injection of 111In-besilesomab (0.17 ± 0.01GBq) for pre-therapeutic dosimetry and were subsequently treated with 90Y-besilesomab(2.43 ± 0.53GBq). A dosimetry analysis was performed on bone marrow (BM) and liver with MTP and STP. We investigated the uncertainty in population mean effective half-life, volume, recovery coefficient, counts, measured activity, fitting parameters, time-integrated-activity, S-factors, and absorbed dose (AD) for a group of patients. The mean absorbed dose per unit administered activity (DpA) to BM was 5.8 ± 1.7 mGy/MBq with MTP and 5.8 ± 1.6 mGy/MBq with STP, and to the liver was 2.9 ± 1.9 mGy/MBq with MTP and 3.1 ± 2.4 mGy/MBq with STP. The mean fractional uncertainty associated with total absorbed dose to BM was 13.18 ± 3.46% with MTP and 18.75 ± 3.22% with STP, and to liver was 5.77 ± 3.13% with MTP and 49.78 ± 25.36% with STP. A moderate positive relationship (R2 = 0.7) was noted between post-injection acquisition time and AD uncertainty with STP for BM, whereas a strong positive relationship (R2 = 1) was noted for the liver. The absorbed dose uncertainty in STP was significantly higher compared to the MTP. Incorporating the uncertainty analysis for STP dosimetry parameters in routine clinical practice is strongly recommended. The accuracy in the acquisition time, population-based half-life and fitting function for time activity curve is vital for minimising uncertainty in STP dosimetry, which is less time-consuming and easier to implement in clinical practice than MTP.
Tanzania has adopted artificial intelligence (AI)-assisted chest X-ray screening for tuberculosis (TB), including the use of CAD4TB version 6, which is registered by the Tanzania Medicines and Medical Devices Authority (TMDA). While GeneXpert, practical reference standard used in routine practice, remains the primary bacteriological confirmatory test in routine practice, there is currently no established national threshold for CAD4TB use in either active case finding (ACF) or passive case finding (PCF) settings. This study evaluates the implementation and operational use of CAD4TB version 6 within mobile TB screening units in Tanzania and highlights challenges affecting its effective use. We conducted a retrospective analysis of screening data from 11,923 individuals collected from mobile clinics equipped with digital X-ray, CAD4TB version 6, and GeneXpert systems. Comparisons were made between manual chest X-ray interpretation, CAD4TB scores, and GeneXpert results within the subset of individuals who underwent confirmatory testing. The findings reveal substantial inconsistencies in screening workflows, including non-uniform use of CAD4TB prior to GeneXpert testing, missing radiological records, and deviations from intended protocols across sites. Descriptive analysis showed that CAD4TB scores generally aligned with GeneXpert-positive cases within the tested subset; however, due to selective application of GeneXpert and incomplete data, these observations cannot be interpreted as measures of diagnostic accuracy. This study should be interpreted as an implementation and operational assessment of AI-assisted TB screening rather than a diagnostic accuracy or threshold-setting study. The findings highlight important gaps in protocol adherence, data completeness, and workflow standardization, underscoring the need for prospective, protocol-driven studies to establish validated national thresholds for CAD4TB use in Tanzania.
Crop diversification through crop rotation or cover cropping is widely recognized as an important strategy to improve agroecosystem sustainability, enhance soil health, and suppress soilborne diseases. Rotating crops or introducing cover crops can disrupt pathogen life cycles, improve nutrient cycling, and promote beneficial microbes. However, the outcomes of diversification practices are often complex, influenced by soil type, crops, and pathogen pressures. Evaluating how cover crops and crop phase affect crop soilborne diseases and root-associated microbiome is critical for designing resilient cropping systems. This study evaluated the legacy effects of cover crops and crop phase on soybean root diseases and root-associated microbiome. Soybean plants were grown in soils collected from a corn-soybean rotation field experiment with and without cover crops, and then challenged with either Fusarium graminearum inoculum or soybean cyst nematode (SCN) in the growth chamber. Soils with a cover crop history significantly reduced F. graminearum-induced root rot, but had a limited impact on SCN, indicating divergent disease responses. Microbial profiling revealed that F. graminearum inoculum strongly reshaped bacterial communities, reducing Shannon diversity and enriching fast-growing copiotrophic taxa, including Bacteroidota genera (Pedobacter, Chitinophaga, Flavobacterium, and Mucilaginibacter) and Proteobacteria genera (Dyella, Pseudomonas, Rhizobium, and Paraburkholderia) regardless of cover crops. In contrast, SCN infection increased bacterial Shannon diversity in soybean-phase soils regardless of cover crops but decreased fungal Shannon diversity in soybean soils without cover crops, highlighting pathogen-specific microbial shifts. Cover cropping enhanced microbial heterogeneity under both pathogen pressures, enriching microbial taxa potentially involved in nutrient cycling (Chitinophaga and Mucilaginibacter), antagonism (Flavobacterium, Streptomyces, Pseudonocardia, and Nocardioides), and competitive interactions (Paraburkholderia). Correlation analyses further linked specific bacterial and fungal genera with disease suppression. Soilborne pathogens and cropping practices exerted interconnected, pathogen- and crop-specific effects on root microbial communities. Cover cropping offers a promising strategy to enhance microbial-mediated disease resilience in soybean systems, providing ecological insights into microbiome-driven plant health.
Federated learning (FL) has become a highly promising paradigm for privacy-preserving distributed model training by enabling edge devices to train without sharing raw data. But in practice, edge environments are both non-stationary and asymmetric, with varying data distributions due to shifts in user behaviour, sensing conditions, and overall environmental dynamics. This causes concept drift (sudden, gradual, and recurrent), leading to poor model performance, slower convergence, and predictive bias. Current approaches to FL are not combined to tackle problems of drift adaptation, differential privacy (DP) and resource efficiency (FedAvg, DP-FedAvg). To address these constraints, we present FedDriftGuard. This Federated learning layer unifies client-level drift detection, drift-adaptive aggregation, and adaptable differential privacy into a single, FLE architecture-compatible system. The proposed DP-DriftNet model implements attention-based time encoding to capture changing data patterns and drift-directed feature weighting to allow greater flexibility in the presence of distributional changes. A drift-optimal privacy scheduler allocates noise probabilistically, subject to a limited privacy budget, thereby enforcing an appropriate privacy-utility trade-off without cancelling formal DP guarantees. Also, update sparsification, compression and periodic transmission techniques are used to reduce communication overhead. Decades of experimentation on real-world and synthetic drift datasets have shown that FedDriftGuard outperforms baseline FL techniques, achieving accuracy and F1-score gains of 9-14% and 11-17%, respectively, with adaptation latency 28% shorter and communication cost 20-35% lower. Such findings are statistically significant and confirm the soundness of the suggested method. FedDriftGuard offers effective, scalable privacy-preserving learning in adaptable, edge-drifting environments.
Prostate, breast, stomach, colon-rectum, lung, and hematologic cancers represent a regional public health burden among older adults in the Caribbean. We described 15-year trends in cancer incidence and mortality among older adults in Martinique. We conducted a population-based cohort study (2008-2022) from the population-based cancer registry of Martinique. Age-Standardized Incidence and Mortality Rates (ASIR and ASMR) were estimated using the Segi/Doll world standard population and Annual Average Percent Change (AAPC) with 95% confidence interval in patients ≥ 65 years, by tumor site and sex. We recorded 15,400 cancer cases over the entire study period (2008-2022), with 64.4% diagnosed in males. Overall ASIR remained broadly stable: in men, 2,257.8 per 100,000 (95% CI: 2,172.6-2,342.9) in 2008-2012 and 2,053.1 (95% CI: 1,982.8-2,230.4) in 2018-2022; in women, 847.8 (95% CI: 801.8-893.8) in 2008-2012 and 862.1 (95% CI: 821.2-902.9) in 2018-2022. Prostate cancer ASIR decreased from 1,355.5 per 100,000 (95% CI: 1,288.6-1,422.4) in 2008-2012 to 1,303.7 (95% CI: 1,242.0-1,365.5) in 2013-2017, and then significantly to 1,102.2 (95% CI: 1,049.6-1,154.9) in 2018-2022.Prostate cancer ASMR also declined from 293.50 per 100,000 (95% CI: 263.30-323.70) in 2008-2012 to 225.80 (95% CI: 205.00-246.50) in 2018-2022. Women breast cancer ASIR increased significantly from 177.3 per 100,000 (95% CI: 155.4-199.2) in 2008-2012 to 198.0 (95% CI: 176.5-219.6) in 2013-2017 and to 230.7 (95% CI: 208.9-252.5) in 2018-2022. Male lung and bronchial cancer incidence showed a significant AAPC decrease of -12.5% (95% CI: -23.3 to -0.13) in 2018-2022. Colorectal cancer ASMR in women remained stable: 68.2 per 100,000 (95% CI: 55.1-81.3) in 2008-2012, 63.8 (95% CI: 53.0-74.6) in 2013-2017, and 71.4 (95% CI: 60.3-82.5) in 2018-2022, and male stomach cancer AAPC decreased significantly during 2018-2022: -11.5% (95% CI: -20.1 to -1.94). Between 2008 and 2022, cancer incidence among older adults in Martinique showed decreasing prostate and lung cancer but increasing breast cancer, consistent with Caribbean and global trends. These patterns reflect screening practices, treatment advances, and COVID-19 disruptions. Future research should prioritize age-tailored cancer management and uninterrupted treatment access.
Ureteral stents are routinely used following endourological procedures to ensure adequate drainage and prevent obstruction. However, stent-related morbidity remains common, and optimal stent dwell time and removal methods are not well defined. This systematic review aimed to evaluate clinical and procedural factors influencing ureteral stent dwell time and the methods used for stent removal after endourological interventions. A systematic review was conducted in accordance with PRISMA guidelines and registered on PROSPERO. MEDLINE and Embase were searched from inception to October 2025. Randomized controlled trials and comparative observational studies evaluating ureteral stent dwell time and/or removal methods in adults undergoing endourological procedures were included. Risk of bias was assessed using RoB 2 and ROBINS-I tools. Thirty-two studies encompassing 4,373 patients were included. Reported stent dwell times varied widely, most commonly ranging between 10 and 14 days in uncomplicated cases, with longer durations associated with increased rates of encrustation and removal difficulty. Removal techniques included rigid cystoscopy (48.7%), flexible cystoscopy (19.9%), extraction strings (23.5%), and device-assisted methods (7.9%). Less invasive approaches, particularly flexible cystoscopy and extraction-string removal, were consistently associated with reduced pain scores and improved patient comfort, although extraction strings carried a small risk of premature dislodgement. While practice patterns vary, the evidence suggests that a 10-14 day dwell time might be the optimal window to balance healing with the prevention of encrustation. Less invasive removal approaches, particularly flexible cystoscopy and extraction-string techniques, were generally associated with lower pain scores and high procedural success rates in selected patients. While these methods are safe and better tolerated, extraction strings carried a small, reproducible risk of premature dislodgement. High-quality prospective studies are needed to define determinant-based, individualized stent management strategies.
Soft skills correspond to intrapersonal and interpersonal abilities related to how individuals interact, make decisions, and manage their activities. In the context of undergraduate nursing education, their development is fundamental to the preparation of professionals capable of acting in an ethical, critical, and relational manner, making it relevant to understand how these competencies are incorporated into the teaching and learning process. In this context, the objective of this study is to understand how faculty members in undergraduate nursing programs incorporate soft skills into their pedagogical approaches and practices, identifying the competencies considered essential and the challenges to their implementation. A qualitative study was conducted with 26 nursing faculty members from four federal public universities in southern Brazil. Data were collected between June and September 2025 through semi-structured interviews, following the criteria of the Consolidated Criteria for Reporting Qualitative Research checklist. The interviews were processed using IRaMuTeQ software and analyzed in light of Discursive Textual Analysis. Three analytical categories emerged: faculty understanding of soft skills in nursing education; pedagogical approaches and strategies for the development of these competencies; and perceived difficulties in their promotion within teaching. The faculty members recognize the relevance of soft skills and report the use of active methodologies and reflective strategies for their development. However, most had not received specific training, and the teaching of these competencies occurs predominantly in an implicit manner. The findings demonstrate that, although soft skills are widely valued in nursing education, their promotion still lacks pedagogical systematization and institutional support. Challenges such as the subjectivity of these competencies, the prioritization of technical skills by students, and distractions associated with the use of technologies limit their intentional development. These results contribute to the international literature in nursing education by highlighting the need for structured institutional strategies for faculty development and for the explicit integration of soft skills into nursing curricula.
There have been discussions as to the time of elective induction of labour to curb the continuation of pregnancy that might endanger the lives of both the mother and child. This research was conducted to assess foetal and maternal consequences of planned delivery at 40 and 41weeks in women with low-risk singleton pregnancy. A randomised controlled trial with equal allocation of participants (96 pregnant women in each arm) into 40weeks and 41weeks. Participants were randomised at the antenatal clinic at 39 weeks for induction of labour. The main outcome was the caesarean section rate. Secondary outcomes were maternal (genital tract laceration rate) and foetal (rates of meconium staining of amniotic fluid, SCBU admission, perinatal mortality, birth trauma, birth weight, and neonatal APGAR score at 1 and 5 minutes). Student t-test and chi-square test were used for inter-group comparison. Incidence of caesarean delivery (26.6% vs. 21.3%; p=0.406), and genital laceration (2.1% vs. 5.6%; p=0.268) did not differ between groups. Significantly higher birth weight was noted among women induced at 41weeks (3.41 ± 0.37kg) than 40weeks (3.28 ± 0.46kg) (p=0.043). Also, there was significant variation in meconium staining of amniotic fluid between 40weeks (11.7%) and 41weeks (25.8%) (p=0.014). Other foetal outcomes showed no significant difference. Inducing labour at 40weeks is safe for low-risk women as it does not significantly increase the cesarean delivery rate and adverse perinatal outcomes. Therefore, elective induction of labour at 40weeks should be recommended and introduced into obstetric practice without the fear of adverse outcomes.
Fast-track and outpatient surgery have significantly reduced postoperative hospital stays across many surgical specialties. As a result, patients are increasingly discharged with strong opioid prescriptions, contributing to the global opioid crisis. Careful follow-up and opioid tapering are essential. While multidisciplinary Transitional Pain Services (TPS), involving pain specialists, psychologists, and physiotherapists, have shown promise, their widespread implementation is limited by costs and complexity. To address these barriers, we implemented a nurse-led TPS, supervised by a pain specialist and embedded within a multidisciplinary pain clinic. The aim of this study was to evaluate its effectiveness in clinical practice, including a mechanism-based treatment approach to postsurgical pain aimed at opioid tapering and optimizing the use of adjuvant analgesics. This observational cohort study included postoperative patients discharged with >20 mg oral oxycodone equivalents and/or those experiencing or at risk for neuropathic pain. Referred patients received telephone consultations by a nurse practitioner (NP) one to two weeks post-discharge. Each consultation included assessment of pain severity, neuropathic characteristics (using the first two items of the DN4 questionnaire), current analgesic use, and willingness to taper opioids. Patient education and motivational interviewing techniques were employed to support opioid tapering. Descriptive statistics and paired t-tests were used to analyze the data. Between June 2019 and July 2025, 243 patients were enrolled in the TPS. Following nurse-led counseling, 73 % of patients discontinued opioid use entirely, 23 % significantly tapered their dosage (from mean 101-43 mg oral oxycodone equivalent), and 4 % continued at the same dose. Anti-neuropathic medications were initiated in 22 % of patients. A nurse-led Transitional Pain Service is a feasible and effective approach to support opioid tapering in postoperative patients. In addition, early screening for neuropathic pain allows for targeted treatment. This model offers a scalable alternative to traditional multidisciplinary TPS programs.
Clinical empathy refers to a healthcare professional's ability to understand a patient's experiences and emotions through cognitive and affective perspective taking, and to communicate that understanding through compassionate and appropriate professional behaviors. Aging simulation suits are experiential educational tools designed to replicate the sensory and physical limitations associated with aging. However, evidence regarding their effectiveness in enhancing clinical empathy among active healthcare professionals remains limited. This study aimed to evaluate the effects of an aging simulation suit on clinical empathy among healthcare professionals working in long-term care settings. A randomized controlled trial was conducted with 82 healthcare professionals from four nursing homes in Madrid and Asturias (Spain). Participants were randomly assigned to an experimental group (EG) (n=41) or a control group (CG) (n=41). Both groups received the same structured educational session on empathy and aging. The experimental group additionally participated in an immersive experience using the GERT aging simulation suit, whereas the control group did not receive the simulation component. Self-reported empathy were measured pre- and post-intervention using the Interpersonal Reactivity Index (IRI) and the Jefferson Scale of Empathy-Health Professions version (JSPE-HPS). No significant differences were found between groups in IRI scores. However, the experimental group showed significant improvements in total JSPE-HPS scores and in the subscales Perspective Taking and Compassionate Care (p < 0.05), compared with the control group. These findings suggest that the immersive intervention enhanced both cognitive and affective components of clinical empathy. The use of an aging simulation suit was associated with improvements in specific dimensions of clinical empathy among healthcare professionals working in long-term care. This educational tool offers a valuable experiential approach that enhances understanding and compassion toward older adults. However, these findings are limited to short-term, self-reported measures, and no behavioral or patient outcome data were collected. Further longitudinal studies are needed to determine the long-term sustainability of these effects and their translation into clinical practice. ClinicalTrials.gov, Unique Protocol ID: 2711201916919; ClinicalTrials.gov ID: NCT07280689. Date of registration: 10/10/2025. Retrospectively registered.
Postpartum hemorrhage (PPH) remains a leading cause of maternal morbidity and mortality globally, particularly in women with antepartum hemorrhage (APH). Current risk assessment methods lack standardized predictive tools that are both simple and reliable for clinical application. We conducted a secondary analysis of a prospectively collected cohort of 100 pregnant women presenting with APH at ≥28 weeks' gestation at a tertiary care centre in northern India. Multivariable logistic regression was used to identify significant predictors of PPH. A point-based clinical risk score was then developed based on the multivariable model and internally validated using bootstrap techniques with 1000 replicates. PPH occurred in 30% of patients (n=30). Multivariable analysis identified four independent predictors of PPH: maternal age (adjusted odds ratio [OR] 1.29 per year; 95% confidence interval [CI] 1.10-1.51; p=0.002), gravidity (OR 2.11 per unit; 95% CI 1.00-4.43; p=0.049), gestational age at delivery (OR 0.64 per week; 95% CI 0.44-0.94; p=0.021), and antepartum blood transfusion (OR 2.44; 95% CI 1.02-5.84; p=0.045). The prediction model demonstrated excellent discrimination with an area under the receiver operating characteristic (ROC) curve of 0.86 (95% CI 0.80-0.92) and good calibration (slope 0.95). Bootstrap internal validation yielded an optimism-corrected AUC of 0.84. The resulting four-factor risk score stratified patients into four risk categories with PPH rates ranging from 4% (low risk) to 100% (very high risk). The four-variable score provides an accurate, easily applicable tool with excellent predictive performance. The score is a promising tool that, pending external validation, may facilitate early identification of high-risk patients and improve maternal outcomes. Further research should focus on external validation of this tool in diverse populations and its integration into clinical practice.
Industrial textile production is a significant contributor to freshwater pollution, particularly in water-intensive processes such as denim finishing. This study presents a process-based assessment of wastewater quality and rinse-water reuse potential using a stage-specific Water Quality Index (WQI) framework. By analyzing effluents from six major denim finishing stages, critical pollution hotspots were identified, and the feasibility of cascading rinse-water recovery was evaluated. Results demonstrate that WQI values decreased significantly after rinsing, with reductions reaching up to approximately 70% depending on the process stage. Furthermore, the proposed stepwise reuse strategy has the potential to reduce freshwater consumption by nearly two-thirds while maintaining operational performance. Beyond a site-specific application, the proposed methodology offers a transferable framework for industrial water optimization and circular water management in textile facilities globally. The findings contribute to advancing sustainable industrial practices and support progress toward SDG 6 and SDG 12.