A Queensland Health report points out that Māori and Pacific Island communities in Queensland are vulnerable due to social disadvantage and limited health service access. The researcher applies decolonisation and cultural safety models, emphasising the importance of understanding the historical, political, and socio-contextual determinants of these groups. Ethical considerations as applied in this research must be flexible and culturally safe, focusing on a reflective and respectful approach to data collection. The literature review is based on Charmaz's constructivist contextual framework, which underpins and later reveals that the socio-contextual determinants for Māori and Pacific Island communities in Queensland are multifaceted, with one direct factor being disengagement from child and family health services. To explore this, qualitative research used both decolonisation and cultural safety methodologies, with a focus on ethical considerations and the service user's experience, specifically examining how care is delivered, and not just the type of care provided. Embracing a decolonial perspective, the Talanoa method guided the data collection process. This involved interviewing 29 Māori and Pacific Island families in Townsville and Brisbane, as well as eight child health service providers in these areas. During data analysis, key interpretations were informed by conceptualisations of culturally safe health service delivery, supported by Charmaz's constructivist grounded theory, which aligns with the cultural safety methodology. The data analysis revealed key themes, including power dynamics, positionality, and identity versus cultural differences, the deficit discourse of trans-culturalism, and cultural disconnection in the portrayal of health service delivery to Queensland Māori and Pacific Island families. In this research context, trans-culturalism is associated with a deficit discourse. The application of the theoretical framework of cultural safety and decolonisation was two-fold; it reveals the importance of working in partnership and reveals significant power imbalances, causing a lack of engagement between service users and service providers. Furthermore, it reveals the unexamined privileges of service providers and the inherent marginalisation of service users. Of significance to the research problem statement, the 'lack of uptake of health service delivery', the grounded theoretical and traditional perspectives of cultural safety and decolonisation shift the focus from service users (Māori and Pacific Island communities) to explore the experiences of child health service providers in their interactions with Māori and Pacific Island families. The analysis uncovered misconceptions and a tendency to assume that healthcare delivery using a trans-cultural approach was culturally safe. Unearthing an understanding that challenges the deficit discourse of trans-culturalism to comprehend participants' positionality is essential for resolving and addressing the health issues faced by Queensland Māori and Pacific Island families accessing services. This recognition indicates an ongoing reliance on colonial imposition rather than promoting (a principle of cultural safety) self-determination in access to health service delivery for Queensland Māori and Pacific Island families.
The costs associated with advanced Parkinson's disease (aPD) extend beyond direct medical expenditure. As symptoms become more severe, professional and informal personal care costs are likely to exceed those incurred for medical and pharmacological treatment. The objective of this analysis is to explore the impact of treatment with subcutaneous foslevodopa/foscarbidopa (LDp/CDp) on the societal cost impact in the UK, France, Germany, Spain, and Canada. A model was developed to aggregate expected costs incurred by a cohort with aPD over a 5-year time frame. Resource use for direct medical, non-medical, and informal care are estimated from a real world data source (Adelphi), mapped to the severity of disease as estimated by the extent of OFF-time experienced by patients. Indirect societal costs are estimated from published literature. Unit costs for each of the included countries are then applied to these resource use estimates. Symptom progression of individuals within the cohort are derived from a previously developed Markov model, which captures the differential effect on OFF-time of LDp/CDp versus best medical treatment (BMT). Overall costs for aDP patients were shown to rise over the 5-year time horizon, as symptom progression occurred. The use of LDp/CDp incurred greater drug costs than BMT, but, by delaying exacerbation of OFF-time, this additional cost was more than offset by other savings - principally attributable to professional and informal care. Aggregated results showed a net cumulative saving of €96,273 per patient over the 5 year time horizon. Results for the five individual countries evaluated ranged from €50,297 to €135,208 per patient saving. LDp/CDp has been shown to significantly improve OFF-time burden in patients with aPD, compared with BMT. Once the costs of professional and informal care are taken into account, the additional acquisition costs of LDp/CDp are more than offset, yielding a net societal saving.
To assess the accuracy of oxidative stress markers in predicting delayed graft function (DGF) lasting more than one week after deceased-donor kidney transplantation (KTx). A translational pilot study was carried out and a five-step hierarchical diagnostic accuracy strategy was applied to evaluate the predictive value of oxidative stress markers for DGF >7 days in adult patients with end-stage kidney disease undergoing deceased-donor KTx. Statistics comprise multiple conventional methods, C-statistics, and diagnostic test performance measures. Cohort analysis of 27 consecutive patients revealed that 33.3% (9/27) developed DGF >7 days. Assessment of the oxidative stress markers selected in the first statistical step comparing donors and controls revealed a significant correlation of the DGF duration with the recipient's protein oxidation (ρ=0.466; p=0.022) as well as donor's hydrogen peroxide (ρ=-0.489; p<0.014). The areas under the curve were 0.683 (95% confidence interval [95%CI], 0.458-0.859; p=0.224) and 0.746 (95%CI=0.534-0.897; p=0.019), respectively. A cutoff of ≤14 μM H2O2 had the highest level of discriminating power for predicting DGF >7 days. The sensitivity, specificity, positive predictive value, negative predictive value, and overall diagnostic accuracy to identify patients at risk for experiencing DGF >7 days was 85.7%, 61.1%, 46.1%, 91.7%, and 68%, respectively. Donor hydrogen peroxide was identified as a potential predictor of DGF >7 days in patients who underwent deceased-donor KTx. A cut-off value of ≤14 μmol/L is suggested to stratify patients at high risk of requiring dialysis beyond the first post-transplant week.
Family physicians (FP) are critical in providing outpatient end-of-life (EOL) care, yet most continuity measures do not consider temporal changes in visits. The relative variance index (RVI), which measures visit interval variation, has not been applied to the EOL context. To measure the association between outpatient FP visit regularity & acute care use in the last month of life. Retrospective cohort study using linked population-level health administrative data. FP visit regularity was measured using the RVI, calculated from outpatient visits during the last two years of life (excluding the last month). Outcomes included hospitalizations, emergency department (ED) visits in the last month of life & acute care setting deaths. Adults with cardiorespiratory conditions who died in Ontario, Canada, between 2017 & 2019. Patients' (N = 151 030) median age at 2-years before death was 80, 55% were male, & had a median of 9 FP outpatient visits in the last two years of life. Higher FP RVI scores were associated with increased hospitalizations (IRR [95% CI]: 1.07 [1.05,1.10]), ED visits (1.06 [1.03, 1.08]) & acute care deaths (OR [95% CI]: 1.30 [1.26, 1.35]). Sensitivity analyses identified that RVI scores varied across different look-back periods, while sensitivity model performances remained stable when adjusting for specialist visit regularity. Higher outpatient FP visit regularity was associated with increased EOL acute care use, despite low visit regularity during the last two years of life. However, the RVI's sensitivity to observation period lengths limits its utility as an EOL care-quality indicator. Family doctors play a crucial role in caring for patients nearing the end of life, but how regularly they see patients may impact healthcare use. This study examines whether the consistency of outpatient family doctor visits—measured using a measure called the relative variance index—affects hospitalizations, emergency department visits, and place of death for people with cardiorespiratory conditions. Using health data from over 151 000 patients who died in Ontario, the study found that outpatient family doctor visit regularity was generally low. However, patients with more consistent family doctor visits had slightly higher rates of hospitalizations, emergency department visits, and deaths in acute care settings. While this may seem counterintuitive, other factors—such as patient complexity and homecare access—had a much stronger influence on outcomes. Additionally, the relative variance index's sensitivity to timing makes it difficult to use as a stand-alone quality measure for end-of-life care. The findings suggest that simply measuring visit regularity may not be enough to assess care quality. Future research should explore better ways to evaluate continuity of care at the end of life, especially given Canada's growing elderly population and family doctor shortages.
The prevalence of leg ulcers (LUs) increases with age. Nevertheless, a significant number of working-age individuals are affected by hard-to-heal (chronic) wounds that last throughout their adult lives, impacting work capacity and leading to absenteeism, unemployment and subsequent financial burden. A retrospective observational study was carried out between January 2022 and June 2024 at a Portuguese tertiary hospital, which included working-age patients living with LUs. Patients who were unemployed and retired for reasons other than LUs were excluded from the study. Clinical data on wound care and work activity were registered, and the Work Productivity and Activity Impairment questionnaire General Health version 2 (WPAI-GH) was applied. Results: A total of 24 patients were included, with a mean age of 53.6±11.0 years. LUs had a median evolution of six years (interquartile range: 3-20) and 19 (79%) patients had recurrent ulcers. A total of 11 (46%) patients were unemployed or had retired early due to their LUs. A reduction of monthly income after developing the ulcer was noted in 43.5% of patients. The median number of work days missed by employed patients in the previous year was 39 (interquartile range: 2.5-317.5); three patients were on leave for ≥1 year. LU was related to incapacity for work and loss of income in almost half of the study cohort. Longer duration of the ulcers, recurrence rates and work that required prolonged standing were positively associated with unemployment and retirement. By establishing earlier and more efficient treatment plans, clinicians can reduce duration of treatment and work incapacity in people with LUs.
Gastric cancer is among the most common cancers in the world, and it has a significant negative impact on the health and economies of different countries, led by those that are developing. This study formulates a deterministic model for the transmission dynamics of gastric cancer through gastric ulcers, incorporating screening and treatment strategies. The model is thoroughly analyzed both quantitatively, qualitatively, and numerically. The key properties considered in the model analysis are positivity, invariant region, equilibria, stabilities, and bifurcation analysis. We compute the control reproduction number $ \mathcal R_{C} $ using the next-generation matrix approach. This enables us to prove that the model has a unique disease-free equilibrium (DFE) and admits a unique endemic equilibrium, which are locally and globally asymptotically stable whenever $ \mathcal R_{C} < 1 $ and $ \mathcal R_{C} > 1 $, respectively. Sensitivity analysis indicates that increasing the rate of screening decreases the control reproduction number, consequently reducing the rate of transmission of infections. Simulation results demonstrate that the combination of screening and treatment is the most effective intervention in reducing infection transmission. Furthermore, a combination of early screening and treatment proves more effective than a combination of late screening and treatment of gastric ulcers. Screening the infected population alone is identified as the least effective strategy for curtailing transmission of infection in the susceptible population. The findings of this study will guide public health officers in making decisions regarding the screening and treatment of exposed individuals with Helicobacter pylori infection and gastric ulcer patients, therefore aiding in fighting gastric ulcers and their progression to gastric cancer.
BackgroundGlobally, disability is a pressing issue that affects people of all ages and backgrounds, especially women with disabilities. Quality of life (QoL) of women with disabilities is a multidimensional concept encompassing physical, emotional, social, and economic aspects. Understanding the challenges and determinants influencing their well-being is crucial for developing effective interventions and policies.ObjectivesThis study aimed to determine the prevalence of poor QoL and factors associated with it among women with disabilities in the Eastern Province of Saudi Arabia.DesignA cross-sectional study was conducted among a representative population of "women with disabilities" in the eastern region of Saudi Arabia.MethodsA convenience sample of 301 participants was selected for this study. Data were collected using a self-administered questionnaire that included sociodemographic and medical characteristics, as well as the World Health Organization Quality of Life -BREF (WHOQOL-BREF) scale.Results88% of participants had poor quality-of-life scores. Factors significantly associated with QoL included education level, Hospital type (p< 0.05).ConclusionThe high prevalence of poor QoL among women with disabilities highlights a critical public health concern. Addressing these challenges requires targeted interventions to improve their overall well-being.
Survivors of nasopharyngeal carcinoma (NPC) frequently suffer from a prolonged reduction in quality of life (QoL) following successful treatment. This study investigates the determinants of QoL among NPC survivors, with a focus on the temporal evolution post-treatment and the influence of socioeconomic and clinical factors. In this cross-sectional study, NPC patients who received treatment in the last 10 years and underwent follow-up at our institute from 2021 to 2023 were enrolled. They completed the EuroQol five-dimension (EQ-5D), World Health Organization Quality of Life-Brief (WHOQOL-BREF), Sinonasal Outcome Test 22 (SNOT-22), Eustachian Tube Dysfunction Questionnaire-7 (ETDQ-7), and Eating Assessment Tool-10 (EAT-10). Kernel smoothing techniques were applied to delineate the trends in QoL scores, and linear mixed models were utilized for evaluating the unadjusted and adjusted influences of household income and other predictors on QoL. Out of 248 participants, 355 QoL evaluations were performed. Kernel-smoothed trajectories revealed that higher household income correlated with superior QoL scores throughout the majority of the post-treatment timeline. Although initial analyses showed that both higher household income and Karnofsky Performance Scale (KPS) scores were predictive of better generic and condition-specific QoL, the effect of income was attenuated upon adjusting for KPS in the multivariate analysis. Mediation analysis indicated that the association between income and QoL was partly mediated by the patients' performance status. Performance status is a pivotal mediator in the interplay between socioeconomic status and QoL outcomes in NPC survivors. These insights underscore the need for prospective studies to confirm these relationships.
Functional decline is a major challenge in aging, particularly for those with a history of cerebrovascular events. Yet, prior research relies heavily on self-reported measures. The impact of objective physical activity patterns, particularly sedentary accumulation, on long-term functional prognosis remains unclear. Using the National Health and Aging Trends Study (NHATS), we analyzed 480 community-dwelling older adults (weighted N = 1.9 million). Group-Based Trajectory Modeling (GBTM) was applied to 7 years of follow-up data to identify trajectories of ADL disability. Wrist-worn accelerometry captured objective activity profiles. Weighted logistic regression and restricted cubic splines assessed associations between activity metrics and adverse trajectories. Three distinct trajectories were identified: "Robust," "Progressive Decline," and "Persistent Severe Disability." Stroke survivors were disproportionately represented in the severe disability group compared to controls (35% vs. 10%). Although stroke survivors constituted a small proportion of the unweighted sample, they exhibited a profoundly amplified vulnerability to sedentary behaviors. Average sedentary bout length emerged as a core risk factor independent of total activity volume (aOR 1.15 per 10-min increase). Total activity exhibited an "L-shaped" protective association, while sedentary time showed a "J-shaped" risk threshold (>700 min/day). Interaction analysis revealed that frequent interruptions of sedentary time could partially offset the risks associated with low total activity volume. Beyond total activity volume, continuous sedentary patterns are critical biomarkers of functional decline in older adults. Stroke survivors represent a highly vulnerable subpopulation where clinical rehabilitation strategies should prioritize 'sedentary interruption'.
To describe the prevalence of Missed Nursing Care and its predictors in Greek public hospitals. Missed Nursing Care is defined as any aspect of required patient care that is omitted or delayed. Despite the available studies, little is still known in countries with significant nursing shortages, such as Greece, where 2.23 registered and assistant nurses per 1000 population have been reported, significantly below the EU-27 average. A national cross-sectional study was conducted in 28 of 124 public hospitals in Greece. Nurses and nursing assistants working in medical or surgical units, providing direct care to adult patients, and with at least 3 months of experience were eligible. The MISSCARE Survey Part A (5-point Likert scale; 1 = never, 5 = always missed), Part B (reasons, four-point Likert scale; 1 = not significant, 4 = significant reason) and the Practice Environment Scale of the Nurse Work Index (4-point Likert scale; 1 = totally agree, 4 = totally disagree) were used. Descriptive and inferential statistics were applied. A total of 676 nurses participated. The Missed Nursing Care Part A total score was 2.06 (±0.65), with patients' daily activities (mean = 2.32 ± 0.73) receiving higher scores than activities related to health status and treatments (mean = 1.80 ± 0.63). The overall score on the Practice Environment Scale of the Nurse Work Index was 2.53 (±0.49). Multiple linear regression analysis showed that issues in nursing care standards for quality of care, staffing adequacy and communication within the team were the most significant predictors of Missed Nursing Care. Missed Nursing Care is a major problem in Greek hospitals. Inadequate staffing is a key factor in missed care according to nurses' perceptions. Increasing nursing staff, along with implementing standards for nursing care and improving communication among team members, will enhance the quality of health services in Greece. Strengthening staffing levels and reinforcing nursing standards are essential strategies for reducing Missed Care in Greek public hospitals.
In the global context of the HIV pandemic, the biopsychosocial environment of key populations remains marked by a culture that fosters stigma and discrimination. These attitudes, based on misconceptions of what it means to live with HIV, transcend the healthcare sphere and negatively impact people's quality of life. In Mexico, systematic documentation of this issue within hospital settings remains limited. The objective of this study was to estimate the frequency of HIV-related stigma and discrimination among healthcare personnel of a secondary-level public hospital, as well as to identify differences by job category. We conducted an observational, descriptive, cross-sectional study from February to March 2025 at a secondary-level public hospital in Cancún, Quintana Roo, Mexico. We applied the short version of the HIV stigma questionnaire proposed by Nyblade et al. A total of 316 healthcare workers from different areas participated. To identify differences in stigmatizing attitudes across professional groups and to explore potential associations between sociodemographic variables and HIV-related beliefs, descriptive statistical analyses were performed, along with non-parametric tests (Kruskal-Wallis, Mann-Whitney U, and Spearman correlations) and post hoc comparisons using Dunn's test with Bonferroni correction. A total of 22.5% of participants reported having witnessed refusal to work with people living with HIV, and 30.1% observed lower-quality care toward them. Additionally, 32.9% agreed that people living with HIV are irresponsible, and 30.4% believed they "do not care about spreading the infection." Significant differences in stigma levels were found across occupational categories (p < 0.001). Medical assistants and interns showed higher levels of stigmatizing attitudes, while family physicians, nursing supervisors, and laboratory staff demonstrated greater empathy. Personally knowing someone living with HIV was significantly associated with greater acceptance of the right to become pregnant (p = 0.047). Stigma toward people living with HIV remains present in hospital environments. There is an urgent need to implement institutional training and awareness programs to reduce discriminatory attitudes and promote respect, accurate information, and empathy among all healthcare staff. En el contexto global del VIH, el entorno biopsicosocial de las poblaciones clave sigue marcado por una cultura que promueve el estigma y la discriminación. Estas actitudes, basadas en concepciones erróneas sobre lo que implica vivir con VIH, trascienden el ámbito sanitario y afectan la calidad de vida de las personas. En México, su documentación dentro de unidades hospitalarias es aún limitada. El objetivo de este estudio fue estimar la frecuencia de estigma y discriminación hacia el VIH en el personal de salud de un hospital público de segundo nivel, así como identificar diferencias según la categoría laboral. Se realizó un estudio observacional, descriptivo y transversal entre febrero y mayo de 2025 en un hospital público de segundo nivel, ubicado en Cancún, Quintana Roo, México. Se aplicó el cuestionario breve de estigma hacia personas con VIH propuesto por Nyblade . Participaron 316 trabajadores de distintas categorías laborales. Con el fin de identificar diferencias en las actitudes estigmatizantes según la categoría profesional y explorar posibles asociaciones entre variables sociodemográficas y las creencias relacionadas con el VIH, se realizaron análisis estadísticos descriptivos, pruebas no paramétricas (Kruskal-Wallis, U de Mann-Whitney, correlaciones de Spearman) y comparaciones con prueba de Dunn y corrección de Bonferroni. El 22,5% del personal reportó haber presenciado rechazo a trabajar con personas que viven con VIH y el 30,1% observó atención de menor calidad hacia ellas. El 32,9% estuvo de acuerdo con que las personas seropositivas son irresponsables y el 30,4% con que “no les importa el contagio”. Se identificaron diferencias significativas en el nivel de estigma entre categorías laborales (p < 0,001). Asistentes médicos y becarios mostraron actitudes más estigmatizantes, mientras que médicos familiares, jefaturas de enfermería y laboratoristas destacaron por su empatía. Conocer personalmente a una persona que vive con VIH se asoció significativamente con una mayor aceptación del derecho a embarazarse (p = 0,047). El estigma hacia las personas que viven con VIH continúa presente dentro del entorno hospitalario. Es apremiante establecer programas de sensibilización y formación continua orientados a reducir actitudes discriminatorias, promoviendo el respeto, la información y la comprensión entre todo el personal de salud.
The coronavirus disease of 2019 (COVID-19) pandemic left lasting effects in all sectors globally. Particularly among women of reproductive age, it has instilled difficulty of accessing healthcare, which has been linked to fear of being exposed to the virus at health facilities. However, most studies especially in East Africa have narrowed COVID-19 related research to qualitative reviews and descriptive analyses due to inadequacy of COVID-19 data. The purpose of this study was to determine the relationship between health facility visiting and healthcare access difficulty due to fear of COVID-19 at health facilities, while exploring both individual and contextual determinants. The current study applied a multilevel logistic mixed-effects regression model to account for both individual and contextual dimensions in the study. Overall, 30.0% of the women reported difficulty in accessing healthcare. Women who had visited a health facility in the past 4 weeks of the survey and had health insurance were associated with less healthcare access difficulty; (AOR= 0.73; 95% CI, 0.57-0.95) and (AOR= 0.74; 95% CI, 0.59-0.94) respectively. On the contrary, women who watched television, in households that had partially and completely lost household income due to the virus were associated with more difficulty; (AOR= 1.38; 95% CI, 1.10-1.74), (AOR= 2.09, 95% CI, 1.51-2.88) and (AOR= 2.40, 95% CI, 1.65-3.47) respectively. About 72% of the total variance in healthcare access difficulty was attributable to differences in enumeration areas (ICC= 0.72; 95% CI, 0.66-0.76), with less significant individual level contribution. Interventions that influence routine health facility visits can boost healthcare access among women. Enumeration area-specific interventions may be more effective.
Chronic obstructive pulmonary disease (COPD) is widely underdiagnosed in Colombia, especially in rural departments with limited access to spirometry. We conducted a department-level ecological study using aggregated administrative data from 2020-2023 to generate diagnosis-based COPD prevalence estimates that explicitly account for regional disparities in diagnostic capacity and socioeconomic conditions. We assembled department-year data from the Individual Registry of Health Services Delivery, national mortality statistics, and the National Quality of Life Survey. A Bayesian generalized additive model with a Gamma family and log link was fitted to a Composite Bias-Correction Multiplier that captured under-ascertainment as a function of spirometry utilization, COPD lethality, outpatient contact rates, multidimensional poverty, household fuel type, and age structure. Posterior estimates of this multiplier were applied to diagnosis-based COPD prevalence in adults aged ≥40 years to obtain detection-adjusted departmental and national estimates. Model performance was summarized using the Bayesian R2 (proportion of variability in the multiplier explained by the covariates) and the leave-one-out information criterion (LOOIC) as a measure of expected predictive fit. The model estimated a population-weighted national COPD prevalence of 2.22% (95% credible interval [CrI], 2.21-2.23). Detection-adjusted departmental prevalence ranged from 0.81% in Vichada to 3.50% in Caldas, whereas diagnosis-based prevalence ranged from 0.27% to 2.22%. Spirometry utilization correlated strongly with diagnosis-based prevalence (r = 0.85, p < 0.001), and departments with higher COPD lethality and greater multidimensional poverty required larger adjustment multipliers. The model explained most of the variability in the Composite Bias-Correction Multiplier (Bayesian R2 = 0.99) and showed good expected predictive performance (LOOIC = -461.1). COPD prevalence in Colombia shows marked regional heterogeneity driven by demographic risk and uneven diagnostic capacity. Detection-adjusted estimates indicate that the highest burden lies in Andean departments such as Caldas, Boyacá, and Risaralda, while remote Amazon and Orinoco territories experience substantial underdiagnosis. These findings support targeted expansion of spirometry and chronic respiratory care in underserved regions and illustrate how accounting for detection bias can improve chronic disease surveillance in low- and middle-income settings. Chronic Obstructive Pulmonary Disease (COPD) is a serious lung condition that makes breathing difficult. In Colombia, many people with COPD are not properly diagnosed, especially in rural areas where healthcare services are limited. Previous studies estimating how many people have COPD in Colombia may not be accurate because they did not consider these differences in healthcare access between regions. We studied COPD rates across all 33 departments (states) in Colombia using advanced statistical methods. We looked at factors such as how often spirometry tests are performed, poverty levels, and access to modern cooking stoves. We estimated that 2.22% of Colombians have COPD (95% credible interval [CrI]: 2.21–2.23%), but this varies widely between regions—from 0.81% to 3.50%. Areas with better access to spirometry, a breathing test used to confirm COPD, showed higher diagnosis rates, which suggests that many cases are missed where testing capacity is limited. These results indicate that diagnosis-based COPD prevalence during 2020–2023 is lower than historical spirometry-based estimates, likely reflecting under-ascertainment in places where spirometry access is limited. None of Colombia’s 33 departments reaches the National Administrative Benchmark of 1105 spirometry tests per 100,000 adults aged ≥40 years, and only Bogotá approaches half of this target. This information can guide health authorities to expand lung function testing and chronic respiratory care in underserved regions. The same modeling strategy could be applied to other chronic diseases that are underdiagnosed in low-resource settings.
Early parent-child interactions are crucial for child development. Existing assessment scales have several limitations: intensive training often exceeding 40 hours, administration time up to two hours, and unbalanced distribution of items to the detriment of the dyadic dimension of interactions. Our study aims to develop and validate the MIPPE (Measure of Early Parent-Child Interactions), a scale adapted for daily clinical practice that assesses the quality of early interactions while integrating the dyadic dimension. The MIPPE scale was developed within the PERL program in Eastern France. After several revisions, the final version includes 9 items scored from 0 to 3. Validation is based on the analysis of 228 parent-child interaction videos from 123 dyads at 4 and 24 months, divided between intervention (N = 62) and control groups (N = 61). The MIPPE demonstrates acceptable internal consistency (Cronbach's α = 0.92, McDonald's ω = 0.93). Exploratory factor analysis reveals a unidimensional structure explaining 66% of the variance, confirmed by confirmatory factor analysis (CFI = 1.000, TLI = 1.002, RMSEA = 0.000). Inter-item correlations range from 0.34 to 0.67, indicating satisfactory cohesion. Clinical thresholds have been established: 15 and 20 (optimal sensitivity 87%). The MIPPE constitutes an accessible, rapid, and psychometrically validated assessment tool for early childhood professionals. It facilitates early screening of interactional difficulties and referral to interventions, thus contributing to the prevention of developmental disorders and parental support.
Pulmonary complications remain a major challenge after esophagectomy for esophageal cancer. While rehabilitation interventions aligned with Enhanced Recovery After Surgery (ERAS) principles show promise, the term "rehabilitation" is often applied to highly variable approaches-from isolated breathing exercises to comprehensive multimodal programs-raising concerns about clinical interpretability and generalizability. This systematic review evaluates the effectiveness of structured perioperative rehabilitation, with emphasis on intervention comprehensiveness, timing, and implications for real-world implementation. We systematically searched PubMed, CINAHL, Cochrane Library, Web of Science, CNKI, Wanfang, and CBM from inception to October 31, 2024. Of 37 included studies, only 12 delivered interventions integrating ≥2 core components (e.g., exercise plus nutrition or education). Due to high clinical and statistical heterogeneity (I2 > 90% for most outcomes), we prioritized narrative synthesis and conducted meta-analyses only within homogeneous subgroups-particularly those delivering continuous care across both preoperative and postoperative periods. Outcomes included functional capacity (6-min walk distance), cardiopulmonary function, pneumonia incidence, length of hospital stay (LOS), and health-related quality of life (HRQoL). Comprehensive perioperative rehabilitation (integrating prehabilitation and postoperative rehabilitation) was uniquely associated with a significant reduction in postoperative pneumonia [RR = 0.34, 95% CI (0.19, 0.61); p < 0.0001]. In contrast, the pooled effect across all rehabilitation interventions showed a modest but statistically significant benefit [RR = 0.70, 95% CI (0.52, 0.96); p = 0.02], suggesting that while even simplified protocols may confer some protection, maximal risk reduction requires continuous, multimodal support spanning the entire surgical trajectory. In contrast, prehabilitation alone achieved the greatest reduction in hospital length of stay [MD = -2.94 days, 95% CI (-5.50, -0.37)] and significantly improved FEV₁ (SMD = 0.60). Multimodal programs consistently enhanced health-related quality of life [SMD = 0.76, 95% CI (0.60, 0.92)] and functional capacity (6-min walk distance, SMD = 0.90). Single-component interventions (e.g., respiratory training alone) showed inconsistent or negligible effects. The current evidence base is predominantly derived from studies conducted in China, reflecting strong regional research momentum while highlighting the need to validate these findings in other healthcare contexts. Multimodal, multi-phase rehabilitation may meaningfully improve short-term recovery after esophagectomy, but its benefits are outcome-specific and contingent on intervention design: pneumonia prevention requires integrated perioperative care, while reductions in resource use and improvements in baseline physiology are primarily driven by prehabilitation. The current literature suffers from conceptual dilution of "rehabilitation," methodological variability, and limited long-term data. To enhance public health impact, future efforts should focus on scalable, standardized delivery models-such as community-integrated or telehealth-supported pathways-that extend support beyond hospital discharge, particularly in resource-constrained settings. PROSPERO registration number: CRD42024617815.
Elderly individuals often experience multiple comorbidities, requiring the use of several pharmacological therapies to improve their quality of life and extend life expectancy. However, medication non-adherence is a common issue in this population, potentially leading to significant clinical complications and increased healthcare costs. This study aims to evaluate medication regimen complexity and adherence among senior citizens in Nepal. Specifically, it explores factors associated with medication complexity and adherence, and examines the relationship between the two variables in the elderly population. A cross-sectional descriptive study was conducted in selected wards of Chandragiri Municipality. Medication regimen complexity was assessed using the validated Medication Regimen Complexity Index (MRCI), while medication adherence was measured using Morisky Green Levine Adherence (MGLA) scale. Descriptive statistics (mean, standard deviation, frequency, and percentage) were used to summarize the data. Pearson's chi-square test and correlation analysis were applied to examine the associations between MRCI levels and medication adherence. Ordinal logistic regression analysis was used to determine the impact of MRCI and other associated variables on adherence. A p-value <0.05 was considered statistically significant, and results were reported using adjusted odds ratios (AOR) with 95% confidence intervals (CI). Out of 422 eligible participants, the majority of participants had a high MRCI score. A statistically significant association was found between MRCI and medication adherence (p < 0.05). Specific comorbidities, including diabetes mellitus, benign prostate hyperplasia (BPH), gastroesophageal reflux disease (GERD), thyroid disorder, chronic obstructive pulmonary disease (COPD), and coronary artery disease was significantly associated with increased MRCI scores. A weak positive correlation was observed between overall MRCI and adherence (r = 0.21), indicating a slight increase in adherence with increasing regimen complexity. Dosing frequency showed a strong correlation with MRCI (r = 0.92, p < 0.01) followed by total medication count, additional directions and dosage forms. In adjusted analysis, participants with higher MRCI level had significantly higher odds of non-adherence (AOR = 2.31, 95% CI: 1.03, 5.15). Additionally, individuals with ≥4 illnesses were more likely to be non-adherent (AOR = 3.82, 95% CI: 1.86-7.83) compared to those with two illnesses. Most participants exhibited high medication regimen complexity, which negatively impacted adherence levels. These findings underscore the importance for healthcare providers to implement strategies aimed at simplifying medication regimens and supporting adherence, particularly among elderly individuals with multimorbidity.
The growing population of breast cancer survivors (BCS) underscores the need to address ongoing challenges that persist beyond active treatment. This scoping review aimed to systematically map the unmet needs of BCS and inform the development of an evidence-based survivorship care framework. A systematic search of databases identified 13,401 studies, of which 170 met the inclusion criteria. Data were analyzed using qualitative content analysis and descriptive statistics to categorize unmet needs across studies. Six major categories of unmet needs were identified: physical symptom burden, psychosocial/emotional needs, informational needs, healthcare support needs, relational/sexual needs, and financial/occupational needs. Physical symptoms (33.5% of studies) and psychosocial/emotional concerns (32.4%) were most frequently reported. Based on these domains, a three-phase conceptual framework was developed: (1) universal needs screening, (2) targeted, multidisciplinary interventions tailored to individual survivors, and (3) ongoing evaluation through patient-reported outcomes to optimize care. Unmet needs among BCS are widespread and multifaceted, particularly regarding chronic physical symptoms and persistent psychosocial distress. Existing care frameworks are fragmented and insufficient. The proposed three-phase framework provides a structured approach to delivering proactive, patient-centred survivorship care. Implementing this framework can improve identification and management of unmet needs, enhance quality of life, support long-term physical and emotional recovery, and foster resilience throughout survivorship.
In 2023, the American Heart Association Cardiovascular-Kidney-Metabolic Scientific Advisory Group introduced the Predicting Risk of Cardiovascular Disease Events (PREVENT) equations, a race-free, sex-specific model for cardiovascular disease (CVD) risk prediction in adults aged 30 to 79 years. While initial validations showed strong performance, their reliability under missingness conditions remains unclear. To evaluate discrimination and calibration of the PREVENT equations in an electronic health record (EHR) cohort and assess robustness to missingness. This retrospective cohort study used Duke University Health System, a health network encompassing tertiary hospitals, regional hospitals, and primary care practices across North Carolina, EHR data from March 2014 to December 2024 with up to 8 years follow-up. Patients without baseline CVD with sufficient data to calculate PREVENT risk were included. Two cohorts were defined: a relaxed cohort, allowing for missing laboratory and vital sign data with race-sex median imputation, and a strict cohort, restricted to those with complete records. Data were analyzed from October 2024 to June 2025. Published PREVENT equations alongside locally fitted Cox proportional hazards, discrete-time neural network, and recalibrated PREVENT models. The primary outcomes were estimated 5-year risk of incident CVD and assessed discrimination (C-index) and calibration (expected vs observed event rates) at 5 years by race, sex, and socioeconomic subgroups. The local adaptation via Duke retraining was compared with machine learning-based recalibration of PREVENT scores. The study included 406 230 patients in the relaxed cohort (239 764 females with a mean [SD] age of 49 [20] years and 166 466 males with a mean [SD] age of 49 [20] years; 16 291 Asian [4.0%], 107 114 Black [26.4%], and 256 403 White [63.1%]) and 127 151 patients in the strict cohort (71 086 females with a mean [SD] age of 54 [13] years and 56 065 males with a mean [SD] age of 53 [12] years; 8210 Asian [6.5%], 29 033 Black [22.8%], and 83 515 White [65.7%]). PREVENT showed strong discrimination in both cohorts (C-index, 0.77 for both males and females in the strict cohort vs 0.75 for males and 0.77 for females in the relaxed cohort), indicating robustness to missing data. Calibration ratios were higher in the strict cohort, indicating more risk underestimation in the relaxed cohort. Local adaptations minimally affected discrimination and modestly improved calibration. In this cohort study, the PREVENT equations showed strong discrimination and generalizability, including with missing laboratory and vital sign data when imputation was applied, supporting reliable CVD risk identification and ranking in routine practice.
To evaluate the effectiveness of perioperative non-drug interventions in reducing postoperative pulmonary complications (PPCs) in adults undergoing abdominal surgery. Systematic review and meta-analysis. Ovid MEDLINE, Embase, and Web of Science from database inception to January 2025 and updated in January 2026, with no language restrictions. Randomised controlled trials assessing the effectiveness of perioperative non-drug interventions for the prevention of PPCs in adults undergoing elective abdominal surgery under general anaesthesia, with clearly defined PPCs. The primary outcome was the proportion of patients developing PPCs. Secondary outcomes included the proportion of patients with PPC subtypes according to European Perioperative Clinical Outcome definitions (respiratory infection, respiratory failure, pleural effusion, atelectasis, or pneumothorax) and hospital length of stay. Two reviewers independently screened studies, extracted data, and assessed risk of bias with the Cochrane RoB 2.0 tool. Data were synthesised using meta-analyses and trial sequential analyses, with the evidence certainty assessed using the Grading of Recommendations, Assessment, Development and Evaluation (GRADE) approach. 255 trials including 55 260 participants were included, evaluating 10 types of interventions with 39 subtypes for PPC prevention. PPCs occurred in 6467 (11.7%) participants across all included trials. High certainty evidence showed that low fraction of inspired oxygen (FiO2) significantly reduced PPCs (risk ratio 0.81, 95% confidence interval 0.71 to 0.92). Moderate certainty evidence showed benefit for four intervention types: lung protective ventilation (risk ratio 0.66, 0.57 to 0.76), physiotherapy (0.55, 0.46 to 0.65), analgesia (0.73, 0.64 to 0.84), and nutrition (0.74, 0.63 to 0.87), with individualised positive end expiratory pressure, composite lung protective ventilation, early mobilisation, and epidural analgesia also showing benefit at the subtype level. Trial sequential analysis confirmed sufficient cumulative evidence for all the above interventions except early mobilisation. By contrast, goal directed haemodynamic therapy, targeted blood pressure management, restrictive fluid therapy, and postoperative bi-level positive airway pressure showed no evidence of benefit, with moderate certainty. This synthesis establishes an evidence hierarchy for PPC prevention in abdominal surgery. Low FiO2 is the only intervention supported by high certainty evidence and should be prioritised in clinical practice. Other beneficial strategies include lung protective ventilation, physiotherapy, analgesic techniques, and nutrition interventions. Conversely, the role of goal directed haemodynamic therapy-despite its widespread use-warrants reconsideration for PPC prevention. These findings facilitate prioritisation of effective interventions and development of evidence based guidelines. PROSPERO CRD42025637449.
Federal policy changes, including restrictions on research topics and proposed National Institutes of Health (NIH) budget reductions, may affect perceptions of career stability among early-career biomedical researchers. To evaluate how NIH K awardees perceive the effects of federal policy on research career stability and intentions to pursue independent NIH funding. In this cross-sectional survey study, an anonymous online national survey was administered from April 15 to May 27, 2025, via REDCap using contact information from the NIH RePORTER database. All principal investigators were holding individual mentored K awards initially funded between 2019 and 2025. Data were analyzed from June 11, 2025, to February 11, 2026. Self-reported perceptions of recent federal policy changes affecting research funding and institutional support. The main outcome was perceived likelihood of remaining in science and significant funding disruptions measured via an online national survey. Secondary outcomes included perceived institutional support and the likelihood of applying for R01-equivalent funding. Of 6118 K award recipients, 1904 respondents representing all NIH institutes completed the survey (adjusted response rate, 34%); 1230 (65%) identified as women and 366 (19%) identified as underrepresented in biomedical research by race, ethnicity, or disability. A total of 1819 (96%) reported that federal policy changes negatively affected the stability of their research careers. Compared with 1 year prior, 988 (52%) believed they were somewhat less likely and 341 (18%) much less likely to continue conducting research. Significant funding disruptions were reported by 343 (18%). Postdoctoral researchers were more likely than associate professors to report being much less likely to stay in science (adjusted odds ratio [aOR], 2.57; 95% CI, 1.35-4.88; P < .001). Respondents identifying as disabled (aOR, 2.21; 95% CI, 1.38-3.54; P < .001), American Indian or Alaska Native (aOR, 5.32; 95% CI, 1.01-28; P = .048), Black (aOR, 2.48; 95% CI, 1.52-4.05; P < .001), or Hispanic (aOR, 2.67; 95% CI, 1.78-3.98; P < .001) were more likely to report significant funding disruptions. A total of 506 respondents (27%) reported high institutional support; 523 of 1338 (39%) who had not yet applied for an R01 reported decreased likelihood of doing so. In this cross-sectional survey study of NIH K-award recipients, nearly all respondents reported a decrease in career stability, and many reported reduced intent to pursue independent NIH funding amid federal policy shifts. These findings indicate widespread perceived instability among early-career NIH-funded investigators.