Patient safety competency among nurses is increasingly emphasized, yet research on patient safety competency among psychiatric nurses remains scarce. To deeply explore the cognition of psychiatric nurses on patient safety competency and explore the demands of nurses for enhancing patient safety competency. The study was conducted from June to July 2025 with 16 psychiatric nurses. Data were collected through in-depth personal interviews and thematic analyzed using Colaizzi's seven-step phenomenological analysis method. Three main themes and 13 subthemes are formed: (1) The connotation of patient safety competency, including "protecting one's own safety is the prerequisite for ensuring patient safety," "understanding patients," "observation," "nurse-patient communication," "nurse-patient relationship," and "handling adverse events;" (2) influencing factors of patient safety competency, including "manpower resources," "leadership," "teamwork," and "adverse event management system;" and (3) the demand for improvement of patient safety competency, such as "emphasize theory over practice," "outdated training content and monotonous formats," and "desire for more experience sharing." This study conducted an in-depth investigation into the connotation, influencing factors and improvement demands of patient safety competency among nurses in the field of mental health, revealing the specificity of psychiatric nurses' perceptions of patient safety competency. This provides a basis for health administration departments and hospital managers to develop relevant on-the-job training programs and intervention strategies to enhance the patient safety competency of psychiatric nurses, thereby improving the quality of nursing care and delivering superior services to patients. It holds practical significance for the optimization of specialized psychiatric nursing and even the broader healthcare field.
The efficacy and adverse reactions of Maren Runchang Pills among patients of different genders remain unclear. This study aims to analyze the efficacy and safety differences of Maren Runchang Pills in patients of different genders, providing a basis for the implementation of gender-precise treatment in clinical practice. Gender-stratified analysis was adopted to include constipation-predominant irritable bowel syndrome (IBS-C) patients who met the syndrome of intestinal heat and accumulation. The efficacy and safety of Maren Runchang Pills were evaluated after 4 weeks of treatment. The main evaluation indicators include the complete spontaneous bowel movement response rates, visual analogue scale scores for abdominal pain and distension, traditional Chinese medicine syndrome scores, irritable bowel syndrome-symptom severity scale/irritable bowel syndrome-quality of life scale scores, and safety indicators. A total of 126 patients were included, among whom 101 were female and 25 were male. After 4 weeks of treatment, the proportion of female patients with a complete spontaneous bowel movement response rate of ≥75% was significantly higher than that of male patients (71.29% vs 48%, P = .0270), and the improvement in abdominal distension visual analogue scale score was also significantly better in females (decrease of 2.78 points vs 1.84 points, P = .0199). However, the total score of traditional Chinese medicine syndromes in male patients was higher than that in female patients (7.08 points vs 5.09 points, P = .0341). There were no statistically significant differences in the scores of the irritable bowel syndrome-symptom severity scale, irritable bowel syndrome-quality of life scale and safety indicators between the 2 groups. Maren Runchang Pills can effectively alleviate the constipation symptoms of patients with IBS-C, and the therapeutic effect is even better in female patients. The research results support considering gender factors in the clinical treatment of IBS-C to achieve individualized and precise intervention.
Objective: To investigate the efficacy and safety of 36 000 IU recombinant human erythropoietin (rhEPO) in the treatment of cancer-related anemia (CRA) and to evaluate whether 36 000 IU rhEPO can serve as a rational "reduced-dose alternative" to the 40 000 IU rhEPO regimen. Methods: The multicenter, open-label, non-inferiority, randomized controlled trial was conducted from March 2023 to July 2024 across 12 hospitals in China, including Liaoning Cancer Hospital. A total of 119 patients with CRA were enrolled and randomly assigned to receive the 36 000 IU rhEPO (n=61) or 40 000 IU rhEPO (n=58). The primary efficacy endpoint was the change in hemoglobin (Hb) levels from baseline at weeks 9-13. Secondary efficacy endpoints included hematologic and other biochemical parameters, transfusion requirements, quality of life (QOL), which was assessed by QOL scores and Karnofsky performance status (KPS) scores, and overall survival. Safety was evaluated by the incidence of treatment-emergent adverse events (TEAEs). Results: The least-squares mean changes in Hb from baseline to weeks 9-13 were (12.9±2.3) g/L in the 36 000 IU group and (13.4±2.4) g/L in the 40 000 IU group. Analysis of covariance showed no statistically significant difference between groups (F=-0.21, P=0.836), with a between-group difference of (-0.5±2.5) g/L (95% CI: -5.4 g/L, 4.4 g/L). The lower limit of the 95% CI (-5.4 g/L) exceeded the predefined non-inferiority margin of -10 g/L, indicating non-inferiority of the 36 000 IU dose compared to the 40 000 IU. At week 13, the proportions of patients with Hb increase ≥10 g/L were 82.0% (50/61) in the 36 000 IU group and 86.2% (50/58) in the 40 000 IU group, with no significant difference between groups (Qmh=0.40, P=0.527). The average weekly transfusion rate was 2.0% in both groups. No significantly significant differences were observed between the two groups in terms of changes in hematocrit, reticulocyte percentage, folate, vitamin B12, albumin, iron metabolism markers, QOL scores, KPS scores, or overall survival (all P>0.05). Regarding safety, the incidence of TEAE was 80.3% (49/61) in the 36 000 IU group and 84.5% (49/58) in the 40 000 IU group, with nausea, fever, and fatigue being the most common symptoms (incidence>5%). No drug-related serious adverse events were reported, and there were no significant differences between the groups (P>0.05). Conclusions: The 36 000 IU dose of rhEPO is non-inferior to the 40 000 IU dose in terms of efficacy and has a favorable safety profile for the treatment of CRA. These findings support the use of 36 000 IU rhEPO as a reasonable clinical option for managing CRA. 目的: 探索36 000 IU重组人促红细胞生成素(rhEPO)治疗肿瘤相关性贫血(CRA)的有效性和安全性,评估36 000 IU rhEPO能否成为40 000 IU rhEPO的合理“减量替代”。 方法: 采用多中心、开放标签、非劣效、随机对照试验设计,于2023年3月至2024年7月在辽宁省肿瘤医院等中国12家医院开展。共纳入119例CRA受试者,根据rhEPO给药剂量规格随机分为36 000 IU组(n=61)和40 000 IU组(n=58)。主要疗效指标为第9~13周血红蛋白(Hb)水平较基线的变化值,次要疗效指标包括血细胞比容等血液生化指标、输血需求、生活质量[生活质量(QOL)评分和卡氏功能状态(KPS)评分]及总生存时间。安全性指标为治疗期不良事件(TEAE)发生率。 结果: 36 000 IU组和40 000 IU组受试者在第9~13周Hb较基线变化值的最小二乘均值分别为(12.9±2.3)g/L与(13.4±2.4)g/L,协方差分析显示组间差异无统计学意义(F=-0.21,P=0.836),组间差值为(-0.5±2.5)g/L(95% CI:-5.4~4.4 g/L),95% CI下限(-5.4 g/L)高于预设的非劣效界值(-10 g/L),表明36 000 IU组相对于40 000 IU组具有非劣效性。在第13周时,36 000 IU组和40 000 IU组Hb升高≥10 g/L的受试者比例分别为82.0%(50/61)和86.2%(50/58),差异无统计学意义(Qmh=0.40,P=0.527),且两组平均周输血率均为2.0%。两组受试者治疗前后的血细胞比容、网织红细胞百分比、叶酸、维生素B12、白蛋白、铁代谢指标、QOL评分、KPS评分以及总生存时间差异均无统计学意义(均P>0.05)。在安全性方面,36 000 IU组和40 000 IU组TEAE发生率分别为80.3%(49/61)与84.5%(49/58),以恶心、发热和乏力症状为主(发生比例高于5%),均未报告任何药物相关严重不良事件,组间差异无统计学意义(均P>0.05)。 结论: 36 000 IU rhEPO在治疗CRA方面对比40 000 IU rhEPO具有非劣效性,且安全性良好,36 000 IU剂量可作为临床治疗CRA的合理选择。.
The voices of implementers are crucial in enhancing policy implementation. In the North West Province of South Africa, there have been no studies on the implementation of national guidelines for patient safety incident reporting since its introduction. Hence, this study explored the implementation of the national guidelines for patient safety incident reporting in selected public hospitals in the Dr Kenneth Kaunda district from the implementers' perspectives. This study employed a qualitative exploratory design with purposive sampling of hospital leaders, nurses, doctors, physiotherapists, occupational therapists, and pharmacists, leading to a total of 23 focus group discussions, spread across three participating hospitals, with four to seven participants in each focus group. The policy triangle framework of context, content, actors, and process guided the development of the focus group discussion guide and informed a deductive thematic analysis. Seven themes emerged, including contextual issues leading to implementing guidelines, common incidents & contributing factors, clarity of policy content, actors' knowledge, role clarity, motivation, and implementation process. The context of implementing guidelines was a need to standardise reporting practices, improve record-keeping, and mitigate potential litigation risks. Key contributing factors to patient safety incidents were inadequate security response, staff shortages, and resource constraints. The content of standard operating procedures was clear, yet lengthy. Discrepancies between the reporting tool and standard operating procedures complicated the reporting process. The actors' knowledge gaps hindered accurate reporting. Managers lacked effective strategies for motivating their reporting staff, further impeding the system's efficacy. While intrinsic motivation, grounded in professional accountability, drove some reporting, fears of consequences were present. The process of reporting was considered burdensome, and insufficient feedback mechanisms left staff uncertain about the value of their contributions. To improve implementation of the patient safety incident reporting, a system-wide approach is necessary; healthcare providers and leaders' knowledge must be improved, strategies to motivate reporting must be explored, leaders must create environments conducive for reporting, including protection of the reporters, and improvements after every reported system-level weakness are mandatory in order to encourage reporting. If reporting is for learning, anonymous reporting should be emphasised. Reporting processes must be made easy and consider available technology.
Integrating clinical research within patient care environments has been shown to enhance patient outcomes, foster a culture of clinical inquiry among staff, and improve hospital efficiency. This study aimed to evaluate the self-reported knowledge, beliefs and attitudes of hospital staff to organisational research and innovation at a large inner-city tertiary referral centre in Dublin, Ireland. The hospital is undergoing a physical and service expansion on its journey to becoming an academic health science centre. A 26-item semi-qualitative survey, iteratively refined by the hospital Research Steering Committee, was disseminated to hospital staff through electronic and face-to-face recruitment, achieving 640 responses. Quantitative data were analysed in R and Microsoft Excel. Content analysis was carried out on qualitative data. Findings revealed that while most staff recognise the clinical value of research and have engaged in research activities, important barriers exist. Predominant challenges include time constraints, limited research support, and inadequate resources. Despite these barriers, over half of the respondents expressed a strong interest in additional research training, emphasising the need for structured support, including protected time for research, statistical assistance, and enhanced patient engagement initiatives. The results highlight both strengths and limitations within the hospital's current research culture. While a positive foundation exists, with evident interest in research among staff, infrastructural and logistical gaps must be addressed to facilitate greater engagement. Targeted interventions, such as streamlined research approval processes, resource allocation, and strategic support for multidisciplinary collaboration, could enhance the hospital's capacity to integrate research into clinical practice. These findings contribute to the ongoing discourse on embedding research in healthcare systems, underscoring the pivotal role of institutional commitment in nurturing a sustainable research culture.
Globally, the older population is increasing rapidly, becoming one of the most significant demographic trends of the 21st century. This growth poses important social, health, and technological challenges for societies that must adapt their environments and services to promote independent and healthy aging. In Spain, the population aged 65 years and older reached 18% of the total population in 2020, and projections indicate that this proportion will continue to rise in the coming decades. Within this context, smart homes have emerged as one of the most promising avenues to support aging in place and improve the quality of life. Smart homes encompass a wide variety of functions, including environmental control, safety monitoring, communication, and other assistive technologies, that may help older people stay healthy, safe, and independent in their own homes. However, older people are not a homogeneous group. Their lifestyles, health conditions, and technological experiences differ substantially, which means that, as with any assistive technology, smart home functions must match the real and perceived needs of the target users to ensure acceptance, adoption, and long-term use. In this study, as a step forward toward the adaptability of smart home technology, we present a method to analyze the practical needs of smart home functions for older people. Specifically, we aim to understand the Spanish older population's readiness and needs for smart homes and to provide insights that can guide the design of more adaptive and user-centered solutions. We conducted an online survey focusing on residentially based lifestyles, health conditions, and preferences for smart home functions, targeting older adults living in Spain. The survey collected information about participants' demographic profiles, daily activities, health self-assessment, and attitudes toward technology. A total of 102 valid responses were analyzed. We then classified the older adults according to their residentially based lifestyles using clustering techniques and analyzed the preferences and needs for smart home functions in each identified group. Four clusters emerged based on the information provided by the participants: (1) high quality of life and independent life, (2) poor quality of life, (3) social-centered life, and (4) creative and personal-centered hobbies at home. On the basis of this classification, we explored each group's specific needs for smart homes and estimated their readiness to embrace different aspects of technology. As a result, the top-priority smart home functions for each group were identified and compared. This research contributes to understanding the practical user needs of smart homes as assistive technologies for older people. It provides a methodological approach to anticipate and prioritize functions according to user characteristics, supporting the development of personalized, adaptive, and more acceptable smart home solutions for aging populations.
Evaluating resident physicians is essential for resident development and patient safety. Fear of retaliation from residents may be a barrier to faculty completing resident physician evaluations. This study examined family medicine program directors' perceptions on fear of retaliation from resident physicians as a barrier to faculty completing honest, high-quality evaluations. The study was conducted as part of the 2024 Council of Academic Family Medicine Educational Research Alliance study of family medicine residency program directors. The 10-item survey assessed program directors' perceptions of faculty fear of retaliation, the impact of this fear, and rates of retaliation occurring in their programs in the last 3 years. The response rate was 45.39% (320/705). More than half (56.4%, 172/305) perceived that faculty in their programs are reluctant to give critical feedback on evaluations; nearly half (48.9%, 150/305) believed that fear of retaliation is a barrier. Fear of a reciprocal negative evaluation (34.5%, 106/305) and fear of formal complaints (38.9%, 119/305) were prevalent. Lack of adequate documentation was attributed to a failure to remediate and dismiss a resident in 19.8% (61/307) and 11.7% (36/306) of programs, respectively. Formal complaints against an evaluator or program occurred in 18.6% (57/307) of programs, and civil lawsuits were filed in 5.2% (16/306) in the preceding 3 years. Family medicine program directors perceive fear of retaliation from residents as a barrier to faculty completing honest, high-quality evaluations. Formal complaints and even civil lawsuits against evaluators or programs are not uncommon.
Sarcopenia has emerged as a potential prognostic factor in patients with advanced prostate cancer (PCa), requiring interventions for its prevention and treatment. We aimed to systematically identify, critically assess and synthesize the available evidence on the effectiveness and safety of interventions for preventing or treating sarcopenia in advanced PCa patients. MEDLINE, Embase and Web of Science were searched. Randomized and non-randomized controlled trials or longitudinal observational studies with a control group focusing on PCa patients aged 60 years and older were considered. Study selection, data extraction and risk-of-bias assessment of the included studies were performed in duplicate. When possible, pooled effect estimates were calculated. Twenty studies (n = 1275) were included. Resistance training (RT) (MD = 3.22 kg; 95% CI 0.69, 5.75) and the use of antimyostatin peptibody (MD = 2.2 kg; SE 0.8%) demonstrated statistically significant prevention of lean body mass loss in men undergoing androgen deprivation therapy (ADT). Exercise improved leg press (MD = 25.17 kg; 95% CI [8.71, 41.62]), leg extension (MD = 9.63 kg; 95% CI [4.83, 14.42]), seated row (MD = 4.38 kg; 95% CI [1.54, 7.22]) and chest press strength (MD = 1.70 kg; 95% CI [-1.48, 4.88]) and enhanced patients' physical functioning in chair sit-to-stand tests (MD = -1.02 kg; 95% CI [-1.70, -0.34]). RT improved health-related quality of life (HRQoL) in both general and specific domains and also reduced somatization (MD = -0.69 kg; 95% CI [-1.32, -0.07]) and psychological distress (MD = -1.63 kg; 95% CI [-3.10, -0.15]). The findings highlight the potential benefits of RT and selected pharmacological interventions on muscle-related and functional outcomes. However, the significant heterogeneity and lack of comprehensive outcome reporting underscore the need for more standardized and long-term research through larger, well-designed randomized controlled trials with standardized measurement methods to draw conclusive evidence and enhance the reliability and applicability of findings in clinical practice.
Background and Objectives: Venous thromboembolic disease (VTE), including deep vein thrombosis (DVT) and pulmonary embolism (PE), is a major cause of morbidity and mortality worldwide and imposes a substantial financial burden on health systems due to both the direct and indirect costs of acute management and long-term complications. This systematic review aimed to assess patient satisfaction with anticoagulation therapy for VTE and to highlight potential differences according to the type of anticoagulant. The review focused on factors influencing the patient experience, such as perceived efficacy, ease of use, adverse effects, and health-related quality of life. Materials and Methods: A systematic review, without quantitative meta-analysis, was conducted in accordance with PRISMA 2020 guidelines. Articles were identified through searches in major databases (PubMed, Scopus, Cochrane Library and others) using keywords including "patient satisfaction", "anticoagulation", "venous thromboembolic disease", and "quality of life". In total, 21 studies published between 2009 and 2025 met the inclusion criteria. The studies assessed patient satisfaction with different types of anticoagulation, including vitamin K antagonists (VKAs), direct oral anticoagulants (DOACs), and low-molecular-weight heparin (LMWH) injections. Results: Across the included studies, patients generally reported higher levels of treatment satisfaction with DOACs compared with VKAs, mainly due to the absence of routine laboratory monitoring and fewer dietary restrictions. However, satisfaction varied according to age, sex, and clinical status. In specific patient populations, such as those with cancer-associated thrombosis, factors including fewer drug-drug interactions and perceptions of safety with LMWH appeared to influence treatment choice and satisfaction. Adverse effects, particularly bleeding, were identified as major drivers of dissatisfaction. Several studies suggested that higher treatment satisfaction was associated with better adherence, while quality of life appeared to improve in patients treated with DOACs in comparison with VKAs. Conclusions: Patient satisfaction is a critical component of successful VTE management. Overall, DOACs appear to be associated with higher treatment satisfaction than traditional therapies such as VKAs, although further high-quality research is needed to individualise anticoagulation strategies. Systematic incorporation of patient-reported satisfaction into clinical decision-making and into international guidelines may improve adherence, enhance quality of life, and ultimately increase the effectiveness of anticoagulation therapy.
Objective: To evaluate the efficacy and safety of low-dose compound tropicamide eye drops for pupillary dilation in premature infants undergoing the fundus examination. Methods: A non-inferiority, randomized, controlled, crossover trial was conducted. Preterm infants hospitalized in the Department of Neonatology, the First Affiliated Hospital of Zhengzhou University scheduled for retinopathy of prematurity screening between October 2024 and June 2025 were enrolled. Participants were randomly assigned to group A or group B using simple randomization. Group A received standard-dose (90 μl) compound tropicamide eye drops in the first fundus examination, but low-dose (45 μl) drops in the second examination, whereas group B received the two doses in the reverse order. The primary efficacy outcomes were the pupil diameter measured 60 to 75 minutes after dilation and the difficulty of fundus examination as assessed by the ophthalmologist (classified as easy, difficult, or impossible). The difference in pupil diameter between the two groups and its 95% confidence interval (CI) were calculated and compared with the non-inferiority margin to determine non-inferiority. The safety was assessed based on changes in vital signs and adverse events. A linear mixed model and generalized estimating equations were used to compare differences between groups. Results: A total of 72 infants were enrolled. Group A comprised 34 infants, of whom one was lost to follow-up during the second examination; group B comprised 38 infants, of whom three were lost to follow-up during the second examination. In the two groups, the gestational ages at birth were (30.85±2.38) and (31.81±2.61) weeks, the birth weights were (1 262.06±422.56) and (1 357.63±389.31) g, and male infants accounted for 35.29% (12 cases) and 50.00% (19 cases), respectively. No statistically significant differences were observed in demographic characteristics or clinical complications between the two groups (all P>0.05). At 60 to 75 minutes after dilation, the difference in pupil diameter between the low-dose and standard-dose regimens was 0.24 mm (95% CI: 0.18 to 0.29), with the upper limit of the 95% CI falling below the pre-specified non-inferiority margin. The rate of examination difficulty was 3% (95% CI: 1% to 11%) after standard-dose administration and 7% (95% CI: 3% to 16%) after low-dose administration, with a rate difference of 4% (95% CI: -1% to 9%, P=0.087; after Bonferroni correction, P=0.435). The lower limit of the 95% CI was above -5%. After low-dose administration, the oxygen saturation by pulse oximetry was 0.09% lower (95% CI: -0.32 to 0.15, P=0.470; after Bonferroni correction, P=1.000), and the heart rate was 0.34 beats/min lower (95% CI: -1.11 to 0.43, P=0.381; after Bonferroni correction, P=1.000), compared with standard-dose administration, with no statistically significant differences. The systolic blood pressure was 0.45 mmHg (1 mmHg=0.133kPa)lower (95% CI: -0.86 to -0.05, P=0.028; after Bonferroni correction, P=0.140), and the diastolic blood pressure was 0.65 mmHg lower (95% CI: -1.05 to -0.25, P=0.001; after Bonferroni correction, P=0.005) after low-dose administration compared with standard-dose administration, with only the difference in diastolic blood pressure reaching statistical significance. Adverse events occurred 11 times after standard-dose administration and 3 times after low-dose administration. Conclusion: For fundus screening in preterm infants, low-dose (45 μl) compound tropicamide eye drops are non-inferior to the standard dose (90 μl) in terms of mydriatic efficacy, while offering advantages in safety profile. 目的: 探讨小剂量复方托吡卡胺滴眼液在早产儿眼底检查中散瞳的有效性和安全性。 方法: 非劣效性随机对照交叉试验。纳入2024年10月至2025年6月在郑州大学第一附属医院新生儿科住院拟行早产儿视网膜病变筛查的受试者。简单随机化分为A、B两组,A组先后接受标准剂量(90 μl)和小剂量(45 μl)复方托吡卡胺滴眼液,B组则相反。以散瞳后60~75 min测量的瞳孔直径和眼科医师眼底检查的难易程度(容易、困难、无法检查)评价散瞳效果,计算使用不同剂量散瞳滴眼液受试者的瞳孔直径差值及其95%置信区间(95%CI),与非劣效性界值对比,用于评估非劣效性;以生命体征变化情况和不良反应发生情况评价安全性。采用线性混合模型和广义估计方差比较组间差异。 结果: 共纳入72例受试者。A组共34例,1例再次检查时失访;B组共38例,3例再次检查时失访。两组受试者的出生胎龄分别为(30.85±2.38)和(31.81±2.61)周,出生体重分别为(1 262.06±422.56)和(1 357.63±389.31)g,男性分别为12例(35.29%)和19例(50.00%);两组受试者人口统计学资料和临床并发症差异均无统计学意义(均P>0.05)。散瞳后60~75 min,使用不同剂量散瞳滴眼液的瞳孔直径差值为0.24 mm(95%CI:0.18~0.29),95%CI的最大值<非劣性界值。使用标准剂量散瞳滴眼液后检查困难率为3%(95%CI:1%~11%),使用小剂量散瞳滴眼液后为7%(95%CI:3%~16%),两组率差为 4%(95%CI:-1%~9%,P=0.087,Bonferroni校正后P=0.435),95%置信区间下限高于-5%。使用小剂量散瞳滴眼液后,经皮氧饱和度比标准剂量低0.09%(95%CI:-0.32~0.15,P=0.470,Bonferroni校正后P=1.000),心率比标准剂量低0.34次/min(95%CI:-1.11~0.43,P=0.381,Bonferroni校正后P=1.000),差异均无统计学意义。使用小剂量散瞳滴眼液后,收缩压比标准剂量低0.45 mmHg(95%CI:-0.86~-0.05,P=0.028,Bonferroni校正后P=0.140),舒张压比标准剂量低0.65 mmHg(95%CI:-1.05~-0.25,P=0.001,Bonferroni校正后P=0.005;1 mmHg=0.133kPa),仅舒张压差值有统计学意义。使用标准剂量散瞳滴眼液后,发生不良事件11次,使用小剂量散瞳滴眼液后,发生3次。 结论: 早产儿眼底筛查中小剂量(45 μl)复方托吡卡胺滴眼液散瞳效果不劣于标准剂量(90 μl),且在安全性方面有一定优势。.
Background/Objectives: Providing an appropriate diet to older adults with dysphagia can prevent aspiration, choking, and nutritional deficiencies and help preserve their quality of life. Therefore, assessments for determining the appropriateness of food types are required. This multicenter study aimed to determine the reliability and validity of the Meal Rounds Observation Form (MROF), which was developed to identify food forms that can be safely consumed by older adults with dysphagia. Methods: We analyzed 532 food-texture observations obtained from 155 participants (114 men and 41 women). The reliability and validity of the MROF were compared with those of videofluoroscopic (VF) or videoendoscopic (VE) examinations of swallowing. Results: The food-form categories were water (108 pairs), 0j (54 pairs), 0t (118 pairs), 1j (20 pairs), 2-1 (28 pairs), 2-2 (37 pairs), 3 (68 pairs), 4 (67 pairs), and normal food (32 pairs) based on JDD 2021 codes. The AUC was lowest for the water (0.568) category and highest for food forms requiring chewing, such as those of the 4 and normal food (0.678) categories. The sensitivity and specificity of the Gugging Swallowing Screen were 60.1% and 69.1%, respectively (p < 0.001). The agreement between the Gugging Swallowing Screen and the MROF evaluation for food types requiring mastication was 73.2%. Logistic regression analysis revealed asymmetric movement of the corners of the mouth and coughing as important indicators when evaluating food types requiring mastication. Conclusions: The MROF is useful for determining food intake safety when VF or VE tests cannot be performed in medical and nursing care settings and can guide clinical decision-making. However, caution is required in applying it clinically because of its relatively low specificity.
Valoctocogene roxaparvovec, a gene therapy for severe hemophilia A, enables endogenous factor VIII (FVIII) expression and confers bleed control. To present the final 5-year efficacy and safety results from the phase 3 GENEr8-1 trial. Adult men (N = 134) with severe hemophilia A without inhibitors who were using FVIII prophylaxis received one 6 × 1013 vg/kg infusion of valoctocogene roxaparvovec. End points included annualized bleeding rate (ABR), FVIII infusion rate, FVIII activity, Haemophilia-Specific Quality of Life Questionnaire for Adults, adverse events, and immunosuppressant use. Median follow-up was 261.1 weeks; 128 of 134 participants completed the study. For the 112 participants who enrolled from a previous noninterventional study (rollover population), mean ABR for treated bleeds declined by 83.3% (P < .0001), mean FVIII infusion rate declined by 94.9% (P < .0001), and mean ABR for all bleeds declined by 78.1% in the 5-year efficacy evaluation period vs baseline. In 132 modified intention-to-treat participants, mean and median FVIII activity at week 260 was 13.7 and 6.2 IU/dL, respectively (chromogenic assay). In year 5, 77.8% of rollover participants had 0 treated bleeds. One participant resumed prophylaxis since the 4-year data cutoff (25/134 across all follow-up). Haemophilia-Specific Quality of Life Questionnaire for Adults total score was improved from baseline at year 5. In year 5, alanine aminotransferase elevations occurred in 51 of 129 participants who entered year 5. Immunosuppressants were not used to manage them. No serious treatment-related adverse events occurred after year 1. Valoctocogene roxaparvovec provides durable hemostatic efficacy, FVIII activity, and improved health-related quality of life for ≥5 years, with no new safety signals.
Predicting the progression/regression of coronary plaque burden is challenging. We aimed to develop a deep learning model to forecast changes in percent atheroma volume (ΔPAV) using intravascular ultrasound (IVUS). We analysed data from IBIS-4 and PACMAN-AMI. Core lab measurements of plaque burden were available from IVUS pullbacks. Each model consists of a bidirectional Long Short-Term Memory (biLSTM) layer followed by two fully connected layers with one neuron each, resulting in both a classification for input progression/regression and an estimation of the ΔPAV. For the derivation and validation, a total of 1,960 regions of interest (ROIs) from the IBIS-4 dataset were used. The mean±standard deviation of the model accuracy was 0.85±0.02, the Matthews correlation coefficient was 0.70±0.04, and the F1 score was 0.85±0.02 for both progression and regression classes. In the testing (external validation) process with the PACMAN-AMI dataset, 5,283 ROIs were utilised. The mean ΔPAV was -0.31±5.63, for which 2,665 featured regression with a mean ΔPAV of -4.57±3.73, and 2,618 presented progression with a mean ΔPAV of 4.02±3.55, representing 49.6% of plaque progression prevalence. The predictive performance across the 100 trained models in the testing dataset showed an accuracy of 0.84, a Matthews correlation coefficient of 0.68, and an F1 score for the progression and regression classes of 0.84. This is the first deep learning model capable of detecting changes in plaque progression by analysing the rate of plaque burden change between adjacent frames.
The prison population in England and Wales exceeds 88,000, with a high turnover - 47% of sentenced admissions in 2023 served less than 12 months. Transitions from prison to the community are recognised as high-risk periods for medication-related harm, driven by complex health needs, short custodial stays, and fragmented healthcare systems. While national and international guidance exists to support safe medication management, implementation during prison-community transitions remains inconsistent, and evidence on both the drivers of unsafe medication practices and potential solutions is limited. This study explored the human, organisational, and environmental factors influencing medication safety during transitions from prison to the community, as well as potential solutions for improvement, from the perspective of staff involved in these transitions. Qualitative semi-structured interviews were conducted with 12 staff members working in roles relevant to transitions from prison to the community, including general practitioners, pharmacists, and prison officers. Participants were recruited through professional networks and snowball sampling. Data were thematically analysed using the Systems Engineering Initiative for Patient Safety (SEIPS) framework. Five main factors impacting medication safety during transitions were identified: release practices, care coordination and communication issues, staffing shortages, IT system limitations, and patient-related factors. Key findings highlighted risks associated with immediate releases, discontinuity in medication regimens, insufficient staffing for discharge planning, and poor information transfer between prison and community healthcare providers. These challenges were further compounded by patient-level issues such as low health literacy, substance use, and housing instability. Staff proposed several improvements to enhance medication safety during prison-to-community transitions, including electronic prescribing for timely access to medication, improved information transfer, dedicated discharge teams to ensure medication follow-up, early discharge planning to address medication needs, and multi-disciplinary meetings to coordinate complex care. Medication safety during transitions from prison to community healthcare requires coordinated efforts to address organisational challenges, including short-notice releases and inadequate information transfer, as well as human factors such as communication barriers and staffing constraints. Improvements that clarify roles, enhance processes and technology, and foster cross-system collaboration are essential to ensuring continuity of care and medication safety. Two people, one with lived experience of care transitions and one carer, contributed to study design, recruitment strategies, participant materials, and the analysis plan through quarterly input. Findings were shared with a wider group of lived experience representatives, carers, professionals, and policy makers, who informed interpretation and dissemination. While PPI members did not directly participate in coding or analysing the data, their input ensured that the study design and interpretation were informed by real-world perspectives.
Chronic tetracycline (TET) exposure in water/food triggers drug resistance, immune damage and allergies, requiring rapid, sensitive TET detection to safeguard food and human safety. Herein, a novel photoelectric active multivariate copper-based metal-organic framework (Cu-MOF) constructed from tetra(4-carboxyphenyl)porphine (TCPP) and 5,10,15,20-tetra(4-pyridyl)porphyrin (TPyP) (denoted as Cu-TCPP/TPyP) was employed as the bioplatform for the fabrication of a photoelectrochemical (PEC) aptasensor for the efficient detection of TET. The Cu-TCPP/TPyP was synthesized via the coprecipitation method using TCPP and TPyP as dual ligands and copper ions as the metal precursor. Due to the dual-coordination metal nodes, the attained Cu-TCPP/TPyP possessed rich defects, large pore size, and high specific surface area relative to Cu-MOFs prepared using the sole ligand. The Cu-TCPP/TPyP also showed enhanced photoelectric conversion efficiency due to the suppressed combination of electron-hole pair, enhanced visible light utilization, and high carrier density. The manufactured Cu-TCPP/TPyP-based PEC aptasensor thus exhibited the ultralow limit of detection of 0.76 fg mL-1 toward TET within the concentration from 1 fg mL-1 to 10 ng mL-1, markedly lower than most reported TET biosensors. In view of the high selectivity, good reproducibility, and high long-term stability, the constructed aptasensor possesses wide practicability for the sensitive determination of TET in diverse samples, which was also confirmed by the standard conventional determination method. The presented PEC aptasensor puts forward the advancement of the MOF-based biosensor in the field of the analysis of food safety.
Chronic kidney disease (CKD) affects approximately 850 million people worldwide and is associated with a substantial and growing symptom burden. Fatigue is one of the most common and debilitating symptoms across all stages of CKD, with prevalence far exceeding that of the general population. It profoundly affects quality of life, daily functioning, and clinical outcomes, underscoring the need to identify and evaluate effective pharmacological and non-pharmacological strategies for its management. This protocol outlines methods for a systematic review to evaluate the efficacy, safety, and perceived effectiveness of interventions for fatigue in people with CKD. We will include randomised controlled trials, non-randomised controlled trials, before-and-after studies, and qualitative evaluations in adults and children with any stage of CKD. Searches will be conducted in MEDLINE, Embase, CINAHL, Web of Science Core Collection, PsycINFO, and ClinicalTrials.gov, alongside grey literature via Google Scholar. No language or publication date restrictions will be applied. Study selection will follow a two-stage screening process, with two reviewers independently assessing titles/abstracts and then full texts using predefined eligibility criteria. Data will be extracted using a standardised form, capturing study characteristics, interventions, and outcomes. Risk of bias will be assessed with the NIH Study Quality Assessment Tools or the Joanna Briggs Institute Critical Appraisal Checklist for Qualitative Research. Results will be synthesised narratively, and, where appropriate, pooled using meta-analysis. Certainty of the evidence for the primary outcome will be assessed using the GRADE approach and presented in a Summary of Findings table. This review will provide a comprehensive synthesis of evidence on interventions for fatigue in CKD, highlight gaps to guide future research, and inform the UK Kidney Association Symptom Guidelines.
Myofascial pain syndrome (MPS) is a musculoskeletal system disorder that is exceedingly painful and distinct from other chronic pain syndromes. MPS can occur on its own or in conjunction with other muscle disorders. Symptoms of MPS include tense bands in muscles, weakening at the afflicted region, radial or repeated pain, restricted range of motion (ROM), and hot and red skin. In addition, symptoms of MTrPs include tense bands in muscles, weakening at the afflicted region, radial or repeated pain, restricted range of motion (ROM), and/or hot and red skin. Kenzo Kase, a Japanese chiropractor, invented Kinesiology Tape (KT) in the 1970s, with a potential role in modulating pain, enhancing muscle function and improving ROM. This review aims to examine the underlying mechanisms of MPS and evaluate current therapeutic strategies, with particular emphasis on the mechanism and clinical application of KT in managing MPS. Recent literature highlights the advances in understanding the pathophysiology of MTrP, conventional therapies and their limitations. KT is a dynamic, stretchable tape that resembles human skin which may be utilised to treat pain and muscle activity as well as increase range of motion (ROM). The majority of the time, KT is used in the treatment and prevention of sports injuries. Clinical studies report improvements in pain intensity, muscle flexibility, and ROM following KT application in individuals with MPS, particularly in sports and rehabilitation settings. MPS is a complex pain disorder requiring multimodal management. KT represents a promising non-invasive intervention that may address both pain and functional restoration through neurophysiological and biomechanical mechanisms. While clinical findings are promising, regulatory protocols are yet to be standardised to ensure long term safety and efficacy of KT in the treatment of MPS.
Quantitative detection of Salmonella typhimurium is of vital importance for promoting food safety monitoring and control. This study successfully developed a 3D-printed microchannel device that integrates cleaning and detection functions, enabling sensitive and semi-automatization colorimetric detection of Salmonella typhimurium. Firstly, the magnetic beads modified monoclonal antibody (MBs-Anti), Salmonella typhimurium and PtRu@ZrFe-MOFs@Apt nanozymes were successively added to the centrifuge tube to facilitate the formation of the MBs-Anti-Salmonella typhimurium-PtRu@ZrFe-MOFs@Apt sandwich complex. Then, the above incubated mixture was injected into the reaction tank of the microchannel device and washed with PBS containing hydrogen peroxide to separate the sandwich complex from the impurities in the sample solution. Subsequently, the TMB substrate solution was added to facilitate the catalytic oxidation of the sandwich complex, thereby forming the blue oxidized TMB product. The RGB image of the blue product was captured using a portable smartphone device, and the colorimetric signal of the image was analyzed to determine the concentration of Salmonella typhimurium. The microchannel device can detect Salmonella typhimurium within a concentration range 101 to 106 CFU/mL within 75 min, with a detection limit of 3.3 CFU/mL. It is worth noting that the 3D-printed microchannel device constructed has good universality. By replacing the corresponding antibodies, aptamers and other biological recognition elements, it can be extended to the detection of other pathogenic bacteria.
Severe scabies, a rare parasitic skin disease characterized by abundant skin mites, may be life-threatening and poses public health concerns worldwide. A combination of standard-dose oral ivermectin and topical scabicides is recommended for treatment. However, data from randomized clinical trials are lacking, and the probability of cure is uncertain. Ivermectin at higher doses has been effective in the treatment of some parasitic diseases. We conducted a blinded randomized trial involving adults with severe scabies (i.e., profuse or crusted), as confirmed by parasitologic or dermoscopic assessment. The patients were assigned in a 1:1 ratio to receive oral ivermectin (to be taken with food) at a dose of 400 μg per kilogram of body weight (higher-dose group) or 200 μg per kilogram (standard-dose group) on days 0, 7, and 14, combined with head-to-toe application of 5% permethrin cream on days 0 and 7 and daily application (as recommended) of an emollient cream. The primary end point was cure of severe scabies, which was defined as the absence of mites and mite-related products (i.e., eggs and feces), as confirmed by parasitologic or dermoscopic assessment on days 18 and 21, and the absence of active clinical lesions on physical examination on day 28. A total of 132 patients (66 in each group) were included in the main analysis. Cure was observed in 75% of the patients in the higher-dose group and in 82% of those in the standard-dose group (odds ratio for cure, 0.64; 95% confidence interval, 0.25 to 1.67). No safety issues were identified. Among adults, the 400-μg-per-kilogram dose of ivermectin plus 5% permethrin cream was not superior to the standard 200-μg-per-kilogram dose of ivermectin plus 5% permethrin cream in curing severe scabies. (Funded by the French Ministry of Health and French Society of Dermatology; ClinicalTrials.gov number, NCT02841215.).
The adverse impacts of chronic pain extend far beyond the physical sensation itself. Chronic pain, an age-related condition, exacerbates geriatric disease burden and drives a central sensitivity-neuropsychiatric complex, necessitating urgent preventive care. This study aimed to investigate the impact of chronic pain on two sensitive diseases, depression, and abilities decline in basic or physical activities (functional limitation) among the inpatients in older adults and explore the attributable hospital costs related to chronic pain. Participants were sourced from the 2021-2022 Inpatient Discharge Dataset of Sichuan Province, Diagnosis of depression, functional limitation, and chronic pain were identified using International Classification of Diseases, 10th Revision (ICD-10) codes. Logistic regression models were employed to analyze the association between chronic pain and depression and functional limitation. Furthermore, total hospital costs, out-of-pocket costs and length of stay (LOS) were compared between patients (depression and functional limitation) with chronic pain and without using Propensity score matching and Multivariable linear regression. The analysis included 38,372 and 4,996 inpatients in the depression and functional limitation cohorts, respectively. Chronic pain was significantly associated with both outcomes, yielding odds ratios (ORs) of 1.24 (95% CI: 1.20-1.27) for depression and 1.60 (1.44-1.78) for functional limitation (all p < 0.001), and the effect intensified as the number of painful areas increased. Compared to those without pain, depression patients with chronic pain incurred 68.2% higher total hospital costs ( β =0.52, p < 0.001), 169.1% higher out-of-pocket ( β =0.99, p < 0.001) and 60.0% higher LOS ( β =0.47, p < 0.001). Among patients with Functional limitation, chronic pain also significantly increased log-transformed total costs ( β =0.20), out-of-pocket ( β =0.51), and LOS ( β =0.30), representing relative increases of 22.1, 66.5, and 35.0%, respectively (all p < 0.05). These economic impacts were more pronounced among patients with multi-area pain. This study provides empirical evidence linking chronic pain to deteriorated psychological and physical health among older adults. It highlights the increased burden of the disease and hospitalization, with a particular emphasis on the dangers of multi-area pain. These findings emphasize that prioritizing mental health-focused pain management in outpatient and emergency settings is crucial for preventing avoidable hospitalizations and hospitalization costs in older adults.