Better evaluation of the contribution of the main diseases, injuries, and risk factors for mortality and life expectancy is crucial for more efficient policy making at the national and subnational levels in Iran. The aim of this study is to assess the effect of emerging causes of mortality on health, specifically COVID-19, which can help policy makers implement preventive measures in similar situations. In this systematic analysis of the Global Burden of Diseases, Injuries, and Risk Factors Study (GBD) 2023, we present estimates of cause-specific mortality at the national and subnational levels in Iran from 1990 to 2023. New to this iteration of GBD, we present a decomposition analysis of the contribution of specific causes of death to net gain or loss in life expectancy across 31 provinces of Iran. We used an array of data sources including censuses, vital registration, and surveys for national and subnational estimates. The two leading causes of death in Iran were ischaemic heart disease and stroke in both 1990 and 2019. However, in 2020 and 2021, the COVID-19 pandemic displaced the leading causes of death, ranking first with age-standardised mortality rates of 286·2 deaths (95% uncertainty interval 267·9-310·5) per 100 000 in 2020 and 250·0 deaths (233·2-272·5) per 100 000 in 2021. COVID-19 ranked second and tenth in 2022 and 2023, respectively. Life expectancy at birth for both sexes combined declined from 78·0 years (77·7-78·1) in 2019 to 74·3 years (74·0-74·4) in 2020. It steadily recovered to 78·8 years (78·5-79·2) in 2023. COVID-19 was the main cause of loss in life expectancy, by 4·19 years, between 2019 and 2020. There was a net gain of 12·4 years in life expectancy in Iran from 1990 to 2023. The net gain at the national level can be mostly attributed to reduced mortality from ischaemic heart disease (2·61 years), stroke (1·63 years), neonatal disorders (1·26 years), transport injuries (0·88 years), and neoplasms (0·64 years). The decline in mortality rates of major causes continued to 2023 despite the pandemic. An exception was Alzheimer's disease, which showed a 4·0% increase in rate between 2019 and 2023 and led to a net loss of 0·04 years in life expectancy since 1990. Diabetes led to a net loss of 0·09 years since 1990. There were variations between provinces in terms of age-standardised rates and the net change in life expectancy before and after the COVID-19 pandemic. The COVID-19 pandemic disrupted the rising trend of life expectancy in Iran, varying across provinces. Findings show that the health-care infrastructure and policies in Iran were not efficient in controlling the pandemic in 2020 and 2021, mainly due to inadequate vaccination coverage and timeliness, specifically for vulnerable subgroups. Sanctions may have aggravated the effect of COVID-19 on loss in life expectancy of Iranians. Despite the pandemic, the declining trend in age-standardised rates for top causes of mortality has continued to 2023, leading to a full recovery of life expectancy and underscoring the ultimate resilience of Iran's health system. Gates Foundation.
Despite unprecedented medical advances, global healthcare systems are failing to deliver universal, equitable and quality care. Many systems also have low resilience to surges in demand, are highly fragmented, or suffer from unsustainable funding models. The crisis of poor accessibility of healthcare services, which includes both lack of availability and unacceptably long waiting times, stems from systemic failures: inadequate provision of primary healthcare, suboptimal deployment of human and non-human healthcare assets, metric- and profit-centric models that exacerbate inequality, fragmented and siloed services, unsustainable costs and a reactive focus on treatment over prevention. Climate change and demographic shifts threaten to overwhelm already strained systems. In this discourse we argue that achieving the fundamental human right to healthcare requires a radical reconstitution of primary healthcare, centred on unlocking previously un- and under-exploited resources, capacities and productivity, governed by the principle of 'Networked Agency with a Safety Net'. We propose a holistic transformation that increases accessibility, resilience, integration, sustainability and, crucially, equity, centred on three synergistic pillars. First, a digital and patient-agency revolution, designed to radically increase access to, and the productivity of, primary healthcare. This involves creating self-care ecosystems such as Do-It-Yourself Digital Medical Centres and Home Clinics, supported by a National Clinical Informatics Centre. By enabling patients to manage routine care, this system frees highly trained healthcare professionals to focus on the complex clinical work that demands their full expertise. This enables and fosters patient empowerment while ensuring continuous clinical oversight (to prevent any misunderstanding: clinical oversight means that all clinical recommendations/decisions are made by healthcare professionals; patient agency involves inter alia implementing such recommendations/decisions). Second, acceleration of the strategic exploitation of microbial technologies-frugal, sustainable tools for diagnostics, prophylactics and therapies, including and especially mental health interventions, and environmental health (One Health). Third, a decisive shift towards disease prevention and health creation, integrating 'Health in All Policies', targeted comprehensive health education, and a comprehensive and systematic dismantling of healthcare accessibility barriers-such as transport impediments-and legacy forms of discrimination like restricted sexual/reproductive healthcare and failure to adequately care for the most chronically underserved, including the ageing population. This model is inherently sustainable and designed to drastically reduce the healthcare sector's carbon footprint and environmental impact through service consolidation, transport-oriented siting and green infrastructure. The measures constitute a technical upgrade and also a fundamental recasting of the primary healthcare system and mindset. This is also a moral imperative. Governments, while increasingly delegating service provision to commercial actors, hold a non-delegable duty of care. Fulfilling this duty necessitates a covenant that transitions healthcare from a market commodity to a publicly-accountable system sustainably designed for long-term resilience, equity and dignity. The roadmap we provide-encompassing governance, infrastructure, innovation and education-charts a course from crisis to a sustainable future where universal access to quality healthcare can finally be realised.
Traditional in-person handovers from the Post-Anaesthesia Care Unit (PACU) to inpatient wards often require PACU nurses to leave their post-operative patients for extended periods, which may delay the handover and management of other patients awaiting transfer. Alternatively, the PACU nurse-in-charge may hand over patient information to another individual to assist with in-person handover in the inpatient ward. Multiple care transitions increase the risk of omission of critical handover information, which compromises patient safety. A Virtual Nursing Handover (VNH) model may address these workflow and safety challenges by enabling direct, real-time communication between PACU and ward nurses, while reducing unnecessary staff movement. To evaluate the safety, feasibility, and usability of a VNH from the PACU to the inpatient ward. This process innovation study was underpinned by the Technology Acceptance Model (TAM). A three-component evaluation was conducted to assess the safety, feasibility, and usability of a VNH from the PACU to the inpatient ward. The study was undertaken in an acute care hospital in Singapore between October and November 2024. Handover safety was evaluated using structured audits based on the Queensland Health Handover Audit Tool, aligned with National Safety and Quality Health Service standards. The tool comprised 17 items assessing handover completeness and safety, along with additional open-ended questions to capture technical quality and notable observations. Feasibility was assessed using implementation metrics, including handover duration, adherence to the intended workflow, and technical reliability. Usability was evaluated using the System Usability Scale (SUS) administered to PACU and ward nurses, supplemented by qualitative feedback. Consistent with the TAM, safety and feasibility were interpreted as indicators of perceived usefulness, while usability reflected perceived ease of use. Quantitative data were analysed using SPSS version 26.0, and qualitative data were analysed thematically. A total of 31 handover safety audits were conducted. Adherence to handover safety standards was high, with all audits meeting at least 15 of the 17 audit criteria (≥ 88%). The VNH demonstrated high feasibility, resulting in an estimated time saving of approximately 4.7 h per day for PACU nurses. No major technical failures were observed, and handovers were conducted as intended. In contrast, system usability was rated below average, with a mean SUS score of 41.9. VNH from the PACU to the inpatient ward was safe and feasible in routine clinical practice; however, suboptimal usability may limit technology acceptance. Further system refinement is required to improve user experience and support sustained adoption.
To describe the process of development and validation of the content and semantics of the "Autonomy & Management in Prison Nursing" application for nurses working in prison units in Minas Gerais, Brazil. This methodological study employed the Contextualized Instructional Design method, following its four recommended stages: analysis, design and development, implementation, and evaluation. In the first phase, 164 nursing professionals participated by completing a semi-structured questionnaire addressing working conditions and professional autonomy. In the second phase, the application was developed with the support of an information technology professional. The third phase consisted of configuring and making the application available. Finally, in the fourth phase, content validation was carried out by 23 specialized judges and semantic validation by 51 nurses. The content validation of the application was satisfactory, as all indicators - total agreement rate, Content Validity Index (0.962; p<0.01), and overall Cronbach's Alpha coefficient (0.969) - reached satisfactory levels. Semantic validation was also satisfactory, as all Cronbach's Alpha coefficients achieved high values, indicating correlation among the items and excellent reliability of the application. The "Autonomy & Management in Prison Nursing" application was developed and demonstrated satisfactory evaluation regarding both content and semantics, showing good adaptation to the proposed criteria and high internal consistency among the items. Describir el proceso de desarrollo y validación del contenido y la semántica de la aplicación “Autonomía & Gestión en enfermería penitenciaria” para enfermeros que trabajan en unidades penitenciarias en Minas Gerais, Brasil. Estudio metodológico que empleó el método de Diseño Instruccional Contextualizado. Se siguieron los cuatro pasos recomendados: análisis, diseño y desarrollo, implementación y evaluación. La primera fase contó con la participación de 164 profesionales de enfermería que completaron un cuestionario semiestructurado con preguntas sobre las condiciones laborales y la autonomía profesional. En la segunda fase, se desarrolló la aplicación con el apoyo del profesional de tecnología de la información. La tercera fase consiste en la configuración de la aplicación y su puesta a disposición. Por último, en la cuarta fase, se realizó la validación de contenido por 23 jueces especialistas y de semántica por 51 enfermeros. La validación de contenido de la aplicación fue satisfactoria debido a que todos los resultados, el grado de concordancia total, el Índice de Validez de Contenido (0.962; p<0.01) y el coeficiente total del Alfa de Cronbach (0.969) alcanzaron valores considerados satisfactorios. La validación semántica también fue satisfactoria, ya que todos los resultados del coeficiente Alfa de Cronbach alcanzaron valores considerados altos, indicando correlación entre los ítems y excelente confiabilidad de la aplicación. Se elaboró la aplicación denominada “Autonomía & Gestión en Enfermería Penitenciaria”, la cual tuvo una evaluación satisfactoria en relación con el contenido y la semántica, mostrando una buena adaptación respecto a los aspectos presentados y una alta consistencia interna entre los ítems. Descrever o processo de desenvolvimento e validação de conteúdo e semântica do aplicativo “Autonomia & Gestão em Enfermagem Prisional” para enfermeiros atuantes em unidades prisionais em Minas Gerais. Estudo metodológico que empregou o método de Design Instrucional Contextualizado. Foram seguidos os quatro passos recomendados: análise, design e o desenvolvimento, implementação e avaliação. A primeira fase teve a participação de 164 profissionais de enfermagem que preencheram um questionário semiestruturado com questões relativas às condições laborais e autonomia profissional. Na segunda fase, foi desenvolvido o aplicativo com apoio do profissional de tecnologia da informação. A terceira fase constitui-se na configuração do aplicativo e disponibilização. Por fim, na quarta fase, foi realizada a validação de conteúdo por 23 juízes especialistas e semântica por 51 enfermeiros. A validação de conteúdo do aplicativo foi satisfatória devido ao fato de que todos os resultados, grau de concordância total, Índice de Validade de Conteúdo (0.962; p<0.01) e coeficiente total do Alfa de Cronbach (0.969) atingiram valores considerados satisfatórios. A validação semântica também foi satisfatória uma vez que todos os resultados do coeficiente Alfa de Cronbach, atingiram valores considerados altos, indicando correlação entre os itens e excelente confiabilidade do aplicativo. Foi elaborado o aplicativo denominado Autonomia & Gestão em Enfermagem Prisional, o qual apresentou avaliação satisfatória em relação ao conteúdo e a semântica, possuindo uma boa adaptação quanto aos quesitos apresentados e alta consistência interna entre os itens.
Persistent toxic substances (PTS), including heavy metals, persistent organic pollutants (POPs), and persistent, mobile, and toxic/very persistent and very mobile (PMT/vPvM) substances present an increasing menace to soil health, alimentary systems, atmospheric cleanliness as well as human health. Despite the large amount of literature on each of the individual groups of contaminants, there is still no unified model that connects the dynamics of the soil-atmosphere environment, bioaccumulation in the food chain, new detection techniques, and policy measures. This review presents an interdisciplinary synthesis of dynamics in the PTS in the agricultural environment, explicitly incorporating (i) historic contaminants and emerging PMT/vPvM chemicals, (ii) soil-crop-livestock-human transfer pathways, and (iii) the state-of-the-art remediation and monitoring technologies into a single management framework. We critically evaluated conventional remediation methods alongside next-generation methods, such as engineered consortia of microorganisms, synergistic phytotransformation of plants and microbes, biochar-assisted immobilization, nanosensor-based detection, IoT-based soil sensing, precision agriculture, machine-learning-driven risk prediction, and blockchain-based traceability. Contrary to the previous reviews, which only take into account the remediation, detection, and policy separately, this study presents a systems-based approach, which integrates technological innovation, sustainable agronomic practices, and multilayered governance tools (such as the Stockholm Convention, REACH, and national soil action plans). We highlight the fact that the combination of smart agricultural technology and regenerative land management will help reduce the accumulation of PTS and maintain productivity, especially in resource-scarcity settings. The review outlines the research gaps, including contaminant-microbiome interactions, longitudinal deterioration of ecosystem services, and socioeconomic barriers to technology adoption. We propose a transdisciplinary roadmap that aligns environmental toxicology, soil science, public health, and policy innovation to mitigate PTS and safeguard food security. This integrative approach provides a strategic framework for advancing sustainable management of persistent toxic substances in agricultural systems. This study looks at persistent toxic substances (PTS), harmful chemicals like some pesticides, industrial pollutants, and heavy metals that do not easily break down in the environment. Because they linger in soil, water, air, and food, they can move through the food chain and affect both ecosystems and people.In this study, the authors reviewed recent research and real-world cases to explain where PTS come from (e.g., farming chemicals, industrial waste, plastics), how they spread (air, water, and soil), and what health problems they can cause (such as hormone disruption, breathing issues, nerve damage, and cancer). They also examined solutions, from traditional cleanup methods to newer, nature-based options.
The primary aim of this study is to examine the mechanisms linking achievement motivation (including motive for success (MS) and motive to avoid failure (MF)) and social support to nurses' health-related procrastination (NHRP) and to determine whether social support plays a mediating role in this relationship. The health of nurses has long been a subject of widespread concern. Compared with general procrastination, NHRP is more directly linked to their health issues, which not only harms their own well-being but may also affect nursing quality and patient safety. Exploring its intrinsic mechanisms with achievement motivation and social support from the perspective of positive psychology is essential for developing effective strategies to reduce NHRP and serves as a cornerstone for ensuring nursing quality and patient safety. From February to July 2025, 346 samples of nurses from first-class comprehensive hospital in Shanghai were selected as participants. The mediating effect of social support on achievement motivation and the NHRP was investigated via the General Data Questionnaire, Nurses' Health-related Procrastination Scale, Social Support Rating Scale, and Achievement Motivation Scale. The mediation effects were tested using the PROCESS macro (Version 4.0), which utilizes linear regression analysis and the bootstrap method. A total of 320 valid questionnaires were collected, yielding an effective response rate of 92.47%. The results indicated that social support was positively correlated with MS (r = 0.339, p < 0.01) and negatively correlated with both MF (r = -0.283, p < 0.01) and NHRP (r = -0.482, p < 0.01). Furthermore, NHRP was negatively correlated with MS (r = -0.433, p < 0.01) and positively correlated with MF (r = 0.397, p < 0.01). Notably, social support was found to partially mediate the relationship between achievement motivation and NHRP. The specific mediating effects of social support were significant, with indirect effect values of -0.468 between MS and NHRP and 0.422 between MF and NHRP, accounting for 31.0% and 29.8% of the total effects, respectively. Achievement motivation can directly influence NHRP and can also indirectly affect it through the mediating role of social support. From the perspective of positive psychology, stimulating nurses' achievement motivation regarding their own health responsibility and fostering a supportive organizational atmosphere can effectively reduce their health-related procrastination, thereby laying a foundation for enhancing overall nursing service quality. Nursing managers should pay attention to nurses' physical and mental health and actively identify and guide the orientation of their achievement motivation: on the one hand, cultivate their positive motivation for "striving for success," and on the other hand, alleviate the anxiety stemming from "fear of failure," thereby fostering the formation of health behaviors through intrinsic mechanisms. Furthermore, it is essential to consciously build a team atmosphere with high social support and to reduce NHRP by establishing effective peer support systems and communication channels.
Diagnosing acute respiratory conditions in adults older than 65 years is challenging due to age-related physiological changes, which increase diagnostic complexity and the risk of hospital admission. We investigated whether extended point-of-care technology (POCT), used at home by acute community nurses, reduced hospital admissions in older adults. In this parallel-group, open-label, randomised controlled trial, conducted in a primary care setting in Denmark, adults aged 65 years and older with potential respiratory symptoms were randomly assigned (1:1) using computer-generated allocation, with blocked randomisation (variable block sizes of two, four, and six) stratified by gender, to standard nurse-led care (clinical assessment, vital signs, and C-reactive protein and haemoglobin) or standard nurse-led care with extended POCT (additional blood tests, focused lung ultrasound, and tele-ultrasound). Participants and clinicians were not masked to assignment. The primary outcome was hospital admission within 30 days. Mortality was assessed as a prespecified safety outcome. Analyses were done on an intention-to-treat basis. The trial is registered with ClinicalTrials.gov, NCT05546073. Between Oct 14, 2022, and Nov 18, 2023, 623 participants were randomly assigned. 313 (50%) of 623 participants were assigned usual care and 310 (50%) were assigned the intervention. No difference was observed in hospital admissions within 30 days (112 [38%] participants in the usual care group vs 119 [40%] participants in the intervention group; risk ratio [RR] 1·06, 95% CI 0·86-1·30). 30-day mortality was lower in the intervention group (25 [8%] deaths in the usual care group vs 11 [4%] deaths in the intervention group; RR 0·44, 95% CI 0·22-0·88). No intervention-related harms were observed. Compared with standard nurse-led community care including basic POCT, extended POCT did not reduce hospital admissions among older adults with potential acute respiratory conditions. These findings suggest that the added value of extended diagnostic technology in community-based acute care might have little impact on admission decisions and warrant further evaluation focusing on clinical decision making and patient trajectories. Region of Southern Denmark, Karen Elise Jensen Foundation, AP Moeller Foundation, Hartmann Foundation, Gangsted Foundation, and Odense University Hospital Innovation Foundation.
This study aims to explore the associations between wildfire behavior parameters and various emissions, analyze the impacts of these emissions on environmental PM2.5 and O3, and reveal the complex causal relationships between fire behavior and PM2.5 and O3 through emissions. The research focuses on three key regions: tropical America, tropical Africa, and tropical Asia. Using multi-output random forest and structural equation modeling, we analyzed the relationships between fire behavior parameters, emissions, and air quality indicators. The results show that: 1) there is a significant spatial correlation between fire behavior parameters and the aforementioned emissions, and this correlation exhibits different patterns in different tropical regions; 2) various pollutants emitted by fires have a significant impact on the formation of PM2.5 and O3, among which: in tropical Asia, the role of emissions such as NMVOCs, NOx, NH3, CH4, and SO2 is particularly prominent. In tropical America and Africa, NH3, N2O, and NMVOCs are the main influencing factors. The RF-MTL model demonstrated high predictive performance in these analyses, achieving R2 values of up to 0.87 for O3 in tropical Asia and 0.83 for PM2.5 in tropical America.; 3) the study not only confirms the role of fire behavior in the synergistic effect between PM2.5 and O3 but also reveals the key role of mediating factors such as N2O, NOx, and SO2 in the path of fire behavior affecting pollutants, where Fire Radiative Power (FRP) exerted a total positive effect of 0.366 on PM2.5 in tropical America and a total negative effect of - 0.594 in tropical Africa. The conclusions of this study provide important insights for understanding the impact of wildfire behavior on air quality and have significant reference value for formulating air quality management and wildfire control policies in different tropical regions.
Background/Objectives: Liver fat represents an early metabolic lesion in the development of diabetes and its cardiometabolic complications. Diets high in free sugars, particularly from sugar-sweetened beverages (SSBs), are associated with abdominal obesity and increased cardiometabolic risk, prompting global guidelines to limit SSBs as a major public health strategy. Low-fat cow's milk is promoted as the preferred caloric replacement strategy for SSBs due to its high nutritional value and cardiometabolic advantages. Fortified soymilk is a plant-based alternative with approved health claims for cholesterol and coronary heart disease risk reduction that offers an equivalent nutritional value to cow's milk. However, given concerns about its classification as an ultra-processed food (UPF), it is unclear whether soymilk offers comparable metabolic health benefits to milk as part of clinical and public health strategies to reduce SSB intake. The Soy Treatment Evaluation for Metabolic (STEM) health trial seeks to evaluate the impact of replacing SSBs with either 2% soymilk or 2% cow's milk on liver fat and other cardiometabolic risk factors in habitual adult consumers of SSBs with obesity. Methods: The STEM trial is a 24-week, pragmatic, 3-arm, parallel, randomized trial. We recruited adults with obesity (high BMI plus high waist circumference based on ethnic specific cut-offs) consuming ≥1 SSB/day. Participants were randomized to one of three groups based on their usual SSB intake at baseline (servings/day): continued SSB (355 mL can) intake; replacement with fortified, sweetened 2% soymilk (250 mL); or replacement with 2% cow's milk (250 mL). The primary outcome is the change in intrahepatocellular lipid (IHCL) measured by 1H-MRS at 24 weeks. Hierarchical testing will be done to reduce the familywise error rate. The superiority of cow's milk to SSBs will be assessed first to establish assay sensitivity. If superiority is established, then the non-inferiority of soymilk to cow's milk will be assessed using a pre-specified non-inferiority margin of 1.5% IHCL units (assessed by difference of means using a 90% confidence interval [CI]). Analyses will be conducted according to the intention-to-treat (ITT) principle using inverse probability weighting (IPW) for superiority testing and per-protocol analyses for non-inferiority testing, using ANCOVA adjusted for age, sex, metabolic dysfunction-associated steatotic liver disease (MASLD) status, medication use, intervention dose, and baseline levels. We hypothesize that soymilk will be non-inferior to cow's milk (Clinicaltrials.gov NCT05191160). Results: Recruitment began in November 2021. A total of 3050 individuals were screened. We randomized 186 participants (62 per group) between 19 April 2022 and 16 April 2024. Participants are 57% male; with a mean [SD] age of 39.9 [11.8] years; BMI of 34.6 [6.1] kg/m2, waist circumference of 112.6 [13.8] cm; IHCL of 10.0 [8.2] % with 64.1% meeting the criteria for MASLD; and SSBs intake of 2.3 [1.3] servings/day. Conclusions: Baseline characteristics were balanced across the study arms, with participants representing adults with a high-risk metabolic phenotype, and 64.1% meeting the criteria for MASLD. Findings will contribute to evidence on the cardiometabolic benefits of soymilk, informing clinical practice guidelines and public health policy.
Diabetes has a greater impact on people living in rural and remote Australia because of limited access to specialist diabetes services and ongoing workforce shortages. Primary healthcare professionals in these regions play a central role in diabetes care, but they often lack targeted training and professional support. This project introduced a co-designed, face-to-face workshop model to build clinical skills, strengthen confidence in culturally responsive communication and foster professional networks among rural healthcare workers. Developed through an academic-clinical partnership with health organisations for Aboriginal and Torres Strait Islander peoples, the workshops used evidence-based teaching methods to improve professional development tailored to the realities of rural primary healthcare needs. Two full-day workshops took place in Queensland and New South Wales in 2024. A pretest-posttest survey design was used to evaluate the workshops, assessing changes in clinical confidence and professional engagement. Of 120 participants, 73 completed post-workshop surveys, and 48 provided matched pre- and post-workshop data. Participants reported substantial increases in clinical confidence, particularly in using diabetes technologies (+72.4%) and understanding recent advances (+66.5%). Notably, 96.3% reported enhanced professional connections, with 87.0% intending to maintain them post-workshop. High levels of participants' satisfaction and partner support highlight the model's potential for broader implementation. This innovative workshop model addresses critical gaps in diabetes care training for rural health professionals by integrating clinical education, culturally responsive communication skills and opportunities for peer support. It offers a sustainable approach to rural health workforce development that aligns with national rural health priorities, and supports quality improvement in primary health care.
Osteoporosis is a chronic skeletal disorder characterized by reduced bone mineral density and disrupted bone microarchitecture, affecting over 200 million individuals worldwide. The lumbar spine, containing the largest volume of metabolically active trabecular bone, is particularly vulnerable to osteoporotic degeneration and compression fractures. This narrative review examines recent advances in imaging modalities for lumbar spine osteoporosis assessment, emphasizing the diagnostic utility and emerging clinical applications of ¹⁸F-sodium fluoride (NaF) positron emission tomography/computed tomography (PET/CT). A comprehensive narrative review was conducted, synthesizing findings from pivotal studies investigating conventional imaging methods and newer PET-based technologies for osteoporosis evaluation. Particular focus was given to studies utilizing quantitative and kinetic PET biomarkers for assessing bone metabolic activity with ¹⁸F-NaF. While dual-energy X-ray absorptiometry (DXA) remains the clinical standard for bone mineral density assessment, it has significant limitations including poor spatial resolution, lack of three-dimensional capability, and inability to differentiate cortical from trabecular bone. In contrast, ¹⁸F-NaF PET/CT demonstrates superior image quality, rapid tracer kinetics, and quantitative assessment of regional osteoblastic activity. Studies show strong correlations between ¹⁸F-NaF uptake and bone turnover markers, mineral density measurements, and therapeutic response. Kinetic modeling approaches provide detailed insights into bone remodeling dynamics, supporting personalized treatment planning and prognostic assessment. Diagnostic performance studies report area under the receiver operating characteristic curves as high as 0.96 for osteoporosis detection when evaluated against DXA-derived BMD, though no study has yet compared both modalities against an independent gold standard such as fracture outcomes. ¹⁸F-NaF PET/CT offers optimal clinical applications for early treatment response monitoring, evaluation of patients with discordant clinical risk and DXA findings, pre-surgical assessment in patients with borderline bone density, and investigation of complex metabolic bone disorders. Ideally, ¹⁸F-NaF PET/CT should be utilized in a complementary fashion to DXA. Primary barriers to clinical adoption include cost, limited accessibility, and absence of standardized kinetic modeling protocols. Future research should focus on establishing reference ranges across age and sex demographics, validating fracture prediction models, and determining cost-effectiveness thresholds for specific clinical scenarios such as high-risk patients with discordant DXA and fracture history.
Depression is highly prevalent yet undertreated among people living with heart failure, indicating barriers to mental health services. Although various digital mental health interventions have been developed to detect, treat, and manage depression in this population, these interventions have seen limited integration into clinical care and a lack of implementation research. Stepped care is a service innovation that may promote the implementation of these technologies into clinical settings, but few studies have examined how these services are designed in clinical settings. This study aimed to identify strategies to address health system barriers to accessing mental health care from the perspective of people living with heart failure, clinicians, and researchers, and to incorporate these strategies into the design of a virtual mental health stepped care service within a heart failure remote management program. A qualitative description study was conducted using purposive recruitment of people living with heart failure, clinicians, and researchers from a heart failure remote patient management program. As part of a service design approach, semistructured interviews explored potential strategies to address barriers to accessing mental health services. Two researchers coded the data descriptively and constructed themes to guide the development of a virtual stepped care service. A total of 22 participants were interviewed, comprising 13 people living with heart failure and 9 clinicians and researchers. Six themes were identified, comprising 4 requirements and 2 foundational principles. The requirements were to (1) adopt a collective approach to identify distress across methods, people, and time points; (2) maintain a referral-based approach; (3) rely on existing mental health human resources; and (4) offer patient choice among various mental health care options. These requirements were supported by two principles: (1) building on organizational strengths and (2) reducing treatment burden. Based on these findings, a virtual stepped care service was developed, incorporating a depression screening module, referral-based workflows, and, where clinically appropriate, patient choice in treatment selection. The stakeholder-informed design of this virtual stepped care service contributes to the limited literature on stepped care service design and demonstrates how such models can be tailored to their intended contexts. Although each component was designed to address health system barriers to mental health care for people living with heart failure, resource limitations may constrain the balance between feasibility and quality of care. Future research should evaluate the acceptability of this model among people living with heart failure and clinicians.
Non-communicable diseases are a growing public health challenge, shaped not only by biological predispositions but also by geo-demographic, socioeconomic, psychological, and lifestyle factors. A comprehensive understanding of these determinants is essential for developing targeted public health strategies. This study aimed to examine the multifactorial determinants of individual health status by analyzing geo-demographic, socio-economic, behavioral, psychological, and lifestyle variables. Data were collected from 4,010 participants (age: 37.2 ± 15.4 years; 59.5% female) across 10 Mediterranean and neighboring countries using the multinational MEDIET4ALL e-survey. Health status was categorized as healthy, at-risk, or with diseases. Multinomial logistic regression, Quade's Rank ANCOVA and series of multiple regression models were conducted. Collectively, around 25% of respondents declared to be at-risk of or with known disease. BMI emerged as the strongest negative predictor of health status (β = -0.145), with both obesity and underweight significantly increasing the odds of being at risk (OR = 1.8 and 5.2, respectively) and having diseases (OR = 2.2 and 11.9, respectively). Other significant negative predictors included psychological distress (notably anxiety, β = -0.091), insomnia (β = -0.084), alcohol consumption (β = -0.053), and prolonged sitting time (β = -0.037). Conversely, life satisfaction was the strongest significant protective factor (β = 0.066), followed by higher education, better sleep quality, and adherence to the Mediterranean Diet and lifestyle (β = 0.034 to 0.050). Socio-economic disparities, including employment status (β = -0.045) and living environment (β = -0.031), also significantly influenced health outcomes with rural environment and employed individual showing lower odd ratios of being at-risk and/or having diseases (p < 0.001). Furthermore, individuals residing in Mediterranean regions, females, married or cohabiting individuals, and non-smokers exhibited significantly lower odds of being at-risk or having diseases (p < 0.05). While gender remained a significant predictor in the final refined comprehensive regression model (β = -0.049), marital status lost significance, suggesting that its protective effect may be mediated by psychological well-being and health-related behaviors. These findings highlight the complex interplay of lifestyle, mental health, and socio-environmental factors in determining health outcomes, while emphasizing the urgent need for multi-level public health interventions, including policies promoting physical activity, healthy eating, mental well-being, and equitable healthcare access. Future research should employ longitudinal designs to establish causal relationships and guides preventive strategies.
Agriculture is highly sensitive to climate variability, and ongoing warming is expected to modify the thermal conditions controlling fruit tree phenology and production. This study analyzes the spatiotemporal variability of winter chilling, spring heat accumulation, and spring frost probability across fruit-growing areas in Aragón (northeast Spain) using a regional, high-resolution agroclimatic approach. Chill portions (CP), growing degree hours (GDH), and spring frost probability occurrence (SFPO) were computed from daily gridded maximum and minimum temperature data at 1 km² spatial resolution. Agroclimatic indicators were derived using statistical, non-experimental methods based on established chilling and forcing models and empirical temperature thresholds. Recent historical variability was characterized using overlapping 30-year historical climate periods while future climate conditions were assessed using daily temperature projections from an ensemble of 18 regional climate models (EURO-CORDEX), dynamically downscaled from global climate models and bias-corrected. Projections were analyzed for a historical reference period (1971-2000) and under the intermediate and high emissions scenario RCP4.5 and RCP8.5 for mid-century (2041-2070) and late-century (2071-2100) periods. Climate projections at ~ 5 km spatial resolution were interpolated to 1 km and bias-corrected using Empirical Quantile Mapping with high-resolution observational data as reference. Changes in CP and GDH distributions were quantified using spatial differences and standardized anomalies relative to historical conditions, and indicators were extracted at the location of existing fruit orchards. Results indicate that effective winter chilling remains within broad ranges compatible with fruit production across all scenarios, with persistent spatial contrasts between western and eastern sectors. In contrast, spring heat accumulation shows a strong and spatially coherent increase, particularly in low-elevation and eastern areas, indicating an increasing influence of spring temperatures on phenological dynamics. Although future scenarios project a substantial reduction in the probability of frost occurrence after early March, increasing heat accumulation may advance phenological development, potentially shifting frost exposure to earlier periods in late winter. Overall, the results indicate that agroclimatic conditions in Aragón are strongly structured by regional climatic gradients, and that climate change is likely to intensify spatial contrasts between colder and warmer production areas rather than producing uniform changes across the region. These results provide a regional agroclimatic framework that can support adaptation planning and the long-term management of fruit production under climate change.
The rapid evolution of technology has triggered profound cultural, social, and psychological changes, along with a constant demand for human adaptation to new challenges. Digital wellbeing (DW) refers to the individual's positive and healthy relationship with information and communication technology (ICT), characterized by feelings of comfort, support, safety, satisfaction, and low levels of stress during their interactions with ICT. However, DW can be threatened by ergonomic, organizational, and psychological issues, particularly in the workplace. Despite efforts to improve the objective conditions of user experience, the understanding of the role of psychological factors in HCI remains partially disregarded. This study aimed to validate a new multidimensional psychometric tool designed to assess the perceived quality of HCI in the workplace, the Work-Related Human Computer Interaction Questionnaire (wrHCI-Q). A sample of 1,198 employees of a large Italian banking group (52% females; age: 49.04 ± 8.7) underwent an online survey. Reliability, exploratory factor analysis (EFA), and confirmatory factor analysis (CFA) were performed. The wrHCI-Q consisted of 35 items and a four-factor structure (technostress, self-efficacy, positive attitude, and HCI aversion), supported by the CFA indices. The wrHCI-Q is a new, valid, and reliable scale to catch some crucial human factors affecting the quality of HCI. It is expected to foster the understanding of the determinants of individual DWE and to shed light on the factors that undermine a healthy employee's interaction with ICT.
Thailand has entered an aging society ahead of its economic readiness, with demographic pressures outpacing the capacity of existing health and social systems. Although the universal coverage scheme and the national long-term care (LTC) initiative provide an important policy foundation, elderly care remains fragmented across ministries, unevenly implemented, and heavily reliant on unpaid or undertrained community caregivers. This commentary synthesizes current evidence through 2025 and identifies key system gaps, including low utilization of local-level funds, shortages of trained personnel, and weak inter-ministerial coordination across the health, welfare, and local government sectors. Recent developments, such as community-based LTC models, expanded telehealth services, and Thailand's emerging regulatory framework for advanced therapy medicinal products (ATMPs), illustrate the country's growing innovation readiness, though major challenges remain. Digital platforms for chronic disease monitoring and post-stroke rehabilitation show early promise, whereas ATMPs require careful evaluation through health technology assessment, ethical oversight, and long-term cost-effectiveness considerations. As Thailand moves toward becoming a super-aged society, strategic reforms in financing, workforce development, digital infrastructure, and ATMP regulatory preparedness are essential. Without coordinated efforts, the country faces the risk of "aging before prosperity" and increasing care disparities. However, with strengthened collaboration and the careful implementation of existing initiatives and emerging technologies, Thailand can build an elderly care system that truly supports healthy and dignified aging.
This study aims to investigate the therapeutic effect of ethanol extract from fermented broth of Acanthopanacis senticosi Radix et Rhizoma seu Caulis-Ganoderma applanatum (EEA-GF) on IM and clarify its active material basis and mechanism of action. Insomnia (IM) model mice were established by induction with p-chlorophenylalanine (PCPA). The IM status was evaluated through a pentobarbital sodium-induced righting reflex test, hematoxylin-eosin (H&E) staining, and biochemical index detection. The composition of EEA-GF and its blood-migrating components were analyzed based on ultra-performance liquid chromatography-tandem mass spectrometry (UPLC-MS/MS) technology. Core active components were screened by combining weighted gene co-expression network analysis (WGCNA), network pharmacology, and molecular docking technology. Animal experiments have shown that EEA-GF exhibits excellent therapeutic efficacy in the treatment of IM. A total of 19 compounds were identified from the serum of mice administered with EEA-GF via gavage using UPLC-MS/MS technology. Through WGCNA, HIF1A, REN, NFE2L2, and PARP1 were determined as core targets. Gene Ontology (GO) enrichment analysis yielded 87 results. In addition to cancer-related pathways, Kyoto Encyclopedia of Genes and Genomes (KEGG) pathway enrichment analysis mainly included pathways such as the thyroid hormone signaling pathway and neuroactive ligand-receptor interaction. Molecular docking results showed that five components including kaempferol exhibited good binding activity with the aforementioned core targets, indicating that they may be the key material basis for EEA-GF to exert its anti-IM effect. This study is the first to confirm the therapeutic effect of EEA-GF on IM. This finding and the research on its related mechanisms provide a new theoretical basis for the clinical treatment and management of IM, as well as a candidate target with great development prospects.
Anti-vascular endothelial growth factor (anti-VEGF) therapy is the mainstay of management for diabetic macular edema (DME), but marked variability in response, high injection frequency, and cumulative treatment burden highlight the need for tools that can individualize treatment beyond protocol-driven regimens. Artificial intelligence (AI) offers a pathway toward more individualized risk stratification and prognostic support primarily by capturing statistical associations rather than biological mechanisms. Deep learning systems have achieved great accuracy in detecting diabetic retinopathy (DR) and DME. Several autonomous DR/DME screening solutions are in clinical use. Recent advances have applied supervised machine learning, convolutional neural networks, generative adversarial networks, and ensemble methods to multimodal data from fundus images, baseline and follow-up optical coherence tomography (OCT), along with clinical and biochemical data, to classify likely responders and non-responders. These models automatically quantify and track imaging biomarkers to accurately predict central subfield thickness and vision outcomes after loading doses, and estimate future injection burden. AI-driven decision-support tools analyze vast amounts of patient data, treatment histories, and integrate multimodal data, including fundus images, OCT images, and systemic data to provide recommendations for optimal treatment and follow-up, tailored to each individual profile. The AI systems can potentially generate individualized risk and response profiles that can support decisions on initiating therapy, choosing between agents, tailoring treat-and-extend intervals, and timing switches to steroids or combination strategies. However, issues of generalizability, transparency, workflow integration, and ethical deployment need to be systematically addressed. AI-enabled decision support for patient selection and treatment response prediction is poised to become an integral component of anti-VEGF therapy.
Background Clinical documentation is a major contributor to clinician workload and burnout, with physicians spending more than half of their workday on electronic health record (EHR) tasks. Artificial intelligence (AI)-based speech recognition (ASR) tools promise to reduce this burden by generating draft notes from dictated or conversational clinical encounters. Despite rapid adoption, concerns remain about their real-world accuracy, reliability, and ability to capture clinically relevant information. To systematically map the breadth of published evidence reporting on the accuracy, reliability, efficiency, and clinical information capture of ASR systems used in healthcare settings for clinical documentation. The scoping review employed the methodology developed by Arksey and O'Malley in 2005 and further expanded by Levac and Colquhoun 2010. Four databases (PubMed, Scopus, Web of Science, and MEDLINE) were searched for studies published between 2008 and 2025. All findings were reported according to PRISMA guidelines for scoping reviews. Of 3,520 records, thirty-two met the inclusion criteria, using benchmarking studies, controlled comparisons, qualitative methods, and retrospective reviews. Across settings, ASR showed substantial accuracy limitations, with word error rates ranging from moderate in dictated notes to very high in conversational and emergency contexts. Common errors included deletions, substitutions, and misrecognition of medication names or brief utterances. Although some studies reported reduced typing burden and improved workflow efficiency, systems frequently missed clinically relevant details. Evidence for improvements in note completeness was mixed, and little research linked system accuracy to patient safety or diagnostic outcomes. ASR can reduce typing and improve documentation efficiency, sometimes capturing richer narrative detail. However, frequent and clinically significant errors shaped by linguistic complexity, context, and speaker variation make unsupervised use unsafe. Human oversights remains essential, and continued refinement, rigorous evaluation, and attention to workflow, cognitive burden, and equity are required.
The effect of Early Technology Review (ETR) through early engagement with multiple stakeholders on strategic development for technologies at prototype development and proof of concept was examined through two generic case studies of relevant outcomes. In both examples, advice to companies could have significantly changed strategic direction to become more relevant to payers and clinical experts. In one instance, the advice was followed and resulted in an expedited first-in-human study and was considered for a second ETR to inform the proof-of-concept study. In the second example, it was reported that changes in strategic direction were being considered.These reports provide descriptive accounts of very early applications of the ETR process that now spans the entire preclinical trajectory. Had the second case study at proof of concept been able to benefit from this approach at the point of prototype development, it could have avoided the costs and research through earlier advice. This begs the question whether a sequential iterative approach to evidentiary multiple stakeholder advice across the technology life cycle may reduce risk and cost while benefitting from efficiencies of applying adaptive design.