Environmental sustainability has become a critical priority in global healthcare, as medical services substantially contribute to carbon emissions, resource use, and waste generation. Ophthalmology, with its high surgical volume and dependence on disposable materials, represents an important component of this burden. This study aimed to assess Turkish ophthalmologists' awareness, attitudes, and clinical and surgical practices related to environmental sustainability. A nationwide cross-sectional survey was conducted between January and April 2025 among 640 actively practicing ophthalmologists across diverse clinical settings in Türkiye. The questionnaire included 26 multiple-choice items addressing demographics, sustainability knowledge and training, personal and institutional practices, and attitudes toward surgical materials. Only 10.3% (n = 66) of respondents reported good or very good sustainability knowledge, and 10.8% (n = 69) had received formal training. Younger ophthalmologists demonstrated significantly lower knowledge levels than older colleagues (P = .03). Conversely, senior physicians more frequently agreed that sustainable practices would not increase healthcare costs (P = .007). Although many participants supported implementing sustainability measures in ophthalmology, a considerable proportion indicated they would modify their use of surgical materials if legal restrictions were relaxed. Notably, 69.3% (n = 440) stated that disposable materials were not indispensable for surgical procedures. Although Turkish ophthalmologists demonstrate strong ethical support for sustainability, gaps in knowledge and regulatory constraints limit practical implementation. Targeted education, updated guidelines, regulatory reform, and institutional engagement are essential to advance environmentally responsible ophthalmic care.
The hepatitis B virus birth dose vaccine (HepB-BD) is administered within the first 24 h after birth. When given during this period, along with at least two additional doses, it effectively prevents perinatal HBV infection and induces immunity. Hepatitis B virus (HBV) infection is a significant public health problem that can cause substantial mortality and morbidity worldwide. The uptake of the HepB-BD vaccine varies across different regions, with more than 70% birth dose coverage observed only in the Americas and the Western Pacific regions. In contrast, coverage in the African Region is generally low, and sub-Saharan Africa (SSA) experiences even lower coverage than the region as a whole. Despite its importance, there is a lack of evidence regarding the uptake of the HepB-BD in SSA. Therefore, this study aimed to assess the prevalence and factors associated with the uptake of the birth dose HBV vaccine among children in SSA. In this study, we used the recent Demographic and Health Survey (DHS) dataset from 2015 to 2023 for seven SSA countries. STATA version 17 software was used for data analysis. After assessing the Intraclass Correlation Coefficient (ICC) and performing the Likelihood Ratio (LR) test, we determined that applying multilevel analysis to account for the hierarchical or nested structure of the DHS data did not provide a significantly better fit than the simpler logistic regression model. As a result, we used the rare-event logistic regression model in our analysis. A Hosmer-Lemeshow test was conducted (Prob > χ2 = 0.4763), which suggested that the logistic regression model fit the data well. Variables with a p-value of less than 0.25 in the bivariate rare-event logistic regression model were included in the multivariable rare-event logistic regression analysis. Variables with p-values less than 0.05 were considered to be significantly associated with the uptake of the birth dose HBV vaccine. The prevalence of birth-dose HBV vaccine uptake in SSA was 2.76% (95% CI: 0.021-0.036). Children whose mothers were aged 35-39 years (AOR = 4.21, 95% CI: 1.11-5.95) and 40-44 years (AOR = 5.36, 95% CI: 1.15-6.13) had higher odds of receiving the birth-dose HBV vaccine compared with those whose mothers were aged 15-19 years. The odds of receiving the birth-dose HBV vaccine were higher among children whose fathers had higher education (AOR = 2.88, 95% CI: 1.20-8.63) as well as those whose fathers' education level was unknown (AOR = 2.61, 95% CI: 1.25-5.46) compared with children whose fathers had no formal education. Furthermore, children from households in the middle wealth index (AOR = 1.86, 95% CI: 1.23-3.51) had higher odds of receiving the birth-dose HBV vaccine than those from the poorest households. Our study revealed that only about three out of every one hundred children in SSA countries received the birth dose of the HBV vaccine within the first 24 h of delivery. Increased maternal age, higher or unknown paternal education level, and belonging to the middle household wealth index were factors that significantly increased the odds of receiving the HBV birth-dose vaccine among children in SSA countries. Targeted strategies are needed to improve HepB birth-dose coverage, including integrating counselling into maternal health care, involving fathers, ensuring timely facility-based and outreach delivery, targeting younger mothers, improving vaccination record-keeping, and training midwives.
Stigma remains a major determinant of impaired quality of life (QoL) in people with epilepsy (PwE). We aimed to investigate perceptions of epilepsy-related stigma in the Italian general population, assess knowledge of epilepsy and seizure first aid, and identify factors associated with stigmatizing attitudes. We conducted a cross-sectional, anonymized online survey using a structured questionnaire. The instrument comprised four sections: (1) sociodemographic characteristics; (2) epilepsy-related knowledge and perceptions; (3) seizure first aid (12 items on correct and incorrect actions during a seizure, summarized into the Seizure First Aid Knowledge Score (SAFE score - range 0-12); and (4) stigma assessment using the validated Stigma Scale of Epilepsy (SSE). Multivariate linear regression was used to explore predictors of stigma. A total of 1159 individuals completed the survey (mean age 36.9 ± 16.2 years; 64.6% women). SSE demonstrated excellent internal consistency (Cronbach's α = 0.89-0.90). Women reported higher stigma than men (53.1 ± 16.8 vs. 48.6 ± 17.6; p < 0.001). Participants from Southern Italy had significantly higher SSE scores compared to Central or Northern regions (53.4 ± 17.5 vs. 48.1 ± 16.4 and 49.8 ± 16.0; p < 0.001). Healthcare professionals reported lower stigma compared with PwE, relatives, or individuals without epilepsy (47.6 ± 17.2 vs. 55.6 ± 18.1, 51.8 ± 16.0, and 52.0 ± 17.2; p = 0.023). A higher SAFE score was inversely associated with stigma (β = -0.89, p = 0.013). Although most participants recognized appropriate first aid measures, misconceptions persisted: 38.1% endorsed inserting hands into the mouth during seizures, and 24.5% considered physical restraint appropriate. Stigma persists in Italy, particularly among women and residents of Southern regions. Tailored educational interventions may both improve seizure safety and mitigate stigma, ultimately enhancing QoL for PwE. Epilepsy is often associated with negative attitudes that can affect the well-being of people living with the condition. In this study, we surveyed adults across Italy to understand how epilepsy is perceived, how much people know about seizure first aid, and how stigma varies across different groups. We found that stigma remains common, especially among women and people living in Southern Italy, while better knowledge of how to help during a seizure was linked to lower stigma. These results highlight the importance of public education to improve understanding, safety, and social inclusion.
Dry eye disease (DED) is a common ocular condition that, if left untreated, can lead to visual impairment. The present study was conducted to investigate the prevalence of DED among ophthalmologists, optometrists, and nurses. This cross-sectional study included ophthalmologists, optometrists, and nurses who attended the Emirates Society of Ophthalmology (ESO) conference in 2024 in the United Arab Emirates. Eligible participants were asked to complete the Dry Eye Questionnaire (DEQ-5), based on which the presence and severity of DED were assessed. The study included 203 participants (53.2% males), with the majority being ophthalmologists (47.8%), followed by nurses (17.2%), optometrists (13.8%), and others (21.2%). The overall prevalence of DED was 82.8%. DED prevalence was significantly higher among females than males (88.4% vs 77.8%, p = 0.046), and female optometrists had a 4.36-fold higher risk of DED than male optometrists (p = 0.013). Regarding DED severity, 27.6% of the participants had mild DED, 35.5% had moderate DED, and 19.7% had severe DED. Females who worked as ophthalmologists (RR = 1.72, p = 0.045) and those engaged in other professions (RR = 3.31, p = 0.002) had a significantly higher risk of severe DED. This study highlights alarmingly high rates of DED among ocular care professionals. Female optometrists had a significantly higher risk of DED than male optometrists.
Breast cancer is a leading cause of mortality and morbidity among females worldwide. As part of the Global Burden of Diseases, Injuries, and Risk Factors Study (GBD) 2023, we provided an updated comprehensive assessment of the epidemiological trends, disease burden, and risk factors associated with breast cancer globally, regionally, and nationally from 1990 to 2023. Breast cancer incidence, mortality, prevalence, years lived with disability (YLDs), years of life lost (YLLs), and disability-adjusted life-years (DALYs) were estimated by age and sex for 204 countries and territories from 1990 to 2023. Mortality estimates were generated using GBD Cause of Death Ensemble models, leveraging data from population-based cancer registration systems, vital registration systems, and verbal autopsies. Mortality-to-incidence ratios were calculated to derive both mortality and incidence estimates. Prevalence was calculated by combining incidence and modelled survival estimates. YLLs were established by multiplying age-specific deaths with the GBD standard life expectancy at the age of death. YLDs were estimated by applying disability weights to prevalence estimates. The sum of YLLs and YLDs equalled the number of DALYs. Breast cancer burden attributable to seven risk factors was examined through the comparative risk assessment framework. The GBD forecasting framework was used to forecast breast cancer incidence and mortality from 2024 to 2050. Age-standardised rates were calculated for each metric using the GBD 2023 world standard population. In 2023, there were an estimated 2·30 million (95% uncertainty interval [UI] 2·01 to 2·61) breast cancer incident cases, 764 000 deaths (672 000 to 854 000), and 24·1 million (21·3 to 27·5) DALYs among females globally. In the World Bank low-income group, where a low age-standardised incidence rate (ASIR) was estimated (44·2 per 100 000 person-years [31·2 to 58·4]), the age-standardised mortality rate (ASMR) was the highest (24·1 per 100 000 [16·8 to 31·9]). The highest ASIR was in the high-income group (75·7 per 100 000 [67·1 to 84·0]), and the lowest ASMR was in the upper-middle-income group (11·2 per 100 000 [10·2 to 12·3]). Between 1990 and 2023, the ASIR in the low-income group increased by 147·2% (38·1 to 271·7), compared with a 1·2% (-11·5 to 17·2) change in the high-income group. The ASMR decreased in the high-income group, changing by -29·9% (-33·6 to -25·9), but increased by 99·3% (12·5 to 202·9) in the low-income group. The increase in age-standardised DALY rates followed that of ASMRs. Risk factors such as dietary risks, tobacco use, and high fasting plasma glucose contributed to 28·3% (16·6 to 38·9) of breast cancer DALYs in 2023. The risk factors with a decrease in attributable DALYs between 1990 and 2023 were high alcohol use and tobacco. By 2050, the global incident cases of breast cancer among females were forecast to reach 3·56 million (2·29 to 4·83), with 1·37 million (0·841 to 2·02) deaths. The stable incidence and declining mortality rates of female breast cancer in high-income nations reflect success in screening, diagnosis, and treatment. In contrast, the concurrent rise in incidence and mortality in other regions signals health system deficits. Without effective interventions, many countries will fall short of the WHO Global Breast Cancer Initiative's ambitious target of achieving an annual reduction of 2·5% in age-standardised mortality rates by 2040. The mounting breast cancer burden, disproportionately affecting some of the world's most vulnerable populations, will further exacerbate health inequalities across the globe without decisive immediate action. Gates Foundation, St Jude Children's Research Hospital.
We present the life and work of William White Cooper, with a focus on his 1850 study On the conical cornea, the earliest known multicentre epidemiological investigation of keratoconus. We contextualise Cooper's contribution within mid-19th-century ophthalmology, reassessing his findings in light of modern epidemiological knowledge. Historical research was conducted using Gallica (the digital platform of the National Library of France (BnF)) and the historical resources of The Lancet website. The keywords used were 'conical cornea' or 'keratoconus,' which were the most common names for keratoconus. William White Cooper, a leading Victorian ophthalmic surgeon and later Surgeon-Oculist in Ordinary to Queen Victoria, conducted what appears to be the first epidemiological study of keratoconus, nearly a century before the earliest study typically cited in modern literature. Drawing on data from multiple ophthalmic hospitals across England, Scotland, Ireland, and Macao, Cooper compiled prevalence figures from over 200,000 examined patients between 1814 and 1850. Prevalence rates, recalculated for comparison with modern standards, varied markedly between regions: from 0 in Edinburgh to 254.0 cases per 100,000 in Plymouth, with an overall prevalence of 85.92 per 100,000 for the United Kingdom. The highest prevalence was observed in Macao (324.15 per 100,000), leading Cooper to hypothesise that warmer climates might favor the occurrence of keratoconus. Despite the diagnostic limitations of the period, these values are strikingly close to those reported in parts of modern Europe, underscoring the scale and relevance of his pioneering work. Cooper's 1850 investigation represents the earliest known multicentre epidemiological study of keratoconus, predating modern reports by nearly a century. Despite methodological limitations inherent to the mid-19th century, his findings anticipated contemporary prevalence patterns and introduced the hypothesis of environmental influences on disease occurrence. This work not only enriches the historical record of keratoconus, but also highlights the enduring value of early clinical observation in shaping epidemiological understanding.
Spinal cerebrospinal fluid (CSF) leak is a disabling and often misdiagnosed condition characterised by CSF hypovolemia. Associated neurological symptoms are diverse and often leave individuals bed-bound due to their orthostatic nature. Prior literature describing the difficulties in diagnosis, treatment, and ongoing impact of CSF leak is, thus far, confined to Europe and North America. This study provides a novel account of lived experiences of spinal CSF leak in Australia and Aotearoa New Zealand (NZ). An online survey exploring symptoms, diagnosis, treatment, and effect on daily life of a person's "first" CSF leak was designed with consumer involvement. Responses were received from May to August 2025. Open-text responses were analysed using thematic analysis. In total, 106 surveys were completed. Over 70 symptoms were reported; the most common were orthostatic headache (95.3%), neck pain (85.8%), and brain fog (79.2%). Most people considered their diagnosis (73.6%) and treatment (65.3%) difficult, underscored by limited clinician awareness and access to care, leaving individuals to self-advocate. Amongst symptomatic participants (73.6%), median EuroQol Visual Analogue Scale score was 40 (interquartile range 25-64; indicating low health-related quality-of-life) and mean Headache Impact Test-6 score was 69 ± 5 (indicating severe impact). Other challenges identified included navigating change to social identity and daily functioning. The spinal CSF leak experience in Australia and NZ is comparable to reports from other high-income countries, highlighting the global need to increase awareness of spinal CSF leak, support timely diagnostic, referral and treatment pathways, and mitigate its impact on quality of life.
The therapeutic potential of the norrin protein, Frizzled-4 (FZD4) receptor, and the β-catenin signaling pathway is gaining increasing importance in ophthalmology due to their multifaceted effects on the function of the blood-retinal barrier (BRB). Disruption of BRB integrity represents a key mechanism underlying pathological vascular permeability and the progression of diabetic retinopathy and age-related macular degeneration. There remains an unmet need for the development of therapies specifically targeting the protection and regeneration of the BRB, as current treatment strategies primarily focus on symptom alleviation rather than addressing the underlying mechanisms of barrier breakdown. Interest in norrin signaling within the Wnt/β-catenin pathway arises from its crucial role in maintaining BRB integrity through the activation of the FZD4 receptor and LRP5/6 co-receptors, leading to the stabilization of endothelial junctional complexes and the maintenance and restoration of retinal vascular barrier properties, particularly following vascular endothelial growth factor (VEGF)-induced barrier disruption modulation of this pthway may represent a novel therapeutic approach aimed at protecting and regenerating the BRB in degenerative retinal diseases. We identify current research directions regarding new therapeutic strategies for ophthalmic diseases associated with BRB dysfunction, with particular emphasis on the role of norrin protein, Frizzled-4 (FZD4) receptors, and β-catenin. A comprehensive search was conducted in electronic databases (PubMed, Google Scholar) using the following keywords: blood-retinal barrier, norrin protein, Frizzled-4 receptors (FZD4), β-catenin, Wnt signaling, retinal diseases, Wnt signaling agonists, Wnt signaling inhibitors, monoclonal antibodies, gene therapy. English-language publications from 2015 to 2025 were analyzed, focusing on the role of norrin, FZD4, and β-catenin in maintaining BRB integrity and their potential therapeutic applications in retinal diseases. Additionally, reference lists of selected publications were reviewed Current research directions focus on 3 potential therapeutic strategies: modulation of Wnt/norrin signaling using pathway agonists and inhibitors, application of monoclonal antibodies targeting receptors, and gene therapy. Although all these approaches demonstrate promising therapeutic potential, further clinical studies are required to assess their long-term efficacy and safety.
Fundus autofluorescence captures the fluorescence of the retina, specifically the retinal pigment epithelium (RPE). Within the RPE, fluorophores, including lipofuscin, melanolipofuscin, and melanosomes, fluoresce when excited by light of varying wavelengths. Quantitative fundus autofluorescence (QAF), which has been implemented using blue excitation light, enables mapping and quantifying lipofuscin and melanolipofuscin fluorescence. This is achieved by using a reference bar to standardize measurements. The technical functionality of QAF relies on repeatability and high image quality. Multicenter studies, however, indicate variability. Age-related lens opacities also affect QAF, necessitating individualized correction formulas for accuracy. Accordingly, this review focuses on the development and application of QAF, addressing technical considerations and its relevance to structural and cellular changes in age-related macular degeneration (AMD), and highlighting how QAF can provide clinically meaningful information for AMD diagnosis and therapy monitoring. In AMD, numerous independent studies have found reduced autofluorescence at the posterior pole and as AMD progresses autofluorescence further decreases. Typical AMD lesions, such as subretinal drusenoid deposits, have been associated with significant QAF reduction. On a cellular level, granule aggregation and degranulation are identified as histological correlates. Subcellularly, technologies like serial block-face scanning electron and structured illumination microscopy have revealed a reduced lipofuscin volumetric density and RPE dysmorphia associated with AMD's reduced autofluorescence. New software tools enhance QAF's potential for detailed lesion analysis. Future advancements in QAF include integrating new excitation wavelengths and combining quantified emission spectra and fluorescence lifetimes to improve early detection of AMD-related changes. Despite challenges, QAF continues to evolve, promising better insights into retinal health and disease.
ObjectiveThis study aimed to evaluate the associations between body mass index (BMI), dietary nutrient intake, and the risk of branch retinal vein occlusion (BRVO) in a nationally representative Korean population, with a focus on sex- and BMI-specific differences.MethodsThis population-based cross-sectional study analyzed data from adults aged ≥40 years who participated in the 7th and 8th Korea National Health and Nutrition Examination Surveys (KNHANES) conducted between 2017 and 2021. BRVO was diagnosed using standardized fundus photography and optical coherence tomography interpreted by certified retinal specialists. BMI was categorized according to Asia-Pacific criteria, and dietary intake of macronutrients and micronutrients was assessed using structured nutrition surveys. Comparisons were performed between participants with and without BRVO, stratified by sex and BMI category. Survey-weighted logistic regression models were applied to account for the complex sampling design and to adjust for age, smoking status, alcohol consumption, hypertension, diabetes mellitus, and BMI.ResultsA total of 15,790 participants (6,629 men and 9,161 women) were included, of whom 96 (43 men and 53 women) were diagnosed with BRVO. Participants with BRVO were significantly older and had a higher prevalence of hypertension (all P < 0.001). Overall, individuals with BRVO showed significantly lower intakes of protein, crude fiber, iron, sodium, and potassium (all P < 0.05). Although overall BRVO prevalence did not differ significantly across BMI categories, sex-stratified analyses revealed an increasing prevalence with higher BMI in women. Associations between dietary nutrient intake and BRVO varied across BMI strata.ConclusionsLower nutrient intake and metabolic risk factors, including hypertension and BMI, may be associated with BRVO risk. The observed sex- and BMI-specific differences highlight the importance of tailored nutritional and metabolic risk assessment. Longitudinal studies are warranted to clarify causal relationships.
Although cancer screening is essential for early detection and an improved prognosis, screening beyond the recommended guidelines may increase the risk of false-positive results. Consequently, educating individuals about the potential harm of non-guideline-based cancer screening is essential; however, effective communication methods remain unclear. This study aimed to evaluate the effectiveness of different types of educational videos in encouraging preferences for guideline-based cancer screening. This 3-arm pseudorandomized controlled trial was conducted in June 2025 using a Japanese online survey platform. Eligible respondents were working adults aged 30 to 60 years with no history of major cancer. Respondents were assigned to 1 of the following 3 video conditions: video A, which provided a logical explanation of false-positive risks; video B, which presented the narrative of a woman who received a false-positive result from breast cancer screening; and video C, which depicted a man who underwent unnecessary follow-up testing after tumor marker screening. The primary outcome was the preference for guideline-based cancer screening after watching the videos. The secondary outcomes included 7 self-reported video evaluation items, such as perceived relevance and clarity, assessed using a 5-point Likert scale. Differences in the primary outcome between video groups were analyzed using multivariable logistic regression with adjustment for covariates. Means and 95% CIs were calculated for each secondary outcome according to sex and video group. In addition, before-and-after changes in screening preferences were assessed using McNemar test, with a significance level of .05. In total, 1200 respondents (400 per group) completed the survey. No statistically significant differences in the primary outcome were observed among the video groups. With reference to video A, the adjusted odds ratios for preferring guideline-based screening were 0.89 (95% CI 0.59-1.32) for video B and 0.98 (95% CI 0.65-1.46) for video C. Regarding secondary outcomes, male respondents rated video B less favorably than female respondents in terms of relevance and willingness to undergo guideline-based screening. The before-and-after comparison showed a significant change in preference for guideline-based screening (P=.04). These videos appeared to be more effective for individuals with an annual history of colorectal cancer screening than for those without such a history. Educational videos have the potential to influence cancer screening preferences; however, no single video format has demonstrated clear superiority. These findings underscore the importance of tailoring educational materials to the target audience characteristics. Further research is required to develop effective strategies for encouraging guideline-based cancer screening. University Hospital Medical Information Network Clinical Trials Registry UMIN000060549; https://center6.umin.ac.jp/cgi-open-bin/ctr/ctr_view.cgi?recptno=R000066119.
Ultrasound imaging has played an important role in ophthalmic diagnostics due to its real-time capability, safety, and cost-effectiveness. In recent years several novel ultrasound modalities have been applied to diagnosis of ocular diseases, including contrast-enhanced ultrasound (CEUS), photoacoustic imaging (PAI), 3-dimensional ultrasound (3D-US), microvascular flow imaging (MFI), super-resolution ultrasound localization microscopy (SRULM), ultrasound elastography, and high-frequency ultrasound (HFUS). These technologies offer improvements in spatial resolution, tissue characterization, and functional imaging. Specifically, CEUS, PAI, MFI, and SRULM allow for the evaluation of ocular blood flow and vasculature, while HFUS and elastography enhance the assessment of intraocular structures and tissue stiffness. 3D-US contributes to the volumetric analysis of ocular lesions. In parallel, the integration of artificial intelligence with ultrasound has enabled automated image interpretation and disease classification, with applications in various ocular diseases, such as retinal detachment, intraocular tumor, and glaucoma. Despite these advances, limitations remain, such as the difficulty in balancing image resolution with penetration depth. Further development in multimodal imaging, algorithm optimization, and clinical validation is needed. Therefore, we review the current progress in novel ultrasound modalities, explores the clinical potential of ophthalmic application, and outlines existing challenges as well as future research directions.
BACKGROUND Missed appointments (patient no-shows) are a critical challenge undermining healthcare system efficiency globally. This study aims to characterize the patient no-show phenomenon in Poland, identify factors associated with missed appointments, and propose potential measures to reduce the no-show phenomenon in the Polish healthcare system. MATERIAL AND METHODS A nationwide cross-sectional survey was conducted using computer-assisted web interviews (CAWI) from August 1 to 4, 2025. The study used quota sampling stratified by sex, age, and residence to obtain a nationwide sample of 1162 Polish adults aged 18 to 96 years. A self-prepared questionnaire was used. RESULTS Among all respondents, 88.5% used healthcare services within the previous 12 months. Among healthcare users (n=1014), 14% missed appointments without cancellation. Forgetting appointments (42.3%) and communication barriers (27.5%) were identified as the primary reasons for no-shows. Text message (SMS) reminder systems received 62.5% support, while 67.5% endorsed the implementation of a penalty fee for public system non-attendance. Multivariable analysis revealed significantly (P<0.05) increased odds of no-shows among adults aged under 60 years of age, parents with children <18 years (aOR, 2.09; 95% CI, 1.28-3.40), and individuals with moderate (aOR, 1.80; 95% CI, 1.19-2.72) or poor financial status (aOR, 2.60; 95% CI, 1.47-4.60). CONCLUSIONS This study showed a relatively high prevalence of missed appointments in Poland. Young age, parental responsibilities, and economic constraints were associated with higher odds of no-shows. Findings support expanding digital notification systems and multi-channel communication infrastructure to reduce no-shows, rather than using punitive approaches.
Cognitive and functional decline are key indicators of mortality risk in older adults. However, temporal onset and progression trajectories of these declines before death, and their cross-national applicability, remain poorly understood. This study aimed to investigate the trajectories of memory performance, depressive symptoms, activities of daily living (ADL), and instrumental activities of daily living (IADL) across four large-scale cohorts: CHARLS (China), ELSA (England), HRS (USA), and SHARE (Europe), to identify patterns associated with imminent mortality. Backward timescale analyses were applied to examine the temporal progression of memory performance, CES-D depressive symptoms, ADL limitations, and IADL impairments leading up to death. Particular attention was given to identifying the onset point and rate of decline associated with mortality. Across the CHARLS, ELSA, HRS, and SHARE cohorts, distinct trajectories in memory performance, depressive symptoms (CES-D scores), ADL, and IADL were observed, highlighting varying patterns associated with mortality as individuals approached death. Memory performance declined significantly in all cohorts, with deceased individuals in CHARLS showing a sharp drop starting 3 years before death (coefficient: -0.0189, p < 0.0001), consistent with trends in ELSA, HRS, and SHARE. CES-D scores progressively increased among deceased individuals, with divergence beginning 6 years before death in CHARLS and 4 years before death in ELSA, peaking at 1 to 2 years before death in most cohorts. ADL scores demonstrated an early decline, with significant differences emerging at 7 years before death in CHARLS and even earlier in HRS (24 years before death), while IADL scores showed more gradual changes, with notable increases in the last 4 to 5 years before death across all cohorts. Memory performance exhibited the strongest association with mortality, followed by ADL scores, emphasizing the critical role of cognitive and functional abilities in end-of-life health trajectories. This study provides cross-national evidence delineating the temporal onset and progression trajectories of cognitive and functional decline before death. Memory performance was the most powerful indicator of imminent mortality, followed by ADL decline. These findings underscore the value of early recognition of declining cognitive and functional abilities among middle-aged and older adults, offering actionable insights for targeted screening, preventive interventions, and end-of-life care planning.
To evaluate the risk factors and quality of life (QoL) in geographic atrophy (GA) in a Korean population. This cross-sectional study, based on data from the Korea National Health and Nutrition Examination Survey (2017-2021), included 15,030 subjects aged ≥ 40 years. Participants were classified into four categories: early age-related macular degeneration (AMD) (n = 2,068), wet AMD (n = 94), GA (n = 36), and no AMD (n = 12,832). Demographics and activity limitations were compared across these groups. Additionally, the subjects' QoL was assessed using the EuroQoL-5 Dimension (EQ-5D) and Health-Related Quality of Life Instrument with 8 Items (HINT-8) methods. Among the four groups, the GA group showed highest mean age (72.47 ± 7.45 years), prevalence of current smokers (61.11%), and prevalence of diabetes (41.67%). GA group reported activity limitations (16.67%) and vision problems (5.56%) at relatively higher rates compared to other groups. In the QoL assessment, the EQ-5D evaluation showed that the GA group experienced a similar decline in QoL as the wet AMD group (0.93 ± 0.02 vs. 0.93 ± 0.01; P = 0.971), while the HINT-8 evaluation indicated a relatively more severe QoL decline in GA group compared to wet AMD group (22.74 ± 3.70 vs. 20.35 ± 3.33; P = 0.007). GA group frequently reported difficulties with mobility (27.27%), usual activities (15.15%), and climbing stairs (61.54%), which were comparable to the frequencies observed in patients with wet AMD. In South Korea, patients with GA tended to have relatively higher rates of current smoking and diabetes compared to individuals without GA. A decline in QoL in these patients was primarily associated with activity limitations. To improve patients' QoL, social support is needed to help overcome these activity limitations. Furthermore, it will be necessary to introduce therapeutic interventions to slow the progression of GA.
Parental stress is a critical yet understudied dimension of childhood total blindness, a condition that imposes substantial developmental, emotional, and functional challenges on families. This cross-sectional study assessed parenting stress, maternal health symptoms, and children's functional vision-related quality of life in 81 mothers of children aged 0 to 12 years with total congenital blindness. Parenting stress was assessed in the full sample using the Parenting Stress Index-Fourth Edition (PSI-4). Children's functional vision-related quality of life was evaluated in age-specific subsamples using the Quality of Family Vision Impact (QFVI-3 for children aged 0-3 years and QFVI-7 for children aged 3-7 years). All participants also completed a sociodemographic and maternal health survey. Total Parent Stress showed moderately elevated percentile scores (mean ≈ 67), with the highest PSI-4 subdomains in Adaptability, Depression, and Health. Approximately 21% of mothers scored within the clinical range for high stress. Maternal symptoms including sadness, insomnia, headaches, forgetfulness, and musculoskeletal pain were significant (all p < 0.01). QFVI global scores indicated moderate impairments in functional vision-related quality of life across age groups. Life Stress demonstrated a small-to-moderate negative correlation with QFVI-7, suggesting that cumulative environmental stressors may adversely affect children's functional outcomes. Several factors were associated with more favorable outcomes. Among children under three years of age, maternal engagement in physical activity was associated with higher QFVI scores, whereas among children aged 3-7 years, school attendance was associated with higher functional vision-related quality of life scores. In contrast, sociodemographic disadvantage, limited access to educational adaptations, and reduced maternal participation in work or leisure activities were associated with higher levels of parental stress. These findings highlight the importance of multidisciplinary, family-centered care incorporating psychosocial assessment, early stimulation, orientation and mobility support, and maternal mental health interventions in pediatric ophthalmology.
Dry eye syndrome (DES) is a prevalent ocular condition that significantly impacts affected individuals' quality of life and occupational performance. This study investigates the prevalence and contributing factors of DES among pilots, which is a group particularly susceptible to environmental and occupational stressors. A descriptive, observational study was conducted, which involved 794 pilots. Based on the severity of DES, these pilots were assigned into mild, moderate and severe groups. Data was collected through surveys, and analyzed using multiple linear regression, in order to determine the relationship between the DES scores and potential influencing factors. The study revealed that all pilots included in the present study were affected by DES, in which 88.40% of pilots experienced moderate DES and 11.60% of pilots reported severe DES. After adjusting for other covariates in the model, the multivariate analysis revealed that eyelid diseases, ocular surface disease, poor sleep quality, and fatigue were statistically significant and positively correlated to higher DES scores (p < 0.05), while residing in the southern region and engaging in physical activities were statistically significant and negatively correlated to the DES scores (p < 0.05). The high prevalence of DES in pilots highlights the urgent need for tailored occupational health interventions. Strategies to mitigate DES risk should include promoting regular physical exercise, improving sleep quality, and addressing fatigue. Future research should prioritize longitudinal studies to establish causal relationships and develop targeted management approaches for this high-risk occupational group.
We synthesized current evidence on the spectrum of ophthalmic alterations in prematurely born children without retinopathy of prematurity (ROP). A literature review was conducted in accordance with PRISMA guidelines. Scopus, PubMed, Web of Science, and Cochrane Library were searched for English-language studies published between 2000 and 2025. Eligible studies included observational designs reporting ophthalmic outcomes in infants born before 37 weeks of gestation without ROP. A total of 32 studies consisting of 4,548 individuals were investigated. Data were qualitatively synthesized across visual, refractive, anterior segment, and posterior segment domains. Mean visual acuity in preterm children without ROP ranged from -0.06 to 0.16 LogMAR and was generally comparable to term peers. Stereoacuity was reduced but showed improvement with age. Strabismus was reported in 5-20% of cases, increasing with age, and most commonly presented as esotropia. Nystagmus incidence ranged from 0-14%. Anisometropia was reported in 1.4-16.7% of preterm children without ROP, with one study demonstrating a higher prevalence compared to term peers. Hyperopia was highly prevalent, ranging from 20.9% to 86.2%, while astigmatism was reported in 14.1-65.1% of cases. Astigmatism predominated at one year of age, whereas hyperopia became the most common refractive error by six years. Hyperopia tends to increase with age, astigmatism gradually declines, and myopia rates remain relatively stable. Anterior segment findings in infancy included greater central corneal thickness, smaller corneal diameter, reduced axial length, shallower anterior chamber depth, and increased lens thickness. Posterior segment alterations included thicker central macula and focal thinning of the retinal nerve fiber layer in some reports. Premature birth, even in the absence of ROP, is independently associated with persistent structural and functional ophthalmic alterations. These findings highlight the need for long-term surveillance and early ophthalmologic intervention in this population.
Hereditary optic neuropathies are characterized by bilateral visual loss due to the degeneration of retinal ganglion cells, resulting in optic nerve degeneration and atrophy. Although the genetic origin of the main isolated and syndromic hereditary optic neuropathies have been characterized, the clinical phenotypes exhibit significant and poorly understood variability in both penetrance and expressivity. Additionally, the genetic and environmental factors that influence the onset of these optic neuropathies remain poorly understood, with limited biomarkers to predict disease progression or as readouts for therapeutic trials. Data-driven omics strategies allow deep phenotyping to improve our understanding of pathophysiological mechanisms and to search for new biomarkers and therapeutic targets. We explore whether the omics strategies applied to patients with hereditary optic neuropathies have provided such new insights. MEDLINE, Web of Science and EMBASE databases were screened for studies with terms relating to hereditary optic neuropathies, transcriptomics, epigenomics, proteomics, metabolomics and lipidomics in clinical studies exploring patients' samples. Out of 1244 references identified, 22 articles were included after double-masked data curation. These articles focused only on the 3 main forms of hereditary optic neuropathies, namely, OPA1-related dominant optic atrophy (n = 4), Leber hereditary optic neuropathy (n = 13) and Wolfram syndrome (n = 5). While the methodological designs and results of these studies were highly heterogeneous, they revealed molecular alterations that we have attempted to discuss at the integrated multi-omics level. This data integration highlighted several common pathophysiological mechanisms such as energetic impairment, endoplasmic reticulum stress, proteotoxic and oxidative stresses, lipid remodeling and altered amino acid and purine metabolisms, while suggesting potential new biomarkers and therapeutic targets. These findings underscore the potential of integrated multi-omics approaches to deepen our understanding of the phenotypic complexity of hereditary optic neuropathies and to support the development of innovative diagnostic and therapeutic strategies.
Ocular toxocariasis (OT) is an inflammatory eye disorder caused by intraocular migration of Toxocara canis or Toxocara cati larvae, constituting a significant cause of visual impairment among children in developing countries. Early and precise diagnosis is crucial for patients' optimal treatment and favorable prognosis. We elucidate 2 commonly employed laboratory diagnostic methods, including early pathological diagnosis and the currently predominant immunological diagnosis, and discuss the limitations inherent to each approach. Although pathological diagnosis is the gold standard for OT, the technical difficulty and inherent risks of ocular tissue biopsy limit its routine use in clinical practice. The first detection of anti-Toxocara IgG (T-IgG) in the vitreous and aqueous humor (AH) in 1979 established a basis for the immunological diagnosis of OT. ELISA is a commonly applied immunological detection method, with various commercial kits available targeting different antigens. Isolated serum anti-Toxocara IgG (T-IgG) testing cannot support an OT diagnosis, whereas intraocular T-IgG detection now constitutes the preferred laboratory diagnostic method. AH provides a reliable proxy when vitreous sampling is unfeasible. When T-IgG detection alone proves inconclusive, supplementary measurement of total IgE levels and eosinophil counts in intraocular fluids (IF) may enhance diagnostic accuracy. Furthermore, WB analysis and cytokine profiling of IF can serve as auxiliary diagnostic tools. Currently, Goldmann-Witmer coefficient (GWC) calculation is recommended for the definitive diagnosis of OT. Given the technical complexity and economic burden of total IgG assays, the feasibility of omitting GWC correction in patients with positive intraocular T-IgG and compatible ocular manifestations requires validation through large-scale multicenter studies. In aggregate, definitive parasite identification through ocular histopathology or imaging remains limited in OT clinical management, rendering immunological methods essential for diagnosis when direct parasitological evidence is absent.