Despite the introduction of the two-dose measles-containing vaccine (MCV2) in 2014, measles remains a persistent public health problem in Tanzania, characterized by recurrent outbreaks and shifting epidemiological patterns. Understanding the epidemiology of measles in the country is therefore essential to inform context-specific elimination strategies. During the 2023 and 2024 measles outbreaks, data and samples from the National Measles Epidemiological Surveillance System were used. Sera were processed to detect specific Measles Immunoglobulin M (IgM) antibodies using an indirect enzyme-linked immunosorbent assay. In addition, in-depth interviews and focus group discussions were conducted with key stakeholders and caregivers, respectively, to explore factors associated with measles outbreaks. A total of 17,902 suspected measles cases were enrolled; of these, 11,576 (64.7%) were confirmed measles cases. The overall incidence rate declined markedly from 163.9 cases per 1,000,000 population in 2023 to 15.4 cases per 1,000,000 population in 2024, with the highest incidence occurring among children aged <1 year. Of the total confirmed measles cases, 10,239 (57.2%), 1,311 (7.3%), and 26 (0.15%) were epi-linked, laboratory-confirmed, and clinically compatible, respectively. Higher proportions of confirmed measles cases were observed among children aged >5 years and infants <9 months, compared with children aged 12-59 months (p < 0.001). A significantly high proportion of cases were enrolled from rural areas and were zero dose. Of 7,637 suspected measles cases tested, measles IgM antibodies were detected in 1,311 (17.2%) cases, with significantly more cases in 2023 than in 2024. Factors that were significantly associated with laboratory-confirmed measles cases in multivariate analysis were age groups 5 + years, under-vaccinated/unvaccinated, and unknown vaccination status. The estimated effectiveness of two doses in preventing laboratory-confirmed measles cases during the 2023 and 2024 outbreaks was 80.7%. Under-vaccinated and unvaccinated due to unavailability of vaccination services, stockouts, and missed opportunities were highlighted by stakeholders as contributing factors to measles outbreaks. The study revealed that the vast majority of confirmed cases occurred among inadequately immunized children, with limited access to vaccination services, driven by multiple underlying factors, emerging as a key contributor to measles outbreaks. These findings highlight the importance of implementing context-specific strategies to close immunity gaps.
The taxonomic boundaries of Ceratocystis species related to C. manginecans have remained contentious due to limited morphological variation, interfertility in laboratory mating studies, and the application of different species concepts. However, recent studies have highlighted significant differences among species recently reduced to synonymy with C. manginecans, including host associations and biological traits, all with important implications for disease management. We examined the phylogenetic relationships, genetic diversity and population structure of isolates considered broadly as Ceratocystis manginecans, specifically those treated as C. eucalypticola and C. manginecans in various previous studies. The study included a comprehensive dataset of isolates from multiple hosts and regions, including both historical and recent outbreaks of Ceratocystis canker and wilt disease. Multi-locus phylogenetic analyses, supported by genotyping at 16 SSR loci, resolved two genetically distinct lineages: Lineage 1 (previously treated as C. eucalypticola) and Lineage 2 (previously treated as C. manginecans). These lineages showed structured genetic differentiation across geographic regions and host associations, consistent with previously reported differences in aggressiveness and infection biology. Detailed morphological comparisons also provided support for the two groupings. Although the results revealed the existence of two lineages among isolates currently treated as C. manginecans, discordance among loci and signals of admixture in a subset of isolates were consistent with evidence for mixed ancestry and/or incomplete lineage sorting. This suggests incomplete reproductive isolation that is contributing to current taxonomic ambiguity and confusion. The results support treating the isolates in the two groups as distinct lineages (Lineage 1 and 2) within C. manginecans, recognising their differences while maintaining a conservative taxonomic approach and thus facilitating important quarantine considerations and future studies focused on disease management.
Pathogen genomics plays a central role in infectious disease surveillance and outbreak response. However, information about available training initiatives remains fragmented, limiting visibility into how programmes are structured, delivered, and assessed. We conducted a structured survey to characterise pathogen genomics training initiatives identified through the PHA4GE Training and Workforce Development Working Group and affiliated professional networks. Eighty-one courses were analysed representing pathogen genomics training initiatives from 17 countries. Over half (52%) targeted academic or research audiences and 46% targeted public health professionals. Majority of courses were delivered as short, limited-duration standalone courses. Beginner-level courses accounted for 58% of offerings, whereas only 6% were classified as advanced. Bioinformatics or genomic data analysis was widely represented (72%), while specialised areas such as biostatistics and systems administration were less frequently included. Nearly half (48%) of courses focused on broadly applicable genomic methods rather than being restricted to a single pathogen. Among courses centred on specific organisms, viral pathogen themes were most commonly represented. Over one-third of courses (38%) did not include structured assessments, with only 7% incorporating quizzes or exams. Most courses relied on local computing resources such as laptops or desktops during delivery (93%). Use of high-performance computing (HPC) and cloud platforms was limited during training but was higher after training, with 37% and 39% of courses indicating use, respectively. This landscape analysis identifies structural patterns, including geographic concentration of providers, predominance of introductory formats, variability in assessment practices and in the use of advanced computing infrastructure across training phases. The findings provide empirical insight into characteristics of pathogen genomics training that may inform efforts to strengthen coordinated and sustainable workforce development strategies.
RNA viruses are historically characterized by acute, strictly transient infections, with few exceptions. However, the emerging phenomenon of viral RNA persistence is fundamentally challenging this paradigm. Accumulating data across non-retroviral RNA viruses including several families of Risk Group 4 (RG4) pathogens demonstrate that viral genetic material can persist within the host long after clinical recovery and the clearance of systemic viremia. The current review explores this new frontier in infectious diseases, synthesizing recent evidence for viral RNA persistence in RG4 zoonotic viruses and its profound clinical and public health implications, from post-acute sequelae and fatal recrudescence to the risk of re-igniting outbreaks. We further attempt to integrate findings from disease-susceptible hosts with those from natural reservoir hosts and propose a hypothetical, central role of viral RNA persistence in the maintenance and transmission of zoonotic RNA viruses, an idea we refer to as "the Latency Hypothesis". These updates underscore the critical need to elucidate the molecular mechanisms underlying persistence and to develop targeted medical countermeasures.
In the midst of a worldwide outbreak of lepidopteran chewing pests such as the fall armyworm (FAW; Spodoptera frugiperda), innovative strategies for pest control are needed. Recent findings have linked the resistance of rice (Oryza sativa) to lepidopteran pests to diterpenoid phytoalexins (DPs), antimicrobial compounds produced in response to disease and abiotic stress conditions. In this study, we explored the involvement of DP biosynthesis-related genes in the response and resistance of rice plants to lepidopteran pests. In interactions between rice and FAW or the oriental armyworm (Mythimna separata), feeding damage from larvae induced the expression of DITERPENOID PHYTOALEXIN FACTOR (DPF), which encodes a key transcription factor in DP biosynthesis, as well as DP biosynthetic genes. Transcriptional analysis suggested that DPF promotes DP biosynthesis in response to chewing by lepidopteran herbivores and other stresses. When lepidopteran larvae were reared on the leaves of DPF-overexpressing transgenic rice plants, which persistently accumulate high DP levels, FAW larvae exhibited poor growth within days. Overexpression of DPF also suppressed the growth of larvae from the oriental armyworm, although the suppression was more moderate. These results demonstrate that DPF overexpression enhances plant resistance to lepidopteran pests, highlighting the potential of DPF as a tool for biotechnological pest control.
A comprehensive synthesis of current evidence on surveillance, prevention and control measures for 25 selected vector-borne diseases (VBDs) affecting animals in the EU is presented here. The assessment integrates evidence from systematic literature reviews, modelling studies, field investigations and expert judgement. The aim is mapping the availability and effectiveness of risk mitigation measures (RMMs), including surveillance, movement restrictions, vaccination, culling, medicinal treatments, on-farm biosecurity and vector control. Of the 25 VBDs considered, 13 are listed under the EU Animal Health Law (AHL) and categorised according to disease control priorities, while 12 additional non-listed diseases were included based on vector presence and epidemiological relevance. Surveillance emerges as the most comprehensively documented mitigation category and is recognised as the cornerstone of VBD risk management. Evidence supports its critical role in early detection, situational awareness and timely response through integrated monitoring of hosts, vectors, pathogens and environmental drivers. Movement restrictions are supported by a moderate evidence base, primarily derived from modelling studies, indicating that they can reduce transmission under specific epidemiological conditions; however, their stand-alone effectiveness is difficult to quantify in real-world outbreaks. Vaccination shows high efficacy and effectiveness for several key VBDs, including African horse sickness virus, bluetongue virus, lumpy skin disease virus and Rift Valley fever virus, with substantial reductions in infection, morbidity and mortality. Evidence for culling is limited and highly disease-specific, and its effectiveness is generally contingent on the combination with other control measures. Medicinal treatments are limited to some groups of pathogens and demonstrate variable effectiveness, with emerging concerns regarding drug resistance. Biosecurity and vector control measures are supported by heterogeneous evidence, with chemical vector control showing robust entomological efficacy, but limited empirical data linking vector reduction to disease incidence. The report identifies key knowledge gaps and uncertainties across RMM categories and provides targeted recommendations for research to strengthen future VBD mitigation capacity in the EU.
Extensive-stage small cell lung cancer (ES-SCLC) is an aggressive malignancy with an extremely poor prognosis. For a long time, platinum-based chemotherapy combined with etoposide has been the primary treatment option. Although the initial response rate is high, the vast majority of patients face the dilemma of rapid recurrence and drug resistance. In recent years, the application of immunotherapy has brought about a significant breakthrough in the treatment of ES-SCLC. Multiple Phase III clinical trials have demonstrated that combining immune checkpoint inhibitors with traditional chemotherapy regimens as first-line treatment significantly improves the median overall survival (OS) and progression-free survival (PFS) in patients, while maintaining manageable safety profiles. Therefore, chemotherapy combined with immunotherapy has become the new standard for first-line treatment of ES-SCLC worldwide. However, the absolute survival benefit from immunotherapy remains limited. Against this backdrop, thoracic radiotherapy (TRT), as an effective local treatment modality, shows potential for further survival gains. The combination of chemoimmunotherapy and TRT is emerging as a key area of current clinical exploration. However, the characteristics of the patient population that may benefit most from this treatment modality, as well as the optimal dose and timing of TRT, remain under investigation. Furthermore, the predictive value of previously discussed biomarkers in this combination therapy strategy for ES-SCLC remains unclear. Therefore, this paper reviewed recent advances in treatment strategies and candidate biomarkers for ES-SCLC, with a particular focus on the evolving role of thoracic radiotherapy in the era of immunotherapy.
Crohn's disease (CD) is a chronic inflammatory bowel condition with increasing global incidence. Diet is a key modulator of chronic inflammation, often assessed by the Dietary Inflammatory Index (DII). However, prospective evidence linking the DII to CD risk remains limited in large populations. This study used the UK Biobank (UKB) to address this gap. Cox regression models were used to examine the association between the energy adjusted dietary inflammation index (E-DII) quartiles and CD with sequential adjustment for confounding variables. Restricted cubic spline regression (RCS) was additionally adopted to determine the association of the continuous E-DII and CD risk. Furthermore, sensitivity and subgroup analyses were conducted to explore the consistency of the results. Among 207,582 participants included in the analysis, 455 incident CD cases were documented over a mean follow-up period of 12.65 years. A positive association was observed between higher E-DII scores, indicating a more pro-inflammatory diet, and an increased risk of CD. In a fully adjusted multivariable model, participants in the highest quartile of E-DII had a significantly elevated risk of CD [HR:1.45, 95% CI (1.11-1.90); p < 0.01] compared to those in the lowest quartile. This association remained consistent across several sensitivity analyses. In this large prospective cohort, a pro-inflammatory dietary pattern, as reflected by higher E-DII scores, was associated with a significantly increased risk of developing CD. From a public health perspective, this finding highlights the potential importance of dietary inflammation in CD prevention, warranting further investigation into targeted interventions.
Stroke is a leading cause of death and disability worldwide, and mortality in acute intracerebral hemorrhagic (ICH) stroke is influenced by many factors, and early identification of high-risk patients is crucial for guiding clinical management. This study aimed to evaluate the role of blood pressure, blood glucose level, and Glasgow coma scale (GCS) on admission as predictors of 10-day in-hospital mortality and to develop a predictive scoring system in patients with acute ICH stroke. A cross-sectional study was conducted at Dr. Zainoel Abidin Hospital, a provincial referral hospital in Banda Aceh, Indonesia, in 2025. Patients with acute ICH were consecutively recruited. Clinical parameters on admission, including systolic and diastolic blood pressure, random blood glucose level, and GCS, were recorded. Associations with 10-day mortality were assessed with a Chi-squared test, and a predictive scoring system was developed based on independent predictors. A total of 62 patients were included in this study. Higher systolic blood pressure (≥140 mmHg), diastolic blood pressure (≥90 mmHg) and GCS <9 on admission were significantly associated with 10-day mortality (p=0.031, p=0.023 and p<0.001, respectively). Multivariate analysis identified that GCS <9 was the only independent predictor. A predictive scoring system assigned 8 points for GCS <9, 5 points for systolic ≥140 mmHg, 4 points for diastolic ≥90 mmHg, and 1 point for random blood glucose ≥200 mg/dL, estimating patient-specific mortality risk, highest when all risk factors were present. This study indicates that GCS <9 and elevated blood pressure on hospital admission are key predictors of 10-day mortality in acute ICH. The developed scoring system may assist in early risk stratification and management, and further exploration of predictive models is warranted to optimize clinical outcomes.
Multiple guidelines recommend single-inhaler triple therapy (SITT) for some patients with COPD. The first two SITTs approved for COPD were beclomethasone-glycopyrronium-formoterol (BEGF, twice daily) and fluticasone-umeclidinium-vilanterol (FUV, once daily). No study has compared the effectiveness and safety of these SITTs on major outcomes. We identified a cohort of patients with COPD, 40 years of age or older, from the United Kingdom's Clinical Practice Research Datalink. The patients who initiated treatment with FUV or BEGF were compared on the incidence of moderate or severe COPD exacerbations, and of pneumonia, over one year, after balancing baseline characteristics by propensity score weighting. The study cohort included 34,825 initiators of FUV and 39,288 initiators of BEGF, well balanced after weighing. The adjusted hazard ratio (HR) of a first moderate or severe exacerbation with FUV compared with BEGF was 0.91 (95% CI: 0.89-0.93), while for severe exacerbation it was 0.92 (95% CI: 0.86-0.97), corresponding to 27.9 fewer subjects with a moderate or severe exacerbation and 1.0 fewer with a severe exacerbation per 100 treated with FUV for one year. The HR of pneumonia requiring hospitalisation, comparing FUV with BEGF, was 1.06 (95% CI 0.99-1.13), over all patients. It was 1.14 (95% CI 1.06-1.24) among those classified as GOLD Group E, and 1.10 (95% CI 1.02-1.19) among those with a blood eosinophil count ≤ 300 cells/µL, corresponding to increases of 1.7 and 1.0 more subjects with a severe pneumonia per 100 treated with FUV for one year, respectively. In a real-world clinical practice setting of COPD treatment, initiating triple therapy with FUV was associated with a lower incidence of moderate and severe exacerbations than with BEGF. On the other hand, the incidence of a severe pneumonia requiring hospitalisation was higher with FUV among GOLD Group E subjects or those whose blood eosinophil count is not elevated.
Spine injuries represent a significant concern in the National Basketball Association (NBA) due to their implications for player health and competitive performance. While studies exist examining operative spine injuries in the NBA, none have comprehensively examined both operative and nonoperative spinal injuries with respect to return to play (RTP) times. To examine injury characteristics and RTP times of all spine injuries sustained by NBA players during the 2021 to 2024 seasons. Descriptive Epidemiology Study. A retrospective analysis of official NBA injury reports was conducted for the 2021-2022, 2022-2023, and 2023-2024 seasons, including the playoffs. Spine injuries were classified according to NBA terminology and analyzed for injury distribution, treatment modalities, and games missed. Data validation was conducted by cross-referencing news reports and publicly available data. Of 285 spine injuries (217 athletes), lumbar and lower back injuries comprised 56.49% of all injured regions, while cervical/neck injuries accounted for 11.93%. The 4 most common injury types were soreness (26.83%), spasm (23.48%), tightness (12.80%), and contusion (11.59%), collectively representing nearly 75% of injuries. Nonsurgical management was utilized in 98.78% of cases, with only 4 injuries (1.22%) requiring surgery. Surgical cases involved disc-related pathology (bulging disc: 2 cases, herniated disc: 1 case, and nerve impingement: 1 case). Common injuries resulted in brief absences, with median missed games ranging from 1 to 2. However, several injury categories demonstrated high variability in recovery times. Our study showed that spine injuries in NBA players are predominantly managed nonoperatively with rapid RTP. Lumbar injuries far exceed cervical injuries, likely reflecting sport-specific biomechanical demands. The substantial variability in recovery times emphasizes the importance of individualized treatment approaches for elite athletes. Surgical cases are predominantly related to disc pathology. These findings provide valuable baseline data for clinicians managing spine injuries in professional basketball players.
Respiratory virus infections in cattle cause an estimated more than $1 billion in production losses and can threaten human health. During February 2024 to May 2025, we employed a One Health approach to surveil for respiratory viruses among cattle, farm workers, and environmental samples from 11 US and Mexican beef or dairy cattle farms. We studied nasal and ocular swabs from cattle, nasal swabs from cattle workers, bioaerosol samples, and other environmental farm samples using molecular and virological techniques. Among 26 distinct viruses identified in cattle, we detected bovine nidovirus 1, influenza D virus (D/OK-like and D/660-like), bovine coronavirus, bovine rhinitis A and B viruses, bovine respirovirus 3 and bovine respiratory syncytial virus (BRSV); 11 of the 26 detected viruses were non-bovine-associated. Two bovine rhinitis A virus was markedly divergent (provisionally designated BRAV-4). Environmental metagenomics additionally identified influenza D virus, bovine coronavirus, and bovine rhinitis B virus. One human nasal swab tested positive for SARS-CoV-2 (cladeLF.7.3). Our findings reveal the presence of emerging, co-circulating, and environmentally linked pathogens at the human-animal-environment interface, underscoring the constant need for One Health surveillance to safeguard livestock and mitigate zoonotic risk.
Autoantibodies against metabolic regulators have been implicated in metabolic disorders; however, the clinical relevance of incretin-related autoantibodies in the development of hyperglycemia remains unclear. We investigated whether autoantibodies against glucose-dependent insulinotropic polypeptide (GIP) and glucagon-like peptide-1 (GLP-1) are associated with future deterioration of glycemic status in a prospective cohort. We analyzed 218 participants who underwent health checkups and were followed for a mean of 8.1 years. Individuals with diabetes or baseline HbA1c ≥ 6.5% were excluded. Baseline serum anti-GIP and anti-GLP-1 antibody (Ab) levels were measured using AlphaLISA. Incident diabetes-range glycemia was defined as fasting plasma glucose ≥ 126 mg/dL or HbA1c ≥ 6.5% without clinical confirmation, to avoid overestimation of incident diabetes in this cohort-based setting. Predictive performance was evaluated using receiver operating characteristic (ROC) analyses. To avoid overadjustment, HbA1c was excluded from the primary prediction models. During follow-up, 21 participants developed diabetes-range glycemia. Baseline anti-GIP Ab levels were significantly higher in individuals who developed diabetes-range glycemia, whereas anti-GLP-1 Ab levels showed no association with risk. Anti-GIP Ab levels alone demonstrated modest but significant discriminative ability (AUC = 0.656). Although body mass index was a strong predictor (AUC = 0.799), adding anti-GIP Ab levels modestly improved model performance (AUC = 0.819). Elevated baseline anti-GIP Ab levels were associated with the future development of diabetes-range glycemia and provided complementary predictive information beyond conventional metabolic risk factors. However, given the modest incremental improvement beyond BMI and the limited number of incident cases, the predictive contribution of anti-GIP antibody levels should be interpreted as exploratory and hypothesis-generating and requires validation in larger independent cohorts. These findings suggest that GIP-related immune responses may represent a distinct immunometabolic component involved in early glycemic deterioration.
Food-evoked nostalgia affects emotional experiences and mental health, but its impact on older adults, a group for whom food memories have strong personal significance, is still not well understood. This study examined the mediating role of food-evoked nostalgia in the relationship between food satisfaction and psychological distress (depression, anxiety, and stress) among older adults. A cross-sectional study was conducted among 300 community-dwelling older adults (≥60 years) attending five primary healthcare centers in Abha City, Saudi Arabia. Data were collected through structured face-to-face interviews using validated Arabic versions of the Satisfaction with Food-Related Life (SWFL), State Functions of Nostalgia Scale (SFNS), and Depression Anxiety Stress Scale-21 (DASS-21). Statistical analyses included descriptive statistics, Pearson correlations, and the latent variable structural equation model (SEM) using SPSS and AMOS (version 26). Common method bias, discriminant validity, and multicollinearity were assessed using Harman's single-factor test, confirmatory factor analysis, and variance inflation factors, respectively. Participants reported high levels of nostalgia (M = 88.3, SD = 24.8) and food satisfaction (M = 16.2, SD = 4.0). Food-evoked nostalgia was positively correlated with food satisfaction (r = 0.97, p < 0.001) and negatively correlated with depression, anxiety, and stress (r ≈ -0.95, p < 0.001). The latent variable structural equation model indicated that food-evoked nostalgia was a significant partial mediator in the relationship between food satisfaction and emotional status. Food satisfaction was strongly positively associated with food-evoked nostalgia (β = 0.97), which in turn was negatively associated with depression (β = -0.64), anxiety (β = -0.63), and stress (β = -0.67). Food satisfaction also showed direct negative associations with depression (β = -0.32), anxiety (β = -0.33), and stress (β = -0.29). All paths were statistically significant (p < 0.001). Incremental fit indices (χ 2/df = 2.84, CFI = 0.94, IFI = 0.94, NFI = 0.92, RMSEA = 0.052; 90% CI: 0.048-0.056) indicated a good model fit. Food-evoked nostalgia is associated with lower levels of psychological distress and may function as a psychological resource linking food satisfaction to emotional outcomes. Incorporating culturally meaningful food experiences may help support mental health among older adults by reinforcing emotional connections, fostering positive nostalgic memories, and contributing to improved emotional wellbeing.
To narratively review the role of surgical interventions in the management of leprosy-associated neuropathy in Pakistan, focusing clinical indications, timing, and outcomes of nerve decompression, nerve abscess management, and reconstructive procedures. This review also aimed to highlight gaps in local evidence, compare Pakistani practices with global standards, and assess how surgical care can be integrated into national leprosy control efforts to reduce disability in the post-elimination era. Literature search was conducted (PubMed, Embase, Cochrane; 1980-2025) to provide a broad overview of the epidemiology, medical management, rehabilitation, and surgical treatment of leprosy-associated neuropathy. Studies describing surgical indications, techniques, outcomes, or rehabilitation were emphasized. Publications focused exclusively on medical therapy without neurological outcomes were excluded. Pakistani studies were prioritized, while international literature was included to contextualize local practice and supplement limited national data. Surgical services in Pakistan are largely limited to specialized centres such as MALC and AKUH. Nerve decompression, tendon transfers, and reconstructive procedures have the potential to restore function and prevent ulcers, with functional recovery rates in comparable low-resource settings. Early surgical intervention combined with rehabilitation yields the best outcomes in most of the articles. International models demonstrate that integration of surgical care into national programs is feasible and effective, even in resource-constrained environments. The reviewed evidence indicates that surgical intervention with rehabilitation offers optimal outcomes for leprosy-associated neuropathy. In Pakistan's post-elimination era, prioritizing disability prevention through expansion of surgical services, strengthening reconstructive training, and integrating rehabilitation into national leprosy programs is essential to reducing long-term disability and stigma.
Global food systems face the double challenge of ensuring human health and reducing environmental impacts. While the Healthy Eating Index (HEI) evaluates nutritional quality, the Planetary Health Diet (PHD) attempts to integrate ecological sustainability with healthy nutrition in its framework. This study assessed the agreement between the metric HEI-2015 (mHEI-2015) and two versions of a Planetary Health Diet Index (PHDI-2019 and its updated version PHDI-2025), as well as their associations with environmental indicators in a Bavarian population. Cross-sectional data from the 3rd Bavarian Food Consumption Survey (BVS III, 2021-2023, n = 1,100, 18-75 years) were used to calculate mHEI-2015, PHDI-2019, and PHDI-2025, with the latter newly operationalized to reflect the updated 2025 PHD framework. Dietary greenhouse gas emissions (GHGE), land use (LU), and water footprint (WFP) were estimated. Statistical analyses included weighted kappa (kw) for agreement, correlation analyses, and survey-weighted regression models adjusted for sociodemographic factors. Mean mHEI-2015 was 51.3 out of 100 points, PHDI-2019 19.3 out of 42 points, and PHDI-2025 20.6 out of 45 points. Regression analyses showed that female participants, individuals with higher education, and residents of large cities (≥500,000 inhabitants) consistently scored higher across all indices (all p ≤ 0.047), whereas age was positively associated only with mHEI-2015 (β = 1.9 per 10 years; p < 0.001). Agreement between PHDI-2025 and PHDI-2019 was nearly perfect [weighted kappa (kw) = 0.84, p < 0.001]. Agreement between PHDI-2025 and mHEI-2015 was modest (kw = 0.38, 95% CI 0.35-0.42; r = 0.58; p < 0.001). Across PHDI-2025 quintiles, higher adherence was associated with lower GHGE (-12.5%; p-trend <0.001) and LU (-28.6%; p-trend <0.001), while WFP was unchanged (p-trend = 0.146). Across quintiles, the mHEI-2015 showed reduced LU (-16.3%; p-trend = 0.008), increased WFP (+16.7%; p-trend <0.001), and no GHGE change (p-trend = 0.849). Health- and sustainability-focused diet indices aligned only partially in the Bavarian population. Both PHDI versions, but not mHEI-2015, were associated with lower GHGE and LU. PHDI-2025 closely mirrored PHDI-2019, indicating refinement rather than redefinition.
Haemosporidian parasites of the genera Plasmodium, Haemoproteus, and Leucocytozoon are widespread vector-borne pathogens of birds, yet their tissue distribution and lineage diversity remain poorly understood in many host species. We investigated haemosporidian infections in hooded crows (Corvus cornix) from Northwestern Italy using a multi-organ molecular approach. Forty-seven individuals collected between 2010 and 2011 were screened by nested PCR targeting the mitochondrial cytochrome b gene, followed by sequencing. Overall infection prevalence was 97.9%, with Leucocytozoon spp. detected in all infected individuals, followed by Plasmodium spp. (52.2%) and Haemoproteus spp. (17.4%). Five Leucocytozoon lineages were identified, with lineage COCOR09 being dominant and widely distributed across organs (heart, lung, liver, kidney, spleen skeletal muscle and brain), while other lineages showed more restricted tissue patterns. Sequencing also revealed the Haemoproteus lineage CXPIP27 and three globally distributed Plasmodium lineages (LINN1, GRW06, and SGS1) detected for the first time in this host species. Mixed infections were frequent, occurring in over two-thirds of infected birds. These findings demonstrate extensive tissue dissemination, lineage-specific organ tropism, and high infection complexity in hooded crows, underscoring the importance of multi-organ, lineage-level approaches for understanding the ecology and epidemiology of avian malaria parasites.
Malnutrition remains a major public health challenge in low- and middle-income countries and disproportionately affecting children under five. Eggs, given their high nutrient density and relative physical or economic accessibility, have been tested for their effect on improving nutritional outcomes in children under five. However, findings from scientific exercises to test the impact of egg-based trials on child growth have not been systhematically pooled and synthesised. Therefore, this meta-analysis aimed to synthesise evidence on the impact of egg-based interventions on the nutritional status of children underfive as determined by weight-for-height Z-score (WHZ), weight-for-age z-score (WAZ), and height-for-age z-score (HAZ). Research articles of randomised controlled trials published between 2013 and 2023 were identified through a comprehensive search of PubMed/MEDLINE, Web of Science, CINAHL, Embase, Science Direct, Google Scholar, and African Index Medicus data bases. Articles evaluated the effect of egg-based interventions against alternative diets, behaviour-change education, or no alternative intervention were included. Primary outcomes are WHZ, WAZ, and HAZ. Random-effects models were used to pool effect sizes (mean difference), and subgroup analyses and meta-regression explored sources of heterogeneity. Publication bias was assessed using funnel plots and Egger's test. Seven studies involving 3673 children met the inclusion criteria. Egg-based intervention significantly improved WAZ (MD: 0.33; 95% CI: 0.11-0.55) and WHZ (MD: 0.30; 95% CI: 0.12-0.48). However, no significant effect was observed on HAZ (MD: 0.05; 95% CI: -0.05-0.14). It is figuredout that egg-based interventions can improve weight-related nutritional outcomes (WHZ and HAZ) among children underfive in sub-Saharan Africa, but not linear growth (HAZ).
Metabolic syndrome (MetS) and type 2 diabetes mellitus (T2DM) are two diseases that are related to each other and the presence of both can increase the risk of heart attack and death. The present study aimed to assess the efficacy of anthropometric indices in predicting the risk of MetS among T2DM patients in Iran. In this study, 400 T2DM subjects were included via convenience sampling. Some anthropometric information, such as weight, height, waist circumference, and hip circumference along with biochemical data including fasting blood sugar, lipid profile components, systolic and diastolic blood pressure, were collected. The performance of the anthropometric indices in predicting MetS was evaluated using receiver operating characteristic (ROC) curve analysis and estimation of the area under the curve (AUC) values. Abdominal volume index (AVI) had the largest AUC in the total population. Although this result was also obtained in females, the highest value of AUC was related to vaisceral adiposity index (VAI) in males. All of the anthropometric indices increased the odds of MetS significantly, with p < 0.001 in all crude and adjusted models. Although AVI had the highest odds ratio in the crude model (21.28, 95% CI 12.26-36.92), the highest odds ratio belonged to Relative fat mass (RFM) in the adjusted models. All anthropometric indices used in the study were effective in predicting the odds of MetS. AVI and VAI showed the strongest associations with MetS in both genders. Given the cross-sectional design of the study, these results should be interpreted as associations rather than causal or predictive relationships.
Condyloma acuminata is a common sexually transmitted disease caused by human papillomavirus infection and is characterized by frequent recurrence despite available therapies. This study evaluated recovery, recurrence, and recurrence-free interval across therapeutic modalities among condyloma acuminata outpatients at Dr. Moewardi Regional General Hospital, Surakarta, Indonesia, from January 2020 to December 2024. Using a cross-sectional analysis of medical records, 132 eligible patients were included and analyzed with bivariate tests. Treatment modality was not significantly associated with recovery (p = 0.157), although recovery was highest with trichloroacetic acid (71.2%), followed by excision (66.7%), combination therapy (trichloroacetic acid + cryotherapy) (64.7%), and cryotherapy alone (33.3%). In bivariate analysis, type of therapy (p = 0.025) as well as type of condyloma (p < 0.001), sexual orientation (p = 0.019), and HIV status were associated with recurrence, with the highest recurrence observed after excision (42.4%) and the lowest after cryotherapy alone (11.1%). Mean recurrence-free interval varied across modalities (cryotherapy 8.00 ± 0.00; trichloroacetic acid (TCA) 15.92 ± 28.49; excision 18.93 ± 13.64; combination therapy 27.50 ± 36.67 weeks), with the longest interval observed in the combination group. However, statistical analysis revealed that only types of condyloma, sexual orientation, and HIV status were associated with the mean recurrence time. Overall, recovery and recurrence patterns differed descriptively across treatment modalities, with excision showing the highest recurrence and cryotherapy alone the lowest. Recurrence outcomes varied according to treatment modality; however, no significant association was observed between therapy type and recurrence-free interval.