The Nursery School Absenteeism Surveillance System (NSASSy) has operated since 2010, but assessing NSASSy effectiveness at controlling outbreaks has been difficult. This study was conducted to demonstrate NSASSy effects for reducing infectious disease incidence at NSASSy-participating nursery schools. In two cities, we acquired NSASSy information from nursery schools with more than 80 children. The study period for one city extended from April 1, 2016 through March in 2019, and for another city from January 1, 2024 through May 20, 2025. We examined the incidence of influenza and COVID-19 and symptoms for age classes at each facility compared to their proportions of data entry to NSASSy, community outbreaks in national official surveillance, and their mutual interaction terms. The estimation procedure was the individual effect model with a random effect of the facility. The relation of influenza to community outbreak was found to be significant and positive. Interaction terms between data entry rates and community outbreaks were significant and negative for both cities. For COVID-19, no association was found. Regarding symptoms, for all symptoms and fever, relations to data entry rates were significant and negative in both cities. Regarding respiratory symptoms and diarrhea, the rate was significant and negative in one city. Results show that NSASSy mitigated community influenza outbreak effects on outbreaks at nursery schools not only before the COVID-19 pandemic, but also after the pandemic. Moreover, NSASSy reduced the rates of incidence of all symptoms.
Zoonotic diseases are common threats to global health. A large number of infectious diseases are transmitted from animals to humans. The current study aimed to assess the community's knowledge, attitudes, and practices (KAP) regarding common zoonotic diseases in the Arbaminch district. A cross-sectional survey was carried out between November 2024 and June 2025. A total of 384 participants were interviewed in the study. Participants residing in these areas were randomly chosen. Data were collected using a structured questionnaire. The collected data were analyzed using Stata 17, and the results were reported using descriptive statistics and the chi-square test. The findings of this study revealed that a majority (55%) of participants had good knowledge about zoonotic diseases. Respondents know several modes of transmission for zoonotic diseases, with animal bites (32.5%) being the most recognized, followed by direct contact (15.5%), ingestion of raw products (10%), and inhalation (10%). Regarding attitudes, 63.2% of respondents exhibited a positive attitude towards the importance of zoonotic disease prevention and control, and 67.4% of respondents followed relatively good hygiene and preventive behaviors. However, risky practices were still common. Knowledge score showed a significant association with age. Attitudes of participants were significantly associated with education, age, occupation, and income. Similarly, practices were significantly associated with gender, education level, occupation, and income, with all associations being statistically significant (p < 0.05). The overall community knowledge, attitudes, and practices regarding zoonotic diseases were relatively good.
Despite decades of mass Ivermectin distribution, onchocerciasis transmission persists in Cameroon. Comorbidities with non-communicable diseases (NCDs) and gaps in treatment uptake may contribute to the sustained transmission of this disease. This study assessed the prevalence of onchocerciasis, its comorbidity with selected NCDs, and determinants of continued transmission in Bafut Health District, Cameroon. A community-based cross-sectional study with health facility-based recruitment was conducted from June to July 2024 among 282 adults aged 30 years or older who had resided in Bafut for at least 5 years. Data on sociodemographic, NCD history, onchocerciasis knowledge, and ivermectin uptake were collected using a structured questionnaire. Laboratory investigations included skin snip microscopy, nodule palpation, blood pressure measurement, and venous blood tests for diabetes and rheumatoid arthritis. The relationship between categorical variables was analysed using the Chi-square test and logistic regression at the 5% significance level. Of the 282 participants examined, 43 (15.2%) had palpable nodules, 17 (6.0%) were positive on skin snip for Onchocerca volvulus, and 6 (2.1%) had both palpable nodules and positive skin snip results. Onchocerciasis infection showed significant associations with sex, age group, marital status, and educational level (p < 0.05). Hypertension (27.8%) and rheumatoid arthritis (25.9%) were the most common comorbidities among infected individuals. Ivermectin uptake was high, with 94.3% of participants reporting that they had ever taken ivermectin at least once during previous community-directed treatment with ivermectin (CDTI) rounds. In comparison, a small proportion (5.7%) declined due to illness or blurred vision. Participants with tertiary education had approximately five times higher odds of taking ivermectin compared to those with no formal education (aOR = 4.62, 95% CI: 1.18-18.12, p = 0.028). Similarly, individuals who had lived in the community for more than 10 years had five times higher odds of adhering to ivermectin treatment than recent residents (aOR = 5.03, 95% CI: 1.11-22.8, p = 0.036), primarily due to a refusal of ivermectin, mainly because of side effects. Nearly half of the participants (48.9%) demonstrated poor knowledge of onchocerciasis. Onchocerciasis remains present in Bafut, with ongoing transmission. Infection was associated with socio-demographic and behavioral factors. Non-communicable diseases were observed; however, no causal relationship can be inferred. Strengthening ivermectin uptake and health education may improve control. Not applicable.
In treating infectious bone defects, bacterial eradication alone is insufficient, as impaired blood perfusion characterized by low fluid shear stress (FSS) hampers angiogenesis, thereby compromising osteogenesis and delaying bone repair. Herein, we engineered a low-FSS-activated, pro-angiogenic implant coating with antibacterial properties. This coating comprises a Fe3+-tannic acid (Fe3+-TA) chelation network as the adhesive sublayer, anchoring black phosphorus (BP) nanosheets preloaded with metformin (Met), and is further capped by an outer layer of Lactobacillus animalis-derived extracellular vesicles (BEVs). The Fe3+-TA and BP components synergistically provide photothermal and photodynamic antibacterial activity, while the BEV layer promotes M2 macrophage polarization and modulates the sustained release of Met and phosphate ions from BP degradation, fostering a pro-regenerative microenvironment. Simultaneously, the BP-mediated photodynamic effect exacerbates local oxygen consumption, amplifying the angiogenic potential of Met under hypoxia. The hypoxia-activated Met lowers the FSS threshold required to enable a laminar shear-protective endothelial phenotype, even under pathological low-FSS conditions. This process drives orderly angiogenesis, restores microvascular perfusion, and supports downstream osteogenesis. Overall, this bioinspired coating integrates "hypoxia activation, vascular guidance, and laminar-flow protection" to promote angiogenesis and osteogenesis, and is augmented by synergistic antimicrobial and immunomodulatory benefits, offering a promising strategy for treating infectious bone defects.
Bile acids (BAs), the main component of bile, play a key role in the digestion of lipids. However, recent studies have demonstrated that BAs can also act as signaling molecules, regulating metabolism by binding to BA receptors. Ferroptosis is an iron-dependent form of cell death characterized by lipid peroxidation, and it is closely associated with lipid and iron metabolism. Recent studies have shown that BA levels are significantly correlated with ferroptosis in certain diseases, including liver and gallbladder diseases, colitis, tumors, and infectious diseases. However, the effect of BAs on ferroptosis varies. Some BAs, such as deoxycholic acid (DCA), glycochenodeoxycholic acid (GCDCA), and ursodeoxycholic acid (UDCA), can trigger cellular oxidative stress, which, in turn, induces ferroptosis. While other BAs, such as taurolithocholic acid (TLCA) and tauroursodeoxycholic acid (TUDCA), significantly inhibit ferroptosis, thereby attenuating cell and tissue damage. To our knowledge, this review is the first to systematically summarize the roles and mechanisms of BAs in regulating ferroptosis under different disease conditions. Potential therapeutic methods and clinical applications have also been proposed for targeting BA-mediated ferroptosis in various diseases, providing guidance for subsequent research on BAs and ferroptosis.
Previous studies have explored the relationships between dengue fever (DF) and its impact factors, using various models. However, few have considered the spatial heterogeneity of these relationships in simulating DF variations. This study analyzed monthly DF incidence and rate data across 34 provincial-level administrative divisions (PLADs) in China from 2004 to 2019, along with climatic and socioeconomic variables. Nine impact factors were included: six climatic variables - minimum temperature (Tmin), mean temperature (Tmean), maximum temperature (Tmax), relative humidity (RH), precipitation (PRCP), and the El Niño-Southern Oscillation (ENSO) - and three socioeconomic variables - population density (PD), per capita regional gross domestic product (pcGDP) and number of foreign visitor arrivals (NFVAs). We introduced a modeling system that incorporates the spatial heterogeneity of these factors and integrates four artificial intelligence (AI) algorithms: support vector machine (SVM), artificial neural network (ANN), random forest (RF), and gradient boosting machine (GBM). This system was implemented under a novel framework - grid-by-grid, multi-algorithms, optimal combination (GGMAOC) - to explore the spatial heterogeneity of both algorithms and impact factors. From 2004 to 2019, DF in China showed a significant upward trend, an average annual increase of 1,618 cases. Among the impact factors, socioeconomic variables exhibited stronger associations, with correlation coefficients exceeding 0.77. In Yunnan, the ANN model performed best (DISO=0.18), with PDLag3 contributing over 31%. In Guangdong, the RF model was optimal (DISO=0.26), with TmaxLag2 contributing over 67%. Performance varied across PLADs, which highlights the spatial heterogeneity of both impact factors and algorithmic responses. Importantly, the proposed GGMAOC framework substantially outperformed the traditional stepwise regression approach. Our findings reveal that impact factor patterns vary across PLADs, emphasizing spatial heterogeneity and the need for PLAD-specific modeling. The GGMAOC framework identifies key impact factors and optimal models for DF dynamics, offering high predictive confidence and broad applicability to other diseases with spatially heterogeneous impact factors. Furthermore, the successful model construction in GD and YN and the identification of challenges in TW modeling further support the necessity and innovation of adopting a spatially heterogeneous modeling framework in infectious disease modeling as proposed in this study.
Dengue risk is increasingly shaped by climate change and rapid urbanization, yet comprehensive, multidimensional risk assessments grounded in a One Health perspective remain scare. We develop a geographically eXplainable artificial intelligence (GeoXAI) model to estimate dengue hazard across China in 2024 (current), 2050, and 2100 under different shared socioeconomic pathway (SSP) scenarios. A hazard-exposure-vulnerability framework is then used to assess dengue risk by integrating dengue hazard, human exposure, and social vulnerability. Here we show a northward expansion of high-hazard areas, with the minimum temperature in the coldest month being the dominant driver (27.2% contribution). Moreover, socioeconomic factors such as population density (3.8%) and urbanization level (2.7%) will further amplify dengue hazard. Current dengue risk assessments reveal high-risk clusters in Southwest China and megacities. Dengue risk exhibits spatially heterogeneous escalation in the future, with Southwest and Southeast China facing the steepest growth and Northwest China experiencing disproportionate increases. Compared to the current, dengue risk in SSP585-a high greenhouse gas emission scenario and limited climate policy interventions-increases by 6.01% (2050) and 8.21% (2100), representing the largest escalation among the three SSPs. Despite ongoing disease control efforts, our findings underscore the need to intensify integrated surveillance and multidimensional intervention strategies against escalating dengue risk in China, and offers lessons for other prevalent Aedes-borne diseases (e.g., chikungunya). Dengue is a vector-borne infectious disease mainly transmitted by mosquitoes, which poses a growing public health concern in China. Understanding where and why this risk is growing is essential for effective prevention. In this study, we developed a multi-dimensional risk assessment framework that combines three key factors: dengue hazard, human exposure, and social vulnerability. Our findings show that high-risk areas are mainly concentrated in southern and southwestern China, with clear spatial differences in risk distribution. We found that increased contact between humans and mosquitoes and rapid urbanization are driving dengue risk across urbanized regions under climate change. In the future, the overall risk intensity is expected to increase unevenly. While high-risk clusters will remain stable in the south, new risk hotspots are likely to emerge in northern areas. Our results show that fighting dengue requires strategies that combine climate adaptation with improved urban planning.
Coccidioidomycosis is a fungal infection endemic to the southwestern United States, particularly Arizona. Fine needle aspiration biopsy (FNAB) is established as effective in diagnosing infectious diseases. However, existing literature evaluating FNAB of thoracic coccidiomycosis remains limited. We present a large single institution series of thoracic coccidioidomycosis cases diagnosed through FNAB. The Mayo Clinic Arizona Pathology database was searched for all FNAB cases with a diagnosis of coccidioidomycosis from January 1, 2013, to December 31, 2025. Electronic medical records were reviewed to tabulate demographics, clinical history, serology, and radiologic findings. All key cytologic, histologic, and special stained slides were reviewed. One-hundred and one FNAB samples were obtained from 100 patients: 37 cases combined endobronchial ultrasound-guided transbronchial needle aspirate (EBUS-TBNA) and robotic-assisted bronchoscopy (RAB), 31 cases RAB, 20 cases EBUS-TBNA, 12 cases percutaneous computed tomography-guided, 1 case endoscopic ultrasound-guided. Coccidioides organisms were identified during rapid on-site evaluation in 34 cases, saving 18 patients from unnecessary procedures. Coccidioides organisms were identified in 94.1% (95/104) of cytology slides and in 67.1% and 66.7% of tissue biopsies and cell blocks, respectively. For the 6 cases without Coccidioides organisms on cytology slides, concurrent tissue biopsies or cell blocks with or without Grocott's methenamine silver stains helped confirm the diagnosis. Of the 41 patients with a malignancy history, one had both Coccidioides and malignancy in the same specimen. FNAB is accurate at diagnosing thoracic coccidioidomycosis when combined with EBUS-TBNA and RAB. Rapid on-site evaluation is critical during interventional procedures and can eliminate the need for unnecessary procedures.
Patients with Down Syndrome (DS) are characterized by dysfunction of several organs, including the liver, brain, heart defects, gastrointestinal anomalies, and lethal immune hypersensitivity. A person with DS is also susceptible to various inflammatory diseases, including hepatic autoimmune diseases. The Cyclic guanosine monophosphate-adenosine monophosphate synthase (cGAS) is known to trigger the stimulator of interferon genes (STING) and downstream proinflammatory factors. In this work, we hypothesized that oxidative stress-associated DNA damage triggers activation of the cGAS-STING signaling pathway and promotes liver inflammation in DS. Here, we investigated the role of reactive oxygen species (ROS) associated DNA damage and the cGAS-STING signaling pathway in the pathogenesis of hepatic inflammation in the DS model. Our results showed that DS cells harbor excessive ROS and DNA damage in DS fibroblasts and DS mouse liver. Further, DS cells accumulate micronuclei that likely serve as a source of cytoplasmic DNA to stimulate cGAS-STING activation. In addition, RNA-seq analysis results showed enhanced expression of key type I interferon factors in cGAS-STING pathways in DS liver and inflammatory responses and elevated liver enzymes such as alanine transaminase (ALT) that indicate a hepatocellular liver injury in DS. The results of this study opened the opportunity to connect endogenous DNA damage triggers innate immune response, which may contribute to the upregulation of the cGAS-STING signaling to exacerbate hepatic inflammation in DS.
The discovery of antibiotics and their subsequent therapeutic use revolutionized our ability to treat once deadly infectious diseases, and antibiotics have become one of the most commonly prescribed drug classes. Unfortunately, these compounds not only target pathogenic strains, but also non-pathogenic bacteria that fulfill important functions for the human host. As such, antibiotic treatment can cause severe collateral damage, resulting in dysbiosis, for example, in the human gut microbiome. Given the immense importance of the gut microbiome for human health, antibiotic-induced dysbiosis can cause a variety of detrimental health outcomes. In addition, antibiotic (over-)use causes selection of antibiotic-resistant strains, and the human gut microbiome has become a major reservoir for resistance determinants that can transfer to pathogenic isolates and cause hard-to-treat infections. In this review, we describe various adverse effects that antibiotic use has on the human gut microbiome, how we can approach this problem experimentally, and discuss pathways to mitigate antibiotic-induced collateral damage.
Escherichia coli is used as an indicator of fecal contamination in river water, yet its geographic variation associated with land use and virulence potential of these strains remain insufficiently characterized. In particular, Shiga toxin (Stx)-producing E. coli (STEC) represents a highly virulent subset capable of causing severe human diseases, while its quantification and virulence assessment are hindered by low abundance in river water. Fixed-point sampling at 10 stations in the Oyodo River, Japan, revealed E. coli counts reaching up to 1.70 × 104 CFU/100 mL and correlating positively with water quality parameters. The counts spiked in residential areas lacking sewer systems and adjacent livestock farms. Four stations where segments characteristics changed were investigated for STEC using the foam concentration method. STEC strains were detected at all locations, with a maximum of 3.98 cells/100 mL, and 13 isolates were recovered, with higher levels downstream of unsewered residences and livestock farms. Given the low infectious dose of STEC, this concentration indicates potential human infection risk. Whole-genome sequencing (WGS) revealed that three isolates (O103:H2 or O111:H8) harbored stx1a and/or stx2a with eae, a combination linked to severe human infections. Ten isolates, including the two stx2a- and eae-positive strains, exhibited multidrug resistance genotypically and phenotypically. WGS of 219 non-STEC E. coli strains isolated across sampling periods revealed consistent shifts in phylogenetic diversity and antimicrobial resistance gene profiles after the river passed through unsewered residences and livestock farms, suggesting persistent contamination. Overall, combining efficient concentration strategies with genome-based analyses enables robust evaluation of STEC occurrence, virulence, and contamination sources in human-impacted river environments.
Alterations of the gut microbiome have been reported in central nervous system demyelinating diseases. While the gut microbiome in pediatric multiple sclerosis (MS) has been studied, the role of the gut microbiome in other pediatric-onset acquired demyelinating syndromes (ADS) remains unknown. We compared the gut microbiome composition between myelin oligodendrocyte glycoprotein antibody-positive (MOG+) and antibody-negative (MOG-) participants with pediatric-onset ADS. Participants aged ≤21 years enrolled in the Canadian Pediatric Demyelinating Disease Network microbiome study (2015-2018) with a single episode or relapsing non-MS, non-neuromyelitis optica spectrum disease attacks of demyelination with symptom onset <18 years were included. Stool sample-derived DNA underwent 16S rRNA (V4) sequencing. Serum MOG-IgG antibodies were tested within 30 days of first attack onset. Alpha-diversity (Shannon, Margalef's index, Chao1) and beta-diversity (weighted UniFrac) were analysed. Phylum/genus-level taxa were assessed using negative binomial models with false discovery rate correction. Rate ratios were sex- and age-adjusted (aRR). Forty-six participants (18 MOG+/28 MOG-) were included. Mean age at stool sample collection (MOG+/MOG-) was 14.7/17.2 years. Alpha-/beta-diversities did not differ between MOG+/MOG- participants (p > 0.3). At the phylum level, the relative abundance of Proteobacteria was lower in MOG+ than MOG- participants (aRR:0.22;95%CI:0.07-0.69;q = 0.03). At the genus level, the relative abundance of Escherichia/Shigella was lower in MOG+ than MOG- participants (aRR:0.01;95%CI:0.001-0.07;q = 0.001), CONCLUSIONS: While alpha/beta-diversities did not differ between MOG+/MOG- participants, taxa-level differences were observed. Our findings suggest that the gut microbiome composition may differ by MOG serostatus among pediatric-onset ADS participants. Future work is warranted, utilizing larger cohorts and longitudinal follow-up.
Modelling approaches that consider system-wide delivery platforms rather than single diseases can be instrumental in economic evaluation and forward-looking policy formulation. This study develops a costing approach tailored to the Thanzi La Onse (TLO) model of Malawi's healthcare system, with general applicability to other health system models. We developed a mixed-method costing approach to estimate the total cost of healthcare delivery (excluding high-level administrative costs) in Malawi using the TLO model, from a healthcare provider perspective. Through iterative adjustments of key parameters, we aligned model-based estimates as closely as possible with real-world expenditure and budget data. Costs were projected for 2023-2030 under alternative scenarios of health system capacity. A comparison with expenditure and budget data suggests our costing method is broadly reliable for the conditions captured by the model, though some mismatches remain owing to data limitations and definitional inconsistencies. Under current system capacity, total healthcare delivery costs for 2023-2030 were estimated at 2.83 billion US dollars [95% uncertainty interval (UI), $2.80-$2.87 billion], excluding non-medical infrastructure and administrative costs, averaging $390.98 million [$385.92-$396.71 million] annually or $16.89 [$16.75-$17.08] per capita. Scenario analysis highlighted strong interdependencies within the health system. Improving consumable availability alone increased consumables costs by 4.63%, while expanding human resources for health (HRH) alone increased them by 1.43%. When both HRH and consumable availability were expanded together, consumable costs rose by 5.93%, a combined effect larger than either change alone, illustrating how bottlenecks in one component constrain the impact of improvements in another. Mixed-method costing using health system models is a feasible and robust method to estimate and forecast healthcare delivery costs. Clarifying assumptions and limitations can improve their utility for economic analyses and evidence-based planning in the health sector.
The objective of this study is to conduct a comparative morphological analysis of the kidneys in the context of various infection pathways in models of acute obstructive pyelonephritis. The study investigated the structural condition of rabbit kidneys (n=40) in relation to the pathogenesis of acute pyelonephritis. The animals were randomly assigned to four groups: two experimental and two control groups. Experimental Groups I and II underwent urethral ligation to induce obstructive pyelonephritis. In Group I, a bacterial strain was introduced into the gastrointestinal tract to examine intestinal translocation, while in Group II, it was introduced into the urinary bladder to study ascending infection. Control Groups III and IV received a similar introduction of the infectious agent without urethral obstruction. On the third day, morphological examination was conducted using optical microscopy and a computerized microscope capable of digital microphotography. In Group I, inflammatory changes were detected in 60% of cases, while in Group II, they were found in 70% of cases. No morphological changes were observed in the control groups. The nature of morphological changes in the kidneys in the model of acute obstructive pyelonephritis did not show statistically significant differences when comparing between the experimental groups. This indicates the involvement of enterorenal bacterial translocation in the development of inflammatory processes in the kidneys in the presence of obstruction.
Fusobacterium spp. are Gram-negative, obligate anaerobic bacteria associated with a broad clinical spectrum, including head and neck infections; soft tissue infections; gastrointestinal and genitourinary infections; and bacteremia without an identified source. Clinical manifestations vary by species and site of colonization. F. necrophorum is linked to head and neck infections that may culminate in Lemierre syndrome, a life-threatening septic thrombophlebitis of the internal jugular vein. Although Fusobacterium bacteremia is uncommon, it is associated with substantial mortality. F. necrophorum bacteremia may occur as a complication of head and neck infection in younger, healthy individuals, whereas F. nucleatum bacteremia is reported predominantly in older patients with malignancy, secondary to an abdominal source or without an identified origin. Fusobacterium isolates are usually susceptible to penicillins, cephalosporins, aminopenicillins with β-lactamase inhibitors, carbapenems, and metronidazole, while resistance to clindamycin and moxifloxacin has been increasingly reported. Because susceptibility testing is not routinely performed and susceptibility data remain limited, severe infections are commonly treated with a β-lactam/β-lactamase inhibitor, a carbapenem, or a β-lactam in combination with metronidazole. This review provides an overview of the microbiology, clinical spectrum, and treatment of Fusobacterium spp.
Cornelia de Lange syndrome is a rare congenital disorder marked by considerable clinical variability, including intellectual disability, growth retardation, distinctive facial features, limb abnormalities, and multisystem involvement. The condition is primarily linked to mutations in genes encoding components of the cohesin complex that are essential for chromosomal stability and gene regulation. We report a case of a mild type of Cornelia de Lange syndrome caused by a de novo mutation in an Iranian family. We investigated a 19-year-old Iranian male individual presenting with developmental delay, borderline intellectual disability, dysmorphic facial features, and multisystem involvement. Whole-exome sequencing was performed to identify causative variants. A de novo heterozygous variant affecting the start codon of NIPBL (NM_133433.4:c.2T>A; NP_597677.2:p.Met1Lys) was identified. This variant was absent from population databases and predicted to disrupt normal translation initiation. Sanger sequencing and co-segregation analysis confirmed the genetic findings. In silico tools and population databases were utilized to assess variant pathogenicity. Clinically, the patient exhibited classical Cornelia de Lange syndrome features with relatively mild intellectual impairment compared with typical loss-of-function cases, consistent with the hypothesis of potential use of alternative start sites. This case shows a known NIPBL start-loss variant's correlation with a relatively mild clinical presentation and offers more genotype-phenotype evidence for it. This finding suggests a possible role for downstream translation initiation as a modifier of disease severity, although further functional validation is required. Comprehensive genetic analysis remains essential for accurate diagnosis, prognosis, and counseling in patients with Cornelia de Lange syndrome.
In the context of the fourth wave of the opioid overdose crisis, defined by fentanyl and stimulant co-use, we aimed to explore if there might be a relationship between exposure to xylazine in the fentanyl supply and patterns of crack/cocaine use. This mixed methods study used data from four Connecticut-based sources: (1) the Office of the Chief Medical Examiner for fatal overdoses, 2019-2023, (2) community-based drug checking data, 2023-2024, (3) a structured survey of people who use drugs, 2024, and (4) one-on-one in-depth interviews, 2024. Xylazine was detected in 19.2% of overdose fatalities involving fentanyl or a fentanyl analog (n = 5,849). Those with xylazine detected had a higher proportion of cocaine simultaneously detected, though this was not statistically significant (54.0% vs. 51.1%, p = 0.085). Cocaine detection rose over time in the fatality data, regardless of xylazine detection. There were 1,048 drug samples submitted to harm reduction organizations for drug checking. Using Fourier Transform Infrared Spectroscopy, 573 of the samples submitted tested positive for fentanyl or a fentanyl analog; 45.2% of these also tested positive for xylazine. Of 1048 samples, 276 tested positive for cocaine. Among the survey sample (n = 88), 62.5% of participants self-reported xylazine exposure. Those who reported lifetime xylazine exposure also reported higher past-year crack/cocaine use, though this was not statistically significant (89.1% vs. 81.8%, p = 0.336). Among our interview sample (n = 31), three themes emerged: (1) participants felt there was an inevitability of polysubstance use due to an increasingly volatile drug supply, (2) participants increased their crack/cocaine use in response to perceived exposure to xylazine in their fentanyl supply, and (3) participants' drug patterning changed due to xylazine's presence (e.g., timing, drug sequencing, etc.). Two meta-inferences were identified: (1) overdose fatality data likely underestimate the "true" prevalence of xylazine in the street supply, and (2) people who use drugs are using crack/cocaine in response to xylazine in their local fentanyl supply. Xylazine exposure has likely contributed to an adaptive and increasing use of crack/cocaine among people who use drugs in Connecticut. It is imperative that practitioners and policymakers are prepared to manage this triad of opioid-stimulant-sedative polysubstance use.
To evaluate the clinical and economic impact of universal screening for cytomegalovirus (CMV) in pregnant women in Italy, with valacyclovir (VCV) therapy in the case of maternal primary CMV infection, compared with no screening. We developed a decision-analytic model using a deterministic decision tree and compared the no-screening strategy (Scenario 1) with universal screening until 13 + 6 weeks' gestation (Scenario 2), and universal screening until 23 + 6 weeks' gestation (Scenario 3) as recommended by the Italian National Health Service. The model was applied in a hypothetical population of 400 000 pregnant women, representative of the annual number of women giving birth in Italy. Only women susceptible to primary CMV infection were considered, in whom CMV screening by serological testing (IgG/IgM testing ± IgG avidity), followed by VCV treatment (8 g/day) in the case of primary CMV infection, is recommended. Outcomes included the numbers of primary maternal CMV infections diagnosed, fetal congenital CMV (cCMV) infections, terminations of pregnancy (TOPs) and symptomatic and asymptomatic neonatal cCMV infections, and the cost per symptomatic cCMV case avoided (in Euros (€)) from the perspective of the Italian National Health Service. Universal screening until 13 + 6 weeks' gestation would identify 910 maternal primary CMV infections. Compared with no screening, it would prevent 92% of symptomatic cCMV infections (183 vs 15 cases) and prevent 70% of TOPs (33 vs 10 cases). Extending the universal screening period to 23 + 6 weeks' gestation would result in 280 additional diagnoses of maternal primary CMV infection and a further 2% and 9% reduction in symptomatic cCMV infections and TOPs, respectively. Both screening strategies would increase costs by approximately €7 million compared with Scenario 1, with a cost per symptomatic cCMV case avoided of ~ €45 500 for Scenario 2 and ~ €44 400 for Scenario 3. Universal serological CMV screening in pregnancy until 24 weeks' gestation, with VCV treatment in the case of maternal primary infection, substantially reduces the burden of cCMV-related disabilities and appears economically justifiable in the Italian healthcare context. These findings may inform policy decisions in countries with a similar CMV seroprevalence and National Health Service. © 2026 The Author(s). Ultrasound in Obstetrics & Gynecology published by John Wiley & Sons Ltd on behalf of International Society of Ultrasound in Obstetrics and Gynecology.
Tanzania has adopted artificial intelligence (AI)-assisted chest X-ray screening for tuberculosis (TB), including the use of CAD4TB version 6, which is registered by the Tanzania Medicines and Medical Devices Authority (TMDA). While GeneXpert, practical reference standard used in routine practice, remains the primary bacteriological confirmatory test in routine practice, there is currently no established national threshold for CAD4TB use in either active case finding (ACF) or passive case finding (PCF) settings. This study evaluates the implementation and operational use of CAD4TB version 6 within mobile TB screening units in Tanzania and highlights challenges affecting its effective use. We conducted a retrospective analysis of screening data from 11,923 individuals collected from mobile clinics equipped with digital X-ray, CAD4TB version 6, and GeneXpert systems. Comparisons were made between manual chest X-ray interpretation, CAD4TB scores, and GeneXpert results within the subset of individuals who underwent confirmatory testing. The findings reveal substantial inconsistencies in screening workflows, including non-uniform use of CAD4TB prior to GeneXpert testing, missing radiological records, and deviations from intended protocols across sites. Descriptive analysis showed that CAD4TB scores generally aligned with GeneXpert-positive cases within the tested subset; however, due to selective application of GeneXpert and incomplete data, these observations cannot be interpreted as measures of diagnostic accuracy. This study should be interpreted as an implementation and operational assessment of AI-assisted TB screening rather than a diagnostic accuracy or threshold-setting study. The findings highlight important gaps in protocol adherence, data completeness, and workflow standardization, underscoring the need for prospective, protocol-driven studies to establish validated national thresholds for CAD4TB use in Tanzania.
Central venous catheters (CVCs) are widely used but carry the risk of central line-associated bloodstream infections (CLABSIs), which threaten patient safety. While previous CLABSI studies have primarily focused on ICUs, we conducted a hospital-wide prospective investigation using an integrated information system. This study tracked CLABSI outcomes and associated healthcare costs across all patients with CVCs, aiming to evaluate the burden of CLABSI from the patient perspective, with particular attention to differential impacts among patient subgroups. This study employed a case-control design. Patients newly diagnosed with CLABSI between January 2023 and June 2024 were enrolled as cases, while contemporaneous catheterised patients without infection were matched 1:1 as controls using propensity score matching (PSM). We adjusted for potential confounders, including demographics, comorbidities, and treatment modalities, to assess the impact of CLABSI on length of stay (LOS), in-hospital mortality, and healthcare economic burden. Stratified subgroup analyses were conducted based on the patients' departments. Among 50,862 CVC patients, the incidence of CLABSI was 0.507/1000 catheter-days, predominantly caused by gram-negative bacteria. Propensity-matched analysis (n=205 pairs) revealed that CLABSI significantly prolonged hospitalisation by 20 days (34 vs. 14 days, P<0.001), increased mortality risk by 88% (22.93% vs. 12.20%, P=0.004), and elevated direct medical costs by $13,962 (P<0.001). Department-specific analysis demonstrated the greatest economic burden occurred in ICU($29,211) and internal medicine patients ($10,781), with all differences being statistically significant (P<0.05). CLABSI significantly increased LOS and costs across all departments, highlighting the need for hospital-wide prevention strategies.