Effective cleaning and disinfection (C&D) are crucial to control Listeria monocytogenes in the food processing environments (FPE). There is a lack of studies examining the effect of C&D agents and practices in the FPE. The aim of the present study was to collect practical experiences from food processing facilities and compare with the guidelines for control of L. monocytogenes. The experiences of 130 food processors from 15 countries were collected by a web-based survey and 12 manufacturers of C&D agents from six countries were interviewed. Most of the food processors (64%) reported detection of Listeria spp. in their FPE at least once in the last five years, with 13% reporting repeated detection in the same sampling location for a period of over three months. Alkaline cleaning agents (72%) and disinfectants based on peracetic acid (48%) or quaternary ammonium compounds (QAC; 36%) were the most used products. Large variations were found in the composition of C&D agents, both regarding additives and the concentrations of biocidal agents. For drains and floors, changing C&D practices (frequency, concentration, exposure time) was reported to eliminate reoccurring L. monocytogenes by 12 out of 27 respondents. For equipment, upgrading or replacing parts were reported successful by 6 out of 20 respondents. No association was found between the active ingredient in C&D agents used and elimination of Listeria spp.. Industry guidelines should be kept updated, to support hygiene best practice and stricter requirements for environmentally friendly and low toxicity C&D agents.
Although one of the biggest listeriosis outbreaks was reported in South Africa, due to a ready-to-eat (RTE) meat product, to date, very few data on L. monocytogenes in foods and food environments are available from other African countries. The aims of this study were to document L. monocytogenes presence in RTE products and food environments from Namibia, Sudan, and Zambia, provide isolates' genomic characterization, and evaluate genomic similarity using data available at the Italian National Reference Laboratory for L. monocytogenes database. A total of 768 samples, including RTE meat products (n = 405), environmental swabs (n = 228), and frozen chicken (n = 135), were collected and tested for Listeria spp. and L. monocytogenes by African partners. Listeria spp. presence was observed mainly in Zambia (38.7%; n = 41/106), followed by Sudan (5.0%; n = 24/480) and Namibia (3.3%; n = 6/182), mostly related to RTE meat products (10.1%; n = 41/405). MALDI-TOF confirmed 14 of 71 strains as L. monocytogenes (n = 12 from meat RTE; n = 2 from environments) and MLST identified six CCs: CC9 (n = 7), CC1 (n = 3), CC3 (n = 1), CC37 (n = 1), CC121 (n = 1), and CC31 (n = 1). A complete L. monocytogenes Pathogenicity Island 3 (LIPI-3) was observed in CC1 and CC3 strains, in addition to LIPI-1, which was identified in all 14 strains. Several resistance factors were detected, including stress islands (SSI-1 and SSI-2), Tn6188_qac, cadA, and bcr genes. Furthermore, three cgMLST clusters were detected: two for CC9 from Zambia, one for CC1 from Sudan, all related to RTE foods. This study confirmed the presence of L. monocytogenes in African RTE meat products and food environments, posing a public health concern for consumers, mainly linked to the presence of CC1 strain, known to be a hyper-virulent clone. Moreover, the presence of L. monocytogenes strains harboring several resistance factors, such as the CC9 clone, could help these strains to adapt, survive, and persist.
Fresh produce and food-contact tableware are important transmission vehicles for Group A rotavirus (RVA). Washing their surfaces before consumption is a key control measure. Municipal tap water rinsing is the most commonly used method for this purpose. In this study, specific primers and probe targeting the VP6 gene of RVA were designed to establish a triple reverse transcription quantitative polymerase chain reaction (RT-qPCR) method for detecting RVA. The method showed no cross-reactivity with other enteric viruses. The limit of detection (LOD) was 48.83 RNA copies/mL. This method was used to evaluate the effectiveness of municipal tap water (Xining, Qinghai) rinsing in removing RVA from 28 types of food and tableware. These items were artificially contaminated with fecal material from RVA-positive patients to simulate natural contamination. After rinsing for 10 seconds to 3 minutes with municipal tap water at 10 ± 1 °C and a flow rate of 0.9 ± 0.1 L/min, the RVA concentration decreased by 0.36 to 2.60 log10 units. RVA remained detectable in some samples even after 3 minutes of rinsing. The results indicate that rinsing with municipal tap water can partially reduce the RVA load on sample surfaces. For samples with a reduction in RVA load of less than 1 log10, extending rinsing time, increasing water flow rate and temperature, or supplementing with physical scrubbing may further reduce the risk of foodborne transmission.
Given increasing food safety concerns in urbanizing China, this study investigated the microbial contamination of retail foods in Pingliang, a typical inland city in Northwest China (2020-2024). It focused specifically on Cronobacter spp. contamination in spices and antimicrobial resistance in foodborne Salmonella. A total of 784 samples across 17 categories were collected and tested by national standard methods. All Salmonella isolates underwent serotyping and antimicrobial susceptibility testing. The overall pathogen detection rate was 13.65%, rising significantly from 5.93% (2020) to 24.00% (2024) (P < 0.001). High-risk categories included spices and powders (37.70%), aquatic products (17.14%), meat products (14.34%), and catering foods (13.14%). Cronobacter sakazakii was detected in 18.03% of spice samples, with a concurrent mold contamination rate of 29.51%. Salmonella was predominantly found in meat products (7.35%), representing 71.43% of all isolates. Notably, Salmonella was more prevalent in aquatic products (10.00%) than Vibrio parahaemolyticus (1.43%). Predominant serotypes were Typhimurium (21.43%) and Enteritidis (14.29%). Resistance rates exceeded 60% for ampicillin, cefazolin, nalidixic acid, and tetracycline, with a multidrug resistance rate of 83.33%. Resistance to carbapenems, polymyxins, and a novel β-lactam/β-lactamase inhibitor combination was detected, while tigecycline remained fully susceptible. This study reveals rapidly increasing foodborne contamination in this inland Chinese city, highlighting spices as an underestimated Cronobacter sakazakii reservoir and widespread multidrug resistance in Salmonella, including to last-resort drugs. These findings underscore the urgent need for routine Cronobacter monitoring in spices and One Health interventions to curb resistance dissemination and provide critical regional data to support global efforts in monitoring Cronobacter in dry matrices and combating antimicrobial resistance in Salmonella.
The persistence of foodborne pathogens on food contact surfaces (FCSs) in low-moisture food processing facilities poses significant risks for cross-contamination and foodborne illness outbreaks. This study evaluated the survival of sessile and planktonic Salmonella enterica and Cronobacter sakazakii on three common FCS materials: stainless steel (SS), high-density polyethylene (HDPE), and nitrile rubber (Buna-N). Cocktails of both pathogens were prepared as either sessile (agar-grown) or planktonic (broth-grown) cells, inoculated onto coupons at 9 log CFU/coupon, dried for 2 h, then stored for up to 90 days at 25 °C and 33 or 53% relative humidity (RH). The Weibull model was used to describe the survival of both pathogens. For S. enterica, sessile populations declined by >2 log CFU/coupon after 60 days at 33% RH and after 30 days at 53% RH, with no further reduction through 90 days. Planktonic S. enterica showed more rapid decline, particularly at 53% RH, where populations were <1 log CFU/coupon after 90 days on all FCSs. C. sakazakii exhibited greater overall desiccation resistance than S. enterica, with sessile populations declining by only 1 log CFU/coupon after 30 days at both RH conditions and remaining stable thereafter. Planktonic C. sakazakii declined by ca. 3 log CFU/coupon over 90 days at 33% RH. Weibull analysis revealed that 53% RH significantly reduced the time to first decimal reduction (δ) for both pathogens compared to 33% RH. Surface material had minimal impact on overall survival patterns, suggesting that RH and cell state are more critical determinants of pathogen persistence. This study determined that both S. enterica and C. sakazakii can persist on FCSs for extended periods under low-RH conditions, with sessile cells presenting greater survival due to enhanced desiccation resistance. These findings emphasize the importance of environmental controls in low-moisture food processing facilities to prevent pathogen establishment and cross-contamination events.
Pecan shells have antimicrobial properties but are generally viewed as a waste product that is often discarded. Poultry manure is often used as a soil amendment but has been shown to promote the survival of foodborne pathogens in the soil. The objective of this research was to first compare the die-off of Salmonella in inoculated ground pecan shells and poultry manure, and then to evaluate its survival in soils amended with pecan shells, poultry manure, or a mixture of both pecan shell and poultry manure, compared with unamended soil. Inoculated ground pecan shells and poultry manure were enumerated at 0, 1, 3, 7, 10, 14, 21, and 28 days postinoculation. Soil treatments (unamended, pecan shell amended, poultry manure amended, and mixed) were inoculated with Salmonella and enumerated at 0, 1, 3, 7, 10, 14, 21, 28, 42, and 56 days postinoculation. Die-off models were generated using GInaFit. A more rapid Salmonella die-off was observed in poultry manure compared to ground pecan shells, with reductions of 5.28 and 1.70 log by day 14, respectively. The die-off data were best fit to a Weibull model with shape parameter (p) values of 1.63 ± 0.12 and 0.39 ± 0.02 in poultry manure and ground pecan shells, respectively. In soil, a higher initial Salmonella die-off was observed in the pecan shell amended treatment; however, by day 56, the greatest reduction occurred in the unamended soil (5.09 log CFU/g), followed by the pecan shell amended (3.94 log CFU/g), mixed (2.85 log CFU/g), and the poultry manure amended (2.81 log CFU/g) soils. A biphasic model was the best fit for the soil treatment data. The results of this study show that pecan shells could be a potential soil amendment that can promote soil health and provide soil nutrients, while, unlike poultry manure, not promoting the survival of Salmonella in soil.
Foodborne pathogens pose a persistent risk to public health, with domestic environments representing a major but often underestimated source of contamination. In this study, we investigated the survival, proliferation, and transfer potential of Salmonella Enteritidis, Escherichia coli, and Staphylococcus aureus in kitchen sponges harboring an established core microbiota. Using culture-based, metagenomic, and fluorescence in situ hybridization approaches in combination with confocal laser scanning microscopy, we examined pathogen persistence, desiccation tolerance, cross-contamination potential, and spatial microbial organization over 14 days. All three pathogens persisted within the sponge matrix for at least 2 weeks, even at very low initial populations (approximately 2,5log10 colony-forming units (CFU) per sponge section). Escherichia coli and Salmonella Enteritidis rapidly established stable populations reaching approximately 9 log CFU per sponge section, whereas Staphylococcus aureus showed limited growth of approximately 4 log CFU per sponge section, indicating species-specific interactions with the resident microbiota. Notably, pathogen populations remained stable after 3 days of desiccation, confirming the role of sponges as long-term microbial reservoirs. Contact between colonized sponges and surfaces under mild pressure resulted in transfer of up to 5 log CFU to contacted surfaces, highlighting realistic domestic transmission pathways. Sensory changes such as odor or discoloration were not correlated with microbial load, indicating that visual assessment is unreliable for sponge replacement decisions. These results underscore the role of kitchen sponges as critical microbial reservoirs in households and emphasize the need for regular sponge replacement or the use of alternative cleaning utensils. The standardized sponge model developed in this study provides a valuable platform for evaluating sanitation strategies and for understanding microbial interactions relevant to domestic hygiene and public health.
The occurrence of extended-spectrum cephalosporin (ESC)-resistant Escherichia coli in poultry could threaten human health. This study aimed to evaluate environmental fitness (the ability to grow at low temperatures and form biofilms) and pathogenicity potential of 30 ESC-resistant E. coli, isolated from the Danish poultry production chain and one human case, to understand the risk of foodborne transmission. The whole genome sequenced isolates were selected among 162 isolates to represent different origins, sequence types (STs), and ESC-resistance genes and characterized in terms of their growth at low temperatures (4, 7, 10, and 14 °C), resistance to antimicrobials, formation of biofilm, invasion of human intestinal epithelial cells, and possession of antibiotic resistance and virulence-associated genes. All isolates exhibited resistance to ≥2 antibiotics, with 47% being resistant to ≥ three classes. None of the isolates grew at low temperatures (4 °C and 7 °C), while all grew at 10 and 14 °C. Biofilm formation at 15 °C and 37 °C was observed in all isolates, as determined by the crystal violet assay. Testing of metabolic activity in biofilms revealed that ST155 and ST162 isolates exhibited the highest activity at both 15 °C and 37 °C. Invasion rates varied with ST429 and ST162 scoring highest. Two ST155 isolates were strong biofilm producers but lacked the ability to invade human INT-407 cells. This study revealed that three STs (ST155, ST162, and ST429) showed the highest growth rates at 10 °C, biofilm formation, and/or invasion rates, indicating their environmental fitness and pathogenicity potential. Future research into their role in human disease burden is needed before inclusion in risk-based surveillance programs for antibiotic-resistant E. coli in poultry production.
The widespread and often unregulated application of antibiotics in livestock production has resulted in the presence of antibiotic residues in animal-derived foods, including meat, milk, and eggs, which raises significant food safety and public health issues. These residues are associated with antimicrobial resistance (AMR), allergic reactions, and the disruption of human gut microbiota. The improper use of antibiotics and noncompliance with withdrawal periods create selective pressure on bacteria, promoting the development of resistant strains that may be transmitted to humans via the food chain or environmental routes. The persistence of residues is attributed to inadequate veterinary oversight, environmental contamination, and insufficient enforcement of food safety standards, especially in developing regions. Advanced detection technologies, such as chromatographic, immunological, and biosensor-based methods, are examined in conjunction with emerging innovations for rapid, multiresidue screening. The article underscores the necessity for coordinated global regulations, efficient surveillance, and the advocacy of responsible antibiotic usage within a One Health framework. Enhancing international collaboration, refining monitoring systems, and promoting sustainable, residue-free livestock production are critical for safeguarding public health and ensuring a secure and resilient food supply.
In response to observations of Shiga toxin-producing Escherichia coli (STEC) cases reporting an epidemiological association with salad consumption, a survey was carried out from April 2023 to March 2024 in England and Northern Ireland. The aim of this study was to examine salad to detect foodborne pathogens including Salmonella spp., STEC, Listeria monocytogenes, Bacillus cereus and coagulase positive staphylococci and to determine levels of indicator bacteria including generic E. coli and Listeria species. Samples of ready-to-eat salads and salad components were collected by local authorities and transported to the laboratory for microbiological examination. During the survey 2,495 samples were tested. S. Typhimurium was detected in one sample (0.04%) but with no link to human cases. In 26 samples stx DNA (genes that codes for Shiga toxin) was detected by PCR and STEC was culture-confirmed from 3/2,495 samples (0.1%) with no link to human cases. L. monocytogenes was not detected in any samples, Coagulase positive staphylococci were detected at between 20 and 4.0 log CFU/g levels in 31 (1%) samples, with none at a level of greater than 4.0 log CFU/g. Levels of greater than 5.0 log CFU/g presumptive Bacillus cereus were found in 11 (0.4%) and greater than 2.0 log CFU/g levels of E. coli were found in 33 (1%) samples. Overall, 2% of all samples were considered to be of unsatisfactory microbiological quality. These results demonstrate the importance of ongoing monitoring of salad products to ensure food safety, along with clear labelling specifying whether the product is RTE or requires further processing and/or washing prior to consumption. However interpreting results for action is complicated by the challenges in culturing STEC, which limits the usefulness of current monitoring methods.
Staphylococcal enterotoxin B (SEB), a highly potent and stable biotoxin, is a leading cause of food poisoning and poses a significant threat to food safety and public health. While lateral flow immunoassay (LFIA) is valued for its low cost and rapid on-site detection capability, it faces challenges in detecting SEB due to the nonspecific binding of staphylococcal protein A to the Fc region of conventional antibodies. To overcome this limitation, a series of paired nanobodies against SEB were successfully selected from an immunized phage-display nanobody library. The screening was accomplished using an efficient one-step sandwich chemiluminescent immunoassay, which strategically employed Avi-tagged and alkaline phosphatase-fused nanobodies (Nb-Avi and ALP-Nb). Based on the optimal nanobody pair (Nb9A and Nb9C), a sensitive time-resolved fluorescence lateral flow immunoassay was developed for rapid SEB detection. This involved selecting nitrocellulose membrane-compatible nanobody-fusion proteins and establishing an advanced site-directed strategy for labeling nanoparticles with nanobodies. The resulting test strip achieves detection within 15 min, with a remarkably low detection limit of 0.034 ng/mL and a linear range of 0.10-6.3 ng/mL, significantly outperforming most existing methods. Due to the complete absence of an Fc fragment, the nanobodies confer strong resistance to protein A interference in the immunochromatographic system. In practical sample analyses, the method demonstrated consistent recovery rates of 84.40%-114.21%, along with excellent specificity and stability. This study presents an innovative lateral flow immunoassay that overcomes key challenges in achieving highly sensitive and rapid SEB detection by using paired nanobodies. The assay enables rapid, ultrasensitive, and specific detection of SEB, demonstrating high reliability in complex samples,and offers a robust, field-deployable solution for on-site food safety monitoring and public health protection, while also providing a novel strategic framework for detecting other protein toxins.
Listeria monocytogenes is a persistent foodborne pathogen capable of surviving sanitation and low-temperature conditions commonly encountered in food processing environments. Under environmental stress, this bacterium can enter a viable but nonculturable (VBNC) state, characterized by retained viability but loss of culturability, making it undetectable by conventional microbiological methods. This study aimed to determine whether VBNC L. monocytogenes populations originating from biofilms can be transferred to a ready-to-eat (RTE) seafood product and subsequently resuscitate during refrigerated storage. Biofilms were formed for 24 h on stainless steel coupons at 8 °C and exposed to a quaternary ammonium-based disinfectant (QA). Total, viable, and cultivable populations were quantified using qPCR, PMA-qPCR, and plate counts. QA treatment significantly reduced cultivable populations (3.0-log reduction) while maintaining high viable populations (≈5.2 log GE/cm2), suggesting VBNC induction. Treated and untreated biofilms were then brought into contact with sterile smoked-salted herring pieces to simulate cross-contamination. L. monocytogenes populations were transferred in all cases, with viable populations reaching approximately 4-6 log GE/g for untreated biofilms and around 3 log GE/g for QA-treated biofilms, where most populations were in a VBNC state. During vacuum-packed refrigerated storage (7 days at 4 °C followed by 14 days at 8 °C), populations on herring surfaces progressively regained culturability, with cultivable populations increasing from low initial levels to 5-7 log CFU/g by the end of the product shelf life (21 days) These findings show that VBNC L. monocytogenes populations present on food-contact surfaces can be transferred to RTE seafood and regain culturability during storage. This highlights a potential contamination pathway and supports the need to consider VBNC populations in hygiene monitoring and risk assessment strategies for vacuum-packed foods.
Food safety messages and labels are important tools for communicating risks associated with food and promoting preventive behaviors when handling food. Given the implication of flour in recent foodborne outbreaks and recalls, efforts have been made to communicate risk to consumers. However, there is limited empirical evidence on the potential of these messages to effect behavioral change. This study used the UNC Perceived Message Effectiveness (UNC PME) scale to evaluate the effectiveness of three flour safety messages: the control message (the current predominant message on flour packages), the loss-framed message (emphasizing the negative consequences of unsafe flour handling), and the neutral message (an informational-only message). The study also tested consumers' preferences for five features of a flour safety label: message type, signal markers, label shape, icon, and background color. A national sample of US wheat flour consumers (N=1212) participated in an online survey. Each participant was presented with the three flour safety messages. The participants were asked to rate each message on the UNC PME scale to determine the potential of the message to effect a behavioral change. Participants were further asked to determine their preferences for various label features with the intent to create a flour safety label. Loss (PME= 4.09±0.03) and neutral (PME= 4.04±0.03) framed messages were perceived as significantly more effective than the control message (PME= 3.91±0.03). There was also a higher preference for Loss message than the other message types (56.6% of study population). Further analysis to determine the features of a flour safety label revealed that 62% of participants preferred an octagonal shape, and there was also a higher preference for "warning" as a signal marker on the label. This study provides evidence-based guidance for developing flour safety messages and labels that effectively communicate risks to consumers.
Vegetables are exceptionally nutrient-dense food sources essential to our diet and are mostly preferred to be consumed uncooked for maximum benefits. In Bangladesh, they are predominantly sold in outdoor markets. However, poor handling practices can lead to bacterial contamination with pathogenic microorganisms, increasing the risk of severe foodborne infections. This study aims to quantify the bacterial load of fresh vegetables and evaluate the impact of the antimicrobial agent acetic acid on the microbial quality of fresh produce available in the local markets. Following a completely randomized design, twenty samples were subjected to three treatment groups: unwashed (T1), washed with distilled water (T2), and treated with food-grade 5% acetic acid (T3). The highest and lowest Total Plate Count (TPC) values for T1 were 2.74 × 1010 cfu/g and 4.80 × 109 cfu/g, respectively, whereas in T2 samples, the highest and lowest counts are 1.59 × 1010 cfu/g and 2.10 × 109 cfu/g, respectively. Furthermore, in the T3, the values were 8.93 × 109 cfu/g and 6.67 × 10 cfu/g, respectively, showcasing a highly significant (P < 0.0001) difference between T1 and T3. T1, T2, and T3 showed a very high significance (P < 0.0001) difference in total Coliform count. The total Salmonella count showed a highly significant difference between T1 and T3, and the Total Yeast and mold count showed a very high significance difference (P < 0.0001) between T1 and T3. The study concluded that fresh vegetables contain a hazardous level of pathogens capable of causing cross-contamination and subsequent foodborne illnesses. The application of food-grade 5% acetic acid significantly reduced the microbial population on fresh produce, thereby decreasing the risk of health hazards.
Depressive symptoms in displacement settings are often framed as consequences of refugee status, which can obscure the shared structural conditions emphasized in the Social Determinants of Health (SDoH) framework that shape mental health for refugees and host communities and limit the identification of practical intervention targets. This study examines the relative contribution of health, socioeconomic, protection, and contextual factors to depressive symptom severity among adults living in displacement-affected settings in South Sudan. We analyzed nationally representative data from 3,055 adults (2,066 refugees, mean displacement duration 11.15 years; 989 host community members) from the 2023 Forced Displacement Survey. Depressive symptom severity was measured using the PHQ-9. We compared Elastic Net regression, Random Forests, and Extreme Gradient Boosting (XGBoost) using 10 × 5 nested cross-validation. The best-performing model was interpreted using SHapley Additive exPlanations (SHAP) to estimate the marginal contribution of each predictor in PHQ units. Mean depressive symptom severity was low to moderate overall (M = 4.43, SD = 5.00) and did not differ by population type (p = 0.783). XGBoost showed the highest predictive performance [Root Mean Squared Error (RMSE) = 4.47, R 2 = 0.247], significantly outperforming Elastic Net regression (p = 0.006). Model explanations identified self-rated health status as the dominant predictor (19.3% of total importance), followed by perceived community violence (11.1%), perceived poverty (9.6%). Age (9.3%), discrimination (9.2%), food insecurity (8.6%), and citizenship (8.2%, pooled model only) contributed at moderate levels, whereas social protection (3.4%) and remittances (0.7%) contributed minimally. Predictor profiles were broadly similar across refugee and host models, with differences primarily in magnitude rather than rank ordering. Depressive symptoms in South Sudan appear to be structured primarily by health, material hardship, and protection-related gradients rather than refugee status per se. Findings support integrated, area-based public health responses that link mental health support with primary health care access, poverty-oriented assistance, and protection and safety interventions rather than programming organized primarily around legal status distinctions.
Given the abundant presence of retail products with gluten-containing cereal grains (wheat, barley, and rye), the strict avoidance of gluten can be challenging, and consumers with a gluten sensitivity will look for gluten-free equivalent products. These products are typically derived from naturally gluten-free cereal grains such as rice, corn, buckwheat, sorghum, oats, and legumes. In this study, we conducted a survey to determine the extent of gluten cross-contamination of products sold in Canada that are made from inherently gluten-free ingredients as a function of labelling claim in regard to gluten. A total of 657 samples were collected and analyzed for gluten content using the RIDASCREEN ® R-7001 gliadin ELISA. The results showed that 30 of these 657 (4.6%) samples were contaminated above the CODEX recommended level for a gluten-free product statement (20 mg/kg). Of the 270 samples displaying a gluten-free claim, only one (0.4%) of these tested above 20 mg/kg gluten. There were 239 samples tested that did not have any gluten statement, and only one (0.4%) of these tested above 20 mg/kg. The remaining 148 samples contained a precautionary statement for gluten and 28 (18.9%) of these tested above 20 mg/kg gluten. The majority of those samples that tested above the CODEX recommended level (19/30) had oats listed as one of the ingredients.
A long-term, stable cropland ecosystem is crucial for the sustainable utilisation of land resources and the reliable guarantee of food security. However, oasis-irrigated agriculture in arid and semi-arid regions continues to face ecological risks, including water constraints, soil salinisation, and desertification, as well as persistent pressures arising from irrational land-use practices, which limit the health and stability of cropland systems. This study focuses on the northern slope of the Tianshan Mountains (NSTM) in Xinjiang and, from the perspective of social-ecological systems, develops a two-dimensional evaluation framework integrating ecological stability and utilisation stability. Cropland ecological stability (CES) and cropland utilisation stability (CUS) over the past two decades were comprehensively assessed, and a spatio-temporal, two-dimensional risk matrix was further constructed to identify patterns of cropland system destabilisation risk, thereby providing a scientific basis for differentiated cropland management decisions. The results indicate that, from 2000 to 2020, CES in the NSTM exhibited a consistently increasing trend with pronounced spatial heterogeneity, and unstable cropland was mainly concentrated in highly fragmented, marginal areas. Overall, CUS in the NSTM remained relatively high during 2000-2020, but unstable utilisation was still concentrated in marginal zones, indicating the persistence of land-use pressure. The integrated risk assessment shows that low-risk cropland accounts for the majority of the region and is widely distributed across oasis plains and central areas with concentrated water systems, whereas medium- and high-risk areas are mainly distributed along oasis margins, suggesting that cropland systems in these areas face significant management and regulatory pressures. This study provides an integrated framework for systematically identifying cropland destabilisation risks in arid oasis regions and offers theoretical and decision-making support for the implementation of targeted regulatory measures and sustainable cropland protection strategies.
Chicken is a major source of foodborne salmonellosis in the United States. The number of human illnesses and the prevalence in chicken of a multidrug-resistant strain of Salmonella enterica serotype Infantis called REPJFX01 have increased during the last decade. REPJFX01 isolates are often resistant to antibiotics used for treatment, including ampicillin, ceftriaxone, ciprofloxacin, and trimethoprim-sulfamethoxazole. Using data from the U.S. Centers for Disease Control and Prevention's PulseNet system and the U.S. Department of Agriculture's Food Safety and Inspection Service during 2003-2023, we examined trends in REPJFX01 isolation from humans and chicken carcasses to assess whether they coincide. We used penalized B-spline generalized additive models to quantify trends in human infection burden and REPJFX01 prevalence on chicken carcasses. We used a time-lagged rank-order cross-correlation to characterize the association between the two-time series. A significant, increasing annual trend in the incidence of human infections and chicken carcass prevalence began in 2016. A 0-year lag on chicken carcass REPJFX01 prevalence maximized rank-order cross-correlation with REPJFX01 infection burden in humans. REPJFX01 trends in humans and chickens appear coincident (both occurring during the same time period), suggesting that chickens may be an important reservoir for REPJFX01. Efforts to prevent Salmonella contamination of chicken and strengthen safe handling and cooking of chicken should be prioritized.
The goal of this study was to determine the probability of finding Clostridium perfringens organisms in raw and finished ready-to-eat (RTE) meat products, which could have a practical implication to food safety. In 2001, the Food Safety Inspection Service (FSIS) reported that the levels of C. perfringens were greater than 4-log CFU/g in a limited number of raw meat samples. This was the basis for creating the current 9CFR 318.17(a) (2) rule limiting permissible potential growth of C. perfringens during thermally processed product chilling (stabilization) to <1 log CFU/g. Data from raw materials (chicken), not-ready-to-eat (NRTE) bacon, and cooked products (submitted by three US commercial meat processors) were gathered. The cooked ready-to-eat (RTE) product data involved 3,141 cooked products not related to process deviations, while data collected from 2,886 products originated from process deviations determined not to meet the "safe harbors" outlined in USDA, FSIS Appendix B. Although five bacon (not RTE) samples exceeded 4-log CFU/g C. perfringens, none of the cooked samples (regardless of process deviations association) exceeded 2-log CFU/g, suggesting that the high initial levels in the raw product were vegetative cells which were eliminated during the cooking process and therefore would not grow during cooling. The probability of having counts ranging from 2 to 3 log CFU/g in nondeviation samples and deviation RTE samples was estimated to be 0.0012 and 0.0013, respectively. The data further suggest that the probability of C. perfringens growing to populations causing illness (8-log CFU per serving, assuming a 100-g serving) during product stabilization of commercial meat products is highly unlikely, even with cooling-related deviations.
Dithiocarbamates (DTCs), widely used fungicidal pesticides, are conventionally detected via a cumbersome 2-h derivatization process that converts them into CS2 for gas chromatography, which is susceptible to leakage and poor reproducibility. Herein, we propose a novel and rapid detection method that improves throughput and simplifies operation. Solid samples are directly introduced into a tubular furnace without pretreatment and reagents. DTCs are pyrolyzed within 30 s to release characteristic CS2 gas, which is then excited by a point discharge microplasma and detected in real time at 257.56 nm using a handheld optical emission spectrometer. The method exhibits good linearity (R2 > 0.99) in the range of 0.01-0.1 mg, with a detection limit of 3.5 μg. Matrix effects are corrected using matrix-matched calibration, yielding spiked recoveries of 87-107% with RSDs of 2.3-7.6%. Operating at atmospheric pressure, the system offers a rapid and reliable approach for DTC detection in solid food samples.