共找到 20 条结果
Microbiology Society journals contain high-quality research papers and topical review articles. We are a not-for-profit publisher and we support and invest in the microbiology community, to the benefit of everyone. This supports our principal goal to develop, expand and strengthen the networks available to our members so that they can generate new knowledge about microbes and ensure that it is shared with other communities.
The Industry 5.0 transition highlights EU efforts to design intelligent devices that can work alongside humans to enhance human capabilities, and such vision aligns with user preferences and needs to feel safe while collaborating with such systems take priority. This demands a human-centric research vision and requires a societal and educational shift in how we perceive technological advancements. To better understand this perspective, we conducted a systematic literature review focusing on understanding how trust and trustworthiness can be key aspects of supporting this move towards Industry 5.0. This review aims to overview the most common methodologies and measurements and collect insights about barriers and facilitators for fostering trustworthy HRI. After a rigorous quality assessment following the Systematic Reviews and Meta-Analyses guidelines, using rigorous inclusion criteria and screening by at least two reviewers, 34 articles were included in the review. The findings underscores the significance of trust and safety as foundational elements for promoting secure and trustworthy human-machine cooperation. Confirm that almost 30% of the revised articles do not present a defi
Data valuation and data monetisation are complex subjects but essential to most organisations today. Unfortunately, they still lack standard procedures and frameworks for organisations to follow. In this survey, we introduce the reader to the concepts by providing the definitions and the background required to better understand data, monetisation strategies, and finally metrics and KPIs used in these strategies. We have conducted a systematic literature review on metrics and KPIs used in data valuation and monetisation, in every aspect of an organisation's business, and by a variety of stakeholders. We provide an expansive list of such metrics and KPIs with 162 references. We then categorise all the metrics and KPIs found into a large taxonomy, following the Balanced Scorecard (BSC) approach with further subclustering to cover every aspect of an organisation's business. This taxonomy will help every level of data management understand the complex landscape of the domain. We also discuss the difficulty in creating a standard framework for data valuation and data monetisation and the major challenges the domain is currently facing.
This paper systematically reviews the research progress and application prospects of machine learning technologies in the field of polymer materials. Currently, machine learning methods are developing rapidly in polymer material research; although they have significantly accelerated material prediction and design, their complexity has also caused difficulties in understanding and application for researchers in traditional fields. In response to the above issues, this paper first analyzes the inherent challenges in the research and development of polymer materials, including structural complexity and the limitations of traditional trial-and-error methods. To address these problems, it focuses on introducing key basic technologies such as molecular descriptors and feature representation, data standardization and cleaning, and records a number of high-quality polymer databases. Subsequently, it elaborates on the key role of machine learning in polymer property prediction and material design, covering the specific applications of algorithms such as traditional machine learning, deep learning, and transfer learning; further, it deeply expounds on data-driven design strategies, such as r
BACKGROUND: Antimicrobial resistance (AMR) poses a major threat to human health around the world. Previous publications have estimated the effect of AMR on incidence, deaths, hospital length of stay, and health-care costs for specific pathogen-drug combinations in select locations. To our knowledge, this study presents the most comprehensive estimates of AMR burden to date. METHODS: We estimated deaths and disability-adjusted life-years (DALYs) attributable to and associated with bacterial AMR for 23 pathogens and 88 pathogen-drug combinations in 204 countries and territories in 2019. We obtained data from systematic literature reviews, hospital systems, surveillance systems, and other sources, covering 471 million individual records or isolates and 7585 study-location-years. We used predictive statistical modelling to produce estimates of AMR burden for all locations, including for locations with no data. Our approach can be divided into five broad components: number of deaths where infection played a role, proportion of infectious deaths attributable to a given infectious syndrome, proportion of infectious syndrome deaths attributable to a given pathogen, the percentage of a given pathogen resistant to an antibiotic of interest, and the excess risk of death or duration of an infection associated with this resistance. Using these components, we estimated disease burden based on two counterfactuals: deaths attributable to AMR (based on an alternative scenario in which all drug-resistant infections were replaced by drug-susceptible infections), and deaths associated with AMR (based on an alternative scenario in which all drug-resistant infections were replaced by no infection). We generated 95% uncertainty intervals (UIs) for final estimates as the 25th and 975th ordered values across 1000 posterior draws, and models were cross-validated for out-of-sample predictive validity. We present final estimates aggregated to the global and regional level. FINDINGS: On the basis of our predictive statistical models, there were an estimated 4·95 million (3·62-6·57) deaths associated with bacterial AMR in 2019, including 1·27 million (95% UI 0·911-1·71) deaths attributable to bacterial AMR. At the regional level, we estimated the all-age death rate attributable to resistance to be highest in western sub-Saharan Africa, at 27·3 deaths per 100 000 (20·9-35·3), and lowest in Australasia, at 6·5 deaths (4·3-9·4) per 100 000. Lower respiratory infections accounted for more than 1·5 million deaths associated with resistance in 2019, making it the most burdensome infectious syndrome. The six leading pathogens for deaths associated with resistance (Escherichia coli, followed by Staphylococcus aureus, Klebsiella pneumoniae, Streptococcus pneumoniae, Acinetobacter baumannii, and Pseudomonas aeruginosa) were responsible for 929 000 (660 000-1 270 000) deaths attributable to AMR and 3·57 million (2·62-4·78) deaths associated with AMR in 2019. One pathogen-drug combination, meticillin-resistant S aureus, caused more than 100 000 deaths attributable to AMR in 2019, while six more each caused 50 000-100 000 deaths: multidrug-resistant excluding extensively drug-resistant tuberculosis, third-generation cephalosporin-resistant E coli, carbapenem-resistant A baumannii, fluoroquinolone-resistant E coli, carbapenem-resistant K pneumoniae, and third-generation cephalosporin-resistant K pneumoniae. INTERPRETATION: To our knowledge, this study provides the first comprehensive assessment of the global burden of AMR, as well as an evaluation of the availability of data. AMR is a leading cause of death around the world, with the highest burdens in low-resource settings. Understanding the burden of AMR and the leading pathogen-drug combinations contributing to it is crucial to making informed and location-specific policy decisions, particularly about infection prevention and control programmes, access to essential antibiotics, and research and development of new vaccines and antibiotics. There are serious data gaps in many low-income settings, emphasising the need to expand microbiology laboratory capacity and data collection systems to improve our understanding of this important human health threat. FUNDING: Bill & Melinda Gates Foundation, Wellcome Trust, and Department of Health and Social Care using UK aid funding managed by the Fleming Fund.
Game theory is a foundational framework for analyzing strategic interactions, and its intersection with large language models (LLMs) is a rapidly growing field. However, existing surveys mainly focus narrowly on using game theory to evaluate LLM behavior. This paper provides the first comprehensive survey of the bidirectional relationship between Game Theory and LLMs. We propose a novel taxonomy that categorizes the research in this intersection into four distinct perspectives: (1) evaluating LLMs in game-based scenarios; (2) improving LLMs using game-theoretic concepts for better interpretability and alignment; (3) modeling the competitive landscape of LLM development and its societal impact; and (4) leveraging LLMs to advance game models and to solve corresponding game theory problems. Furthermore, we identify key challenges and outline future research directions. By systematically investigating this interdisciplinary landscape, our survey highlights the mutual influence of game theory and LLMs, fostering progress at the intersection of these fields.
Advancements in artificial intelligence (AI) have transformed many scientific fields, with microbiology and microbiome research now experiencing significant breakthroughs through machine learning applications. This review provides a comprehensive overview of AI-driven approaches tailored for microbiology and microbiome studies, emphasizing both technical advancements and biological insights. We begin with an introduction to foundational AI techniques, including primary machine learning paradigms and various deep learning architectures, and offer guidance on choosing between traditional machine learning and sophisticated deep learning methods based on specific research goals. The primary section on application scenarios spans diverse research areas, from taxonomic profiling, functional annotation \& prediction, microbe-X interactions, microbial ecology, metabolic modeling, precision nutrition, clinical microbiology, to prevention \& therapeutics. Finally, we discuss challenges in this field and highlight some recent breakthroughs. Together, this review underscores AI's transformative role in microbiology and microbiome research, paving the way for innovative methodologies an
Relative distances between a high-redshift sample of Type Ia supernovae (SNe~Ia), anchored to a low-redshift sample, have been instrumental in drawing insights on the nature of the dark energy driving the accelerated expansion of the universe. A combination (hereafter called SBC) of the SNe~Ia with baryon acoustic oscillations (BAO) from the Dark Energy Spectroscopic Instrument (DESI) and the cosmic microwave background (CMB) recently indicated deviations from the standard interpretation of dark energy as a cosmological constant. In this paper, we analyse various systematic uncertainties in the distance measurement of SNe~Ia and their impact on the inferred dark energy properties in the canonical Chevallier-Polarski-Linder (CPL) model. We model systematic effects like photometric calibration, progenitor and dust evolution, and uncertainty in the galactic extinction law. We find that all the dominant systematic errors shift the dark energy inference towards the DESI 2024 results from an underlying $Λ$CDM cosmology. A small change in the calibration, and change in the Milky Way dust, can give rise to systematic-driven shifts on $w_0$-$w_a$ constraints, comparable to the deviation rep
Microorganisms are ubiquitous in nature, and microbial activities are closely intertwined with the entire life cycle system and human life. Developing novel technologies for the detection, characterization and manipulation of microorganisms promotes their applications in clinical, environmental and industrial areas. Over the last two decades, terahertz (THz) technology has emerged as a new optical tool for microbiology. The great potential originates from the unique advantages of THz waves including the high sensitivity to water and inter-/intra-molecular motions, the non-invasive and label-free detecting scheme, and their low photon energy. THz waves have been utilized as a stimulus to alter microbial functions, or as a sensing approach for quantitative measurement and qualitative differentiation. This review specifically focuses on recent research progress of THz technology applied in the field of microbiology, including two major parts of THz biological effects and the microbial detection applications. In the end of this paper, we summarize the research progress and discuss the challenges currently faced by THz technology in microbiology, along with potential solutions. We also
Context: Microservices running and being powered by Edge Computing have been gaining much attention in the industry and academia. Since 2014, when Martin Fowler popularized the Microservice term, many studies have been published relating these subjects to explore how the Edge's low-latency feature could be combined with the high throughput of the distributed paradigm from Microservices. Objective: Identifying how Microservices work together with Edge Computing whereas they take advantage when running on Edge. Method: In order to better understand this relationship, we first identified its key concepts, which are: architecture approaches and features, microservice composition (orchestration/choreography), and offloading. Afterward, we conducted a Systematic Literature Review (SLR) as the survey method. Results: We reviewed 111 selected studies and built a taxonomy of Microservices on Edge Computing demonstrating their current architecture approaches and features, composition, and offloading modes. Moreover, we identify the research gaps and trends. Conclusion: This paper is a step forward to help researchers and professionals get a general overview of how Microservices and Edge have
Outliers have been widely observed in Large Language Models (LLMs), significantly impacting model performance and posing challenges for model compression. Understanding the functionality and formation mechanisms of these outliers is critically important. Existing works, however, largely focus on reducing the impact of outliers from an algorithmic perspective, lacking an in-depth investigation into their causes and roles. In this work, we provide a detailed analysis of the formation process, underlying causes, and functions of outliers in LLMs. We define and categorize three types of outliers-activation outliers, weight outliers, and attention outliers-and analyze their distributions across different dimensions, uncovering inherent connections between their occurrences and their ultimate influence on the attention mechanism. Based on these observations, we hypothesize and explore the mechanisms by which these outliers arise and function, demonstrating through theoretical derivations and experiments that they emerge due to the self-attention mechanism's softmax operation. These outliers act as implicit context-aware scaling factors within the attention mechanism. As these outliers st
Mathematical models are increasingly a part of microbiological research. Here, we share our perspective on how modeling advances the discipline by: (i) enforcing logical consistency, (ii) enabling quantitative prediction, (iii) extracting hidden parameters from data, and (iv) generating intuitive understanding. We map a spectrum of modeling frameworks, from whole-cell simulations to minimal logistic growth equations, and provide interactive examples for some common frameworks. Building on this overview, we outline pragmatic criteria for choosing an appropriate level of description to capture phenomena of interest. Finally, we present a case study in modeling of microbial ecosystems from our own work to illustrate how mechanistic modeling can yield generalizable intuition. This perspective aims to be an introductory roadmap for integrating mathematical modeling into experimental microbiology.
A parametric study of an underwater pulsed plasma discharge in pin-to-pin electrode configuration has been performed. The influence of two parameters has been reported, the water conductivity (from 50 to 500 $μ$S/cm) and the applied voltage (from 6 to 16 kV). Two complementary diagnostics, time resolved refractive index-based techniques and electrical measurements have been performed in order to study the discharge propagation and breakdown phenomena in water according to the two parameters. A single high voltage of duration between 100 $μ$S and 1 ms is applied between two 100 $μ$m diameter platinum tips separated by 2 mm and immersed in the aqueous solution. This work, which provides valuable complementary results of paper [1], is of great interest to better understand the mechanisms of initiation and propagation of pin-to-pin discharge in water. For low conductivity (from 50 to 100 $μ$S/cm) results have confirmed two regimes of discharge (cathode and anode) and the increase of the applied voltage first makes the breakdown more achievable and then favors the apparition of the anode regime. For 500 $μ$S/cm results have highlighted cathode regime for low applied voltage but a mixed
Given the increasing demands in computer programming education and the rapid advancement of large language models (LLMs), LLMs play a critical role in programming education. This study provides a systematic review of selected empirical studies on LLMs in computer programming education, published from 2023 to March 2024. The data for this review were collected from Web of Science (SCI/SSCI), SCOPUS, and EBSCOhost databases, as well as three conference proceedings specialized in computer programming education. In total, 42 studies met the selection criteria and were reviewed using methods, including bibliometric analysis, thematic analysis, and structural topic modeling. This study offers an overview of the current state of LLMs in computer programming education research. It outlines LLMs' applications, benefits, limitations, concerns, and implications for future research and practices, establishing connections between LLMs and their practical use in computer programming education. This review also provides examples and valuable insights for instructional designers, instructors, and learners. Additionally, a conceptual framework is proposed to guide education practitioners in integra
Context: Blockchain and AI are increasingly explored to enhance trustworthiness in software engineering (SE), particularly in supporting software evolution tasks. Method: We conducted a systematic literature review (SLR) using a predefined protocol with clear eligibility criteria to ensure transparency, reproducibility, and minimized bias, synthesizing research on blockchain-enabled trust in AI-driven SE tools and processes. Results: Most studies focus on integrating AI in SE, with only 31% explicitly addressing trustworthiness. Our review highlights six recent studies exploring blockchain-based approaches to reinforce reliability, transparency, and accountability in AI-assisted SE tasks. Conclusion: Blockchain enhances trust by ensuring data immutability, model transparency, and lifecycle accountability, including federated learning with blockchain consensus and private data verification. However, inconsistent definitions of trust and limited real-world testing remain major challenges. Future work must develop measurable, reproducible trust frameworks to enable reliable, secure, and compliant AI-driven SE ecosystems, including applications involving large language models.
In a recent article [Phys. Rev. Applied 6, 014017 (2016)], Chyba and Hand propose a new scheme to generate electric power continuously at the expense of Earth's kinetic energy of rotation, by using an appropriately shaped cylindrical shell of a well chosen conducting ferrite, rigidly attached to the Earth. No experimental confirmation is reported for the new prediction. In the present Refutation, I first use today's standard electromagnetism and essentially the same model as Chyba and Hand to show in a very simple way that no device of the proposed type can produce continuous electric power, whatever its configuration or size, in agreement with widespread expectation. Next, I show that the prediction of non-zero continuous power by Chyba and Hand results from a confusion of frames of reference at a critical step of their derivation. When the confusion is clarified, the prediction becomes exactly zero and the article under discussion appears as pointless. At the end, I comment about the persistent invocation by Chyba and Hand of the misleading legacy notion that quasi-static magnetic fields have an intrinsic velocity, and other questionable concepts.
The study of microorganisms, or microbiology, has demonstrated significant development since its inception and is currently a key field of biological sciences that has a huge impact on modern society and scientific research. Over the centuries, this discipline has undergone significant changes, shaping our understanding of infectious diseases and food safety. Starting from the simplest observations of microscopic organisms such as bacteria, viruses, fungi and protozoa, and ending with modern molecular and genomic research methods. This article describes a brief historical path of microbiology development. The heuristic, morphological, physiological, immunological, and molecular genetic stages are the main periods into which the development of this science is traditionally divided, despite the lack of full-fledged and precise boundaries between them.
Diffuse Reflectance Spectroscopy has demonstrated a strong aptitude for identifying and differentiating biological tissues. However, the broadband and smooth nature of these signals require algorithmic processing, as they are often difficult for the human eye to distinguish. The implementation of machine learning models for this task has demonstrated high levels of diagnostic accuracies and led to a wide range of proposed methodologies for applications in various illnesses and conditions. In this systematic review, we summarise the state of the art of these applications, highlight current gaps in research and identify future directions. This review was conducted in accordance with the PRISMA guidelines. 77 studies were retrieved and in-depth analysis was conducted. It is concluded that diffuse reflectance spectroscopy and machine learning have strong potential for tissue differentiation in clinical applications, but more rigorous sample stratification in tandem with in-vivo validation and explainable algorithm development is required going forward.
Software documentation is essential for program comprehension, developer onboarding, code review, and long-term maintenance. Yet producing quality documentation manually is time-consuming and frequently yields incomplete or inconsistent results. Large language models (LLMs) offer a promising solution by automatically generating natural language descriptions from source code, helping developers understand code more efficiently, facilitating maintenance, and supporting downstream activities such as defect localization and commit message generation. However, the effectiveness of LLMs in documentation tasks critically depends on how they are prompted. Properly structured instructions can substantially improve model performance, making prompt engineering-the design of input prompts to guide model behavior-a foundational technique in LLM-based software engineering. Approaches such as few-shot prompting, chain-of-thought reasoning, retrieval-augmented generation, and zero-shot learning show promise for code summarization, yet current research remains fragmented. There is limited understanding of which prompting strategies work best, for which models, and under what conditions. Moreover, e
Accurate, non-invasive flow measurement is imperative for efficient water resource management and leak detection in distribution systems. Despite the advent of diverse external sensing technologies, a paucity of consolidated evidence exists regarding their comparative performance, energy efficiency, and applicability in varied operational contexts. The document delineates the protocol for a systematic literature review (SLR) that aims to identify, evaluate, and synthesize the extant evidence on non-invasive flow monitoring techniques for piped networks. Adhering to the Kitchenham methodology, the review will investigate the accuracy, precision, and energy consumption of prevailing solutions, such as ultrasonic and accelerometer-based systems. The analysis will also assess the impact of signal processing and machine learning (ML) algorithms on enhancing system capabilities. The objective of this study is to map the state-of-the-art, identify key research gaps, and provide an empirical foundation to direct future research toward operational deployment.