Recent advances have shown the effectiveness of self-evolving LLM agents on tasks such as program repair and scientific discovery. In this paradigm, a planner LLM synthesizes an agent program that invokes parametric models, including LLMs, which are then tuned per task to improve performance. However, existing self-evolving agent frameworks provide no formal guarantees of safety or correctness. Because such programs are often executed autonomously on unseen inputs, this lack of guarantees raises reliability and security concerns. We formulate agentic code generation as a constrained learning problem, combining hard formal specifications with soft objectives capturing task utility. We introduce Formally Guarded Generative Models (FGGM), which allow the planner LLM to specify a formal output contract for each generative model call using first-order logic. Each FGGM call wraps the underlying model in a rejection sampler with a verified fallback, ensuring every returned output satisfies the contract for any input and parameter setting. Building on FGGM, we present SEVerA (Self-Evolving Verified Agents), a three-stage framework: Search synthesizes candidate parametric programs containin
Recent advances in large language models have sparked growing interest in AI agents capable of solving complex, real-world tasks. However, most existing agent systems rely on manually crafted configurations that remain static after deployment, limiting their ability to adapt to dynamic and evolving environments. To this end, recent research has explored agent evolution techniques that aim to automatically enhance agent systems based on interaction data and environmental feedback. This emerging direction lays the foundation for self-evolving AI agents, which bridge the static capabilities of foundation models with the continuous adaptability required by lifelong agentic systems. In this survey, we provide a comprehensive review of existing techniques for self-evolving agentic systems. Specifically, we first introduce a unified conceptual framework that abstracts the feedback loop underlying the design of self-evolving agentic systems. The framework highlights four key components: System Inputs, Agent System, Environment, and Optimisers, serving as a foundation for understanding and comparing different strategies. Based on this framework, we systematically review a wide range of self
Knowledge utilization is a critical aspect of LLMs, and understanding how they adapt to evolving knowledge is essential for their effective deployment. However, existing benchmarks are predominantly static, failing to capture the evolving nature of LLMs and knowledge, leading to inaccuracies and vulnerabilities such as contamination. In this paper, we introduce EvoWiki, an evolving dataset designed to reflect knowledge evolution by categorizing information into stable, evolved, and uncharted states. EvoWiki is fully auto-updatable, enabling precise evaluation of continuously changing knowledge and newly released LLMs. Through experiments with Retrieval-Augmented Generation (RAG) and Contunual Learning (CL), we evaluate how effectively LLMs adapt to evolving knowledge. Our results indicate that current models often struggle with evolved knowledge, frequently providing outdated or incorrect responses. Moreover, the dataset highlights a synergistic effect between RAG and CL, demonstrating their potential to better adapt to evolving knowledge. EvoWiki provides a robust benchmark for advancing future research on the knowledge evolution capabilities of large language models.
In this paper, we prove that spatially semi-discrete evolving finite element method for parabolic equations on a given evolving hypersurface of arbitrary dimensions preserves the maximal $L^p$-regularity at the discrete level. We first establish the results on a stationary surface and then extend them, via a perturbation argument, to the case where the underlying surface is evolving under a prescribed velocity field. The proof combines techniques in evolving finite element method, properties of Green's functions on (discretised) closed surfaces, and local energy estimates for finite element methods
Recent results from the Dark Energy Spectroscopic Instrument (DESI) collaboration have been interpreted as evidence for evolving dark energy. However, this interpretation is strongly dependent on which Type Ia supernova (SN) sample is combined with DESI measurements of baryon acoustic oscillations (BAO) and observations of the cosmic microwave background (CMB) radiation. The strength of the evidence for evolving dark energy ranges from ~3.9 sigma for the Dark Energy 5 year (DES5Y) SN sample to ~2.5 sigma for the Pantheon+ sample. The cosmology inferred from Pantheon+ sample alone is consistent with the Planck LCDM model and shows no preference for evolving dark energy. In contrast, the the DES5Y SN sample favours evolving dark energy and is discrepant with the Planck LCDM model at about the 3 sigma level. Given these difference, it is important to question whether they are caused by systematics in the SN compilations. A comparison of SN common to both the DES5Y and Pantheon+ compilations shows evidence for an offset of ~0.04 mag. between low and high redshifts. Systematics of this order can bring the DES5Y sample into good agreement with the Planck LCDM cosmology and Pantheon+. I c
We show that the latest empirical constraints on cosmology, from a combination of DESI, CMB and supernova data, can be accounted for if a small component of dark matter has an evolving and oscillating equation of state within $-1<w<1$. From a fundamental physics perspective, this interpretation is more appealing than an evolving phantom dark energy with $w<-1$, which violates the null energy condition.
We introduce a modified incremental learning algorithm for evolving Granular Neural Network Classifiers (eGNN-C+). We use double-boundary hyper-boxes to represent granules, and customize the adaptation procedures to enhance the robustness of outer boxes for data coverage and noise suppression, while ensuring that inner boxes remain flexible to capture drifts. The classifier evolves from scratch, incorporates new classes on the fly, and performs local incremental feature weighting. As an application, we focus on the classification of emotion-related patterns within electroencephalogram (EEG) signals. Emotion recognition is crucial for enhancing the realism and interactivity of computer systems. We extract features from the Fourier spectrum of EEG signals obtained from 28 individuals engaged in playing computer games -- a public dataset. Each game elicits a different predominant emotion: boredom, calmness, horror, or joy. We analyze individual electrodes, time window lengths, and frequency bands to assess the accuracy and interpretability of resulting user-independent neural models. The findings indicate that both brain hemispheres assist classification, especially electrodes on the
We consider the governing equations for the motion of compressible fluid on an evolving surface from both energetic and thermodynamic points of view. We employ our energetic variational approaches to derive the momentum equation of our compressible fluid systems on the evolving surface. Applying the first law of thermodynamics and the Gibbs equation, we investigate the internal energy, enthalpy, entropy, and free energy of the fluid on the evolving surface. We also study conservative forms and conservation laws of our compressible fluid systems on the evolving surface. Moreover, we derive the generalized heat and diffusion systems on an evolving surface from an energetic point of view. This paper gives a mathematical validity of the surface stress tensor determined by the Boussinesq-Scriven law. Using a flow map on an evolving surface and applying the Riemannian metric induced by the flow map are key ideas to analyze fluid flow on the evolving surface.
The study of evolution of networks has received increased interest with the recent discovery that many real-world networks possess many things in common, in particular the manner of evolution of such networks. By adding a dimension of time to graph analysis, evolving graphs present opportunities and challenges to extract valuable information. This paper introduces the Evolving Graph Markup Language (EGML), an XML application for representing evolving graphs and related results. Along with EGML, a software tool is provided for the study of evolving graphs. New evolving graph drawing techniques based on the force-directed graph layout algorithm are also explored. Our evolving graph techniques reduce vertex movements between graph instances, so that an evolving graph can be viewed with smooth transitions
In this paper, we study a class of stochastic processes, called evolving network Markov chains, in evolving networks. Our approach is to transform the degree distribution problem of an evolving network to a corresponding problem of evolving network Markov chains. We investigate the evolving network Markov chains, thereby obtaining some exact formulas as well as a precise criterion for determining whether the steady degree distribution of the evolving network is a power-law or not. With this new method, we finally obtain a rigorous, exact and unified solution of the steady degree distribution of the evolving network.
The aim of this paper is to develop a numerical scheme to approximate evolving interface problems for parabolic equations based on the abstract evolving finite element framework proposed in (C M Elliott, T Ranner, IMA J Num Anal, 41:3, 2021, doi:10.1093/imanum/draa062). An appropriate weak formulation of the problem is derived for the use of evolving finite elements designed to accommodate for a moving interface. Optimal order error bounds are proved for arbitrary order evolving isoparametric finite elements. The paper concludes with numerical results for a model problem verifying orders of convergence.
We study event horizon candidates for slowly evolving dynamical black holes in General Relativity and Einstein-Gauss-Bonnet (EGB) gravity. Such a type of horizon candidate has been termed as slowly evolving null surface (SENS). It signifies a near-equilibrium state of a dynamic black hole. We demonstrate the time evolution of such surfaces for three different metrics. First, we locate such a surface for a charged Vaidya metric and show that the parameter space of the black hole gets constrained to allow a physically admissible slowly evolving null surface. We then consider a supertranslated Vaidya solution that contains a non-spherical horizon and study the properties of the SENS. This spacetime generates a non-vanishing shear at the SENS due to the presence of the supertranslation field. The SENS for a spherically symmetric Vaidya-like solution in EGB gravity yields a bound on the accretion rate that depends on the size of the horizon. We also show that the first and second laws of black hole mechanics can be established for these slowly evolving surfaces.
In order to explore the impact of periodically evolving domain on the transmission of disease, we study a SIS reaction-diffusion model with logistic term on a periodically evolving domain. The basic reproduction number ${\mathcal{R}}_0$ is given by the next generation infection operator, and relies on the evolving rate of the periodically evolving domain, diffusion coefficient of infected individuals $d_I$ and size of the space. The monotonicity of ${\mathcal{R}}_0$ with respect to $d_I$, evolving rate $ρ(t)$ and interval length $L$ are derived, and asymptotic property of ${\mathcal{R}}_0$ if $d_I$ or $L$ is small enough or large enough in one-dimensional space are discussed. ${\mathcal{R}}_0$ as threshold can be used to characterize whether the disease-free equilibrium is stable or not. Our theoretical results and numerical simulations indicate that small evolving rate, small diffusion of infected individuals and small interval length have positive impact on prevention and control of disease.
Frequent pattern mining is a key area of study that gives insights into the structure and dynamics of evolving networks, such as social or road networks. However, not only does a network evolve, but often the way that it evolves, itself evolves. Thus, knowing, in addition to patterns' frequencies, for how long and how regularly they have occurred---i.e., their persistence---can add to our understanding of evolving networks. In this work, we propose the problem of mining activity that persists through time in continually evolving networks---i.e., activity that repeatedly and consistently occurs. We extend the notion of temporal motifs to capture activity among specific nodes, in what we call activity snippets, which are small sequences of edge-updates that reoccur. We propose axioms and properties that a measure of persistence should satisfy, and develop such a persistence measure. We also propose PENminer, an efficient framework for mining activity snippets' Persistence in Evolving Networks, and design both offline and streaming algorithms. We apply PENminer to numerous real, large-scale evolving networks and edge streams, and find activity that is surprisingly regular over a long
Domain generalization aims to learn a predictive model from multiple different but related source tasks that can generalize well to a target task without the need of accessing any target data. Existing domain generalization methods ignore the relationship between tasks, implicitly assuming that all the tasks are sampled from a stationary environment. Therefore, they can fail when deployed in an evolving environment. To this end, we formulate and study the \emph{evolving domain generalization} (EDG) scenario, which exploits not only the source data but also their evolving pattern to generate a model for the unseen task. Our theoretical result reveals the benefits of modeling the relation between two consecutive tasks by learning a globally consistent directional mapping function. In practice, our analysis also suggests solving the DDG problem in a meta-learning manner, which leads to \emph{directional prototypical network}, the first method for the DDG problem. Empirical evaluation of both synthetic and real-world data sets validates the effectiveness of our approach.
We develop a unified theory for continuous in time finite element discretisations of partial differential equations posed in evolving domains including the consideration of equations posed on evolving surfaces and bulk domains as well coupled surface bulk systems. We use an abstract variational setting with time dependent function spaces and abstract time dependent finite element spaces. Optimal a priori bounds are shown under usual assumptions on perturbations of bilinear forms and approximation properties of the abstract finite element spaces. The abstract theory is applied to evolving finite elements in both flat and curved spaces. Evolving bulk and surface isoparametric finite element spaces defined on evolving triangulations are defined and developed. These spaces are used to define approximations to parabolic equations in general domains for which the abstract theory is shown to apply. Numerical experiments are described which confirm the rates of convergence.
Generalizing the previous works on evolving fuzzy two-sphere, I discuss evolving fuzzy CP^n by studying scalar field theory on it. The space-time geometry is obtained in continuum limit, and is shown to saturate locally the cosmic holographic principle. I also discuss evolving lattice n-simplex obtained by `compactifying' fuzzy CP^n. It is argued that an evolving lattice n-simplex does not approach a continuum space-time but decompactifies into an evolving fuzzy CP^n.
We present an abstract framework for treating the theory of well-posedness of solutions to abstract parabolic partial differential equations on evolving Hilbert spaces. This theory is applicable to variational formulations of PDEs on evolving spatial domains including moving hypersurfaces. We formulate an appropriate time derivative on evolving spaces called the material derivative and define a weak material derivative in analogy with the usual time derivative in fixed domain problems; our setting is abstract and not restricted to evolving domains or surfaces. Then we show well-posedness to a certain class of parabolic PDEs under some assumptions on the parabolic operator and the data.
In many realistic networks, the edges representing the interactions between the nodes are time-varying. There is growing evidence that the complex network that models the dynamics of the human brain has time-varying interconnections, i.e., the network is evolving. Based on this evidence, we construct a patient and data specific evolving network model (comprising discrete-time dynamical systems) in which epileptic seizures or their terminations in the brain are also determined by the nature of the time-varying interconnections between the nodes. A novel and unique feature of our methodology is that the evolving network model remembers the data from which it was conceived from, in the sense that it evolves to almost regenerate the patient data even upon presenting an arbitrary initial condition to it. We illustrate a potential utility of our methodology by constructing an evolving network from clinical data that aids in identifying an approximate seizure focus -- nodes in such a theoretically determined seizure focus are outgoing hubs that apparently act as spreaders of seizures. We also point out the efficacy of removal of such spreaders in limiting seizures.
This chapter will go through the important nuclear reactions in stellar evolution and explosions, passing through the individual stellar burning stages and also explosive burning conditions. To follow the changes in the composition of nuclear abundances requires the knowledge of the relevant nuclear reaction rates. For light nuclei (entering in early stellar burning stages) the resonance density is generally quite low and the reactions are determined by individual resonances, which are best obtained from experiments. For intermediate mass and heavy nuclei the level density is typically sufficient to apply statistical model approaches. For this reason, while we discuss all burning stages and explosive burning, focusing on the reactions of importance, we will for light nuclei refer to the chapters by M. Wiescher, deBoer & Reifarth (Experimental Nuclear Astrophysics) and P. Descouvement (Theoretical Studies of Low-Energy Nuclear Reactions), which display many examples, experimental methods utilized, and theoretical approaches how to predict nuclear reaction rates for light nuclei. For nuclei with sufficiently high level densities we discuss statistical model methods used in presen