There is growing recognition among financial institutions, financial regulators and policy makers of the importance of addressing nature-related risks and opportunities. Evaluating and assessing nature-related risks for financial institutions is challenging due to the large volume of heterogeneous data available on nature and the complexity of investment value chains and the various components' relationship to nature. The dual problem of scaling data analytics and analysing complex systems can be addressed using Artificial Intelligence (AI). We address issues such as plugging existing data gaps with discovered data, data estimation under uncertainty, time series analysis and (near) real-time updates. This report presents potential AI solutions for models of two distinct use cases, the Brazil Beef Supply Use Case and the Water Utility Use Case. Our two use cases cover a broad perspective within sustainable finance. The Brazilian cattle farming use case is an example of greening finance - integrating nature-related considerations into mainstream financial decision-making to transition investments away from sectors with poor historical track records and unsustainable operations. The d
Standard Virtual Element Methods (VEM) are based on polynomial projections and require a stabilization term to evaluate the contribution of the non-polynomial component of the discrete space. However, the stabilization term is not uniquely defined by the underlying variational formulation and is typically introduced in an ad hoc manner, potentially affecting the numerical response. Stabilization-free and self-stabilized formulations have been proposed to overcome this issue, although their theoretical analysis is still less mature. This paper provides an in-depth numerical investigation into different stabilized and self-stabilized formulations for the p-version of VEM. The results show that self-stabilized and stabilization-free formulations achieve optimal accuracy while suffering from worse conditioning. Moreover, a new projection operator, which explicitly accounts for variable coefficients, is introduced within the framework of standard virtual element spaces. Numerical results show that this new approach is more robust than the existing ones for large values of p.
Geometrical methods in quantum information are very promising for both providing technical tools and intuition into difficult control or optimization problems. Moreover, they are of fundamental importance in connecting pure geometrical theories, like GR, to quantum mechanics, like in the AdS/CFT correspondence. In this paper, we first make a survey of the most important settings in which geometrical methods have proven useful to quantum information theory. Then, we lay down a general framework for an action principle for quantum resources like entanglement, coherence, and anti-flatness. We discuss the case of a two-qubit system.
Based on a nonsmooth coherence condition, we construct and prove the convergence of a forward-backward splitting method that alternates between steps on a fine and a coarse grid. Our focus is a total variation regularised inverse imaging problems, specifically, their dual problems, for which we develop in detail the relevant coarse-grid problems. We demonstrate the performance of our method on total variation denoising and magnetic resonance imaging.
Recently developed methods for video analysis, especially models for pose estimation and behavior classification, are transforming behavioral quantification to be more precise, scalable, and reproducible in fields such as neuroscience and ethology. These tools overcome long-standing limitations of manual scoring of video frames and traditional "center of mass" tracking algorithms to enable video analysis at scale. The expansion of open-source tools for video acquisition and analysis has led to new experimental approaches to understand behavior. Here, we review currently available open-source tools for video analysis and discuss how to set up these methods for labs new to video recording. We also discuss best practices for developing and using video analysis methods, including community-wide standards and critical needs for the open sharing of datasets and code, more widespread comparisons of video analysis methods, and better documentation for these methods especially for new users. We encourage broader adoption and continued development of these tools, which have tremendous potential for accelerating scientific progress in understanding the brain and behavior.
From flocking birds to schooling fish, organisms interact to form collective dynamics across the natural world. Self-organization is present at smaller scales as well: cells interact and move during development to produce patterns in fish skin, and wound healing relies on cell migration. Across these examples, scientists are interested in shedding light on the individual behaviors informing spatial group dynamics and in predicting the patterns that will emerge under altered agent interactions. One challenge to these goals is that images of self-organization -- whether empirical or generated by models -- are qualitative. To get around this, there are many methods for transforming qualitative pattern data into quantitative information. In this tutorial chapter, I survey some methods for quantifying self-organization, including order parameters, pair correlation functions, and techniques from topological data analysis. I also discuss some places that I see as especially promising for quantitative data, modeling, and data-driven approaches to continue meeting in the future.
Despite increasingly realistic image quality, recent 3D image generative models often operate on 3D volumes of fixed extent with limited camera motions. We investigate the task of unconditionally synthesizing unbounded nature scenes, enabling arbitrarily large camera motion while maintaining a persistent 3D world model. Our scene representation consists of an extendable, planar scene layout grid, which can be rendered from arbitrary camera poses via a 3D decoder and volume rendering, and a panoramic skydome. Based on this representation, we learn a generative world model solely from single-view internet photos. Our method enables simulating long flights through 3D landscapes, while maintaining global scene consistency--for instance, returning to the starting point yields the same view of the scene. Our approach enables scene extrapolation beyond the fixed bounds of current 3D generative models, while also supporting a persistent, camera-independent world representation that stands in contrast to auto-regressive 3D prediction models. Our project page: https://chail.github.io/persistent-nature/.
The finite difference time domain method is one of the simplest and most popular methods in computational electromagnetics. This work considers two possible ways of generalising it to a meshless setting by employing local radial basis function interpolation. The resulting methods remain fully explicit and are convergent if properly chosen hyperviscosity terms are added to the update equations. We demonstrate that increasing the stencil size of the approximation has a desirable effect on numerical dispersion. Furthermore, our proposed methods can exhibit a decreased dispersion anisotropy compared to the finite difference time domain method.
Models accounting for imperfect detection are important. Single-visit methods have been proposed as an alternative to multiple-visits methods to relax the assumption of closed population. Knape and Korner-Nievergelt (2015) showed that under certain models of probability of detection single-visit methods are statistically non-identifiable leading to biased population estimates. There is a close relationship between estimation of the resource selection probability function (RSPF) using weighted distributions and single-visit methods for occupancy and abundance estimation. We explain the precise mathematical conditions needed for RSPF estimation as stated in Lele and Keim (2006). The identical conditions, that remained unstated in our papers on single-visit methodology, are needed for single-visit methodology to work. We show that the class of admissible models is quite broad and does not excessively restrict the application of the RSPF or the single-visit methodology. To complement the work by Knape and Korner-Nievergelt, we study the performance of multiple-visit methods under the scaled logistic detection function and a much wider set of situations. In general, under the scaled log
In the past three decades, many stars orbiting about the supermassive black hole (SMBH) at the Galactic Centre (Sgr A*) were identified. Their orbital nature can give stringent constraints for the mass of the SMBH. In particular, the star S2 has completed at least one period since our first detection of its position, which can provide rich information to examine the properties of the SMBH, and the astrophysical environment surrounding the SMBH. Here, we report an interesting phenomenon that if a significant amount of dark matter or stellar mass is distributed around the SMBH, the precession speed of the S2 stellar orbit could be `slow down' by at most 27\% compared with that without dark matter surrounding the SMBH, assuming the optimal dark matter scenario. We anticipate that future high quality observational data of the S2 stellar orbit or other stellar orbits can help reveal the actual mass distribution near the SMBH and the nature of dark matter.
Two blind source separation methods (Independent Component Analysis and Non-negative Matrix Factorization), developed initially for signal processing in engineering, found recently a number of applications in analysis of large-scale data in molecular biology. In this short review, we present the common idea behind these methods, describe ways of implementing and applying them and point out to the advantages compared to more traditional statistical approaches. We focus more specifically on the analysis of gene expression in cancer. The review is finalized by listing available software implementations for the methods described.
The problem of the Nature of Time is twofold: whether or not time is a fundamental quantity of Nature, and how does clock time of metrology emerge in the experimental description of dynamics. This work strongly supports the fundamental timelessness of Nature. However, the correct view that physics is described by relations between variables does not address the second problem of how time does emerge at the macroscopic scale on the ground of a timeless framework. In this work ordinary Hamiltonian dynamics is recast in a timeless formalism capable to provide a definition of parameter time on the basis of the only generalized coordinates, together with the Hamiltonian invariance on trajectories, and a variational principle. Next, by relaxing the assumption of periodicity of real clocks to the only cyclicity in the phase space, the second problem is addressed. Physical systems, if complex enough, can be separated in a subsystem whose dynamics is described, and another cyclic subsystem which behaves as a clock. The dynamics of the first is mapped in the states of the second cyclic subsystem which provides a discrete approximation of the parameter time. Useful clocks fulfill a stability
In this paper, we consider gradient methods for minimizing smooth convex functions, which employ the information obtained at the previous iterations in order to accelerate the convergence towards the optimal solution. This information is used in the form of a piece-wise linear model of the objective function, which provides us with much better prediction abilities as compared with the standard linear model. To the best of our knowledge, this approach was never really applied in Convex Minimization to differentiable functions in view of the high complexity of the corresponding auxiliary problems. However, we show that all necessary computations can be done very efficiently. Consequently, we get new optimization methods, which are better than the usual Gradient Methods both in the number of oracle calls and in the computational time. Our theoretical conclusions are confirmed by preliminary computational experiments.
Possible topological nature of Kondo and mixed valence insulators has been a recent topic of interest in condensed matter physics. Attention has focused on SmB6, which has long been known to exhibit low temperature transport anomaly, whose origin is of independent interest. We argue that it is possible to resolve the topological nature of surface states by uniquely accessing the surface electronic structure of the low temperature anomalous transport regime through combining state-of-the-art laser- and synchrotron-based angle-resolved photoemission spectroscopy (ARPES) with or without spin resolution. A combination of low temperature and ultra-high resolution (laser) which is lacking in previous ARPES studies of this compound is the key to resolve the possible existence of topological surface state in SmB6. Here we outline an experimental algorithm to systematically explore the topological versus trivial or mixed (topological and trivial surface state admixture as in the first 3D TI Bi$_{1-x}$Sb$_x$) nature of the surface states in Kondo and mixed valence insulators. We conclude based on this methodology that the observed topology of the surface Fermi surface in our low temperature
Sharing diverse genomic and other biomedical datasets is critical to advance scientific discoveries and their equitable translation to improve human health. However, data sharing remains challenging in the context of legacy datasets, evolving policies, multi-institutional consortium science, and international stakeholders. The NIH-funded Polygenic Risk Methods in Diverse Populations (PRIMED) Consortium was established to improve the performance of polygenic risk estimates for a broad range of health and disease outcomes with global impacts. Improving polygenic risk score performance across genetically diverse populations requires access to large, diverse cohorts. We report on the design and implementation of data sharing policies and procedures developed in PRIMED to aggregate and analyze data from multiple, heterogeneous sources while adhering to existing data sharing policies for each integrated dataset. We describe two primary data sharing mechanisms: coordinated dbGaP applications and a Consortium Data Sharing Agreement, as well as provide alternatives when individual-level data cannot be shared within the Consortium (e.g., federated analyses). We also describe technical implem
Ultra-weak photon emission (UPE) from living systems is widely hypothesized to reflect un-derlying self-organization and long-range coordination in biological dynamics. However, distin-guishing biologically driven correlations from trivial stochastic or instrumental effects requires a robust, multi-method framework. In this work, we establish and benchmark a comprehensive anal-ysis pipeline for photon-count time series, combining Distribution Entropy Analysis, Rényi entro-py, Detrended Fluctuation Analysis, its generalization Multifractal Detrended Fluctuation Analysis, and tail-statistics characterization. Surrogate signals constructed from Poisson processes, Fractional Gaussian Noise, and Renewal Processes with power-law waiting times are used to validate sensitivity to memory, intermittency, and multifractality. Across all methods, a coherent hierarchy of dynamical regimes is recovered, demonstrating internal methodological consistency. Application to experimental dark-count data and attenuated coherent-laser emission confirm Poisson-like behavior, establishing an essential statistical baseline for UPE studies. The combined results show that this multi-resolution approach reliab
The Great Divide in metaphysical debates about laws of nature is between Humeans, who think that laws merely describe the distribution of matter, and non-Humeans, who think that laws govern it. The metaphysics can place demands on the proper formulations of physical theories. It is sometimes assumed that the governing view requires a fundamental / intrinsic direction of time: to govern, laws must be dynamical, producing later states of the world from earlier ones, in accord with the fundamental direction of time in the universe. In this paper, we propose a minimal primitivism about laws of nature (MinP) according to which there is no such requirement. On our view, laws govern by constraining the physical possibilities. Our view captures the essence of the governing view without taking on extraneous commitments about the direction of time or dynamic production. Moreover, as a version of primitivism, our view requires no reduction / analysis of laws in terms of universals, powers, or dispositions. Our view accommodates several potential candidates for fundamental laws, including the principle of least action, the Past Hypothesis, the Einstein equation of general relativity, and even
We present a kernel compensation method for Maxwell eigenproblem for photonic crystals to avoid the infinite-dimensional kernels that cause many difficulties in the calculation of energy gaps. The quasi-periodic problem is first transformed into a periodic one on the cube by the Floquet-Bloch theory. Then the compensation operator is introduced in Maxwell's equation with the shifted curl operator. The discrete problem depends on the compatible discretization of the de Rham complex, which is implemented by the mimetic finite difference method in this paper. We prove that the compensation term exactly fills up the kernel of the original problem and avoids spurious eigenvalues. Also, we propose an efficient preconditioner and its FFT and multigrid solvers, which allow parallel computing. Numerical experiments for different three-dimensional lattices are performed to validate the accuracy and effectiveness of the method.
Upon mechanical loading, granular materials yield and undergo plastic deformation. The nature of plastic deformation is essential for the development of the macroscopic constitutive models and the understanding of shear band formation. However, we still do not fully understand the microscopic nature of plastic deformation in disordered granular materials. Here we used synchrotron X-ray tomography technique to track the structural evolutions of three-dimensional granular materials under shear. We establish that highly distorted coplanar tetrahedra are the structural defects responsible for microscopic plasticity in disordered granular packings. The elementary plastic events occur through flip events which correspond to a neighbor switching process among these coplanar tetrahedra (or equivalently as the rotation motion of 4-ring disclinations). These events are discrete in space and possess specific orientations with the principal stress direction.
This review outlines concepts of mathematical statistics, elements of probability theory, hypothesis tests and point estimation for use in the analysis of modern astronomical data. Least squares, maximum likelihood, and Bayesian approaches to statistical inference are treated. Resampling methods, particularly the bootstrap, provide valuable procedures when distributions functions of statistics are not known. Several approaches to model selection and good- ness of fit are considered. Applied statistics relevant to astronomical research are briefly discussed: nonparametric methods for use when little is known about the behavior of the astronomical populations or processes; data smoothing with kernel density estimation and nonparametric regression; unsupervised clustering and supervised classification procedures for multivariate problems; survival analysis for astronomical datasets with nondetections; time- and frequency-domain times series analysis for light curves; and spatial statistics to interpret the spatial distributions of points in low dimensions. Two types of resources are presented: about 40 recommended texts and monographs in various fields of statistics, and the public do