共找到 20 条结果
We report our experience in applying runtime monitoring to a FluxCD-based continuous deployment (CD) process. Our target system consists of GitHub Actions, GitHub Container Registry (GHCR), FluxCD, and an application running on Kubernetes. We monitored its logs using SyMon. In our setting, we regard a deployment update as detected when FluxCD's polling log resolves the latest image tag. Through the case study, we found that FluxCD did not always detect a new image within five minutes after it was pushed to GHCR, whereas it always did so within ten minutes in the collected logs. Moreover, our results show that SyMon is fast enough for near-real-time monitoring in our setting.
Pollux is a candidate European instrument contribution to the Habitable Worlds Observatory (HWO), designed to advance our understanding of the formation and evolution of cosmic structures in the universe, and specifically search signs of life on extrasolar planets. This high-resolution spectrograph (R\,$>$\,40,000) with polarimetric capabilities offers nearly continuous and simultaneous coverage from the FUV ($\sim$100\,nm) to the NIR ($\sim$1.9\,$\micron$), making it a versatile tool for a wide range of scientific investigations from solar system studies to cosmology. Several Solar System ocean worlds have been the focal point of the scientific community to understand the conditions of their internal saline oceans, as well as the possible emergence of life beyond Earth. The ocean world science case will leverage Pollux's UV spectropolarimetric capabilities to investigate surface reflectance and composition, characterize airglow emissions in the environments of giant-planet moons, as well as constrain the microphysical properties of atmospheric aerosols.
We report results of the CASE 2022 Shared Task 1 on Multilingual Protest Event Detection. This task is a continuation of CASE 2021 that consists of four subtasks that are i) document classification, ii) sentence classification, iii) event sentence coreference identification, and iv) event extraction. The CASE 2022 extension consists of expanding the test data with more data in previously available languages, namely, English, Hindi, Portuguese, and Spanish, and adding new test data in Mandarin, Turkish, and Urdu for Sub-task 1, document classification. The training data from CASE 2021 in English, Portuguese and Spanish were utilized. Therefore, predicting document labels in Hindi, Mandarin, Turkish, and Urdu occurs in a zero-shot setting. The CASE 2022 workshop accepts reports on systems developed for predicting test data of CASE 2021 as well. We observe that the best systems submitted by CASE 2022 participants achieve between 79.71 and 84.06 F1-macro for new languages in a zero-shot setting. The winning approaches are mainly ensembling models and merging data in multiple languages. The best two submissions on CASE 2021 data outperform submissions from last year for Subtask 1 and Su
The recently introduced Genetic Column Generation (GenCol) algorithm has been numerically observed to efficiently and accurately compute high-dimensional optimal transport plans for general multi-marginal problems, but theoretical results on the algorithm have hitherto been lacking. The algorithm solves the OT linear program on a dynamically updated low-dimensional submanifold consisting of sparse plans. The submanifold dimension exceeds the sparse support of optimal plans only by a fixed factor $β$. Here we prove that for $β\geq 2$ and in the two-marginal case, GenCol always converges to an exact solution, for arbitrary costs and marginals. The proof relies on the concept of c-cyclical monotonicity. As an offshoot, GenCol rigorously reduces the data complexity of numerically solving two-marginal OT problems from $O(\ell^2)$ to $O(\ell)$ without any loss in accuracy, where $\ell$ is the number of discretization points for a single marginal. At the end of the paper we also present some insights into the convergence behavior in the multi-marginal case.
Knowledge-intensive processes (KiPs) are becoming increasingly important for organizations with the rise of the knowledge society. Due to their unpredictable and emergent characteristic worklfow management solutions are not suitable to support KiPs. Various case management related approaches have been proposed by researchers and practitioners to support characteristics of KiPs. In this paper we provide a comprehensive list of definitions available on case management, e.g. case handling, adaptive case management, dynamic case management, production case management. For every definition we present the explicit definition, paragraphs that better describe and summarize the case management approach, or extracted sequences that define the term in the referenced publication. All of these definitions are compared against characteristics of KiPs in order to get about understanding of the domain.
There have been theory-based endeavours that directly engage with AI and ML in the landscape discipline. By presenting a case that uses machine learning techniques to predict variables in a coastal environment, this paper provides empirical evidence of the forthcoming cybernetic environment, in which designers are conceptualized not as authors but as choreographers, catalyst agents, and conductors among many other intelligent agents. Drawing ideas from posthumanism, this paper argues that, to truly understand the cybernetic environment, we have to take on posthumanist ethics and overcome human exceptionalism.
Recently, a groundswell of research has identified the use of counterfactual explanations as a potentially significant solution to the Explainable AI (XAI) problem. It is argued that (a) technically, these counterfactual cases can be generated by permuting problem-features until a class change is found, (b) psychologically, they are much more causally informative than factual explanations, (c) legally, they are GDPR-compliant. However, there are issues around the finding of good counterfactuals using current techniques (e.g. sparsity and plausibility). We show that many commonly-used datasets appear to have few good counterfactuals for explanation purposes. So, we propose a new case based approach for generating counterfactuals using novel ideas about the counterfactual potential and explanatory coverage of a case-base. The new technique reuses patterns of good counterfactuals, present in a case-base, to generate analogous counterfactuals that can explain new problems and their solutions. Several experiments show how this technique can improve the counterfactual potential and explanatory coverage of case-bases that were previously found wanting.
We present a model-independent study of boundary states in the Cardy case that covers all conformal field theories for which the representation category of the chiral algebra is a - not necessarily semisimple - modular tensor category. This class, which we call finite CFTs, includes all rational theories, but goes much beyond these, and in particular comprises many logarithmic conformal field theories. We show that the following two postulates for a Cardy case are compatible beyond rational CFT and lead to a universal description of boundary states that realizes a standard mathematical setup: First, for bulk fields, the pairing of left and right movers is given by (a coend involving) charge conjugation; and second, the boundary conditions are given by the objects of the category of chiral data. For rational theories our proposal reproduces the familiar result for the boundary states of the Cardy case. Further, with the help of sewing we compute annulus amplitudes. Our results show in particular that these possess an interpretation as partition functions, a constraint that for generic finite CFTs is much more restrictive than for rational ones.
In this paper I will explain a rigidity conjecture that intertwines the deep diagonal pentagram maps and Poncelet polygons. I will also establish a simple case of the conjecture, the one involving the $3$-diagonal map on a convex $8$-gon with $4$-fold rotational symmetry. This case involves a textbook analysis of a pencil of elliptic curves.
In this paper, we study the Empirical Risk Minimization problem in the non-interactive local model of differential privacy. In the case of constant or low dimensionality ($p\ll n$), we first show that if the ERM loss function is $(\infty, T)$-smooth, then we can avoid a dependence of the sample complexity, to achieve error $α$, on the exponential of the dimensionality $p$ with base $1/α$ (i.e., $α^{-p}$), which answers a question in [smith 2017 interaction]. Our approach is based on polynomial approximation. Then, we propose player-efficient algorithms with $1$-bit communication complexity and $O(1)$ computation cost for each player. The error bound is asymptotically the same as the original one. Also with additional assumptions we show a server efficient algorithm. Next we consider the high dimensional case ($n\ll p$), we show that if the loss function is Generalized Linear function and convex, then we could get an error bound which is dependent on the Gaussian width of the underlying constrained set instead of $p$, which is lower than that in [smith 2017 interaction].
We compactify the classical moduli variety $A_g$ of principally polarized abelian varieties of complex dimension $g$ by attaching the moduli of flat tori of real dimensions at most $g$ in an explicit manner. Equivalently, we explicitly determine the Gromov-Hausdorff limits of principally polarized abelian varieties. This work is analogous to the first of our series (available at arXiv:1406.7772v2), which compactified the moduli of curves by attaching the moduli of metrized graphs. Then, we also explicitly specify the Gromov-Hausdorff limits along holomorphic family of abelian varieties and show that they form special non-trivial subsets of the whole boundary. We also do it for algebraic curves case and observe a crucial difference with the case of abelian varieties.
With the advent of WWW and outburst in technology and software development, testing the software became a major concern. Due to the importance of the testing phase in a software development life cycle, testing has been divided into graphical user interface (GUI) based testing, logical testing, integration testing, etc.GUI Testing has become very important as it provides more sophisticated way to interact with the software. The complexity of testing GUI increased over time. The testing needs to be performed in a way that it provides effectiveness, efficiency, increased fault detection rate and good path coverage. To cover all use cases and to provide testing for all possible (success/failure) scenarios the length of the test sequence is considered important. Intent of this paper is to study some techniques used for test case generation and process for various GUI based software applications.
This paper presents the case for Data Plane Timestamping (DPT). We argue that in the unique environment of Software-Defined Networks (SDN), attaching a timestamp to the header of all packets is a powerful feature that can be leveraged by various diverse SDN applications. We analyze three key use cases that demonstrate the advantages of using DPT, and show that SDN applications can benefit even from using as little as one bit for the timestamp field.
We consider continuous state branching processes that are perturbed by a Brownian motion. These processes are constructed as the unique strong solution of a stochastic differential equation. The long-term extinction and explosion behaviours are studied. In the stable case, the extinction and explosion probabilities are given explicitly. We find three regimes for the asymptotic behaviour of the explosion probability and, as in the case of branching processes in random environment, we find five regimes for the asymptotic behaviour of the extinction probability. In the supercritical regime, we study the process conditioned on eventual extinction where three regimes for the asymptotic behaviour of the extinction probability appear. Finally, the process conditioned on non-extinction and the process with immigration are given.
Reliable, high-intensity operation of the Fermilab Accelerator Complex is critical to the success of the Long-Baseline Neutrino Facility and Deep Underground Neutrino Experiment. We describe the requirements and infrastructure necessary to support routine use of artificial intelligence and machine learning (AI/ML) in the accelerator control system. Three capabilities are identified: a machine learning operations (MLOps) framework standardizing the lifecycle of AI/ML automation from data management through deployment and monitoring; a data quality framework defining and enforcing standards required to build trustworthy AI/ML applications; and workflow integration with large language models to assist physicists, engineers, and operators with information retrieval, code development, and routine analysis. Use cases spanning beam diagnostics, beam control, and support system automation illustrate the technical requirements across the complex.
Formally self-adjoint, conformally covariant, polydifferential operators provide a general framework for studying variational problems, such as prescribing the scalar, $Q$-, or $σ_2$-curvatures, within a conformal class. We describe recent progress on Yamabe problems for such operators, including uniqueness results on the sphere and nonuniqueness results in general. We also highlight a number of open questions related to these operators, some of which constitute a possible blueprint for the general solution of the Yamabe problem for polydifferential operators.
We construct a large family of conformally covariant tridifferential operators as tangential operators in the Fefferman--Graham ambient space. Our construction is analogous to the linear and bilinear constructions of Graham--Jenne--Mason--Sparling and Case--Lin--Yuan, respectively. We also show that the symmetrization of our ambient operators are formally self-adjoint when acting on densities of the correct weight.
Reverse engineering (RE) of x86 binaries is indispensable for malware and firmware analysis, but remains slow due to stripped metadata and adversarial obfuscation. Large Language Models (LLMs) offer potential for improving RE efficiency through automated comprehension and commenting, but cloud-hosted, closed-weight models pose privacy and security risks and cannot be used in closed-network facilities. We evaluate parameter-efficient fine-tuned local LLMs for assisting with x86 RE tasks in these settings. Eight open-weight models across the CodeLlama, Qwen2.5-Coder, and CodeGemma series are fine-tuned on a custom curated dataset of 5,981 x86 assembly examples. We evaluate them quantitatively and identify the fine-tuned Qwen2.5-Coder-7B as the top performer, which we name REx86. REx86 reduces test-set cross-entropy loss by 64.2% and improves semantic cosine similarity against ground truth by 20.3\% over its base model. In a limited user case study (n=43), REx86 significantly enhanced line-level code understanding (p = 0.031) and increased the correct-solve rate from 31% to 53% (p = 0.189), though the latter did not reach statistical significance. Qualitative analysis shows more accur
We show that there are infinitely many pairwise nonhomothetic, complete, periodic metrics with constant scalar curvature that are conformal to the round metric on $S^n\setminus S^k$, where $k < \frac{n-2}{2}$. These metrics are obtained by pulling back Yamabe metrics defined on products of $S^{n-k-1}$ and compact hyperbolic $(k+1)$-manifolds. Our main result proves that these solutions are generically distinct up to homothety. The core of our argument relies on classical rigidity theorems due to Obata and Ferrand, which characterize the round sphere by its conformal group.
This study presents a comparative analysis of Monte Carlo (MC) and quasi-Monte Carlo (QMC) methods in the context of derivative pricing, emphasizing convergence rates and the curse of dimensionality. After a concise overview of traditional Monte Carlo techniques for evaluating expectations of derivative securities, the paper introduces quasi-Monte Carlo methods, which leverage low-discrepancy sequences to achieve more uniformly distributed sample points without relying on randomness. Theoretical insights highlight that QMC methods can attain superior convergence rates of $O(1/n^{1-ε})$ compared to the standard MC rate of $O(1/\sqrt{n})$, where $ε>0$. Numerical experiments are conducted on two types of options: geometric basket call options and Asian call options. For the geometric basket options, a five-dimensional setting under the Black-Scholes framework is utilized, comparing the performance of Sobol' and Faure low-discrepancy sequences against standard Monte Carlo simulations. Results demonstrate a significant reduction in root mean square error for QMC methods as the number of sample points increases. Similarly, for Asian call options, incorporating a Brownian bridge constr