共找到 20 条结果
The renowned difference-in-differences (DiD) estimator relies on the assumption of 'parallel trends,' which does not hold in many practical applications. To address this issue, the econometrics literature has turned to the triple difference estimator. Both DiD and triple difference are limited to assessing average effects exclusively. An alternative avenue is offered by the changes-in-changes (CiC) estimator, which provides an estimate of the entire counterfactual distribution at the cost of relying on (stronger) distributional assumptions. In this work, we extend the triple difference estimator to accommodate the CiC framework, presenting the `triple changes estimator' and its identification assumptions, thereby expanding the scope of the CiC paradigm. Subsequently, we empirically evaluate the proposed framework and apply it to a study examining the impact of Medicaid expansion on children's preventive care.
Open quantum systems are powerful effective descriptions of quantum systems interacting with their environments. Studying changes of Fock state probabilities can be intricate in this context since the prevailing description of open quantum dynamics is by master equations of the systems' reduced density matrices, which usually requires finding solutions for a set of complicated coupled differential equations. In this article, we show that such problems can be circumvented by employing a recently developed path integral-based method for directly computing reduced density matrices in scalar quantum field theory. For this purpose, we consider a real scalar field $φ$ as an open system interacting via a $λχ^2φ^2$-term with an environment comprising another real scalar field $χ$ that has a finite temperature. In particular, we investigate how the probabilities for observing the vacuum or two-particle states change over time if there were initial correlations of these Fock states. Subsequently, we apply our resulting expressions to a neutrino toy model. We show that, within our model, lighter neutrino masses would lead to a stronger distortion of the observable number of particles due to t
We investigate the high energy emission activities of two bright gamma-ray pulsars, PSR~J2021+4026 and Vela. For PSR~J2021+4026, the state changes in the gamma-ray flux and spin-down rate have been observed. We report that the long-term evolution of the gamma-ray flux and timing behavior of PSR~J2021+4026 suggests a new gamma-ray flux recovery at around MJD~58910 and a flux decrease around MJD~59500. During this epoch, the staying time, the gamma-ray flux difference and spin-down rate are smaller than previous epochs in the same state. The waiting timescale of the quasi-periodic state changes is similar to the waiting timescale of the glitch events of the Vela pulsar. For the Vela pulsar, the quench of the radio pulse was in a timescale of $\sim0.2$~s after the 2016 glitch, and the glitch may disturb the structure of the magnetosphere. Nevertheless, we did not find any evidence for a long-term change in the gamma-ray emission properties using years of $Fermi$-LAT data, and therefore, no long-term magnetosphere structural change. We also conduct searching for photons above 100~GeV using 15-year $Fermi$-LAT data, and found none. Our results provide additional information for the rela
Flow-level network measurement is critical to many network applications. Among various measurement tasks, packet loss detection and heavy-hitter detection are two most important measurement tasks, which we call the two key tasks. In practice, the two key tasks are often required at the same time, but existing works seldom handle both tasks. In this paper, we design ChameleMon to support the two key tasks simultaneously. One key design/novelty of ChameleMon is to shift measurement attention as network state changes, through two dimensions of dynamics: 1) dynamically allocating memory between the two key tasks; 2) dynamically monitoring the flows of importance. To realize the key design, we propose a key technique, leveraging Fermat's little theorem to devise a flexible data structure, namely FermatSketch. FermatSketch is dividable, additive, and subtractive, supporting the two key tasks. We have fully implemented a ChameleMon prototype on a testbed with a Fat-tree topology. We conduct extensive experiments and the results show ChameleMon supports the two key tasks with low memory/bandwidth overhead, and more importantly, it can automatically shift measurement attention as network st
Observations of Cepheids in the Large Magellanic Cloud, made over the last several decades, allow us to search for evolutionary period changes. None of the Cepheid from our sample of 378 stars stopped pulsating. Also none of them showed a large period change which could indicate mode switching. However for Cepheids with log P > 0.9 we found significant period changes, positive as well as negative. A comparison between the observed period changes and theoretical predictions shows moderate agreement with some models (Bono et al. 2000), and a very large disagreement with others (Alibert et al. 1999). The large differences between the models are likely caused by the very high sensitivity of stellar evolution during core helium burning phase to even small changes in the input physics, as discovered by Lauterborn, Refsdal and Weigert (1971).
The recent suggestion that dramatic changes may occur in the lifetime of alpha and beta decay when the activity, in a pure metal host, is cooled to a few Kelvin, is examined in the light of published low temperature nuclear orientation (LTNO) experiments, with emphasis here on alpha decay. In LTNO observations are made of the anisotropy of radioactive emissions with respect to an axis of orientation. Correction of data for decay of metallic samples held at temperatures at and below 1 Kelvin for periods of days and longer has been a routine element of LTNO experiments for many years. No evidence for any change of half life on cooling, with an upper level of less than 1%, has been found, in striking contrast to the predicted changes, for alpha decay, of several orders of magnitude. The proposal that such dramatic changes might alleviate problems of disposal of long-lived radioactive waste is shown to be unrealistic.
We characterize cutting arcs on fiber surfaces that produce new fiber surfaces, and the changes in monodromy resulting from such cuts. As a corollary, we characterize band surgeries between fibered links and introduce an operation called Generalized Hopf banding. We further characterize generalized crossing changes between fibered links, and the resulting changes in monodromy.
Comparison of the old observations of Cepheids in the Small Magellanic Cloud from the Harvard data archive, with the recent OGLE and ASAS observations allows an estimate of their period changes. All of matched 557 Cepheids are still pulsating in the same mode. One of the Harvard Cepheid, HV 11289, has been tentatively matched to a star which is now apparently constant. Cepheids with log P > 0.8 show significant period changes, positive as well as negative. We found that for many stars these changes are significantly smaller than predicted by recent model calculations. Unfortunately, there are no models available for Cepheids with periods longer than approximatelly 80 days, while there are observed Cepheids with periods up to 210 days.
Manuscripts have a complex development process with multiple influencing factors. Reconstructing this process is difficult without large-scale, comparable data on different versions of manuscripts. Preprints are increasingly available and may provide access to the earliest manuscript versions. Here, we matched 6,024 preprint-publication pairs across multiple fields and examined changes in their reference lists between the manuscript versions as one aspect of manuscripts' development. We also qualitatively analysed the context of references to investigate the potential reasons for changes. We found that 90 percent of references were unchanged between versions and 8 percent were newly added. We found that manuscripts in the natural and medical sciences undergo more extensive reframing of the literature while changes in engineering mostly focused on methodological details. Our qualitative analysis suggests that peer review increases the methodological soundness of scientific claims, improves the communication of findings, and ensures appropriate credit for previous research.
How do you use imaging to analyse the development of the heart, which not only changes shape but also undergoes constant, high-speed, quasi-periodic changes? We have integrated ideas from prospective and retrospective optical gating to capture long-term, phase-locked developmental time-lapse videos. In this paper we demonstrate the success of this approach over a key developmental time period: heart looping, where large changes in heart shape prevent previous prospective gating approaches from capturing phase-locked videos. We use the comparison with other approaches to in vivo heart imaging to highlight the importance of collecting the most appropriate data for the biological question.
We have detected the four 18cm OH lines from the $z \sim 0.765$ gravitational lens toward PMN J0134-0931. The 1612 and 1720 MHz lines are in conjugate absorption and emission, providing a laboratory to test the evolution of fundamental constants over a large lookback time. We compare the HI and OH main line absorption redshifts of the different components in the $z \sim 0.765$ absorber and the $z \sim 0.685$ lens toward B0218+357 to place stringent constraints on changes in $F \equiv g_p [α^2/μ]^{1.57}$. We obtain $[ΔF/F] = (0.44 \pm 0.36^{\rm stat} \pm 1.0^{\rm syst}) \times 10^{-5}$, consistent with no evolution over the redshift range $0 < z < 0.7$. The measurements have a $2 σ$ sensitivity of $[Δα/α] < 6.7 \times 10^{-6}$ or $[Δμ/μ] < 1.4 \times 10^{-5}$ to fractional changes in $α$ and $μ$ over a period of $\sim 6.5$ Gyr, half the age of the Universe. These are among the most sensitive current constraints on changes in $μ$.
We study time-changes of unipotent flows on finite volume quotients of semisimple linear groups, generalising previous work by Ratner on time-changes of horocycle flows. Any measurable isomorphism between time-changes of unipotent flows gives rise to a non-trivial joining supported on its graph. Under a spectral gap assumption on the groups, we show the following rigidity result: either the only limit point of this graph joining under the action of a one-parameter renormalising subgroup is the trivial joining, or the isomorphism is "affine", namely it is obtained composing an algebraic isomorphism with a (non-constant) translation along the centraliser.
This study describes a method to quantify potential gait changes in human subjects. Microsoft Kinect devices were used to provide and track coordinates of fifteen different joints of a subject over time. Three male subjects walk a 10-foot path multiple times with and without motion-restricting devices. Their walking patterns were recorded via two Kinect devices through frontal and sagittal planes. A modified sample entropy (SE) value was computed to quantify the variability of the time series for each joint. The SE values with and without motion-restricting devices were used to compare the changes in each joint. The preliminary results of the experiments show that the proposed quantification method can detect differences in walking patterns with and without motion-restricting devices. The proposed method has the potential to be applied to track personal progress in physical therapy sessions.
In this paper, a method for measuring synchronic corpus (dis-)similarity put forward by Kilgarriff (2001) is adapted and extended to identify trends and correlated changes in diachronic text data, using the Corpus of Historical American English (Davies 2010a) and the Google Ngram Corpora (Michel et al. 2010a). This paper shows that this fully data-driven method, which extracts word types that have undergone the most pronounced change in frequency in a given period of time, is computationally very cheap and that it allows interpretations of diachronic trends that are both intuitively plausible and motivated from the perspective of information theory. Furthermore, it demonstrates that the method is able to identify correlated linguistic changes and diachronic shifts that can be linked to historical events. Finally, it can help to improve diachronic POS tagging and complement existing NLP approaches. This indicates that the approach can facilitate an improved understanding of diachronic processes in language change.
In a wide range of applications, the stochastic properties of the observed time series change over time. The changes often occur gradually rather than abruptly: the prop- erties are (approximately) constant for some time and then slowly start to change. In such situations, it is frequently of interest to locate the time point where the properties start to vary. In contrast to the analysis of abrupt changes, methods for detecting smooth or gradual change points are less developed and often require strong paramet- ric assumptions. In this paper, we develop a fully nonparametric method to estimate a smooth change point in a locally stationary framework. We set up a general procedure which allows to deal with a wide variety of stochastic properties including the mean, (auto)covariances and higher-order moments. The theoretical part of the paper estab- lishes the convergence rate of the new estimator. In addition, we examine its finite sample performance by means of a simulation study and illustrate the methodology by applications to temperature and financial return data.
Most of the literature on change-point analysis by means of hypothesis testing considers hypotheses of the form H0 : θ_1 = θ_2 vs. H1 : θ_1 != θ_2, where θ_1 and θ_2 denote parameters of the process before and after a change point. This paper takes a different perspective and investigates the null hypotheses of no relevant changes, i.e. H0 : ||θ_1 - θ_2|| ? \leq Δ?, where || \cdot || is an appropriate norm. This formulation of the testing problem is motivated by the fact that in many applications a modification of the statistical analysis might not be necessary, if the difference between the parameters before and after the change-point is small. A general approach to problems of this type is developed which is based on the CUSUM principle. For the asymptotic analysis weak convergence of the sequential empirical process has to be established under the alternative of non-stationarity, and it is shown that the resulting test statistic is asymptotically normal distributed. Several applications of the methodology are given including tests for relevant changes in the mean, variance, parameter in a linear regression model and distribution function among others. The finite sample propertie
In this paper we investigate the evolution of the IPv4 and IPv6 Internet topologies at the autonomous system (AS) level over a long period of time.We provide abundant empirical evidence that there is a phase transition in the growth trend of the two networks. For the IPv4 network, the phase change occurred in 2001. Before then the network's size grew exponentially, and thereafter it followed a linear growth. Changes are also observed around the same time for the maximum node degree, the average node degree and the average shortest path length. For the IPv6 network, the phase change occurred in late 2006. It is notable that the observed phase transitions in the two networks are different, for example the size of IPv6 network initially grew linearly and then shifted to an exponential growth. Our results show that following decades of rapid expansion up to the beginning of this century, the IPv4 network has now evolved into a mature, steady stage characterised by a relatively slow growth with a stable network structure; whereas the IPv6 network, after a slow startup process, has just taken off to a full speed growth. We also provide insight into the possible impact of IPv6-over-IPv4 t
We study statistical properties of the highest degree, or most popular, nodes in growing networks. We show that the number of lead changes increases logarithmically with network size N, independent of the details of the growth mechanism. The probability that the first node retains the lead approaches a finite constant for popularity-driven growth, and decays as N^{-phi}(ln N)^{-1/2}, with phi=0.08607..., for growth with no popularity bias.
We look at the effect of the tick size changes on the TOPIX 100 index names made by the Tokyo Stock Exchange on Jan-14-2014 and Jul-22-2104. The intended consequence of the change is price improvement and shorter time to execution. We look at security level metrics that include the spread, trading volume, number of trades and the size of trades to establish whether this goal is accomplished. An unintended effect might be the reduction in execution sizes, which would then mean that institutions with large orders would have greater difficulty in sourcing liquidity. We look at a sample of real orders to see if the execution costs have gone up across the orders since the implementation of this change. We study the mechanisms that affect how securities are traded on an exchange, before delving into the specifics of the TSE tick size events. Some of the topics we explore are: The Venue Menu and How to Increase Revenue; To Automate or Not to Automate; Microstructure under the Microscope; The Price of Connections to High (and Faraway) Places; Speed Thrills but Kills; Pick a Size for the Perfect Tick; TSE Tick Size Experiments, Then and Now; Sergey Bubka and the Regulators; Bird`s Eye View;
A new bivariate partial sum process for locally stationary time series is introduced and its weak convergence to a Brownian sheet is established. This construction enables the development of a novel self-normalized CUSUM test statistic for detecting changes in the mean of a locally stationary time series. For stationary data, self-normalization relies on the factorization of a constant long-run variance and a stochastic factor. In this case, the CUSUM statistic can be divided by another statistic proportional to the long-run variance, so that the latter cancels, avoiding estimation of the long-run variance. Under local stationarity, the partial sum process converges to $\int_0^t σ(x) d B_x$ and no such factorization is possible. To overcome this obstacle, a bivariate partial-sum process is introduced, allowing the construction of self-normalized test statistics under local stationarity. Weak convergence of the process is proven, and it is shown that the resulting self-normalized tests attain asymptotic level $α$ under the null hypothesis of no change, while being consistent against abrupt, gradual, and multiple changes under mild assumptions. Simulation studies show that the propos