Economists and social scientists have debated the relative importance of nature (one's genes) and nurture (one's environment) for decades, if not centuries. This debate can now be informed by the ready availability of genetic data in a growing number of social science datasets. This paper explores the potential uses of genetic data in economics, with a focus on estimating the interplay between nature (genes) and nurture (environment). We discuss how economists can benefit from incorporating genetic data into their analyses even when they do not have a direct interest in estimating genetic effects. We argue that gene--environment (GxE) studies can be instrumental for (i) testing economic theory, (ii) uncovering economic or behavioral mechanisms, and (iii) analyzing treatment effect heterogeneity, thereby improving the understanding of how (policy) interventions affect population subgroups. We introduce the reader to essential genetic terminology, develop a conceptual economic model to interpret gene-environment interplay, and provide practical guidance to empirical researchers.
With the reversal of Roe v. Wade in 2022, many U.S. employers announced they would reimburse employees for abortion-related travel expenses. This action complements increasingly common employer policies subsidizing employee access to assisted reproductive technologies such as in-vitro fertilization and egg freezing. This article reflects on why employers offer these benefits and whether they enhance or undermine reproductive justice. From the employer's perspective, abortion and assisted reproductive technologies help women to plan childbearing around the demands of their jobs. Both are associated with delayed childbirth and reduced fertility, which lower the costs of motherhood to employers. However, firm subsidization of these services does not further reproductive justice because it reifies structures which incentivize women to delay childbirth and reduce fertility, and it reinforces economic and reproductive inequalities. We conclude by questioning whether reproductive justice is possible without transforming the economy so that it prioritizes care over profits.
Rapid increases in food supplies have reduced global hunger, while rising burdens of diet-related disease have made poor diet quality the leading cause of death and disability around the world. Today's "double burden" of undernourishment in utero and early childhood then undesired weight gain and obesity later in life is accompanied by a third less visible burden of micronutrient imbalances. The triple burden of undernutrition, obesity, and unbalanced micronutrients that underlies many diet-related diseases such as diabetes, hypertension and other cardiometabolic disorders often coexist in the same person, household and community. All kinds of deprivation are closely linked to food insecurity and poverty, but income growth does not always improve diet quality in part because consumers cannot directly or immediately observe the health consequences of their food options, especially for newly introduced or reformulated items. Even after direct experience and epidemiological evidence reveals relative risks of dietary patterns and nutritional exposures, many consumers may not consume a healthy diet because food choice is driven by other factors. This chapter reviews the evidence on diet
This paper presents a novel quantitative approach for comparative economic studies, addressing limitations in current classification methods. Conventional approaches in comparative economics often rely on ad hoc and categorical classifications, leading to subjective judgments and disregarding the continuous nature of the spectrum of economic systems. These can result in subjectivity and significant information loss, particularly for countries with systems near categorical borders. To overcome these shortcomings, the present paper proposes distance-based indices for objective categorization, considering economic foundations and using hard data. Accordingly, the paper introduces institutional similarity indices--Capitalism Similarity Index (CapSI), Communism Similarity Index (ComSI), and Socialism Similarity Index (SocSI)-which reflect countries' positions along the economic system continuum. These indices adhere to mathematical rigor and are grounded in the mathematical fields of real analysis, metric spaces, and distance functions. By classifying 135 countries and creating GIS maps, the practical applicability of the proposed approach is demonstrated. Results show a high explanator
Can AI effectively perform complex econometric analysis traditionally requiring human expertise? This paper evaluates AI agents' capability to master econometrics, focusing on empirical analysis performance. We develop ``MetricsAI'', an Econometrics AI Agent built on the open-source MetaGPT framework. This agent exhibits outstanding performance in: (1) planning econometric tasks strategically, (2) generating and executing code, (3) employing error-based reflection for improved robustness, and (4) allowing iterative refinement through multi-round conversations. We construct two datasets from academic coursework materials and published research papers to evaluate performance against real-world challenges. Comparative testing shows our domain-specialized AI agent significantly outperforms both benchmark large language models (LLMs) and general-purpose AI agents. This work establishes a testbed for exploring AI's impact on social science research and enables cost-effective integration of domain expertise, making advanced econometric methods accessible to users with minimal coding skills. Furthermore, our AI agent enhances research reproducibility and offers promising pedagogical applic
This paper investigates Large Language Models (LLMs) ability to assess the economic soundness and theoretical consistency of empirical findings in spatial econometrics. We created original and deliberately altered "counterfactual" summaries from 28 published papers (2005-2024), which were evaluated by a diverse set of LLMs. The LLMs provided qualitative assessments and structured binary classifications on variable choice, coefficient plausibility, and publication suitability. The results indicate that while LLMs can expertly assess the coherence of variable choices (with top models like GPT-4o achieving an overall F1 score of 0.87), their performance varies significantly when evaluating deeper aspects such as coefficient plausibility and overall publication suitability. The results further revealed that the choice of LLM, the specific characteristics of the paper and the interaction between these two factors significantly influence the accuracy of the assessment, particularly for nuanced judgments. These findings highlight LLMs' current strengths in assisting with initial, more surface-level checks and their limitations in performing comprehensive, deep economic reasoning, suggesti
This paper establishes the theoretical and practical foundations for using Large Language Models (LLMs) as measurement instruments for latent economic variables -- specifically variables that describe the cognitive content of occupational tasks at a level of granularity not achievable with existing survey instruments. I formalize four conditions under which LLM-generated scores constitute valid instruments: semantic exogeneity, construct relevance, monotonicity, and model invariance. I then apply this framework to the Augmented Human Capital Index (AHC_o), constructed from 18,796 O*NET task statements scored by Claude Haiku 4.5, and validated against six existing AI exposure indices. The index shows strong convergent validity (r = 0.85 with Eloundou GPT-gamma, r = 0.79 with Felten AIOE) and discriminant validity. Principal component analysis confirms that AI-related occupational measures span two distinct dimensions -- augmentation and substitution. Inter-rater reliability across two LLM models (n = 3,666 paired scores) yields Pearson r = 0.76 and Krippendorff's alpha = 0.71. Prompt sensitivity analysis across four alternative framings shows that task-level rankings are robust. Obv
The goal of this paper is to investigate the importance of providing visual "big pictures" in the teaching of economics. The plurality and variety of concepts, variables, diagrams, and models involved in economics can be a source of confusion for many economics students. However, reviewing the existing literature on the importance of providing visual "big pictures" in the process of learning suggests that furnishing students with a visual "big picture" that illustrates the ways through which those numerous, diverse concepts are connected to each other could be an effective solution to clear up the mentioned mental chaos. As a practical example, this paper introduces a "big picture" that can be used as a good resource in intermediate macroeconomics classes. This figure presents twenty-seven commonly-discussed macroeconomic diagrams in the intermediate macroeconomics course, and gives little detail on some of these diagrams, aiming at helping students to get the whole picture at once on a single piece of paper. This macroeconomics big picture mostly focuses on the routes through which common diagrams in macroeconomics are connected to each other, and finally introduces the general ma
This study investigates the relationship between the market volatility of the iShares Asia 50 ETF (AIA) and economic and market sentiment indicators from the United States, China, and globally during periods of economic uncertainty. Specifically, it examines the association between AIA volatility and key indicators such as the US Economic Uncertainty Index (ECU), the US Economic Policy Uncertainty Index (EPU), China's Economic Policy Uncertainty Index (EPUCH), the Global Economic Policy Uncertainty Index (GEPU), and the Chicago Board Options Exchange's Volatility Index (VIX), spanning the years 2007 to 2023. Employing methodologies such as the two-covariate GARCH-MIDAS model, regime-switching Markov Chain (MSR), and quantile regressions (QR), the study explores the regime-dependent dynamics between AIA volatility and economic/market sentiment, taking into account investors' sensitivity to market uncertainties across different regimes. The findings reveal that the relationship between realized volatility and sentiment varies significantly between high- and low-volatility regimes, reflecting differences in investors' responses to market uncertainties under these conditions. Additiona
This paper describes a fundamental and empirically conspicuous problem inherent to surveys of human feelings and opinions in which subjective responses are elicited on numerical scales. The paper also proposes a solution. The problem is a tendency by some individuals -- particularly those with low levels of education -- to simplify the response scale by considering only a subset of possible responses such as the lowest, middle, and highest. In principle, this ``focal value rounding'' (FVR) behavior renders invalid even the weak ordinality assumption often used in analysis of such data. With ``happiness'' or life satisfaction data as an example, descriptive methods and a multinomial logit model both show that the effect is large and that education and, to a lesser extent, income level are predictors of FVR behavior. A model simultaneously accounting for the underlying wellbeing and for the degree of FVR is able to estimate the latent subjective wellbeing, i.e.~the counterfactual full-scale responses for all respondents, the biases associated with traditional estimates, and the fraction of respondents who exhibit FVR. Addressing this problem helps to resolve a longstanding puzzle in
We analyse 'stop-and-go' containment policies that produce infection cycles as periods of tight lockdowns are followed by periods of falling infection rates. The subsequent relaxation of containment measures allows cases to increase again until another lockdown is imposed and the cycle repeats. The policies followed by several European countries during the Covid-19 pandemic seem to fit this pattern. We show that 'stop-and-go' should lead to lower medical costs than keeping infections at the midpoint between the highs and lows produced by 'stop-and-go'. Increasing the upper and reducing the lower limits of a stop-and-go policy by the same amount would lower the average medical load. But increasing the upper and lowering the lower limit while keeping the geometric average constant would have the opposite effect. We also show that with economic costs proportional to containment, any path that brings infections back to the original level (technically a closed cycle) has the same overall economic cost.
Recent studies in psychology and neuroscience offer systematic evidence that fictional works exert a surprisingly strong influence on readers and have the power to shape their opinions and worldviews. Building on these findings, we study what we term Potterian economics, the economic ideas, insights, and structure, found in Harry Potter books, to assess how the books might affect economic literacy. A conservative estimate suggests that more than 7.3 percent of the world population has read the Harry Potter books, and millions more have seen their movie adaptations. These extraordinary figures underscore the importance of the messages the books convey. We explore the Potterian economic model and compare it to professional economic models to assess the consistency of the Potterian economic principles with the existing economic models. We find that some of the principles of Potterian economics are consistent with economists models. Many other principles, however, are distorted and contain numerous inaccuracies, contradicting professional economists views and insights. We conclude that Potterian economics can teach us about the formation and dissemination of folk economics, the intuiti
A fundamental challenge for modern economics is to understand what happens when actors in an economy are replaced with algorithms. Like rationality has enabled understanding of outcomes of classical economic actors, no-regret can enable the understanding of outcomes of algorithmic actors. This review article covers the classical computer science literature on no-regret algorithms to provide a foundation for an overview of the latest economics research on no-regret algorithms, focusing on the emerging topics of manipulation, statistical inference, and algorithmic collusion.
Business growth is a goal of great importance for its both private and social benefits. Many firms view business growth as an imperative for their survival, stability, and long-term success. Business growth can be socially beneficial, too, as it enables businesses to expand into new territories where they can stimulate economic growth and development, creates more jobs, increase living standards, and better serve their communities by giving back more through Corporate Social Responsibility initiatives. Business growth must be planned reasonably and optimally so that it can effectively achieve its critical ambitions in business practice. The current common practices for planning the supply side of business growth are usually ad-hoc and lack well-established mathematical and economic foundations. The present paper argues that business growth planning can be pursued more structurally, reliably, and meaningfully within the framework of Growth Accounting (GA), which was first introduced by Economics Nobel Laureate Robert Solow to study economic growth. It is shown that, although GA was initially put forth as a procedure to explain "economic growth" ex-post, it can similarly be used to p
As the amount of economic and other data generated worldwide increases vastly, a challenge for future generations of econometricians will be to master efficient algorithms for inference in empirical models with large information sets. This Chapter provides a review of popular estimation algorithms for Bayesian inference in econometrics and surveys alternative algorithms developed in machine learning and computing science that allow for efficient computation in high-dimensional settings. The focus is on scalability and parallelizability of each algorithm, as well as their ability to be adopted in various empirical settings in economics and finance.
The Growth-at-Risk (GaR) framework has garnered attention in recent econometric literature, yet current approaches implicitly assume a constant Pareto exponent. We introduce novel and robust econometrics to estimate the tails of GaR based on a rigorous theoretical framework and establish validity and effectiveness. Simulations demonstrate consistent outperformance relative to existing alternatives in terms of predictive accuracy. We perform a long-term GaR analysis that provides accurate and insightful predictions, effectively capturing financial anomalies better than current methods.
This study analyses the impacts of economic complexity on environmental performance in BRICS-T countries. Annual data for the period 1999-2021, Durbin-Hausman cointegration test and Augmented Mean Group (AMG) estimator are used in the analysis. The robustness of the Panel AMG results is tested with CCEMG and CS-ARDL methods. The results indicate that economic complexity has a positive impact on environmental performance. An increase of 1% in the economic complexity index increases environmental performance in BRICS-T countries between 0.020% and 1.243%. However, economic growth, energy intensity and population density were found to have a negative impact on environmental performance. Renewable energy use, in contrast, contributes positively to environmental performance.
This paper studies the macroeconomic impact of economic freedom on foreign direct investments inflows in both global and regional panel analyses involving 156 countries through the period of 1995-2016. Unlike to prior literature, it includes often neglected nations such as Fragile and Conflict-Affected states, Sub-Saharan, Oceanian, and Post-Soviet countries. The paper finds a positive impact of economic freedom on FDI under fixed-effects model in global case where a unit change in economic freedom scales FDI inflows up to 1.15 units. More specifically, all 9 regions also refer to positive and significant impact of economic freedom on FDI. The highest impact is recorded in European countries, whereas the lowest ones are documented in Fragile-Conflict affected states, Sub-Saharan zone, and Oceanian countries.
This paper investigates the economic feasibility of replacing human labor with robotics and automation in Qatar's manufacturing and service sectors. By analyzing labor costs, productivity gains, and implementation expenses, the study assesses the potential financial impact and return on investment of robotic integration. Results indicate the sectors where automation is economically viable and identify challenges related to workforce adaptation, policy, and infrastructure. These insights provide guidance for policymakers and industry stakeholders considering automation strategies in Qatar.
The analogies between economics and classical mechanics can be extended from constrained optimization to constrained dynamics by formalizing economic (constraint) forces and economic power in analogy to physical (constraint) forces in Lagrangian mechanics. In the differential-algebraic equation framework of General Constrained Dynamics (GCD), households, firms, banks, and the government employ forces to change economic variables according to their desire and their power to assert their interest. These ex-ante forces are completed by constraint forces from unanticipated system constraints to yield the ex-post dynamics. The flexible out-of-equilibrium model can combine Keynesian concepts such as the balance sheet approach and slow adaptation of prices and quantities with bounded rationality (gradient climbing) and interacting agents discussed in behavioral economics and agent-based models. The framework integrates some elements of different schools of thought and overcomes some restrictions inherent to optimization approaches, such as the assumption of markets operating in or close to equilibrium. Depending on the parameter choice for power relations and adaptation speeds, the model