Loading...
Loading...
Browse, search and filter the latest cybersecurity research papers from arXiv
This paper estimates and quantifies how geopolitical alignment shapes global trade across three distinct eras: the Cold War, hyper-globalization, and contemporary fragmentation. We construct a novel measure of bilateral alignment using large language models to compile and analyze 833,485 political events spanning 193 countries from 1950 to 2024. Our analysis reveals that trade flows systematically track geopolitical alignment in both bilateral relationships and aggregate patterns. Using local projections within a gravity framework, we estimate that a one-standard-deviation improvement in geopolitical alignment increases bilateral trade by 20 percent over ten years. Integrating these elasticities into a quantitative general equilibrium model, we find that deteriorating geopolitical relations have reduced global trade by 7 percentage points between 1995 and 2020. Our findings provide empirical benchmarks for evaluating the costs of geopolitical fragmentation in an era of renewed great power competition.
As platforms increasingly deploy robots alongside human labor in last-mile logistics, little is known about how contextual features like product attributes, environmental conditions, and psychological mechanisms shape consumer preference in real-world settings. To address this gap, this paper conducts an empirical study on consumer choice between human versus robot service, analyzing 241,517 package-level choices from Alibaba's last-mile delivery stations. We identify how product privacy sensitivity, product value, and environmental complexity affect consumer preference. Our findings reveal that consumers are significantly more likely to choose robot delivery for privacy-sensitive packages (11.49%) and high-value products (0.97% per 1% increase in value), but prefer human couriers under adverse weather conditions (1.63%). These patterns are robust to alternative specifications and controls. These results also underscore that delivery choices are shaped not only by functional considerations but also by psychological concerns, highlighting the need for context-aware service design that aligns strategies with consumer perceptions.
This paper estimates the effect of Hurricane Harvey on wages and employment in the construction labor industry across impacted counties in Texas. Based on data from the Quarterly Census of Employment and Wages (QCEW) for the period 2016-2019, I adopted a difference-in-differences event study approach by comparing results in 41 FEMA-designated disaster counties with a set of unaffected southern control counties. I find that Hurricane Harvey had a large and long-lasting impact on labor market outcomes in the construction industry. More precisely, average log wages in treated counties rose by around 7.2 percent compared to control counties two quarters after the hurricane and remained high for the next two years. Employment effects were more gradual, showing a statistically significant increase only after six quarters, in line with the lagged nature of large-scale reconstruction activities. These results imply that natural disasters can generate persistent labor demand shocks to local construction markets, with policy implications for disaster recovery planning and workforce mobilization.
Gravity equations are often used to evaluate counterfactual trade policy scenarios, such as the effect of regional trade agreements on trade flows. In this paper, we argue that the suitability of gravity equations for this purpose crucially depends on their out-of-sample predictive power. We propose a methodology that compares different versions of the gravity equation, both among themselves and with machine learning-based forecast methods such as random forests and neural networks. We find that the 3-way gravity model is difficult to beat in terms of out-of-sample average predictive performance, further justifying its place as the predominant tool for applied trade policy analysis. However, when the goal is to predict individual bilateral trade flows, the 3-way model can be outperformed by an ensemble machine learning method.
Agent-based models (ABMs) are gaining increasing traction in several domains, due to their ability to represent complex systems that are not easily expressible with classical mathematical models. This expressivity and richness come at a cost: ABMs can typically be analyzed only through simulation, making their analysis challenging. Specifically, when studying the output of ABMs, the analyst is often confronted with practical questions such as: (i) how many independent replications should be run? (ii) how many initial time steps should be discarded as a warm-up? (iii) after the warm-up, how long should the model run? (iv) what are the right parameter values? Analysts usually resort to rules of thumb and experimentation, which lack statistical rigor. This is mainly because addressing these points takes time, and analysts prefer to spend their limited time improving the model. In this paper, we propose a methodology, drawing on the field of Statistical Model Checking, to automate the process and provide guarantees of statistical rigor for ABMs written in NetLogo, one of the most popular ABM platforms. We discuss MultiVeStA, a tool that dramatically reduces the time and human intervention needed to run statistically rigorous checks on ABM outputs, and introduce its integration with NetLogo. Using two ABMs from the NetLogo library, we showcase MultiVeStA's analysis capabilities for NetLogo ABMs, as well as a novel application to statistically rigorous calibration. Our tool-chain makes it immediate to perform statistical checks with NetLogo models, promoting more rigorous and reliable analyses of ABM outputs.
What is the best macroprudential regulation when households differ in their exposure to profits from the financial sector? To answer the question, I study a real business cycle model with household heterogeneity and market incompleteness. In the model, shocks are amplified in states with high leverage, leading to lower investment. I consider the problem of a Ramsey planner who can finance transfers with a distortive tax on labor and levy taxes on the balance sheet components of experts. I show that the optimal tax on capital purchases is zero and the optimal policy relies mostly on a tax on deposit issuance. The latter redistributes between agents by affecting the equilibrium rate on deposits. The welfare gains from optimal policy are due to both redistribution and insurance and are larger the more unequal the initial distribution is. A simple tax rule that targets a level of leverage can achieve most of the welfare gains from optimal policy.
Background: It is unclear what the relative impacts of prevention or treatment of NCDs are on future health system expenditure. First, we estimated expenditure in Australia for prevention vs treatment pathways to achieve SDG target 3.4. Second, we applied the method to 34 other OECD countries. Methods: We used GBD data to estimate average annual percentage changes in disease incidence, remission, and CFRs from 1990-2021, and projected to 2030 to estimate business-as-usual (BAU) reductions in NCD mortality risk (40q30). For countries not on track to meet SDG3.4 under BAU, we modelled two intervention scenarios commencing in 2022 to achieve SDG3.4: (1) prevention via accelerated incidence reduction; (2) treatment via accelerated increases in remission and decreases in CFRs. Australian disease expenditure data were input into a PMSLT model to estimate expenditure changes from 2022 to 2040. Assuming similar expenditure patterns, the method was applied across OECD countries. Findings: In Australia, current trends project a 25% reduction in 40q30 by 2030, short of the 33.3% SDG3.4 target. Achieving this requires a 2.53 percentage point (pp) annual acceleration in incidence decline (prevention) or 1.56pp acceleration in CFR reduction and remission increase (treatment). Prevention reduces disease expenditure by 0.72%-3.17% by 2030 and 2040; treatment initially increase expenditure by 0.16%, before reducing it by 0.98%. A treatment scenario reducing only CFRs increased expenditure initially; increasing remission alone achieved savings similar to prevention. Only Sweden, Ireland, and South Korea were on track to meet SDG3.4. Other OECD countries showed similar expenditure impacts to Australia. Interpretation: Whether reducing NCD mortality saves money depends on pathway taken (prevention or treatment). Care is needed when linking NCD mortality reduction to health system savings.
This paper evaluates the short and medium-term effectiveness of hiring incentives aimed at promoting the permanent conversion of temporary contracts through social contribution exemptions. Using rich administrative data from Tuscany, providing detailed employment histories, we use difference in differences and regression discontinuity designs to exploit a unique change in eligibility criteria in 2018. We find that the incentives immediately increased the probability of conversion, with no evidence of substitution against non-eligible cohorts. However, these positive effects were short-lived and appear to reflect anticipated conversions, as we find null longer-term effects on permanent hirings.
This paper investigates the economic feasibility of replacing human labor with robotics and automation in Qatar's manufacturing and service sectors. By analyzing labor costs, productivity gains, and implementation expenses, the study assesses the potential financial impact and return on investment of robotic integration. Results indicate the sectors where automation is economically viable and identify challenges related to workforce adaptation, policy, and infrastructure. These insights provide guidance for policymakers and industry stakeholders considering automation strategies in Qatar.
Artificial intelligence (AI) is a key enabler of innovation against climate change. In this study, we investigate the intersection of AI and climate adaptation and mitigation technologies through patent analyses of a novel dataset of approximately 63 000 Green AI patents. We analyze patenting trends, corporate ownership of the technology, the geographical distributions of patents, their impact on follow-on inventions and their market value. We use topic modeling (BERTopic) to identify 16 major technological domains, track their evolution over time, and identify their relative impact. We uncover a clear shift from legacy domains such as combustion engines technology to emerging areas like data processing, microgrids, and agricultural water management. We find evidence of growing concentration in corporate patenting against a rapidly increasing number of patenting firms. Looking at the technological and economic impact of patents, while some Green AI domains combine technological impact and market value, others reflect weaker private incentives for innovation, despite their relevance for climate adaptation and mitigation strategies. This is where policy intervention might be required to foster the generation and use of new Green AI applications.
As variable renewable energy increases and more demand is electrified, we expect price formation in wholesale electricity markets to transition from being dominated by fossil fuel generators to being dominated by the opportunity costs of storage and demand management. In order to analyse this transition, we introduce a new method to investigate price formation based on a mapping from the dual variables of the energy system optimisation problem to the bids and asks of electricity suppliers and consumers. This allows us to build the full supply and demand curves in each hour. We use this method to analyse price formation in a sector-coupled, climate-neutral energy system model for Germany, PyPSA-DE, with high temporal resolution and myopic foresight in 5-year steps from 2020 until full decarbonisation in 2045. We find a clear transition from distinct price levels, corresponding to fossil fuels, to a smoother price curve set by variable renewable energy sources, batteries and electrolysis. Despite higher price volatility, the fully decarbonised system clears with non-zero prices in 75% of all hours. Our results suggest that flexibility and cross-sectoral demand bidding play a vital role in stabilising electricity prices in a climate-neutral future. These findings are highly relevant for guiding investment decisions and informing policy, particularly in support of dynamic pricing, the expansion of energy storage across multiple timescales, and the coordinated development of renewable and flexibility technologies.
Artificial intelligence (AI) is transforming financial planning by expanding access, lowering costs, and enabling dynamic, data-driven advice. Yet without clear safeguards, digital platforms risk reproducing longstanding market inefficiencies such as information asymmetry, misaligned incentives, and systemic fragility. This paper develops a framework for responsible AI in financial planning, anchored in five principles: fiduciary duty, adaptive personalization, technical robustness, ethical and fairness constraints, and auditability. We illustrate these risks and opportunities through case studies, and extend the framework into a five-level roadmap of AI financial intermediaries. By linking technological design to economic theory, we show how AI can either amplify vulnerabilities or create more resilient, trustworthy forms of financial intermediation.
We characterize the family of utility functions satisfying linear fractional relative risk aversion (LFRRA) in terms of the Gauss hypergeometric functions. We apply this family, which nests various utility functions used in different strands of literature, to monopolistic competition and obtain a closed-form solution for the profit-maximizing price by generalizing the Lambert W function. We let firm-level data decide whether the RRA in each sector or in the aggregate economy is increasing, decreasing, or constant, which in turn determines whether markups are decreasing, increasing, or constant with respect to marginal costs.
How does the climatic experience of previous generations affect today's attention to environmental questions? Using self-reported beliefs and environmental themes in folklore, we show empirically that the realized intensity of deviations from typical climate conditions in ancestral generations influences how much descendants care about the environment. The effect exhibits a U-shape where more stable and more unstable ancestral climates lead to higher attention today, with a dip for intermediate realizations. We propose a theoretical framework where the value of costly attention to environmental conditions depends on the perceived stability of the environment, prior beliefs about which are shaped through cultural transmission by the experience of ethnic ancestors. The U-shape is rationalized by a double purpose of learning about the environment: optimal utilization of typical conditions and protection against extreme events.
The increasing prevalence of chronic diseases poses a significant challenge to global efforts to alleviate poverty, promote health equity, and control healthcare costs. This study adopts a structural approach to explore how patients manage chronic diseases by making trade-offs between inpatient care and ambulatory care outpatient services. Specifically, it investigates whether disadvantaged populations make distinct trade-offs compared to the general population and examines the impact of anti-poverty programs that reduce inpatient cost-sharing. Using health insurance claims data from a rural county in China, the study reveals that disadvantaged individuals tend to avoid ambulatory care unless it substantially lowers medical expenses. In contrast, the general population is more likely to prioritize ambulatory care, even at higher costs, to prevent disease progression. The findings also indicate that current anti-poverty insurance policies, which focus predominantly on hospitalization, inadvertently decrease ambulatory care usage by 23\%, resulting in increased healthcare costs and a 46.2\% decline in patient welfare. Counterfactual analysis suggests that reducing cost-sharing for ambulatory care would be a more cost-effective strategy for improving health outcomes and supporting disadvantaged populations than providing travel subsidies.
Despite growing policy interest, the determinants of supply chain resilience are still not well understood. We propose a new theory of supply chain formation with compatibility frictions: only compatible inputs can be used in final good production. Intermediate producers choose the degree of specialization of their goods, trading off higher productivity against a lower share of compatible final producers. We model supply chains as complex production processes in which multiple complementary inputs must be sourced for final production to take place. Specialization choices, production complexity, and search frictions jointly determine supply chain resilience. Relative to the efficient allocation, the equilibrium is characterized by over-specialization due to a novel network externality arising from the interplay between frictional markets, endogenous specialization, and complex production. Over-specialization makes supply chains more productive in normal times but less resilient to disruptions than socially desirable. We show how a targeted transaction subsidy can decentralize efficient resilience in supply chains, and examine the implications of setting compatibility standards.
The chemicals industry accounts for about 5% of global greenhouse gas emissions today and is among the most difficult industries to abate. We model decarbonization pathways for the most energy-intensive segment of the industry, the production of basic chemicals: olefins, aromatics, methanol, ammonia, and chlor-alkali. Unlike most prior pathways studies, we apply a scenario-analysis approach that recognizes the central role of corporate investment decision making for capital-intensive industries, under highly uncertain long-term future investment environments. We vary the average pace of decarbonization capital allocation allowed under plausible alternative future world contexts and construct least-cost decarbonization timelines by modeling abatement projects individually across more than 2,600 production facilities located in four major producing regions. The timeline for deeply decarbonizing production varies by chemical and region but depends importantly on the investment environment context. In the best-of-all environments, to deeply decarbonize production, annual average capital spending for abatement for the next two to three decades will need to be greater than (and in addition to) historical "business-as-usual" investments, and cumulative investment in abatement projects would exceed $1 trillion. In futures where key drivers constrain investment appetites, timelines for decarbonizing the industry extend well into the second half of the century.
Regulators and utilities have been exploring hourly retail electricity pricing, with several existing programs providing day-ahead hourly pricing schedules. At the same time, customers are deploying distributed energy resources and smart energy management systems that have significant flexibility and can optimally follow price signals. In aggregate, these optimally controlled loads can create congestion management issues for distribution system operators (DSOs). In this paper, we describe a new linear pricing mechanism for day-ahead retail electricity pricing that provides a signal for customers to follow to mitigate over-consumption while still consuming energy at hours that are preferential for system performance. We show that by broadcasting a linear price designed for price-signal control of cost-optimizing loads, we can shape customer load profiles to provide congestion management without the need for bi-directional communication or customer bidding programs.