Loading...
Loading...
Browse, search and filter the latest cybersecurity research papers from arXiv
Energy storage can promote the integration of renewables by operating with charge and discharge policies that balance an intermittent power supply. This study investigates the scheduling of energy storage assets under energy price uncertainty, with a focus on electricity markets. A two-stage stochastic risk-constrained approach is employed, whereby electricity price trajectories or specific power markets are observed, allowing for recourse in the schedule. Conditional value-at-risk is used to quantify tail risk in the optimization problems; this allows for the explicit specification of a probabilistic risk limit. The proposed approach is tested in an integrated hydrogen system (IHS) and a battery energy storage system (BESS). In the joint design and operation context for the IHS, the risk constraint results in larger installed unit capacities, increasing capital cost but enabling more energy inventory to buffer price uncertainty. As shown in both case studies, there is an operational trade-off between risk and expected reward; this is reflected in higher expected costs (or lower expected profits) with increasing levels of risk aversion. Despite the decrease in expected reward, both systems exhibit substantial benefits of increasing risk aversion. This work provides a general method to address uncertainties in energy storage scheduling, allowing operators to input their level of risk tolerance on asset decisions.
The main purpose of this article is to give a general overview and understanding of the first widely used option-pricing model, the Black-Scholes model. The history and context are presented, with the usefulness and implications in the economics world. A brief review of fundamental calculus concepts is introduced to derive and solve the model. The equation is then resolved using both an analytical (variable separation) and a numerical method (finite differences). Conclusions are drawn in order to understand how Black-Scholes is employed nowadays. At the end a handy appendix (A) is written with some economics notions to ease the reader's comprehension of the paper; furthermore a second appendix (B) is given with some code scripts, to allow the reader to put in practice some concepts.
Machine learning models used for high-stakes predictions in domains like credit risk face critical degradation due to concept drift, requiring robust and transparent adaptation mechanisms. We propose an architecture, where a dedicated correction layer is employed to efficiently capture systematic shifts in predictive scores when a model becomes outdated. The key element of this architecture is the design of a correction layer using Probabilistic Rule Models (PRMs) based on Markov Logic Networks, which guarantees intrinsic interpretability through symbolic, auditable rules. This structure transforms the correction layer from a simple scoring mechanism into a powerful diagnostic tool capable of isolating and explaining the fundamental changes in borrower riskiness. We illustrate this diagnostic capability using Fannie Mae mortgage data, demonstrating how the interpretable rules extracted by the correction layer successfully explain the structural impact of the 2008 financial crisis on specific population segments, providing essential insights for portfolio risk management and regulatory compliance.
Recent work has emphasized the diversification benefits of combining trend signals across multiple horizons, with the medium-term window-typically six months to one year-long viewed as the "sweet spot" of trend-following. This paper revisits this conventional view by reallocating exposure dynamically across horizons using a Bayesian optimization framework designed to learn the optimal weights assigned to each trend horizon at the asset level. The common practice of equal weighting implicitly assumes that all assets benefit equally from all horizons; we show that this assumption is both theoretically and empirically suboptimal. We first optimize the horizon-level weights at the asset level to maximize the informativeness of trend signals before applying Bayesian graphical models-with sparsity and turnover control-to allocate dynamically across assets. The key finding is that the medium-term band contributes little incremental performance or diversification once short- and long-term components are included. Removing the 125-day layer improves Sharpe ratios and drawdown efficiency while maintaining benchmark correlation. We then rationalize this outcome through a minimum-variance formulation, showing that the medium-term horizon largely overlaps with its neighboring horizons. The resulting "barbell" structure-combining short- and long-term trends-captures most of the performance while reducing model complexity. This result challenges the common belief that more horizons always improve diversification and suggests that some forms of time-scale diversification may conceal unnecessary redundancy in trend premia.
Execution algorithms are vital to modern trading, they enable market participants to execute large orders while minimising market impact and transaction costs. As these algorithms grow more sophisticated, optimising them becomes increasingly challenging. In this work, we present a reinforcement learning (RL) framework for discovering optimal execution strategies, evaluated within a reactive agent-based market simulator. This simulator creates reactive order flow and allows us to decompose slippage into its constituent components: market impact and execution risk. We assess the RL agent's performance using the efficient frontier based on work by Almgren and Chriss, measuring its ability to balance risk and cost. Results show that the RL-derived strategies consistently outperform baselines and operate near the efficient frontier, demonstrating a strong ability to optimise for risk and impact. These findings highlight the potential of reinforcement learning as a powerful tool in the trader's toolkit.
This paper develops and empirically implements a continuous functional framework for analyzing systemic risk in financial networks, building on the dynamic spatial treatment effect methodology established in our previous studies. We extend the Navier-Stokes-based approach from our previous studies to characterize contagion dynamics in the European banking system through the spectral properties of network evolution operators. Using high-quality bilateral exposure data from the European Banking Authority Transparency Exercise (2014-2023), we estimate the causal impact of the COVID-19 pandemic on network fragility using spatial difference-in-differences methods adapted from our previous studies. Our empirical analysis reveals that COVID-19 elevated network fragility, measured by the algebraic connectivity $\lambda_2$ of the system Laplacian, by 26.9% above pre-pandemic levels (95% CI: [7.4%, 46.5%], p<0.05), with effects persisting through 2023. Paradoxically, this occurred despite a 46% reduction in the number of banks, demonstrating that consolidation increased systemic vulnerability by intensifying interconnectedness-consistent with theoretical predictions from continuous spatial dynamics. Our findings validate the key predictions from \citet{kikuchi2024dynamical}: treatment effects amplify over time through spatial spillovers, consolidation increases fragility when coupling strength rises, and systems exhibit structural hysteresis preventing automatic reversion to pre-shock equilibria. The results demonstrate the empirical relevance of continuous functional methods for financial stability analysis and provide new insights for macroprudential policy design. We propose network-based capital requirements targeting spectral centrality and stress testing frameworks incorporating diffusion dynamics to address the coupling externalities identified in our analysis.
Robust yield curve estimation is crucial in fixed-income markets for accurate instrument pricing, effective risk management, and informed trading strategies. Traditional approaches, including the bootstrapping method and parametric Nelson-Siegel models, often struggle with overfitting or instability issues, especially when underlying bonds are sparse, bond prices are volatile, or contain hard-to-remove noise. In this paper, we propose a neural networkbased framework for robust yield curve estimation tailored to small mortgage bond markets. Our model estimates the yield curve independently for each day and introduces a new loss function to enforce smoothness and stability, addressing challenges associated with limited and noisy data. Empirical results on Swedish mortgage bonds demonstrate that our approach delivers more robust and stable yield curve estimates compared to existing methods such as Nelson-Siegel-Svensson (NSS) and Kernel-Ridge (KR). Furthermore, the framework allows for the integration of domain-specific constraints, such as alignment with risk-free benchmarks, enabling practitioners to balance the trade-off between smoothness and accuracy according to their needs.
This paper studies the mathematical problem of allocating payouts (compensations) in an endowment contingency fund using a risk-sharing rule that satisfies full allocation. Besides the participants, an administrator manages the fund by collecting ex-ante contributions to establish the fund and distributing ex-post payouts to members. Two types of administrators are considered. An 'active' administrator both invests in the fund and receives the payout of the fund when no participant receives a payout. A 'passive' administrator performs only administrative tasks and neither invests in nor receives a payout from the fund. We analyze the actuarial fairness of both compensation-based risk-sharing schemes and provide general conditions under which fairness is achieved. The results extend earlier work by Denuit and Robert (2025) and Dhaene and Milevsky (2024), who focused on payouts based on Bernoulli distributions, by allowing for general non-negative loss distributions.
Estimating risk measures such as large loss probabilities and Value-at-Risk is fundamental in financial risk management and often relies on computationally intensive nested Monte Carlo methods. While Multi-Level Monte Carlo (MLMC) techniques and their weighted variants are typically more efficient, their effectiveness tends to deteriorate when dealing with irregular functions, notably indicator functions, which are intrinsic to these risk measures. We address this issue by introducing a novel MLMC parametrization that significantly improves performance in practical, non-asymptotic settings while maintaining theoretical asymptotic guarantees. We also prove that antithetic sampling of MLMC levels enhances efficiency regardless of the regularity of the underlying function. Numerical experiments motivated by the calculation of economic capital in a life insurance context confirm the practical value of our approach for estimating loss probabilities and quantiles, bridging theoretical advances and practical requirements in financial risk estimation.
Natural hedging allows life insurers to manage longevity risk internally by offsetting the opposite exposures of life insurance and annuity liabilities. Although many studies have proposed natural hedging strategies under different settings, calibration methods, and mortality models, a unified framework for constructing and evaluating such hedges remains undeveloped. While graphical risk assessment has been explored for index-based longevity hedges, no comparable metric exists for natural hedging. This paper proposes a structured natural hedging framework paired with a graphical risk metric for hedge evaluation. The framework integrates valuation, calibration, and evaluation, while the graphical metric provides intuitive insights into residual dependencies and hedge performance. Applied to multiple hedging scenarios, the proposed methods demonstrate flexibility, interpretability, and practical value for longevity risk management.
Risk assessment in casualty insurance, such as flood risk, traditionally relies on extreme-value methods that emphasizes rare events. These approaches are well-suited for characterizing tail risk, but do not capture the broader dynamics of environmental variables such as moderate or frequent loss events. To complement these methods, we propose a modelling framework for estimating the full (daily) distribution of environmental variables as a function of time, that is a distributional version of typical climatological summary statistics, thereby incorporating both seasonal variation and gradual long-term changes. Aside from the time trend, to capture seasonal variation our approach simultaneously estimates the distribution for each instant of the seasonal cycle, without explicitly modelling the temporal dependence present in the data. To do so, we adopt a framework inspired by GAMLSS (Generalized Additive Models for Location, Scale, and Shape), where the parameters of the distribution vary over the seasonal cycle as a function of explanatory variables depending only on the time of year, and not on the past values of the process under study. Ignoring the temporal dependence in the seasonal variation greatly simplifies the modelling but poses inference challenges that we clarify and overcome. We apply our framework to daily river flow data from three hydrometric stations along the Fraser River in British Columbia, Canada, and analyse the flood of the Fraser River in early winter of 2021.
We study Pareto-optimal risk sharing in economies with heterogeneous attitudes toward risk, where agents' preferences are modeled by distortion risk measures. Building on comonotonic and counter-monotonic improvement results, we show that agents with similar attitudes optimally share risks comonotonically (risk-averse) or counter-monotonically (risk-seeking). We show how the general $n$-agent problem can be reduced to a two-agent formulation between representative risk-averse and risk-seeking agents, characterized by the infimal convolution of their distortion risk measures. Within this two-agent framework, we establish necessary and sufficient conditions for the existence of optimal allocations, and we identify when the infimal convolution yields an unbounded value. When existence fails, we analyze the problem under nonnegative allocation constraints, and we characterize optima explicitly, under piecewise-linear distortion functions and Bernoulli-type risks. Our findings suggest that the optimal allocation structure is governed by the relative strength of risk aversion versus risk seeking behavior, as intuition would suggest.
We propose a new approach, termed Realized Risk Measures (RRM), to estimate Value-at-Risk (VaR) and Expected Shortfall (ES) using high-frequency financial data. It extends the Realized Quantile (RQ) approach proposed by Dimitriadis and Halbleib by lifting the assumption of return self-similarity, which displays some limitations in describing empirical data. More specifically, as the RQ, the RRM method transforms intra-day returns in intrinsic time using a subordinator process, in order to capture the inhomogeneity of trading activity and/or volatility clustering. Then, microstructural effects resulting in non-zero autocorrelation are filtered out using a suitable moving average process. Finally, a fat-tailed distribution is fitted on the cleaned intra-day returns. The return distribution at low frequency (daily) is then extrapolated via either a characteristic function approach or Monte Carlo simulations. VaR and ES are estimated as the quantile and the tail mean of the distribution, respectively. The proposed approach is benchmarked against the RQ through several experiments. Extensive numerical simulations and an empirical study on 18 US stocks show the outperformance of our method, both in terms of the in-sample estimated risk measures and in the out-of-sample risk forecasting
With the rise of emerging risks, model uncertainty poses a fundamental challenge in the insurance industry, making robust pricing a first-order question. This paper investigates how insurers' robustness preferences shape competitive equilibrium in a dynamic insurance market. Insurers optimize their underwriting and liquidity management strategies to maximize shareholder value, leading to equilibrium outcomes that can be analytically derived and numerically solved. Compared to a benchmark without model uncertainty, robust insurance pricing results in significantly higher premiums and equity valuations. Notably, our model yields three novel insights: (1) The minimum, maximum, and admissible range of aggregate capacity all expand, indicating that insurers' liquidity management becomes more conservative. (2) The expected length of the underwriting cycle increases substantially, far exceeding the range commonly reported in earlier empirical studies. (3) While the capacity process remains ergodic in the long run, the stationary density becomes more concentrated in low-capacity states, implying that liquidity-constrained insurers require longer to recover. Together, these findings provide a potential explanation for recent skepticism regarding the empirical evidence of underwriting cycles, suggesting that such cycles may indeed exist but are considerably longer than previously assumed.
Despite accounting for 96.1% of all businesses in Malaysia, access to financing remains one of the most persistent challenges faced by Micro, Small, and Medium Enterprises (MSMEs). Newly established or young businesses are often excluded from formal credit markets as traditional underwriting approaches rely heavily on credit bureau data. This study investigates the potential of bank statement data as an alternative data source for credit assessment to promote financial inclusion in emerging markets. Firstly, we propose a cash flow-based underwriting pipeline where we utilise bank statement data for end to end data extraction and machine learning credit scoring. Secondly, we introduce a novel dataset of 611 loan applicants from a Malaysian lending institution. Thirdly, we develop and evaluate credit scoring models based on application information and bank transaction-derived features. Empirical results show that the use of such data boosts the performance of all models on our dataset, which can improve credit scoring for new-to-lending MSMEs. Lastly, we intend to release the anonymised bank transaction dataset to facilitate further research on MSMEs financial inclusion within Malaysia's emerging economy.
Spot covariance estimation is commonly based on high-frequency open-to-close return data over short time windows, but such approaches face a trade-off between statistical accuracy and localization. In this paper, I introduce a new estimation framework using high-frequency candlestick data, which include open, high, low, and close prices, effectively addressing this trade-off. By exploiting the information contained in candlesticks, the proposed method improves estimation accuracy relative to benchmarks while preserving local structure. I further develop a test for spot covariance inference based on candlesticks that demonstrates reasonable size control and a notable increase in power, particularly in small samples. Motivated by recent work in the finance literature, I empirically test the market neutrality of the iShares Bitcoin Trust ETF (IBIT) using 1-minute candlestick data for the full year of 2024. The results show systematic deviations from market neutrality, especially in periods of market stress. An event study around FOMC announcements further illustrates the new method's ability to detect subtle shifts in response to relatively mild information events.
Stablecoins have emerged as a significant component of global financial infrastructure, with aggregate market capitalization surpassing USD250 billion in 2025. Their increasing integration into payment and settlement systems has simultaneously introduced novel channels of systemic exposure, particularly liquidity risk during periods of market stress. This study develops a hybrid monetary architecture that embeds fiat-backed stablecoins within a central bank-anchored framework to structurally mitigate liquidity fragility. The proposed model combines 100 percent reserve backing, interoperable redemption rails, and standing liquidity facilities to guarantee instant convertibility at par. Using the 2023 SVB USDC de-peg event as a calibrated stress scenario, we demonstrate that this architecture reduces peak peg deviations, shortens stress persistence, and stabilizes redemption queues under high redemption intensity. By integrating liquidity backstops and eliminating maturity-transformation channels, the framework addresses run dynamics ex ante rather than through ad hoc intervention. These findings provide empirical and theoretical support for a hybrid stablecoin-CBDC architecture that enhances systemic resilience, preserves monetary integrity, and establishes a credible pathway for stablecoin integration into regulated financial systems.
The SABR model is a cornerstone of interest rate volatility modeling, but its practical application relies heavily on the analytical approximation by Hagan et al., whose accuracy deteriorates for high volatility, long maturities, and out-of-the-money options, admitting arbitrage. While machine learning approaches have been proposed to overcome these limitations, they have often been limited by simplified SABR dynamics or a lack of systematic validation against the full spectrum of market conditions. We develop a novel SABR DNN, a specialized Artificial Deep Neural Network (DNN) architecture that learns the true SABR stochastic dynamics using an unprecedented large training dataset (more than 200 million points) of interest rate Cap/Floor volatility surfaces, including very long maturities (30Y) and extreme strikes consistently with market quotations. Our dataset is obtained via high-precision unbiased Monte Carlo simulation of a special scaled shifted-SABR stochastic dynamics, which allows dimensional reduction without any loss of generality. Our SABR DNN provides arbitrage-free calibration of real market volatility surfaces and Cap/Floor prices for any maturity and strike with negligible computational effort and without retraining across business dates. Our results fully address the gaps in the previous machine learning SABR literature in a systematic and self-consistent way, and can be extended to cover any interest rate European options in different rate tenors and currencies, thus establishing a comprehensive functional SABR framework that can be adopted for daily trading and risk management activities.