Main Conference Day Three - GMT (Greenwich Mean Time, GMTZ)
Main Conference Day Three - GMT (Greenwich Mean Time, GMTZ)
How to position, plan, and assess climate change expectations.
- Chris Kenyon - Global Head of Quant Innovation, MUFG Securities
Where are we now? Where are we headed? What are the current practical applications?
How do we expect quantum ML to play a sharper role in finance that modern ML?
- Julien Guyon - Professor of Applied Mathematics, École nationale des ponts et chaussées
- The flaws of the Heston model
- Variance curve models
- Rough volatility
- The term-structure of the equity at-the-money skew
- Time-shifted power-law model
- Bergomi models
- Path-dependent volatility
- Julien Guyon - Professor of Applied Mathematics, École nationale des ponts et chaussées
We investigate the dynamic properties of various stochastic and notably rough volatility models, focusing on the dynamics of implied volatilities. While recent literature has extensively analyzed static properties, such as a model's calibration power or the term structure of ATM skews, dynamic features have received less attention. We focus on the Skew-Stickiness Ratio (SSR), an industry-standard indicator of joint spot price and implied volatility dynamics, pursuing the analysis of [Bergomi, Smile dynamics IV, Risk 2009] and extending it to rough volatility models. Using different numerical estimators, we compare the behavior of the model SSR for several models (not limited to the affine framework) with the empirical market SSR for the SPX Index; this comparison sheds light on the suitability of certain modeling choices. Notably, we observe that Bergomi's original intuition—that a forward variance model with a power-law kernel should generate an SSR with a constant term structure—turns out to be accurate, but only for small volatilities of volatilities. On the contrary, the typical parameter sets required for the calibration of fractional models to the SPX options surface (with high levels of volatilities of volatilities) generate a term structure of the SSR that displays important deviations with respect to the market, leading to preliminary conclusions not in favor of models such as rough Bergomi or rough Heston.
- Florian Bourgey - Quantitative Researcher, Bloomberg LP
- Stefano De Marco - Professor, Ecole Polytechnique
- Brian Huge - Head of Quant, Saxo Bank
How to position, plan, and assess climate change expectations.
- Chris Kenyon - Global Head of Quant Innovation, MUFG Securities
- Why it is a difficult problem
- Approximate fits by parametric models
- Minimum-entropy calibration: VIX constrained martingale Schrödinger bridges
- Julien Guyon - Professor of Applied Mathematics, École nationale des ponts et chaussées
Economic concerns, such as the impact of geopolitical conflicts like the Russia-Ukraine war, rising inflation, and supply chain disruptions, exacerbate market instability and illustrate how equity markets have experienced sharp declines followed by swift rebounds, underscoring their continued choppiness.
Within such a context, Gap options, which payoff when a significant one-day downside move occurs, effectively hedge against abrupt underlying asset price drops with no trading in between. They can be priced using jump models calibrated to short-term European options and provide an effective and efficient hedging strategy through the use of short-dated, deep out-of-the-money puts, taking advantage of the huge growth in weekly options since 2011 (30-50% options volume).
Tailored to banking Financial Structured Products of Life Structured Annuities, Gap options enable to significantly improve Tail Risk mitigation (vs. vanilla options) while significantly reducing regulatory capital requirements, in a cost-effective and agnostic way.
- Aymeric Kalife - Associate Professor, Paris Dauphine University
Black Scholes, local volatility, and stochastic models
We present a unified framework for computing CVA sensitivities using probabilistic machine learning meant as refined regression tools on simulated data, validatable by low-cost companion Monte Carlo procedures. Various notions of sensitivities are introduced and benchmarked numerically. We identify the sensitivities representing the best practical tradeoffs in downstream tasks including CVA hedging and risk assessment.
- Stephane Crepey - Professor of Mathematics, Université Paris Cité
- Bouazza Saadeddine - Quantitative Researcher, Crédit Agricole CIB
- Hoang Nguyen - PhD Student, Université Paris Cité
We address incomplete market situations where model calibration is faced with lack of liquid assets. A common framework is first provided for historical and risk-neutral worlds with the rationale of probability measure for risk indicators. A procedure is then proposed for learning market behavior and using it to complete available market setup and build a consistent arbitrage-free dynamics.
- Bringing historical and risk-neutral worlds together
- Probability measure rationale for risk indicators
- Market evolution and historical learning
- Completing markets with observed dynamics
- No-arbitrage constraints and calibration bounds
- Nadhem Meziou - Quantitative Expert Leader, Global Markets, Natixis
The role of quantum computing in data analysis for the most time-sensitive algorithms.
Algo wheels have proliferated across the industry and allow the buy-side firms to select their brokers based on performance with respect to a particular benchmark (e.g. arrival slippage, vwap slippage and many others). The specific benchmarks are client specific and the factors involved are multiple ones. In this work, we introduce the concept of Lenses in Algorithmic Trading. We show concrete examples of Lenses and how to used them to improve algorithmic trading performance with the objective of being more competitive in these wheels.
- Gabriel Tucci - Global Head of Equities Cash Quant Trading, Citi
Convolutional and recurrent neural networks (CNNs & RNNs)
- Julien Hok - Quantitative Analysis, Investec Bank
The paper uses a graph model to examine the effects of financial market regulations on systemic risk. Focusing on central clearing, we model the financial system as a multigraph of trade and risk relations among banks. We then study the impact of central clearing by a priori estimates in the model, stylized case studies, and a simulation case study. These case studies identify the drivers of regulatory policies on risk reduction at the firm and systemic levels. The analysis shows that the effect of central clearing on systemic risk is ambiguous, with potential positive and negative outcomes, depending on the credit quality of the clearing house, netting benefits and losses, and concentration risks. These computational findings align with empirical studies, yet do not require intensive collection of proprietary data. In addition, our approach enables us to disentangle various competing effects. The approach thus provides policymakers and market practitioners with tools to study the impact of a regulation at each level, enabling decision-makers to anticipate and evaluate the potential impact of regulatory interventions in various scenarios before their implementation.
- Nikolai Nowaczyk - Technical Specialist, NatWest
The Integration of cloud computing in the decision-making process