Main Conference Day Three - GMT (Greenwich Mean Time, GMTZ)
- Julien Hok - Quantitative Analysis, Investec Bank
- Milena Vuletić - DPhil Candidate, University of Oxford
Local stochastic volatility refers to a popular model class in applied mathematical finance that allows for “calibration-on-the-fly,” typically via a particle method, derived from a formal
McKean-Vlasov equation. Well-posedness of this limit is a well-known problem in the field; the general case is largely open, despite recent progress in Markovian situations.
Our take is to start with a well-defined Euler approximation to the formal McKean-Vlasov equation, followed by a newly established half-step-scheme, allowing for good approximations
of conditional expectations. In a sense, we do Euler first, particle second in contrast to previous works that start with the particle approximation. We show weak rate one, plus error terms that account for the said approximation. The case of particle approximation is discussed in detail and the error rate is given in dependence of all parameters. Joint work with B. Jourdain (Paris) and T. Wagenhofer (Berlin).
- Peter Friz - Professor of Mathematics, TU Berlin, Weierstraß-Institut Berlin
How to position, plan, and assess climate change expectations.
- Chris Kenyon - Global Head of Quant Innovation, MUFG Securities
- Davide Venturelli - Associate Director - Quantum Computing & Research Scientist, USRA
- David Shaw - Chief Analyst, Global Quantum Intelligence
- Davide Venturelli - Associate Director - Quantum Computing & Research Scientist, USRA
- Jack Jacquier - Professor, Imperial College
- Marco Paini - VP Financial Services and EMEA, Rigetti Computing
- Saavan Patel - Chief Technology Officer, Infinity Q
- Mert Esencan - CEO, Icosa Computing
- Julien Guyon - Professor of Applied Mathematics, École nationale des ponts et chaussées
- The flaws of the Heston model
- Variance curve models
- Rough volatility
- The term-structure of the equity at-the-money skew
- Time-shifted power-law model
- Bergomi models
- Path-dependent volatility
- Julien Guyon - Professor of Applied Mathematics, École nationale des ponts et chaussées
We investigate the dynamic properties of various stochastic and notably rough volatility models, focusing on the dynamics of implied volatilities. We pay attention to the Skew-Stickiness Ratio (SSR), an industry-standard indicator of joint spot price and implied volatility dynamics, pursuing the analysis of [Bergomi, Smile dynamics IV, Risk 2009] and extending it to rough volatility models.
- Using different numerical estimators, we compare the model-generated SSR for several models with an estimation of the empirical market SSR of the SPX Index.
- We notice that Bergomi's original intuition that a forward variance model with a power-law kernel should generate an SSR with a constant term structure turns out to be accurate, but only for small levels of volatilities of volatilities.
- On the contrary, we observe that typical parameter sets calibrated to recent SPX options data (with high levels of vols of vols) induce a model SSR whose term-structure deviates considerably with respect to the market, for the models we investigate.
- Florian Bourgey - Quantitative Researcher, Bloomberg LP
- Stefano De Marco - Professor, Ecole Polytechnique
Highly structured or illiquid fixed income instruments with complex covenants are hard to due diligence, risk manage or value. Yet, investment grade, high yield and emerging markets debt seem stretched in valuation, and investors are searching for good risk/return features in securitisations, in particular in leveraged finance, and in illiquid private debt. An approach to performance measurement and risk management for these more complex investments is crafted. For risk management, looking through structures and through covenants is essential. Marking to model valuation, also known as "Level 3" pricing attempts to update pricing to current market conditions, rather than relying on stale, historic pricing.
- Erik Vynckier - Board Member, Chair of the Investment Committee, Foresters Friendly Society and Institute and Faculty of Actuaries
How to position, plan, and assess climate change expectations.
- Chris Kenyon - Global Head of Quant Innovation, MUFG Securities
An exploration of low-latency, fault-tolerant sequencer trading systems, tailored for application within advanced machine learning research teams. This presentation delves into the architectural advantages of sequencers for high-performance research environments, highlighting key attributes like model coordination, feature consolidation, and seamless zero-downtime upgrades.
- Why it is a difficult problem
- Approximate fits by parametric models
- Minimum-entropy calibration: VIX constrained martingale Schrödinger bridges
- Julien Guyon - Professor of Applied Mathematics, École nationale des ponts et chaussées
- Roza Galeeva - Senior Lecturer, John Hopkins, AMS Department
Economic concerns, such as the impact of geopolitical conflicts like the Russia-Ukraine war, rising inflation, and supply chain disruptions, exacerbate market instability and illustrate how equity markets have experienced sharp declines followed by swift rebounds, underscoring their continued choppiness.
Within such a context, Gap options, which payoff when a significant one-day downside move occurs, effectively hedge against abrupt underlying asset price drops with no trading in between. They can be priced using jump models calibrated to short-term European options and provide an effective and efficient hedging strategy through the use of short-dated, deep out-of-the-money puts, taking advantage of the huge growth in weekly options since 2011 (30-50% options volume).
Tailored to banking Financial Structured Products of Life Structured Annuities, Gap options enable to significantly improve Tail Risk mitigation (vs. vanilla options) while significantly reducing regulatory capital requirements, in a cost-effective and agnostic way.
- Aymeric Kalife - Associate Professor, Paris Dauphine University
- Nadhem Meziou - Quantitative Expert Leader, Global Markets, Natixis
We present a unified framework for computing CVA sensitivities using probabilistic machine learning meant as refined regression tools on simulated data, validatable by low-cost companion Monte Carlo procedures. Various notions of sensitivities are introduced and benchmarked numerically. We identify the sensitivities representing the best practical tradeoffs in downstream tasks including CVA hedging and risk assessment.
- Bouazza Saadeddine - Quantitative Researcher, Crédit Agricole CIB
- Hoang Dung Nguyen - PhD Student, Université Paris Cité
We address incomplete market situations where model calibration is faced with lack of liquid assets. A common framework is first provided for historical and risk-neutral worlds with the rationale of probability measure for risk indicators. A procedure is then proposed for learning market behavior and using it to complete available market setup and build a consistent arbitrage-free dynamics.
- Bringing historical and risk-neutral worlds together
- Probability measure rationale for risk indicators
- Market evolution and historical learning
- Completing markets with observed dynamics
- No-arbitrage constraints and calibration bounds
- Nadhem Meziou - Quantitative Expert Leader, Global Markets, Natixis
Algo wheels have proliferated across the industry and allow the buy-side firms to select their brokers based on performance with respect to a particular benchmark (e.g. arrival slippage, vwap slippage and many others). The specific benchmarks are client specific and the factors involved are multiple ones. In this work, we introduce the concept of Lenses in Algorithmic Trading. We show concrete examples of Lenses and how to used them to improve algorithmic trading performance with the objective of being more competitive in these wheels.
- Gabriel Tucci - Global Head of Equities Cash Quant Trading, Citi
In this talk, I will discuss the Local Extrema Predictor (LEAP), a machine learning algorithm designed to detect mean reversion in FX markets. LEAP leverages both FX and non-FX data to build extensive feature spaces, which are then optimised through support vector classifiers refined by a genetic algorithm. Given its impressive performance, we’ll also examine the optimised features driving its success.
The talk will cover:
• Our feature extraction techniques
• Training and hyperparameter tuning of support vector machines
• Applying genetic algorithms for feature selection
• Examining the chosen indicators that effectively detect mean reversion in the FX market
- Mike Emambakhsh - Senior Research Scientist, Mesirow
Motivated by optimal execution with stochastic signals, concave market impact and almost sure constraints in financial markets, we formulate and solve an optimal trading problem with a general non-linear propagator model under linear functional inequality constraints. In the general concave transient impact case, the first-order condition reduces to the resolution of a non-linear Fredholm equation whose source-term is an effective signal process dependent on the Lagrange multipliers accounting for the corresponding constraints. In the particular case of a linear transient impact, such Fredholm equation can be semi-explicitly solved in terms of the Lagrange multipliers and their conditional expectations. Leveraging both generalized stochastic Karush-Kuhn-Tucker optimality conditions and such semi-explicit solution in the linear case, we present novel numerical schemes to build sample paths of the optimal trading strategy facing constraints or concave transient market impact. We illustrate our findings on various applications: (i) an optimal execution problem with an exponential or a power law decaying transient impact, with either a `no-shorting' constraint in the presence of a `sell' signal, a `no-buying' constraint in the presence of a `buy' signal or a stochastic `stop-trading' constraint whenever the exogenous price drops below a specified reference level; (ii) a trader facing concave transient market impact for various types of memory decays.
- Nathan De Carvalho - PhD Student, Université Paris Cité
Computing the impact of central clearing on systemic risk - a generative approach
The paper uses a graph model to examine the effects of financial market regulations on systemic risk. Focusing on central clearing, we model the financial system as a multigraph of trade and risk relations among banks. We then study the impact of central clearing by a priori estimates in the model, stylized case studies, and a simulation case study. These case studies identify the drivers of regulatory policies on risk reduction at the firm and systemic levels. The analysis shows that the effect of central clearing on systemic risk is ambiguous, with potential positive and negative outcomes, depending on the credit quality of the clearing house, netting benefits and losses, and concentration risks. These computational findings align with empirical studies, yet do not require intensive collection of proprietary data. In addition, our approach enables us to disentangle various competing effects. The approach thus provides policymakers and market practitioners with tools to study the impact of a regulation at each level, enabling decision-makers to anticipate and evaluate the potential impact of regulatory interventions in various scenarios before their implementation.
- Nikolai Nowaczyk - Technical Specialist, NatWest
We provide a framework for modeling risk and quantifying payment shortfalls in cleared markets with multiple central counterparties (CCPs). Building on the stylized fact that clearing membership is shared among CCPs, we develop a modeling framework that captures the interconnectedness of CCPs and clearing members. We illustrate stress transmission mechanisms using simple examples as well as empirical evidence based on calibrated data. Furthermore, we show how stress mitigation tools such as variation margin gains haircutting by one CCP can have spillover effects on other CCPs. The framework can be used to enhance CCP stress-testing, which currently relies on the “Cover 2” standard requiring CCPs to be able to withstand the default of their two largest clearing members. We show that who these two clearing members are can be significantly affected if one considers higher-order effects arising from interconnectedness through shared clearing membership. Looking at the full network of CCPs and shared clearing members is, therefore, important from a financial stability perspective. This is joint work with Iñaki Aldasoro.
- Luitgard Veraart - Professor, London School of Economics and Political Science