Main Conference Day Two - GMT (Greenwich Mean Time, GMTZ)
Modelling techniques and investment strategies for better decision-making and improved investment performance.
- Hamza Bahaji - Head of Financial Engineering and Investment Solutions, Amundi ETF, Indexing & Smart Beta,, Amundi
- Alisa Rusanoff - Head of Credit and Technology, Crescendo Asset Management
- Yehuda Dayan - Head of Thematic Data Science, Citigroup
- Svetlana Borovkova - Associate Professor Of Quantitative Finance, Vrije Universiteit Amsterdam
Roma Agrawal, engineer, author and broadcaster, looks at historical examples of the importance of human creativity and instinct in dealing with large amounts of data. She will also share stories of how time is a humanmade construct and why we should reframe our understanding of it.
- Roma Agrawal MBE - Structural Engineer, Author and Broadcaster, Roma the Engineer
The financial and strategic implications of the lack of diversity in the workplace and the wider industry
- Samar Gad - Associate Professor, Kingston Business School
- Laura Lise - Markets Quantitative Analytics Director – Equities Prime and Delta One, Citi
- Elissa Ibrahim - Associate, Quantitative Finance, Model Validation, EBRD
- Svetlana Borovkova - Associate Professor Of Quantitative Finance, Vrije Universiteit Amsterdam
- Benchmarking Monte Carlo simulations to accurately price risk-free yields for any holding period is essential
- Traditional lattice-based models are often incorrect in this regard
- This session provides a practical guide to Heath, Jarrow and Morton modeling of risk-free yields in multiple countries
- Results for a worked example are discussed
- Donald van Deventer - Managing Director, Risk Research and Quantitative Solutions, SAS, SAS
- Mahdi Anvari - Head of Equity Derivatives Quantitative Analysis, Millennium
- Stability and accuracy of finite difference solution: myths, facts and orthodoxy
- Discrete consistency: backwards, forwards, Dupire fd solution, Monte-Carlo.
- Local volatility and absence of arbitrage.
- Positive transition probabilities and exact match to European option prices.
- Discrete dividends, stochastic volatility, jumps.
- Barrier options, American options.
- Jesper Andreasen - Head of Quantitative Analytics, Verition Fund Management
- We extract forward looking market consistent information on the correlation between the S&P and the VIX markets
- We achieve this with a joint model for the two indices built on time changed Lévy processes
- Closed analytical expressions are available directly from the joint characteristic function
- We illustrate the approach by calibrating the model to market data
- Laura Ballotta - Professor of Mathematical Finance, Bayes Business School (formerly Cass)
- Neil Palmer - Co-Head, VP of Quantitative Engineering, Beacon
- Matthias Arnsdorf - Global Head of Counterparty Credit & Market Risk Modelling, JP Morgan Chase
- Sachin Anandikar - Chief Technology Officer, Pemberton
We introduce a new way to decompose the P&L of a delta hedged option, and apply that decomposition to revisit several key trading questions, such as the fair value of implied volatility, the best choice for a delta hedging scheme, and the ex-ante risk profile of an option portfolio.
- Olivier Daviaud - Quantitative Strategist, Executive Director, JP Morgan
In this talk, we explore the application of GenAI methods in the context of deep pricing and we study numerical efficiencies gained compared to more traditional Deep Learning approaches.
- Youssef Elouerkhaoui - Managing Director, Global Head of Markets Quantitative Analysis, Citigroup
Federated Learning revolutionizes how machine learning models are trained, keeping sensitive data local and secure. In this presentation, we’ll unveil how insurers can join forces to build powerful neural network models for claims frequency prediction—all while preserving data privacy. Discover the cutting-edge techniques that make it possible to model insights without ever exposing sensitive information.
- Malgorzata Smietanka - Researcher, University College London
In this talk, we will make use of Malliavin calculus techniques to study the relationship between the volatility swap, the variance swap, the at-the-money implied volatility (ATMI) and the zero vanna implied volatility. In particular, we derive exact expressions for the differences between these quantities. These expressions allow us to establish the role of the correlation parameter and, for volatilities driven by a fractional Brownian motion (rough volatilities), the impact of the corresponding Hurst parameter . This talk is based on some joint papers with David García-Lorite, Aitor Muguruza, Frido Rolloos, and Kenichiro Shiraya.
- Elisa Alòs Alcalde - Associate Professor, Universitat Pompeu Fabra
A new measure for realized volatility is introduced as an alternative to the traditional log return formula, with enhanced theoretical accuracy and predictive power for options strategies. This measure is then backtested historically and compared against implied volatilities of options. Examples are provided, highlighting implied volatility premiums and discounts across different historical contexts. Additionally, potential applications of this measure for constructing hedging instruments are demonstrated.
- Behzad Alimoradian - Quantitative and Acturial Analyst, Valerian Capital
It is commonly assumed that a detailed and accurate description of the joint dynamics of risk factors and option prices is a pre-requisite for the design of (good) hedging and risk management strategies for derivative instruments. This has motivated the development of increasingly complex stochastic models with many risk factors and parameters, which are challenging to estimate and implement.
We argue that this assumption is incorrect. We show that ANY 'auxiliary' pricing model capable of calibrating the cross section of liquid option prices and satisfying a certain identifiability assumption is capable of recovering market dynamics through parameter recalibration, and enables to compute correct hedge ratios without explicit knowledge of market dynamics.
- Rama Cont - Professor of Mathematics and Chair of Mathematical Finance, University of Oxford
The recent defaults & near defaults events (LDI criss / LME Nickel, Archegos, CS, SVB…) brought greater focus on the need of effective CCR stress testing, as well as on the complexity of modelling highly levered and WWR counterparties. With the present contribution, we build up on recent analytical progress and we show how a suitable combination of Gaussian copulas and mixture models technics can be used to realise a flexible Monte Carlo based CCR stress testing framework.
- Fabrizio Anfuso - Senior Technical Specialist, Bank of England
We present a new, general-purpose approach to portfolio optimization that can be used to control turnover in equity strategies, without introducing the path dependence normally associated with L1 trading constraints or penalty terms in the objective. We define the optimization objective function to be a measure of distance to a generalized target portfolio, specified as a set of assets and associated weights. The target portfolio allows a wide variety of investment preferences to be expressed, including target portfolios selected to have inherently low turnover. Solution portfolios will resemble the target portfolio, subject to constraints that can be used to specify a wide variety of additional preferences and mandate requirements, including risk diversification preferences. Where constraints are suitably selected, the problem can remain entirely convex. We illustrate the success of the approach with some examples where Fidelity International proprietary research is used in the formation of target portfolios.
- Barney Rowe - Senior Quantitative Analyst, Fidelity International
Stocks are classically classified according to Fama-French features: market capitalization, value/growth, momentum, and by industrial sectors. It is an implicit assumption of the market that stocks that fall into the same category tend to be more correlated than when they are in different categories. We challenge this assumption by reclassifying stocks using their polymodel representation, that is, how they react to a very large set of risk factors. A stock that officially belongs to a given industrial sector, but whose activity has shifted to the point of behaving like stocks of another sector, will be requalified in this new sector.
From this polymodel representation, we use machine learning techniques to predict the trend in their forthcoming return. We also derive fragility/antifragility properties of each stock. The portfolio we design, selecting the best predictions together with strong antifragility, has much better characteristics that the S&P500 index, both in terms of performance and risks.
- Raphaël Douady - Research Professor, University of Paris 1 Pantheon Sorbonne
We study the impact of different order types on bitcoin price formation within and between Coinbase and Binance, two of the largest centralized crypto asset exchanges. We find that limit order submissions and cancellations are considerably more frequent than market orders and have a larger total contribution to price discovery. While prices are generally integrated across markets, this integration breaks down at high frequencies, allowing for price discrepancies to persist. We also highlight the differences between crypto and traditional asset markets, including the absence of order and trade interaction rules across crypto exchanges, the lack of external oversight, and the fragmentation in the market. Recent regulatory actions reflect a broader push to increase oversight in the crypto market to protect investors.
- Carol Alexander - Professor, University of Sussex and Exponential Science Foundation
In today's dynamic financial landscape, effective model monitoring is crucial for maintaining the health and integrity of both AI and large language models (LLMs). This presentation will introduce Modelscape Monitor, a robust platform designed for continuous monitoring of production models. We will explore strategies for managing model drift, ensuring compliance in regulated environments, and optimizing performance in unregulated workflows. Attendees will gain insights into advanced instrumentation, observability techniques, and the integration of automated alerts to enhance model reliability and business value.
- Hannu Harkonen - Principal Software Engineer, MathWorks
Using path-integrals we develop an accurate and easy-to-compute semi-analytical approximation for generalized short rate models. We illustrate the accuracy of the method by presenting results for the Black-Karasinski model for which the proposed approximation provides remarkably accurate results, even in regimes of high volatility and for multi-year time horizons. The accuracy and the computational efficiency of the proposed approximation makes it a viable alternative to fully numerical schemes for a variety applications in derivatives pricing and XVA.
- Luca Capriotti - Managing Director - Head of NCL Quants, UBS Group
Local Volatility (LV) is a very powerful tool for market modeling. This tool can be used to generate arbitrage-free scenarios calibrated to all available options. It is demonstrated how to implement LV to reproduce the most of caps/floors and swaption prices within a single model. The crucial part of this approach is a Small Volatility Approximation in HJM interest rate model. This approximation is used to calculate sensitivities of forward volatilities. These calculations are deterministic and therefore they are fast. Accuracy of calibration is very good.
- Viatcheslav Belyaev - Senior Quantitative Analyst, U.S. Bank
The comparison between Stochastic Local Volatility (SLV) and Minimum Relative Entropy (MRE) models is a novel topic in quant finance. SLV models are a fusion of local volatility and stochastic volatility features, aiming to capture the dynamic nature of financial markets by calibrating to the implied volatility surface. They combine the best aspects of local and stochastic volatility models, offering a robust framework for option pricing. On the other hand, MRE is a technique that helps in calibrating asset-pricing models to market prices by finding a probability distribution that is closest to a prior distribution while satisfying certain constraints, such as market prices of options. This approach is grounded in the principle of entropy, which in this context, measures the distance between probability distributions. The purpose of this presentation is to evaluate the advantages and disadvantages of both pricing models.
- Daniel Arrieta Rodriguez - Head of XVA Model Validation, Santander
- Mahdi Anvari - Head of Equity Derivatives Quantitative Analysis, Millennium
- Abhishek Gupta - Associate Director, Head of Product, Scientific Infra and Private Assets
- Model risk: from finance to climate
- The butterfly effect
- A conic finance view towards model risk
- Wim Schoutens - Professor Of Financial Engineering, University of Leuven
We construct a non-Gaussian family of stochastic processes which are statistically indistinguishable from (fractional) Brownian motions. We also construct process where Hölder roughness is not the reciprocal of p-th variation. Therefore, when observing a sample path from a process in a financial market such as a price or volatility process, one should not measure its Hölder roughness by computing p-th variation and should not conclude that the sample follows a Brownian motion or fractional Brownian motion even though it exhibits the same (statistical) properties of those Gaussian processes.
- Purba Das - Lecturer in Mathematical Finance, King's College London
- Switch to ARR and latest data: impact on universal regimes for rates
- Volatility (diffusion) and jump dependencies on rate levels
- Central Bank rate role. Short rates are special.
- Micro and macro structure of the regimes
- CB rates as a part of a more advanced model
- 1-day vs 10-days returns: FRTB angle
- Inflation regimes and their evolution
- Vladimir Chorniy - Managing Director, Head of Risk Model Fundamentals and Research Lab, Senior Technical Lead, BNP Paribas
- Vinay Kotecha - Managing Director, Market Risk Methodology Lead, BNP Paribas
The Markowitz portfolio optimization model was put forward by Harry Markowitz in 1952. With some modifications it remains the primary mode of thinking when it comes to allocation of resources (including portfolio management). Markowitz himself was aware of the model's limitations. In this talk we will consider these limitations, ways of thinking about them mathematically and statistically, and ways of remedying them using modern statistical and machine learning methods.
- Paul Bilokon - Visiting Professor, Imperial College
This paper analyses the market valuations of climate innovations. We rely on the granularity of patent data and look at the relationship between Tobin's Q and firms' stock of patents in different categories of climate technologies. Only a small group of patents in non-carbon intensive climate technologies, those contributing to both adaptation and mitigation are valued positively. Other types of non-carbon intensive climate innovations are not associated with higher firm valuation. Within carbon intensive technologies, only those with a climate tag, namely inventions that can improve the efficiency of these technologies and lead to a potential emission reduction are positively correlated with firms' valuation.
- Murad Nuriyev - Research Analyst, Amundi
Quantitative strategies for portfolio management
- We introduce the Kurtosis-based Factor Risk Parity (KFRP)
- KFRP evaluates tail risk contributions at the factor level and utilizes kurtosis as a risk measure
- This unique feature allows us to distribute tail risks in the factor investing framework
- Application to several portfolios and factors show that portfolio tail risk is almost entirely due to factors’ tail risk
- KFRP method outperforms traditional Volatility-based Factor Risk Parity (VFRP) in managing risks, while maintaining strong financial efficiency
- Svetlana Borovkova - Associate Professor Of Quantitative Finance, Vrije Universiteit Amsterdam
Appropriate responses to mitigate and adapt to climate change, and avoid detrimental impact on nature, require substantial and consistent economic efforts, at a global scale. Innovative financial mechanisms, and instruments which support the widespread adoption thereof, can address the need for impactful capital and investment flows designed to support various decision timescales and priorities across different regions of the world. In this presentation, we shall focus on how modern financial markets can play a central role in the deployment and allocation of funds to accelerate and sustain a just transition to a resilient and prosperous future.
- Andrea Macrina - Professor of Mathematics, University College London
A modification of Newton’s method for solving systems of n>0 nonlinear equations is presented. The new matrix-free method is exact as opposed to a range of inexact Newton methods in the sense that both the Jacobians and the solutions to the linear Newton systems are computed without truncation. It relies on a given decomposition of a structurally dense invertible Jacobian of the residual into a product of structurally sparse invertible elemental Jacobians according to the chain rule of differentiation. Inspired by the adjoint mode of algorithmic differentiation, explicit accumulation of the Jacobian of the residual is avoided.
A series of sparse linear systems is solved instead. Reductions in computational cost by an order of complexity (n squared vs. n cubed) can be observed.
We outline the method and we discuss generalization to first-order optimality criteria. The latter are expected to be crucial for applicability in computational finance in general and for the calibration of financial models in particular.
- Uwe Naumann - Professor Of Computer Science, RWTH Aachen University
At Man Group we’ve spent time investing in our data pipelines and optimising for a full range quant use-cases. We’ll look at some of the common mistakes people make when implementing data solutions. We’ll consider modern OLAP vs alternative solutions and the trade-offs between them. Are your systems living up to their promise? We’ll look at common anti-patterns when implementing data infra, processes and teams. We’ll also look at how and why we designed ArcticDB to address time-series storage and scaling to support large data centric organisations.
- James Munro - Head of ArcticDB, Man Group
In Banking, Large Language Models (LLMs) and Generative AI have the potential to largely redefine what Quants can do. As the sector turns to this technology, it becomes clear that model risk management is one of the most important enablers for large-scale adoption.
This session will present the unique challenges and best practices for establishing robust model risk management frameworks for LLMs. Attendees will gain new insights into regulatory expectations and validation techniques to ensure the safe and effective deployment of LLMs in Banking.
- Fabien Choujaa - Head of Strats and Model Risk Management, HSBC
- Alex Morris - Vice President - Quantitative Research & Trading, Selby Jennings
- Fabio Mercurio - Global Head of Quant Analytics, Bloomberg L.P.