Articles & Video
What does 2020 have in store for quant finance?
Understanding new trends, challenges, and solutions
Machine learning with quantum annealing
2018 Quant of the Year Alexei Kondratyev shares his passion for machine learning and quantum computing.
Artificial intelligence, machine learning, and data in quant finance
In this compilation of articles, FutureQuantMinds explore AI, ML, and the impact of data on quant finance
Another STAC-A2 record for Intel – what are these guys doing?
James Reinders, James Reinders Consulting, reviews some of the research done by the Securities Technology Analysis Center (STAC) to find the best solutions that allow quant finance to enter its next evolutionary phase.
Demystifying AAD – what is the role of adjoint algorithmic differentiation in quant finance?
Leading quant finance experts show us what adjoint algorithmic differentiation means for this sector
On vectorisation of automatic adjoint C++ Code
To vectorise or not? If you ask Johannes Lotz, Klaus Leppkes, Uwe Naumann from RWTH Aachen University, Germany, the answer is an obvious yes.
Holistic view of XVAs – Chebyshev interpolation for speedup and unification
Kathrin Glau, Lecturer in Financial Mathematics at Queen Mary University of London and EPFL FELLOW, tackles the issue of a computational bottleneck in XVA calculations!
It’s about time! Intel introduces a new category of memory
A revolutionary new memory technology hasn’t been introduced in decades—until now.
Quantitative finance in the digital age
Download our latest eMagazine for exclusive articles focusing on data, alternative data, AAD, vectorisation, and quantum computing.
Data’s big year: The last frontiers of quantitative edge
Stocktwits’ Director of Business Development and Revenue Strategy Pierce Crosby on the 2 data trends that quants should look out for in 2019.
Bond flotation with exotic commodity collateral
Professor Michael Dempster introduces his presentation at QuantMinds International, diving deep into the intricacies of bond flotation with exotic commodity collateral.
A brief introduction to Automatic Adjoint Differentiation (AAD)
AAD is both a simple but somewhat hard to comprehend mathematical algorithm, and a highly challenging computer programming practice. Antoine Savine breaks it down for us.
Who put the quants in charge?
Guntram Werther suggests machine-human interface ‘best practitioner’ characteristics for differing kinds of risk problems and asks, "who put the quants in charge?"
AI and the Finance Sector: Innovation or Misdirection?
We got in touch with several finance, FinTech and digital transformation experts to ask them what they thought the challenges and limitations of AI were including Christina Qi, Jim Marous, Kunal Patel, Spiros Margaris, Sally Eaves and Theodora Lau.
The best python tools to analyze alternative investment data for $0
Building an alternative data effort can be expensive but fortunately unlike buying datasets and hiring people, the technology infrastructure required to analyze alternative data can be acquired at minimal to zero cost.
Alternative Data in Quantitative Strategies: Use Cases
Quantitative strategies all aim at maximising returns while minimising risk. Here, SESAMm describes a Natural Language Processing methodology aimed at creating investment signals.
Quant finance and quantum computing - a match made in heaven?
How will quantum computing impact encryption and Machine Learning for qaunts?
Lessons from 10+ years of Algorithmic Differentiation in computational finance
What has Uwe Naumann learned about algorithmic differentiation over 10 years of implementing it?
Breakthroughs in technology shaping quant finance
Live from QuantMinds International, our panel discuss Machine Learning, Big Data and quantum computing and how it's changing quant finance.
Quant vs. machine: derivative pricing by Machine Learning
Wim Schoutens took an old problem – exotic option pricing – and a model with a limited set of parameters and the options under that model, before subjecting it to ML. The result was a tremendous speed up in computation.