The asset management industry has been facing four main challenges since the 2008 financial crisis, regarding not only alpha generation but also risk mitigation:
- Persistent low interest rates driven by $20 trillion stimulus by central banks (Fed’s 7 trillion balance sheet, nearly doubling since August 2019, +$120bn per month);
- Drastic change in volatility regime patterns, illustrated by a drag on volatility driven by persistent low interest rates combined with significant share buybacks (stirring 40% of the S&P500 growth since 2012), combined with recurrent unexpected short-lived but severe volatility spikes (June 2010, August 2011, June 2012, August 2015, February 2016, June 2016, February 2018, October-December 2018, March-May 2020);
- Deflationary environment and decreasing alpha, driven by lower household consumption and energy price decreases;
- And increased competition across asset managers.
These challenges have significantly penalised active management over the past decade, whether based on...
...such as the sector rotation strategies as diversification is dying: 75% cross-correlation of stocks within S&P500 as it is dominated by a few stocks (five techs = 20% of the performance and 50% when combined with the consumer stocks), or value strategies which have been systematically destroyed by debt-induced corporate buybacks and central banks.
...such as drawdown VIX futures strategies due to significant cost of carry since 2014, except for times of heightened volatility and dispersion; those tail risk derivatives strategies suffer from
- transactions costs (subsequent rolls of option premiums),
- delta-hedging discreteness losses within volatility spikes,
- high cost of carry (due to volatility risk premium and time decay) associated to market timing issues, requiring experienced active management in both timing and volatility regime.
Such limitations have initially favoured low fees passive management (e.g. managed volatility funds, CTA, or ARP) that aim to deliver a portfolio with a stable level of volatility in all market environments by systematically reducing an allocation to securities with lower expected risk-adjusted return in exchange for securities with a higher level of expected risk-adjusted return. This relies on the ability to forecast the near term risk by taking advantage of the usually observed
- highly negative correlation between equities and bonds during times of exceedingly equity volatility,
- negative relationship between volatility and return in the short-term,
- the persistence of volatility.
As a result, these passive managed volatility strategies consistently contain volatility within a much tighter range while reducing outsized drawdowns in extreme market conditions, improving risk adjusted returns by tangible 20-30% for holding periods up to a quarter, enhancing portfolio skewness and achieving robust tail-risk reduction.
Equity volatility spikes combined with increasingly correlated markets have made most passive management inefficient, especially since 2013 when a new volatility regime pattern emerged.
As illustrated by -40% for equity alternative risk premia during the Covid-19 crash, many passive management strategies performed poorly. They suffer from key empirical weaknesses.
They do not meet all investors’ risk appetite and may incur significant transaction costs, due to their dynamic adjustment of equity exposures and high tracking error to a market benchmark for some periods of time.
They experience under-risking and over-risking patterns. Even though the long-term realised volatility tracks the target volatility (thanks to autocorrelation of daily returns), short term volatility may oscillate widely and be significantly away from the target (implying potential under-risking during uptrend and over-risking during downturn).
This stems from the lack of accuracy of the widely used exponentially weighted moving average volatility estimators (based on i.i.d. returns with zero-mean assumptions), which notably lacks a robust inverse relationships with equity drawdowns beyond daily time horizons, and is too sensitive to drawdowns outliers and to volatility modelling (whether constant/stochastic/jumps).
This can even translate into overly aggressive equity de-allocation (August 2011, June 2012, February 2018, March 2020; in August 2015 with $50bn equities sold while $25bn in June 2016 and $150-$200bn in February 2018) and slow re-equitisation which could take many weeks, if not months, e.g. in 2019 with single digit only performance vs. +29% S&P500).
The correlation between bonds and equities may be positive (e.g. March 2020, 1995-2002 or 2006-2007, 2013, 2015, 2016, 2018 – since 1885 bonds and equities moved opposite only 11% of the time while 30% of the time in tandem).
They contribute to feedback loops that exacerbate both selloffs (January and June 2016, February and October-December 2018, March 2020) and rallies. Actually, they sell equities when volatility is rising and buy equities when volatility is falling. It’s very common for them to sell within a day of the emergence of elevated volatility, which can increase further market volatility because so much money is moving out of equities.
As a result, neither active nor passive styles seem to have significantly and sustainably captured velocity changes over the past decade. Going beyond the passive vs. active asset management paradigm through a mix between fundamentals and systematic quantitative investment strategies may be appropriate to help capture velocity changes.
Capturing velocity changes requires a sound mix between fundamentals and systematic market technical indicators.
Mixing the two approaches is likely to improve the P&L, where fundamental approaches notably select assets based on KPIs and macro drivers, while systematic ones detect technical patterns and industrialise mitigating actions.
Given their structural weaknesses, capturing velocity changes requires embedding some fundamentals within systematic algorithmic passive asset management. That way, they do not overreact following initial markets fall, but only consistently with sound economic concerns.
You can introduce fundamentals the following ways:
Fundamental economic catalysts
For example economic growth (e.g. persistent low real rates and inflation indicative of low growth thus low cost of capital and low volatility), M&A activity, share buybacks (bearing downward pressure on volatility), central banks policies (e.g. quantitative easing and increasing balance sheets depressing volatility).
Fundamentals of the volatility regime
This provides anticipative signals regarding the behaviour of future near term volatility. For instance, a short-dated implied volatility is indicative of a shift from low to high volatility regimes; or a high volatility risk premium can encourage investors to seek volatility short selling opportunities, thus lower volatility.
Supply and demand fundamentals
For example, when market dealers hedge gamma positions and their increasingly massive use of weekly options (30-50% of all options vs. < 5% in 2011). (Positive net long equity gamma positions dampen volatility, while net short equity gamma positions tend to exacerbate volatility.)
These mostly drive the asset rebalancing and are volatile and path dependent. Technically, the P&L is the product of the spread between the expected and realised volatility times the net asset vs. liability gamma. As a result, a sustainable “break-even” volatility target is such that the average asset-liability gamma-theta P&L is zero over the time horizon.
Capturing velocity changes also relies on selecting resilient assets, which mixes both fundamentals and systematic market technical indicators.
Contrary to conventional wisdom, less volatile stocks empirically tend to outperform over the long term by losing significantly less during drawdowns. In contrast, volatile stocks have to work much harder to first restore the value lost during periods of decline and then to grow. This historical performance of low-risk stocks defies the central paradigm of traditional finance theory which states that lower risk goes with lower returns, stemming from
- a lottery mentality driving most investors to consistently overpay for the small chances of winning big in riskier stocks,
- an inclination to avoid low-beta stocks, and
- the general use of log-Gaussian modelling assumption in returns distributions (characterised by left-skewed skewness distribution of returns, i.e. a long tail to the left of the returns distribution).
As a result, a sound mix of fundamentals and systematic technicals within the asset selection and the asset allocation process that combines “volatility control” technicals with high fundamental quality stocks selection (healthy and stable profitability, strong free cash flows, low debt and shareholder-friendly practices, above average dividend payout, low net equity issuance) can produce stronger performance. This has proved to be a more efficient way since the 2008 crisis, not only to mitigate significant declines but also to generate significantly higher returns at similar levels of risks.
Beyond the asset value, capturing velocity changes relies on an optimised trade-off between sustainable low tacking-errors vs. low costs.
The portfolio may drift from some target asset allocation in case of velocity changes, providing risk/return characteristics that may be inconsistent with an investor’s goals and preferences (e.g. 60/40 Equity / bonds target). As a result, keeping tight tracking error, while minimising transaction costs may become the primary objective.
Since rebalancing costs are linear while rebalancing benefits are quadratic, at some trigger point the benefits of rebalancing will begin to outweigh the costs, and the net benefit of rebalancing will become positive. Optimised rebalancing (halfway, or target boundary) significantly outperforms the periodic rebalancing regarding the fund value by a factor >2, and even more (up to a factor of 10) when scaled with the volatility of the fund.
Similarly, rule-based hedging strategies help capture velocity changes through an optimised trade-off between low tracking errors vs. low costs.
Rebalancing only at discrete time intervals reduces the total transaction costs, while leading to a hedging error.
Regarding mean-variance P&L, the best trade-off lies either in time-based rule-based or move-based rule-based strategies, i.e. whenever change in asset/ delta > bandwidth, and depending on whether unit transaction costs are small (systematic weekly rebalancing, combined with daily emergency thresholds based on variable bandwidth delta tolerance) or large (gamma bandwidth delta tolerance, or asset tolerance rebalancing, or fixed bandwidth delta tolerance).
Rule-based long/short dynamic allocation strategies enable to cheapen the cost of carry, by monetising some yield participating in upward trending markets, such as:
Using VIX futures
This reduces the cost of carry due to the implied to realised volatility risk premium and associated market timing issues, by systematically rolling VIX futures contracts to get a constant monthly forward-starting variance.
It also partially mitigates the cost of carry due to the theta time decay, through opportunistic VIX futures overlays whenever the VIX term structure is upward sloping – consistent with not distressed equities.
Using tail risk puts:
- Corridor barrier puts that are vanilla puts, subject to the underlying staying within a range, provide up to 60% discount.
- “Best of puts”, given the very high correlation between puts during market crashes, cheapens the cost up to -50%, when combined with a notional increase as the market goes up.
- Resettable puts monetise the view that the market is expected to make new highs.
With respect to the three value propositions above, digital technologies can be used for
- securely merging huge quantities of data across different sources without damage (using data virtualisation),
- devising suitable portfolio rebalancing strategy (using multi-steps automatised process using back testing and stress tests across a wide range of patterns and stress tests);
- selecting sustainable investment and hedging assets, and managing semi-static hedging and liquidity issues (using big data, artificial intelligence, non-linear market impact modelling and optimal control).