10+ years on, we are still battling with the fallout from the 2008 financial crisis. The industry’s practices were challenged, forcing enterprises to re-evaluate their strategies and reassess customers’ risk appetite. But with technology developing at an unprecedented rate, could the stale strategies be revamped using digital innovations? Aymeric Kalife, Associate Professor at Paris Dauphine University and Founding Partner at iDigital Partners, explores the opportunities provided by big data and automation.
Since the 2008 financial crisis, major challenges (persistent low interest rates, deflationary environment and decreasing alpha, sweeping changes in volatility regimes patterns) have drastically penalised the performance of the asset management industry. Whether active or passive, this put significant pressure on fees. Additionally, the development of digital technologies has shifted the focus from the products to the customers, which reversed the bargaining power between the customer vs. the asset manager in favour of the former.
Such a new paradigm requires switching from a top-down to a bottom-up approach. Being customer-centric is not any more an alternative or a “plus”, it is a requirement for survival. Reputation risk has become the key, where success of the past does not warrant the future.
As a result, succeeding in this new business era requires efficient, customised services in due time. Customers’ needs may change often, quickly, and drastically, so it requires agility at minimum costs to stay afloat. Leveraging large and available datasets which have been piling up over the past 20 years could help, while big data, process automatisation, data virtualisation, and prospective intelligence as innovations can be efficient resources to accelerate and tailor the development of services to customers.
This change in approach must take into account sustainable growth and solvency constraints, while meeting both customers’ and shareholders’ interests, notably through the following 3 innovations.
Funds’ design optimisation and transaction cost mitigation for better target allocation, returns, and sustainable growth
Optimising portfolio rebalancing for target allocation consistency and risk management
Over time, the portfolio may drift from its target asset allocation, providing risk/ return characteristics that may be inconsistent with an investor’s goals and preferences (e.g. 60/40 equity/ bonds target). A rebalancing strategy addresses this risk by formalising guidelines about how often the portfolio should be rebalanced, how far an asset allocation can deviate from its target before it’s rebalanced, and how much rebalancing should restore a portfolio to its target.
We take as given the target asset ratio and focus only on keeping close to it. For example, maximising the fund value is not the primary objective, rather, it is keeping tight tracking error while minimising transaction costs. The rebalancing increases with the dispersion of returns across the portfolio assets. With the volatility implying more fluctuation around the target allocation, higher correlation among the returns means that they tend to move together thus reducing the need to rebalance.
The principal benefit of rebalancing is the reduction of the tracking error risk which quadruples as a portfolio drifts twice as far off target (as the risk is quadratic). Since rebalancing costs are linear while rebalancing benefits are quadratic, at some “trigger point” the benefits of rebalancing will begin to outweigh the costs, and the net benefit of rebalancing will become positive.
Digital technologies can be leveraged through a multi-step automatised process using back testing and stress tests. These can explore a wide range of patterns, and in a customer-centric approach, the tracking error is weighted by the customer risk tolerance.
Optimised rebalancing outperforms the “periodic rebalancing” significantly. Rebalancing more frequently by more than the offset could mean far less transaction costs. However, the “halfway” rebalances are less frequent than the optimisation to the “target boundary”, but they are compensated by higher transaction costs per rebalancing.
Capped volatility funds combined with high fundamental quality stocks selection to provide sustainable growth and resilience to equity market downturn
Contrary to conventional wisdom, less volatile stocks empirically tend to outperform over the long term, because of countercyclical behaviour in market slumps by losing significantly less during downturns and drawdowns, and by avoiding the performance swings. In contrast, volatile stocks must work much harder to first restore the value lost during periods of decline and then to grow.
This historical performance of low-risk stocks defies the central paradigm of traditional finance theory which states that lower risk goes with lower returns (negative mean-variance relationship). This stems from (i) a lottery mentality driving most investors to consistently overpay for the small chances of winning big in riskier stocks, (ii) an inclination to avoid low-beta stocks, and (iii) the general use of log-Gaussian modelling assumption in returns distributions (characterised by left-skewed skewness distribution of returns, i.e. a long tail to the left of the returns distribution).
As a result, a sound mix of passive and active asset management that combines volatility control mechanisms with high fundamental quality stocks selection (healthy and stable profitability, strong free cash flows, low debt and shareholder-friendly practices, above average dividend payout, low net equity issuance) can not only mitigate significant declines but also generate significantly higher returns at similar levels of risks.
Digital technologies can be leveraged to
- merge and visualise data (data virtualisation and BI),
- forecasting returns/ risk and customising the utility function of the customers (artificial intelligence),
- transaction costs modelling (artificial intelligence technologies using “optimal control”).
Hedging rebalancing optimisation for safer long term products and better solvency
Since volatility and transaction costs rise significantly during extreme markets, those hedging strategies that work well under normal market conditions may deteriorate in performance during crisis periods. A rebalancing hedging strategy formalises guidelines about how often the delta should be rebalanced, how far the delta can deviate from its target, and how much delta rebalancing should restore to its target.
Rebalancing the delta only at discrete time intervals reduces the total transaction costs, while leading to a hedging error. We seek the best delta strategy regarding both hedging accuracy and transaction costs. “Time-based strategies” vs. “move-based strategies” (i.e. whenever change in asset/ delta > bandwidth).
Optimised delta hedging rebalancing outperforms the “periodic rebalancing” significantly, notably by making use of adequate hedge instruments:
- as long as unit transaction costs are small (e.g. Bond Futures), systematic weekly delta hedging rebalancing coupled with daily emergency thresholds (based on variable bandwidth delta metrics), aka “Variable Bandwidth Delta Tolerance” proves to be the most efficient in most market conditions;
- as unit transaction costs rise (such as for swaps in extreme conditions), some alternatives may prove more efficient, such as the so called “Gamma Bandwidth Delta Tolerance” or “Asset Tolerance Rebalancing”. In extreme market conditions, a “fixed bandwidth” delta rebalancing may also prove more efficient.
Digital technologies can be leveraged in similar ways as described above.
Guarantees design optimisation for higher customer income and improved behaviour risk management
Behavior risks are plural and complex: lapses, deferral period, fund switches, withdrawals…
Behavior risks are indeed critical for investment and insurance guaranteed products. They are the most important risk among life underwriting risks Variable Annuities, as illustrated by solvency issues experienced by the policyholder run in the late 1980’s and early 2010’s. They are also crucial to a proper calibration of regulatory standard models and internal risk models. Besides sustainable growth, it also requires an appropriate product design tailored to both customers’ needs and insurers’ risk appetite.
Two modelling approaches exist: dynamic approach (backward looking, based on data, econometrics & statistics) vs. “efficient behavior” approach (prudent forward Looking, based on individual policyholder’s optimisation of net profit over the full life of the contract duration, depending on key drivers such as the moneyness of the guarantee or the level of interest rates). The scarcity of extreme scenario samples and the inability to dynamically extrapolate the observed behaviour to various market conditions make the “rational” lapse approach useful.
If we focus on lapse risks (surrender and partial withdrawals), wheredata across the past 20 years are the most available, while extreme scenarios have been experienced, “efficient” lapse aims to measure customer behaviour from a pure individual financial rationality. This allows us to predict undesirable customer behaviours for the insurers. Similarly, with Bermudan-style options at discrete time frequency, the policyholder is supposed to compare the account value with the value of the liability to decide whether to lapse or not. If the account value is bigger than the liability, policyholders surrender the contract and get back the account value. Otherwise, they continue to hold the policy.
“Efficient" modelling framework is a useful benchmark that can be used in pricing once key drivers are also considered to get realistic prices consistent with practice: market conditions, policy durations, past decisions (potential partial withdrawals), tax regime. Such a modelling framework can also be used in product designs to stir policyholder behaviour whilst also fitting client appetite. For instance, the design of the fees opens a way to improve both the competitiveness and the clients’ risk appetite fit, ensuring a sustainable growth. Regarding “efficient” annuitisation within a GMIB Variable Annuities, high roll-up rate boosts the “guarantee” level, which delays annuitisation, in contrast with roll-up rates indexed on interest rates in favour of an early annuitisation.