During the first day of Global Derivatives 2016, conference participants had an opportunity to learn about the latest regulatory developments facing the world of quant finance today. As the compliance landscape continues to evolve, the industry is grappling with definitions, standardization, customization, and key decisions that will affect their relationships with regulators, their own internal risk monitoring systems, and the costs of implementation and ongoing risk management activities. The speakers addressed the unique role of quants in the current regime, offering highly detailed presentations and insightful comments from the trenches. This article highlights some of the key points made in that set of presentations, drawing directly from those sources.
Bilateral Margin – SIMM: From Theory to Practice
Martin Baxter, Global Head of Quant Research, Nomura
In the post-crisis environment, political leaders are working to improve the state of inter-bank collateral arrangements for uncleared OTC derivatives. In addressing this area, Martin Baxter focused in part on the differences between variation margin and initial margin. For variation margin (VM), the existing procedures will be regulated, made consistent, and strengthened under the new regime. VM is linked to the mark-to-market (current PV) of the portfolio. In contrast, initial margin (IM) is a new concept of bilateral margin to be posted every day, thus it is not really initial margin, and both sides will post similar amounts. IM is linked to the variability of the portfolio.
After explaining some of the challenges facing firms in this area of compliance, Martin turned to SIMM, a standardized model that is available to all derivative users globally and may be applied to any portfolio. The model is based on risks that are already calculated by banks and includes all major and material risks, vega and convexity, as well as spread risk, basis risk, and term structure. SIMM takes a portfolio-based approach, handling trades through their contribution to the total risks of the portfolio, rather than having a different model for each product. It allows netting and diversification within an asset class, but not between asset classes.
MARTIN EMPHASIZED THAT, LIKE ALL MODELS, THIS IS AN APPROXIMATION OF THE TRUTH, BUT IT HAS BEEN SUBJECTED TO EXTENSIVE BACKTESTING AND HAS SHOWN GOOD QUALITY.
He offered detailed analysis and commentary on the testing process and concluded that SIMM is a powerful, accurate, and practical model that may be helpful to a wide range of derivatives users. In closing, he noted that a significant mathematical effort has been made to keep the formula simple, yet with underlying sophistication.
Counterparty Credit Exposure in the Presence of Dynamic Initial Margin
Alexander Sokol, Head of Quant Research, CompatibL
Continuing on the theme of initial margin, Alexander Sokol stated that even under full collateralization, considerable exposure is still present due to the Margin Period of Risk (MPoR). He explained that the BCBS-IOSCO dynamic bilateral initial margin (IM) was introduced to eliminate this residual exposure and CVA. However, he showed that because of the way in which ISDA and CSA operate, dynamic IM is not as effective as people have believed it to be. He discussed the normal exchange of collateral, highlighting the fact that a margin call is actually a chain of events that takes several days to complete. This means that changes in variation margin will always lag behind changes in portfolio value, as a result of operational and legal delays that are baked into the workings of the ISDA/CSA contracts that govern OTC trading.
Moving on to settlement risk, Alexander outlined the dynamics during the period in advance of actual default. In brief, there are two payment types. The first type, trade flows, covers the contractual cash and asset flows defined in the trade’s term sheet. A missed trade flow is a serious event, and a “failure to pay” can rapidly lead to default and trade termination unless addressed promptly. The second payment type, margin flows, is the exchange of collateral between the parties. The ISDA/CSA affords relatively mild treatment to a party who misses a margin flow and, in fact, partially missing a margin payment is a common occurrence, as disputes about margin amounts tend to happen regularly and sometimes persist for years.
However, the delays in normal exchange of collateral and the settlement risk inherent in margin and trade flows wind up creating complexity in how the prospect of default is handled. Alexander noted that credit default cannot be treated as a one-time event for the purposes of modeling exposure. Instead, the sequence of events leading up to and following the default must be considered, from the market observation date for the last successful margin call to the time when the amount of loss becomes known (in industry parlance, “crystallized”). This is known as the margin period of risk (MPoR).
There are several models for the MPoR, including the Classical-, which assumes that margin and trade flows by both counterparties will terminate simultaneously at the beginning of MPoR. Another approach, Classical+, assumes that margin flows will terminate at the beginning of the MPoR, but the trade flows will terminate at the end of the MPoR (simultaneously). Alexander observed that the Classical- and Classical+ approaches co-exist in the market and neither one has become the sole market practice. However, the Classical model has a number of shortcomings, centered around the timing of payments, failure to pay, and observation and recognition of that failure by either or both parties.
After providing detailed comments on models for MPoR and analysis of exposures under variation and initial margin, Alexander concluded that in a highly stylized classical model for MPoR, residual CVA under IM is around 1% of CVA without IM. However, once the precise legal terms of ISDA/CSA are incorporated into the model, this number could be as high as 20% to 50%, depending on the portfolio. So there is sure to be more work on models and margins ahead.
E-Trading and Regulatory Requirements: How to Ensure that Your Algorithms Are Fully Compliant
Alexander Giese, MD, Head of Equity & Commodity Quants, UniCredit
There is no question about the rising importance of algorithmic trading, asserted Alexander Giese. Algorithms are widely used in banks, and for market making in bonds, warrants, certificates, listed derivatives, and other instruments. They are also running on the back of market making activities for optimal hedging and auto-hedging efforts. As market micro-structure has evolved, algorithms have been deployed for best execution of orders, particularly large orders, and in more recent years, smart order routing algorithms fan out across trading venues to find the best prices and liquidity available.
FOR ALL THEIR USEFULNESS, ALGORITHMS HAVE ALSO BROUGHT NEW CHALLENGES TO ORDERLY MARKET OPERATIONS AND RISK MANAGEMENT SYSTEMS.
Alexander offered several well-known examples of malfunctions that caused serious damage to the firms involved and in some cases to the broader markets. High profile cases include the U.S. firm Knight Capital, which suffered a loss of $460 million in less than 45 minutes when
a malfunction in a newly deployed system pushed millions of erroneous buy and sell orders on 140 U.S. equity names to the New York Stock Exchange in August 2012.
Just a year later, in August 2013, Goldman Sachs lost several million dollars due to a computer error whereby automated trading systems accidentally sent what were intended to be indications of interest as real orders to be filled at the exchange during the first 15 minutes of trading.
In December 2015, Barclays agreed to pay $150 million to resolve an investigation by New York’s banking regulator into the “last look” function of its electronic foreign-exchange trading business. The function allowed the bank to exploit a milliseconds-long lag between an order and its execution. The feature was designed to protect the firm from price swings, but could to also be used to help the bank profit at the expense of its own customers.
Also in similar financial headlines at the end of the year, Deutsche Bank faced a lawsuit over high-speed trading, due to the assertion that the firm had used a software platform known as Autobahn to take advantage of millisecond changes in exchange rates to give clients worse prices than they were entitled to.
The Flash Crash is, of course, one of the most dramatic instances of market disruption due to algo trading in recent years, but all of these examples highlight the strong drive for regulation of algorithmic trading. Objectives include the avoidance of high operational losses, prevention of market abuse, increased market transparency, and promotion of greater stability in the financial markets.
To date, the main European regulations focused on algorithmic trading are the European Securities and Markets Authority (ESMA) Guidelines 2012/122, the German High-Frequency Trading Act (2013), the French Regulation on Algo Trading (2015), the Swiss Financial Market Infrastructure Act (FMIA), and the Markets in Financial Instruments Directive MiFID II (from 2018 on). Alexander outlined the MiFID requirements in detail and concluded with remarks on the role of quants in this environment.
Clearly, quants can add a great deal of value at the intersection of IT infrastructure, algorithms, and regulation. Those who can cope with the policy language can help to translate regulatory requirements into practical implementations. On the more purely technical side, they can also assist in creating documentation on trading algorithms and related pricing functions, in collaboration with the IT department. Quants can also help to ensure efficient implementation of regulatory requirements and may support the mandatory testing of trading algorithms; on the last point Alexander pointed to increased awareness in the front office concerning approval processes and requirements, not only for new products, but also for new algorithms.
THE JOB OF THE QUANT IS CHANGING, BUT THOSE WHO ARE ABLE TO BRIDGE THE GAP WILL SURVIVE THE CURRENT ERA.
Prudent Valuation: Here We Go
Marco Bianchetti, Head of Fair Value Policy, Intesa Sanp
Traditionally, quantitative finance practitioners are divided into two populations: those who seek fair value, i.e. means of price distributions, and those who seek risk measures, i.e. quantiles of price distributions. Fair value people and risk people typically live in separate lands, and worship different gods: the profit and loss balance sheet, and regulatory capital, respectively. Prudent Valuation is a rather unexplored midland that has recently emerged somewhere in between the well- known mainlands of Pricing and Risk Management. The Capital Requirements Regulation (CRR), requires financial institutions to apply prudent valuation to all fair value positions. The difference between the prudent value and the fair value, called Additional Valuation Adjustment (AVA), is directly deducted from the Core Equity Tier 1 (CET1) capital. The Regulatory Technical Standards (RTS) for prudent valuation proposed by the EBA have been adopted by the EU (reg. 2016/101) on 28th Jan. 2016. This talk explored the issues in depth, with detailed qualitative and quantitative observations, backed by an array of data.