BCBS d352 – Impact of FRTB rules

By Philippe Carrel

Brace for change. By releasing its Fundamental Review of the Trading Book (FRTB)1 on January 16, 2016, the Basel Committee for Banking Supervision (BCBS) truly enforces the integration of all traded asset classes into a risk factor based computation framework. This will have the durable impact of bringing enterprise risk management technology closer to the front office.

The new standardized method introduces three major changes to how trading book risk is calculated:

  1. The new framework focuses on intentions (trading versus holding) rather than actual activities, and on the availability of reliable, transparent data provided on a daily basis.
  2. Risk is now decomposed into factors of sensitivity, departing from the previous approach which focused on instruments, disregarding adjacent risks and correlations within.
  3. A default risk charge is now applicable to all relevant instruments, including equities.

The new internal model approach (IMA) requires calculating a liquidity adjusted expected shortfall (ES), calibrated for horizons provided by the regulators, based on risk factors also defined by them. Less debated but at least as constraining is the back-testing requirement, involving P/L attributions and model validation, as well as daily reporting of each desk’s VaR/ES and P/L p-value.2 The eligibility to IMA of each desk remains subject to the back-testing results on a continuous basis. Each desk can be downgraded to SA-MR whenever any of these parameters is no longer met (e.g., if market data become unavailable).

The general wisdom is that for the majority of banks that had opted for VaR based models since Basel I, the new regulation will globally result in substantial capital increases. In our view, the necessary changes in operation models and IT infrastructure could be just as disruptive as the financial implications.

Policing the Trading Books

The new definition of the trading book now hinges on trade intent rather than on the nature of the traded instruments. Unless it qualifies as a trading book item, any transaction is by default allocated to the banking book. The process of computing a daily P/L is assumed to be for trading purposes; likewise, market making or running naked short positions would necessarily pertain to trading activities.

The real complications start with the notion of internal risk transfers (IRT), which are simply banned from a regulatory capital point of view, unless the hedge occurs with an “eligible hedging 3rd party” (i.e., a CCP) and exactly matches the IRT. Routine hedge operations involving derivatives to cover the bank’s treasury exposure, bond holdings, or banking book equity exposures, for example, will need to be labeled IRT and followed up individually in a process similar to hedge accounting.

Banks will need to create and maintain new intermediate portfolios containing internal trades only. Rules over IRT will require simplified pricing sources, perfect data integrity, and front office systems capable of following the hedges as bundles. Transparency rules will require banks to provide details of all curve calculations and models in use and to back-test all portfolio values.

The Standard Approach

A key aspect of the new regulation is the principle of exposure decomposition by risk factors in a Sensitivity Based Approach (SBA). The regulator has retained the following risk factors for the trading book: interest rates, credit spread (differentiated for securitization), equity, forex, and commodities, as well as default risk where applicable. See Exhibit 1 for instrument-specific details.

Ci-article-May16-Spotlight-1

For the banking book, only forex and commodities risk will be considered. The net sensitivity for each risk factor within a risk class is multiplied by a respective risk weight provided by the regulator.

Rates and spread risks are further decomposed by term structure (vertices). Futures are decomposed into underlying risk and forward rate risk. Commodities and forex risks, however, remain a single net value regardless of basis effects and their potential impact on forward volatility.

The General Interest Rates Risk (GIRR) framework specifies the use of risk and credit curves and provides a risk-free curve to map the time buckets. Vertex risk weights are applied to the buckets in addition to inter-bucket correlations provided by the regulator. The capital charge is finally calculated with a formula recognizing the diversification across currencies and involving the correlation factors.

The Credit Spread Risk (CSR) framework similarly assumes the decomposition by time bucket with an Option-adjusted Spread (OAS) calculation of each credit spread. Equity and equity forwards/futures must involve default risk in addition to the value based exposure. Options are measured in delta-equivalent terms, plus a default risk allocation for equity options.

A three-pronged approach is chosen for securitization, to encompass credit and market risks along the various time buckets. Similarly, Credit Default Swaps (CDS) must combine three risk classes: default, spread, and interest rates.

The table above exhibits the factor allocations for each asset class using the standardized market risk approach.

The decomposition of exposure by risk factor and vertices, carried out at the trade desk level is one of the most disruptive changes proposed by the new regulation. It requires data derived from trade tickets, inventories, pricing models, and more.

In terms of IT architecture, the structure of the existing application portfolio will determine how arduous the task will be:

  • Firms who have achieved straight-through processing vertically (i.e., using standalone applications dedicated to specific asset classes) will either need to create an additional platform to collect and process data at various levels of the processing chain, or evolve toward a cross-asset structure capable of computing multi-asset sensitivity
  • Firms with a portfolio of front and back office applications based on the legal structures of the traded instruments (cash, derivatives, OTC, MM securities, exchange traded securities, etc.) will find it almost impossible to achieve the risk decomposition and data matching without a complete overhaul
  • Firms with horizontal integration (i.e., a cross-asset front office and a single back office platform interfaced together) will feel a lot less pain

Internal Models

Subject to regulatory approval, a bank can apply for the internal model approach (IMA). Each trading desk will then be placed under the supervision of an independent risk manager tasked with ex-post comparisons of risk measures with the actual P/L generated each day over long periods of time. That “desk-level risk department” is also responsible for periodic methodology audits, data integrity, and model validation – specifically verifying that the models used for regulatory capital purposes are the same as those used for the actual risk models.

The expected shortfall (ES) is calculated daily for each trading desk, using a 97.5% percentile, for n-businessday movements in risk factors (n to be determined based on liquidity). A period of stress should be included in the data history. Banks will propose their reduced (relevant) set of risk factors (necessarily including the SA factors), for which they can justify 10 years of quality data history. The data must be updated at least every month. Correlations are derived empirically from the prescribed risk factors, using the suitable liquidity horizons. Options will also include a vega risk sensitivity. Default risk needs to be computed independently, using VaR with 10 years of data over a one-year horizon. Finally, each risk factor needs to be validated each month with evidence of “real price” market activity.

Implications

To create these IMA measures, a full daily revaluation is necessary for every liquidity horizon and asset class using the relevant sets of stressed data. Besides the obvious performance challenge, the most daunting part of the exercise certainly lies with data consistency. Since compliance depends on the convergence of the P/L measures, it would be a real challenge to achieve it with different data sets or pricing methodologies. Models developed with insufficient data will turn “non-modellable” and have a higher capital cost.

For both the standardized and internal approaches, institutions will need a data model capable of accommodating transactions across all desks and splitting exposures into risk factors. The platform will need to warehouse the pricing models, evaluate the books, shock and interpolate all relevant curves and surfaces, and run simulations at high speed. It could be an insert within the STP process itself, or it could be its core. Either way it will likely require a material re-architecture of the STP process.

Ci-article-May16-Spotlight-2


Philippe Carrel is a member of the Global Association of Risk Professionals.
[1] BIS d352, Minimum capital for market risk, January 2016
[2] Calculated probability of finding an expected result