In January 2016, the evolution of FRTB culminated in the Basel Committee on Banking Supervision (BCBS) publishing the finalised standards, titled Minimum Capital Requirements for market risk. The new standards replaced the existing regulatory framework for market risk and go beyond just dealing with quantitative measurement of risk. They also consider internal practices, processes, and other qualitative aspects of a bank’s risk management landscape.
FRTB is set to revolutionise current market risk practices, placing emphasis on the coordination of operational, risk and data management processes as well as systems and technology.
One of the biggest challenges of the new standards is the collection and management of quality market risk data. Banks will need to source, process and store more data and trace data lineage through its various processes. Data used in risk models also needs to adhere to much stricter quality requirements, therefore, more effort will be expended on data analysis and cleaning. The growth in the amount of data required has been intensified by a stronger focus on ‘what if’ capital analysis and budgeting.
For IMA, one of the new qualitative criteria includes longer timespans of historical data used as inputs for modelling. Expected shortfall must be calibrated to the banks most stressful period over an observation horizon going back to 2007. Banks are also required to update their observations on a monthly basis, meaning there will be an ever growing mountain of data that needs to be stored, maintained and fed into modelling processes.
For SA, banks will need detailed position and instrument data to accurately identify risk factor sensitivities and allocate risk buckets. This needs to go hand-in-hand with new mapping and dimensional data, i.e. industry sector or market capitalisation, as gaps here can lead to unclassified positions which attract the highest risk weight and forgoes hedging benefit within buckets.
The BCBS wants to encourage banks towards sourcing better quality data for use within their models. For a risk factor to be considered modellable, banks need to continuously obtain “real prices” across a sufficient set of representative transactions.
A lack of sufficient quality data to model risk factors or arrive at prescribed sensitivities would transfer more risk to the capital intensive NMRF (non-modellable risk factors) component (RRA or SES), therefore increasing the overall market risk charge. Even where risk factors can be modelled, using data of poor quality can lead to desks losing their IMA approval or result in a less than optimal capital requirement.
In a recent Quantifi survey* 80% of respondents claimed that in order to deliver the required front-to-back architecture needed for market risk calculations, reporting and management they plan to adopt a shared or single market data feed across front office & risk. This confirms the need for a unified set of front-to-back aligned risk models, calibrations and data capabilities to achieve optimal trading, risk pricing, performance & cost-efficiency.
*’FRTB – Are Banks Prepared?’
Operational and strategic considerations
To address the problem of banks building models using data of insufficient quality or quantity, the new eligibility criteria requires desks to demonstrate that data is real and derived from actual transactions. This means banks will need to attribute more risk to NMRF if they cannot show lineage by keeping track of data at desk and portfolio level. To achieve this banks will need visual audit trails, for both data and processes, plus rigid metadata management practices to maintain data dictionaries and ensure all data assets are catalogued.
FRTB also presents a business optimisation problem because banks will have to redefine their trading desk structure to produce the best capital and operational outcome. This may be in contrast to some banks incentives for structuring desks around trading mandate or optimising individual risks.