How to Accelerate XVA Performance

As banks look to reduce, mitigate, and optimize XVA and other capital charges, they are making investment in XVA capabilities in an attempt to solve the computational challenge of simulating a full universe of risk factors.

In the post-crisis world, an increasing number of banks have set up a centralized XVA desk. With the introduction of new regulations to ensure banks are adequately capitalized, it has become common practice to include certain costs in the pricing of OTC derivatives that, in many cases, had previously been ignored. To assist in the pricing for the cost of dealing with a counterparty in a derivative transaction, the markets have developed various metrics including CVA, DVA, FVA, ColVA, KVA, and MVA— collectively known as XVAs.

One of the key challenges of XVAs is that adjustments need to be calculated on a portfolio basis rather than trade-by-trade. This requires dealing with a large number of computations and orders of magnitude more calculations for accurate results. The calculation of XVAs is highly complex, combining the intricacies of derivative pricing with the computational challenges of simulating a full universe of risk factors. Given the strategic importance of XVA, banks require enhanced capabilities and modern infrastructures to calculate the required credit, funding, and capital adjustments. As banks look to reduce, mitigate, and optimize XVA and other capital charges, they are making an investment in XVA capabilities in an attempt to solve the computational challenge of simulating a full universe of risk factors.

Another key challenge is how to efficiently calculate XVA sensitivities. While sensitivities have always been an important component of XVA desk risk management, the FRTB-CVA framework published by the Basel Committee in 2015 has made managing regulatory capital a priority for banks globally. This has further driven the demand for calculation of sensitivities. Banks that are unable to calculate CVA capital charge using the sensitivity-based FRTB approach will have to use the rather punitive formula-based basic approach.

The demand for higher performance has highlighted the need to get the most out of the latest generation of software. A distributed architecture that supports the heavy demands of big data provides a number of benefits when dealing with large, complex portfolios. The main benefits include scalability, reliability, and resilience. However, the use of distributed computing for calculating XVA also presents a number of challenges, mainly in regard to I/O performance and (central processing unit) CPU processing.

insights

Innovative thinking

Whitepapers

Intel & Quantifi Accelerate Derivative Valuations by 700x Using AI on Intel Processors

This paper demonstrates that accurate, real-time pricing for a portfolio of derivatives can be generated locally or in the cloud using AI technology.

Whitepapers

Vectorisation: The Rise of Parallelism

There is a rise in demand for vendors to deliver high performance solutions in order to satisfy the computational requirements of problems like XVA. This demand has pushed software providers to put technology at the forefront of the strategic roadmap and make significant optimisations.

Whitepapers

A First View on the New CVA Risk Capital Charge

The impact of the new CVA risk regulation framework on calculation methods and the infrastructure of banks could potentially be the turning point for many of the medium-sized institutes we are seeing in the market.

Let's talk!

Speak with one of our solution experts
Loading...