How is Data Science Transforming Banking and Capital Markets?

Part 1 of this blog series explores data science in the context of risk management technology and operations. This blog reflects on how the financial environment, and the broader landscape, has changed over the last decade and the market trends that are driving the rise in data science approaches.
26 May, 2020

Harsh Realities from the Financial Crisis

Looking at the headline stats that characterize banking and capital markets over the last 10 years, it seems inevitable that the harsh realities from the 2008 financial crisis will be amplified post-COVID-19. For example, over the last 10 years or so, the average banking industry return on equity has dropped from around the 15% level down to the 5-10% range, which is a significant drop. At the same time, regulatory capital buffers have increased by more than 300% in the last decade and this roughly translates to about 12-14% of capital as a percentage of risk weighted assets, compared to the 4-6% percent buffers before the crisis. Simultaneously, bank operating expenses associated with risk, compliance, technology and operations have skyrocketed and spending is expected to hit more than $90 billion dollars globally by 2022. Judging by these figures across the industry, even without the current uncertainties that lay ahead, it seems reasonable to conclude that conditions have already been challenging for many sell-side banks and increasingly also for buy-side firms.

Market Trends – Amplification of Change Drivers

There are several big-picture themes expected to drive change for financial institutions in the coming years. The first of these drivers are technology related and include digitization, the application of big data, high performance computing and data sciences across different parts of the bank’s value chain. Financial services and their risk management functions have always been a participant, as well as a beneficiary, of technology advancements. However, with the current shift in emerging technologies, this is both a source of risk, as well as an enabler, for many institutions. As with any change, there are the typical discussions around better automation, performance and new functionality. Forward-thinking executives are now exploring questions around what a digital operating model might actually look like for the risk and control function. From their standpoint, some are now asking how to go beyond being just a control function and meaningfully use these technology advancements to become a more useful resource for the business.

Secondly, across all financial services sectors including banking, insurance, trading and investment management, there has been a rise of FinTech adopters and challenger firms. These FinTech firms have less legacy encumbrances to contend with, they focus on differentiated customer experience and some are even employing more innovative risk algorithms. For example, the use of supervised machine learning for credit underwriting or to detect fraud and cyber-attacks, which are especially pertinent now.

All of these changes are increasing customer expectations, in terms of speed and better quality of service and hence control functions will need to feature seamlessly as part of the overall customer journey. For example, if you are doing pre-deal risk and capital adjusted pricing or credit risk, underwriting and approvals, then some of these risk processes and calculations will need come in line with the core business process provided to the customer.

Regulation continues to be a source of complexity, inefficiency and risk for institutions. Prudential and conduct related regulations are typically more principal based rather than prescriptive and therefore are subject to regulatory discretion, especially in uncertain times. Systemic institutions, for example, are more likely to face increased supervisor queries and subjected to higher capital add-ons. Therefore, some regulator facing activities will need to be better streamlined.

Lastly, there are new types of uncertainties and risks that have not conventionally been areas of focus for the risk function. For example, digital assets and their intersection with the real economy, cyber risk, climate change and also some of the emerging risks that are associated with the increased usage of AI and machine learning. All these areas are now receiving more attention from a risk management perspective. More pertinently, when lockdown ends in the coming months, human beings in collective terms, will also be considered risk factors in the workplace from a business standpoint.

Taking into account the current challenges, we see that the stakes are not just on new processes and operational efficiency but also achieving greater customer advantage and capital efficiency across lending and trading. This will only become more important in the years ahead as firms deal with the ongoing waves of change and some of the fallout from the current pandemic scenario globally.

The Great Quantitative & Analytics Challenge

As a general statement, it is probably fair to say that to remain differentiated or at least to stay in line with the curve, in relation to risk management technology and analytical capabilities, one needs a fair degree of investment, both up front and on an ongoing basis. Analytics development is still highly labour-intensive and is potentially replicated across different asset classes and instruments that a firm trades in. It also involves specialized programming and quantitative finance skill sets which are conventionally difficult to recruit for. The majority of model development activities still involve significant manual data wrangling and data quality issues, with limited reusability in terms of models and code. Finally, many firms with tightly coupled connections between applications and system components find it more prohibitive, costly and risky to make these changes.

Celent has conducted a study around the cost of deploying analytics capabilities for front and middle office trading. This study focused on derivatives pricing and risk, it illustrates the costs associated with some of these efforts when developing in-house.

Figure 1: Total Cost of Derivatives Analytics

The key findings in Figure 1 suggested that firms pursuing areas such as derivatives pricing and risk analytics would require an upfront investment of at least $9 million and a total cost of $25 – $36 million over five years, in terms of total cost of ownership. It is important to note that this study applies more to larger Tier 1 institutions and whilst the majority of firms may not reach the higher extreme of that range, the levels of expenditure can still be significant relative to the size of any firm. This would also be dictated by the nature of a firm’s investment strategy and how dynamic and sophisticated the analytical models need to be.

Data and Alignment Issues

The provisioning and deployment of analytics for front office positioning, risk measurement and portfolio modelling still represents one of the last bastions where costs can be further streamlined. Given that many firms contend with a fragmentation of tools, data and development environments, this makes analytical integration, validation and governance activities highly manual and onerous. Against the current challenging backdrop, firms will need to ensure that data analytics are efficient and also lean from a cost perspective, such that there is headroom for innovation and more forward-looking changes. This tends to be where firms typically fall down or where there is a technical disconnect across the organization.

What poses the biggest complications for many firms wanting to quickly assess financial impact, forecasting and planning, is that there are multifaceted dimensions, scenarios and also granular factors involved. The planning, modelling and formulation of business response strategies are not always easily compartmentalised within single systems. For many organizations the disconnect lies in the ability to respond quickly based on timely and accurate information. Figure 2 highlights the results from a 2018 Celent survey. Common pain points relate to information management issues; fragmented tools, data and information quality and inadequately granular reporting. This does not reveal anything ground-breaking or new, yet it does highlight that there are still improvements that need to be made.

Figure 2: Challenges with Existing risk systems/Data

The reality of the different challenges faced by the corporate business unit and departmental stakeholders, in contrast with what the organization needs in terms of being strategically responsive, still represents a stark picture for many firms. This is where next-generation technologies and approaches could possibly help when deployed appropriately. As firms move beyond some of the immediate business continuity responses to prepare for a life after lockdown, it is likely that digitization trends and the appetite for firms to adopt data science approaches will increase and this will hold interesting possibilities for the risk function as well.

Technology and Innovation

The last decade, especially the past five years, has seen the rise of symbiotic, emerging technologies and next-generation approaches which are fanning the flames of innovation and change across the financial industry and beyond. There are now synergistic groups of technologies and IT paradigms that are operating at scale and will further accelerate digitization and boost resiliency. This is happening around a combination of areas such as high performance virtualization of hardware and software infrastructure via the cloud, advanced analytics, data sciences coupled with collaborative development paradigms that would drive operational efficiencies and business collaboration.


These modern data science approaches and the ecosystem of components that have emerged in recent years, when properly deployed, can enable firms to address some of the current limitations inherent within incumbent risk systems and operations. For instance, in a typical front and middle office risk management environment, quants, developers, data scientists and risk analysts may not work efficiently due to bottlenecks in their IT departments. This may be due to the slow provisioning of computing resources or not having timely access to data. This also includes ease of collaborating with peers due to the fragmentation of tools and datasets, as well as a general lack of collaboration features in some of these tools for example, Excel spreadsheets.

In contrast to this, having higher on-demand access to large computational resources via the cloud, with high performance fast data stores using in-memory architectures. This enables firms to do more ad-hoc analysis, testing and validation of models in a centralized framework, using the most granular levels of data without the need to pre-aggregate or pre-format the data as a whole. Celent has seen a growth in firms looking to infuse a data science approach into both front line as well as risk management activities.

Improving the use of data and analysis is a top management priority

Figure 3 is taken from a 2019 Celent survey of risk management functions. It shows that the most important focus for firms is improving the use of data and analytics, alongside a growing interest in augmenting or enhancing risk systems. The goal of these enhancements is to deliver more timely information into frontline positions and to allow dynamic risk monitoring based on some of the changes in risk conditions that a firm faces. In Figure 4 we can see that firms are looking for better risk information to inform on specific decisions and also to assess emerging risks.

Figure 3
Figure 4

When firms look to use risk information to support specific decisions, especially those that are more one-off strategic decisions or for identifying emerging risks that are facing the organization, this usually involves steps that are more exploratory and typically not well defined. Risk analysts, data scientists or business unit groups may need to collaboratively conduct research and gather conventional and alternative data sets to designate proxies, in order to make some of these bespoke assessments. The nature of some of these tasks is usually ad-hoc and they require iterative what-if type queries to test and validate hypotheses, for which conventional risk systems, even transactional systems, may not be well-suited for. Data science tools are apt to address these activities and this is where there have shifts in terms of strategic intent towards the use of a data science approaches.

Growing demand for quantitative finance and data analytics expertise

In the results from a 2019 Celent survey we can see that the demand for data science and data analytics expertise has grown significantly in recent years. The risk function at firms are recruiting for data scientists and machine learning experts who can perform advanced mathematical and statistical analysis. At present, these skill sets are still expensive and relatively difficult to find.

Figure 5: Desired Expertise in Future Risk Function Hires


Having fragmented quantitative and development infrastructure tools and data sources, could hamper a firm’s ambition to grow and scale up. For instance in areas such as efficiency of model and IT development life cycles, the ability to work collaboratively together i.e. sharing code, findings and charts productively, as well as to streamline supervisor interactions and the regulatory transparency requirements of underlying models and calculations.

Risk and compliance functions will be significant beneficiaries of collaborative, enterprise-grade data and quantitative platforms

In relation to some of the points outlined above, the one change that will be significant in the coming years is the emergence of next-generation data science offerings. This is connected to the rise of centralized, rapid workbench environments with enterprise-grade data and quantitative analytics for development, testing and production tasks. These industrialized data and model building sandbox facilities will allow data scientists, quants, business experts and end-users to consolidate, prepare and manufacture analytical and data components in a collaborative manner. This concept of a financial and risk modelling tool for ad-hoc prototyping, data analysis and data interrogation is not new, but until now the use of more industrial-grade tools have yet to take off in a meaningful way.

Technology advancements are making it more compelling for firms to embrace and standardize on studio environments based on cloud native computing and storage with interactive on-the-fly analysis. These environments use script based languages, such as Python, that are business friendly,and have facilities and native integration around BI visualization charting along with machine learning libraries.

There a number of these next-generation components and offerings on the market, some are commercial offerings, while others are more generic, open source data science tools. In many ways, these offerings are next evolution of data warehouses or data lakes, although are actually more sophisticated. These next-generation environments combine functionalities that will allow risk, compliance and developers to construct user interfaces, define business rules and logic, as well as the facilities to prepare data.

These platforms can be used to distribute and interface with applications that need access to shareable data and analytics within these environments. Many larger financial institutions develop and deploy their own proprietary environments to achieve some of these capabilities. For example, we have seen this through Goldman Sachs with SecDB, JP Morgan with Athena and Bank America with Quartz. In recent years, a number of commercially packaged offerings have emerged on the market to address various industry use cases. There are differences in terms of a specific functional content, workflows and templates that would reflect industry specific needs. If an organization is looking to deploy some of these environments, firms will need to take into account the breadth and depth of their requirements as well as the scope for which they want to standardize some of these tools across the firm versus merely using them as specialist tools. Many firms are still at the early stage of piloting and migrating some of the core front office risk, compliance and regulatory datasets on such platforms to test and learn. This space is rapidly changing but the full potential will only be realized when firms start linking up specific control activities in line with front office production workflows.

Let's talk!

Speak with one of our solution experts
Loading...