James Zante, Product Manager, Risk Analytics, Integrated Market and Credit Risk
The latest report on Basel III adoption per country and pending actions on the US implementation of Dodd-Frank highlights how regulation is changing financial services, and why financial firms are adapting their business models and operations in ways that are dramatically increasing Big Data demands.
Regulatory reporting requirements call for aggregate measures of enterprise-wide exposure, at increasingly frequent intervals, and these new regulations also offer incentives that will reduce regulatory capital requirements for firms that adopt more sophisticated risk analytics methodologies. The competitive pressure to improve return on capital has capital markets firms in a race to build infrastructures that can support more advanced risk calculation methodologies within shorter timeframes. IBM clients like Scotiabank are leading the way on Internal Model Method (IMM) as explained in this GARP Basel III webcast.
Banks have always faced Big Data challenges within capital markets, where traders yearn for insights into what trades will increase the profitability of the portfolio. Big Data solutions can provide traders with a competitive advantage by delivering up-to-the-minute insights on how market moves are driving potential exposures and creating new opportunities.
These insights come at a cost - they are data-hungry and computationally intensive. But when a single decision can be worth millions, most banks are unwilling to take analytical shortcuts, and are using the onslaught of new regulation to support their business case for making strategic investments in their enterprise risk management infrastructure.
This is a new era of Big Data for capital markets that demands accurate insights aggregated from massive volumes of trade data across the firm’s entire book of business. Both banks and regulators are demanding that the approaches taken are able to track the data sources used, how the data has changed over time, what is driving the changes, and who is involved in the process. We can examine emerging Big Data demands and practices specific to risk management within capital markets using the four dimensions of Volume, Variety, Velocity, and Veracity – as outlined in the recent white paper by the IBM Institute for Business Value, Analytics: The real-world use of big data:
Volume – the more the merrier
Understanding the market sensitivity on the credit risk of a portfolio to thousands of different risk factors - such as interest rates or commodity prices - requires simulating portfolio valuations across thousands of scenarios, projecting trade valuations through hundreds of time steps into the future, and aggregating the results across all scenarios for each individual risk factor, and segmenting the exposures into counterparty relationships and hierarchies.
This requires an infrastructure that can pull the huge volume of input market data required and efficiently distribute the processing of this data into the exposure analytics required. Firms already have vast data warehouse infrastructures for external market data and the internal data on individual trades, and in pursuit of a competitive edge on trading decisions, they are looking to establish links with an increasing variety of structured and unstructured information.
Variety – more of everything
Structured information includes the trade terms and conditions, trade confirmation records, or events related to violations or excesses generated. Unstructured information includes any current market news about the associated country, counterparty, or currency, which can drawn from an ever growing number of news sources and social media feeds. Banks have taken several different approaches on how to process and structure this information into their risk analytics engines. Participants see this as a new space to develop sustainable competitive advantages with system requirements that can deliver pre-deal risk-reward metrics such as CVA, funding costs, and capital charges which captures a variety of products, risk relationships, and scenarios. This enables managers of the bank to set the right incentives for traders to make the right decisions that satisfy risk policy for the firm and the individual stakeholders involved.
Velocity – deliver more, faster
How to leverage the volume and variety of available data towards making better decisions is the biggest challenge facing trading desks today. The level of demand for higher quality information in increasingly shorter timeframes has outpaced advances in hardware, and firms that have approached this challenge with brute-force methods alone have continued to find themselves bound by the input/output constraints of their underlying hardware.
Firms need integrated hardware and software approaches that can manage end-user demands for analytics in ways that prioritize workloads and deliver the required risk insights within the appropriate timeframes of seconds, minutes, or hours. Global banks must handle committed transactions and pre-deal checks in ‘real-time’ where requests are coming in waves of 50 or more per-second during peak trading periods. Societe Generale had to overcome similar challenges, as Kai Pohl, the Head of CVA Trading at Societe Generale explains in this Societe Generale case study video.
Understanding the movements of risk measures day-over-day requires the ability to rank-order trades by some fixed result, and firms face challenges in processing huge artifacts of data stored on disk where the read/write of the hardware becomes a bottleneck. The latest approaches to hardware acceleration explored in this IBM labs video, and in-memory analytics can get around some of the I/O constraints. Ultimately, decisions on how to best optimize the infrastructure to meet demands is an exercise for each individual firm and their specific requirements. For example, creating an architecture which loads everything to memory would be fast, but for most the costs involved would not be an efficient use of resources, so banks are looking for software that is smarter about what gets loaded to memory when, so that the required data is ready when needed. This is essential for tasks like rank-ordering trades with an individual counterparty by their standalone exposure or by their CVA, so that the user can quickly understand why risk measures on a counterparty have changed.
Veracity – more requirements and much more transparency
Regulations such as Basel III and Dodd-Frank require banks to track a trade's history and lineage and all events around it, and this is a growing dimension of Governance Risk and Compliance in capital markets.
Regulators are also becoming more stringent on what practices are acceptable, and reporting must prove that banks are following regulatory guidelines. For example, a big issue in capital markets is the shrinking liquidity of the CDS market, which makes it more challenging to price credit risk into trades via measures like CVA. It is becoming more difficult to get regulators to accept hedges on the CVA capital charge since the hedge must be 1:1 and cannot be from a proxy name.
The final word
Regulatory forces are squeezing the profit margins of banks, which presents a clear business case for banks to invest in more sophisticated risk analytics systems that can help mitigate the profitability impacts of increased regulation. With interactive risk-aware decision support, firms can improve their strategies to minimize regulatory capital and create the right incentives for traders to balance risk and reward with guidance on what to trade, who to trade with, and how to best structure the trade.
For the latest client case studies and papers on related topics, visit the IBM resources pages for Basel III and CVA.