Cyber security expert issues call to modernise Patriot Act
By Dan DeFrancesco | News | 21 March 2018
OpRisk North America: current policies make it difficult to share information on cyber attacks
Restrictions on data sharing in the USA Patriot Act and at the Financial Services Information Sharing and Analysis Center are hindering efforts to combat cyber attacks, the leader of a cyber crimes unit at Wells Fargo has said.
FS-ISAC is an industry-run initiative that allows financial firms to anonymously exchange information on cyber threats. Banks can also share data about money laundering and terrorist activities under Section 314(b) of the Patriot Act, which provides a safe harbour from legal liability. In both cases, the sharing of personally identifiable information (PII) is restricted.
“There is a real need to either evolve the information-sharing process under the FS-ISAC or modernise 314(b) sharing [in the Patriot Act] to include PII alongside cyber threats,” said Kelley Chamberlain, who leads a team within Wells Fargo’s Global Financial Crimes Intelligence Group focusing on cyber threats. “I personally don’t care which one gets modernised, but it has to happen otherwise we are just going to keep continuing to be owned by the bad guys.”
PII includes any information that can be used to ascertain the identity of an individual. However, Chamberlain said this information was necessary to effectively combat cyber threats. “You can’t decouple PII from cyber threats and expect to be effective,” she said.
Chamberlain also criticised the speed at which 314(b) forms are processed, saying response times can be a “month, maybe more, maybe never”.
“Cyber attacks obviously happen very quickly and some of these mechanisms are not necessarily as fast as they need to be for information sharing,” she said.
Chamberlain was speaking on a panel at the OpRisk North America conference in New York earlier today (March 21).
Speaking on the same panel, Filippo Curti, a financial economist in the quantitative supervision and research unit of the Federal Reserve Bank of Richmond, urged the industry to increase the number of cyber attacks it gathers information on.
“What is important for cyber security, at least in these early stages of this issue, is to share as much as you can, meaning that you don’t only look at the events above a certain threshold,” Curti said. “I am sure there is information in smaller [cyber attacks].… They can be informative.”
Curti also called on banks to identify the most important factors in mitigating cyber attacks and to include that information in a standardised format when sharing data on a breach.
“I do believe it would be better for the industry to share more,” he said. “At this point, it is very hard to give a standardised factor. I think the industry needs to understand together what they believe are the factors and then they can share and understand.”
Cyber regulations prompt banks to elevate Ciso role
By Steve Marlin | News | 21 March 2018
Information security increasingly seen as risk management function
New cyber risk regulations are forcing banks to give their chief information security officers (Cisos) a more senior role in risk management.
The New York State Department of Financial Services’ Cybersecurity Regulation, which became effective in March 2017, requires Cisos to update the board of directors at least annually on banks’ cyber security programmes and vulnerabilities.
“Our friends at the NYDFS have made life even busier for organisations by prescribing different points that have to be met,” said Thomas Kartanowicz, head of information security at Natixis North America. “The Ciso is going to have to adopt a risk posture; it can’t just be an IT function. It has to be in line with the business, totally engaged in all activities.”
Kartanowicz was speaking on a panel at the CyberRisk North America conference in New York on March 20.
Cyber-related risks featured prominently in Risk.net’s Top 10 op risks survey, with respondents citing IT disruption and data compromise as the biggest threats facing financial firms in 2018.
However, a report published by the Financial Services Information Sharing and Analysis Center (FS-Isac) on February 12 found 39% of Cisos at financial firms currently report to the chief information officer, with only 14% reporting to the chief risk officer. Another 13% of Cisos report to the chief operations officer, while 8% report directly to the chief executive officer.
Cyber risk specialists expect that to change. “In the future, we will see more Cisos report to the risk side of the organisation, especially in banks, where the three lines of defence are clearly defined, either by regulation or best practice,” said Henry Jiang, head of cyber risk at Societe Generale Corporate and Investment Banking, who was speaking on the same panel.
Regulators are pushing banks in this direction. The Federal Financial Institutions Examination Council, which prescribes standards for bank examiners, has already recommended information security should be separated from daily IT operations.
FS-Isac has also called on Cisos to provide more frequent and timely reports to the board of directors. Meanwhile, Cisos have become more adept at communicating to a board-level audience.
“Going back 10 or 15 years, the Ciso would be unchained from the server room, wheeled out, and give a presentation, and the rest of the C-suite would be looking at them like they’re speaking a different language,” said Kartanowicz. “Fast forward 10 years, the Ciso is starting to talk more to C-level people. In 10 years, you’re going to have to have your game at an even higher level.”
CCAR gives op risk modelling a new lease of life
By Steve Marlin | News | 21 March 2018
OpRisk North America: Fed’s annual stress tests are rehabilitating ‘black box’ op risk modelling
The US Federal Reserve’s annual stress tests for large banks may have accomplished what the Basel Committee on Banking Supervision could not: legitimising the field of operational risk modelling.
The committee’s advanced measurement approach (AMA) for operational risk – which is being scrapped in favour of a simpler, standardised approach – allowed banks to use their own models to calculate regulatory capital charges. However, AMA models were often perceived as ‘black boxes’ that failed to reflect the way bankers think about operational risk.
“We’ve been more successful in getting buy-in for the way we measure losses for CCAR [the Fed’s Comprehensive Capital Analysis and Review] than for capital, because with CCAR, we tap into the existing BAU [business as usual] processes and the way the lines of business think about the loss-generating mechanisms,” said Nedim Baruh, head of operational risk measurement and analytics at JP Morgan.
Baruh was speaking on a panel at the OpRisk North America conference in New York yesterday (March 20).
CCAR requires banks to identify operational risks using scenario analysis. But unlike with the AMA, risk managers, business heads and senior executives now have a direct say in how these scenarios are constructed.
“Too many people in the businesses saw scenario output being thrown into a black box [under the AMA],” said Evan Sekeris, a partner in the financial services practice at Oliver Wyman. “Now, there is significantly more involvement because there’s a tangible causal effect. Because they are part of the risk identification process, it forces them to identify the different control failures and provides useful information on areas of improvement.”
CCAR is also forcing banks to devise new ways to model emerging operational risks – such as cyber threats – that may not have materialised yet in loss histories.
“We are coming up with more innovative ways to model risk,” said Justin Hahn, head of internal capital adequacy and assessment process at Deutsche Bank. “Modelling cyber security risk is not something we would’ve been hearing about two or three years ago.”
We’ve been more successful in getting buy-in for the way we measure losses for CCAR [the Fed’s Comprehensive Capital Analysis and Review] than for capital
Nedim Baruh, JP Morgan
Deustche Bank has also established so-called “uncertainty buffers” around the amount of op risk-weighted assets associated with a business line based on the quality of information it provides for stress-testing purposes.
“It’s very convincing when you start to tie losses to RWAs and capital,” said Hahn. “A very motivating factor is to say, ‘The more ambiguous the output I’m getting, the more it will result in uncertainty buffers that we end up layering on to your business or segment, which directly impacts the P&L of your business’.”
Banks are also finding that much of the op risk modelling needed for CCAR is already being performed by various business units within them. For instance, the consumer banking division has models for estimating losses associated with credit card fraud, while the legal department quantifies potential losses arising from litigation and regulatory fines.
“These segments of potential losses can be measured using existing information that is part of BAU processes within the bank,” said Baruh.
JP Morgan segments op risk losses into three buckets: recurring losses; large, infrequent losses such as legal settlements and regulatory penalties; and idiosyncratic losses that have relatively little loss history such as cyber breaches. It then leverages existing bank processes for collecting data on these losses, Baruh said.
Op risk modelling will continue to improve as banks refine their approach to CCAR, Sekeris said: “We’re not fully there yet. A lot of the CCAR machinery has been built around the macro drivers and not as much on firm-specific drivers, but over time we will move in that direction.”
UBS hoping for capital relief for past op risk losses
By Alexander Campbell, Tom Osborn | News | 21 March 2018
OpRisk North America: Swiss bank has taken action to prevent a repeat of costly missteps
UBS is hoping its regulators will reduce the level of operational risk capital it must hold against past losses under the forthcoming standardised measurement approach (SMA), according to James Oates, UBS’s global head of compliance and operational risk control.
Speaking at the OpRisk North America conference in New York yesterday (March 20), Oates set out how the bank had systematically overhauled its operational risk and compliance function in a bid to prevent a repeat of costly missteps – from the Kweku Adoboli rogue trading case to its involvement in the forex and Libor benchmark-rigging scandals – which have harmed its reputation and driven up its operational-risk-weighted assets.
Under the forthcoming switch to the standardised approach to measuring op risk capital (SMA), the amount of capital a bank must hold is skewed by the losses it has suffered over the past decade. However, a compromise clause inserted into the Basel III framework allows national regulators to let banks under their jurisdiction ignore the impact of past losses from their calculations – significantly reducing their overall op risk capital requirement.
Asked whether UBS hoped to benefit from this clause, Oates said: “That’s the million-dollar question – maybe even billion-dollar question, actually. We are engaging in that very question with our regulators.”
Even if the bank’s regulators decide not to show clemency, several of its largest fines will have rolled off by the time the SMA is implemented from 2022, Oates pointed out, such as the $780 million settlement with US regulators relating to criminal charges that it helped clients avoid paying taxes made in 2009.
“I think the time horizon certainly helps, in terms of when [past losses] roll off. Some of our big losses are early on in that cycle. But we are keen to work with our stakeholders to figure out a way of [assessing]: what’s the real root cause of some of those losses, and can we demonstrate for some of the businesses we are no longer in or have exited, that we’ve not only exited the businesses, we’ve exited the root causes associated with those losses as well?” he said.
That is easier said than done, Oates noted. Several banks have struggled to convince regulators that, though they have divested business lines, they should no longer be forced to hold op risk capital against them.
Our control framework was not designed to stop people trying to commit a crime
James Oates, UBS
UBS has certainly taken its medicine in terms of overhauling its risk culture, though. Curing its op risk problems required a complete change of focus for the bank, said Oates. In recent years, the lender has reorganised its risk and compliance functions, devised a new firm-wide risk taxonomy, revamped its risk and control assessments, and stepped up its monitoring of employee behaviour and conduct.
But it was its implication in the Libor scandal which brought home that the bank’s approach to operational risk was aimed at the wrong target – mistakes, rather than malice, said Oates.
“We were very focused on end-of-day controls – trade confirmation, profit and loss and so on. But we had one trader in Japan who had all his trades and P&L booked properly at the end of each day, but he tried to collude with someone at another bank – to try to get him to submit a price he didn’t believe in. Our control framework was not designed to stop people trying to commit a crime,” Oates said.
Cultural change was an early focus as the bank struggled to get beyond the scandals, Oates said, with a mindset change required of the bank’s compliance and operational risk functions which had missed the developing benchmark-rigging scandals.
“We found that most compliance officers believed their job was to have good policies in place for each regulation – and good compliance officers would also see that these policies were being followed. So they didn’t pay attention to unregulated markets. And when we asked why the operational risk function didn’t notice [the scandals], we found that operational risk saw itself mainly as a reporting function.”
But the biggest challenge was working with the concept of risk appetite. At first, Oates says, raising the subject met with a knee-jerk reaction that appetite for operational risk should be zero – “we can’t have another unauthorised trading incident”, he was told.
The result was to force business leaders to take ownership and accountability of operational risk, he said, and think about their risk tolerance in terms of appetite for controls. Exercises such as risk and control self-assessments, which op risk staff had been faithfully compiling, are best conducted against a defined risk appetite, Oates noted; without an appetite statement, there is no way to judge whether controls are adequate.
“If I had my life to do over again, I would have started with that risk appetite discussion very early on,” Oates said.
Precise cyber modelling ‘a pipe dream’, expert says
By Dan DeFrancesco | News | 20 March 2018
OpRisk North America: Cyber risk models should aim for accuracy, not precision
Jack Jones, co-founder of RiskLens and architect of the Factor Analysis of Information Risk (Fair) model, urged banks to “get over this notion of precision” and focus on generating accurate results when modelling cyber risk.
“When people think about quantifying risk who aren’t in the profession of quantifying risk, they think we are supposed to quantify it precisely, which is a pipe dream,” Jones said. “Precise measurement of risk in our problem space is not going to happen in my lifetime.”
Jones was speaking on a panel at the OpRisk North America conference earlier today (March 20).
Fair was developed in the early 2000s and provides a standard map of cyber risk factors and their interrelationships. The model’s outputs can be used to inform quantitative analysis, such as Monte Carlo simulations or sensitivities-based analysis.
While proponents of the model say it helps banks order and prioritise their defences against cyber threats, its detractors say it produces “guesses and estimates” rather than actionable outputs.
Jones said incomplete data made it difficult – but not impossible – to model cyber risk. “You can still have an accurate measurement, it’s just going to be a wide, flat distribution. [There’s] nothing wrong with that. It’s a faithful representation of the quality of your data, which by itself is a really important part of the conversation with management,” he said.
Even imprecise model results can help a bank to improve its risk management, Jones argued: “If you go to management and say it’s a medium risk, that is not conveying confidence in any way, shape or form. But if you go to them with a quantitative measurement – a distribution of possible outcomes that is wide and flat – then a part of the conversation can be, ‘it’s wide and flat because we have really poor data over here. If we did this over the coming years, we could sharpen that up’.”
Manan Rawal, head of US model risk management at HSBC USA, agreed with Jones’ assessment. The challenge is translating the model results – however imprecise – into action points, he said.
“That articulation has got to be: what is the uncertainty of your estimate of your quantification? What is the likely outcome if you don’t do anything? Where are the holes in your control infrastructure to allow you to mitigate some of that potential uncertainty? All those types of things have to be fully laid out,” said Rawal, speaking on the same panel as Jones. “If you do that in a clean and concise way, you will generally get a more proactive engagement factor from those folks.”
Jones added that senior managers rarely have a problem with imprecise risk management – only modellers themselves believe the results have to be precise.
“Quantification is not a problem,” he said. “Precise quantification is a pipe dream.”
OCC forms working group to tackle fraud
By Tom Osborn | News | 20 March 2018
OpRisk North America: Rise in cyber fraud prompts US regulator to reassess guidance to examiners
The US Office of the Comptroller of the Currency is reviewing its guidance to bank examiners on assessing the adequacy of fraud detection and prevention frameworks amid a rise in cyber threats.
Speaking at the OpRisk North America conference earlier today (March 20), Lazaro Barreiro, director of governance and operational risk policy at the OCC, said the watchdog had spent the last six months building a working group dedicated to the task.
“We’re starting a fraud working group, to assess whether there is a need for additional guidance on fraud, and if there is, what kind of guidance do we need? Should it be a handbook with which to equip the examiners? Right now, we don’t have much on fraud, and when it comes to cyber, we’re seeing an increase in fraudulent activities,” he said.
Asked whether the working group’s focus was on bolstering guidance on internal or external fraud, Barreiro indicated it would encompass both, saying: “I think it’s going to be very broad fraud assessment, mostly focused on [banks’] risk management programmes for fraud.”
Senior operational risk practitioners ranked the threat of losses from theft and fraud fourth on their list of top concerns for 2018, with many citing the fast-growing number of losses from cyber sources such as phishing attacks or online identity theft as major worries. Losses from fraud were also among the industry’s top realised losses last year.
Speaking on the sidelines of the conference, Barreiro said the working group’s primary aim was to bolster the ability of the regulator’s own bank examiners to assess whether a firm has adequate frameworks in place for fraud detection and prevention.
He gave the example of improving guidance to OCC examiners on assessing whether banks were correctly reporting losses in their credit card businesses as genuine credit losses, or whether there were in fact cases of credit card fraud going undetected or underreported.
Although the working group is still in its embryonic stages, Barreiro indicated the OCC was actively discussing the initiative with other US prudential regulators, with a view to sharing policy insights. He also indicated the watchdog could look to invite industry stakeholders to share their views with the group in future, including banks.
Speaking on the same panel as Barreiro, Glenna Hagopian, chief conduct officer and head of enterprise risk management at Citizens Financial Group, suggested her bank was reassessing the way it assigned tolerance limits to various categories of operational risk, particularly those related to fraud.
Many banks use so-called key risk indicators (KRIs) to measure their level of exposure to a given risk at a particular point in time. A classical approach to op risk dictates that by monitoring KRIs and checking outputs against internal limits and thresholds, a bank can determine whether its op risk exposures are within its risk appetite. But Hagopian questioned whether KRIs were an appropriate tool for measuring loss events for which banks have a permanently low tolerance, such as fraud.
Fraud loss, which Hagopian characterised as a cost of doing business, might have a hard limit established. By contrast, she questioned the value of assigning hard limits to potential loss events such as data breaches or failed penetration tests, pointing out an institution would not have an appetite for even one such incident. Rather, she suggested that such events would be better candidates for measurement and monitoring over time within a KRI programme.
“90% of our op risk losses are fraud. So what are we measuring, and what are we limiting? Why are we establishing limits on anything else but fraud, basically? We build the cost of fraud into our projected cost of doing business.”
Barreiro later suggested many banks – and in turn, the OCC – were struggling to put a dollar value on potential losses from certain types of operational risk, due to loss data and other relevant information being siloed between divisions.
“We’re having a very difficult time quantifying operational risk. The information is so scattered that it’s very difficult for us to get the metrics we need to clearly assess it. And I think it’s the same for many institutions: the exposure lies within so many different divisions, that it’s hard to step back and say ‘OK, this is our holistic view of op risk across our enterprise,’ versus looking at it within your department or division. It’s important to have a broad perspective when you’re quantifying op risk.”
Update, March 22, 2018: This article has been updated to add further context to Glenna Hagopian’s remarks.
By Ariane Chapelle | Chapter | 16 March 2018
A good deed is never lost. These are the words of St Basil the Great, a fourth-century bishop, and they have a particular resonance here. In June 2014, I was grateful to Incisive Media for inviting me to speak at their OpRisk Europe conference, the biggest gathering of operational risk professionals in the financial industry. To thank them for their kind invitation, I wrote an article about the conference and how, at last, the industry seemed to be making significant progress in understanding the causes and mechanisms of operational risks (chapter 19). Alexander Campbell, editor of Operational Risk & Regulation, liked it and offered me a column in the magazine. Occasional contributions became regular bimonthly articles, which I have found demanding but also enjoyable and liberating. The articles gave me a platform to express new ideas, and to communicate to a wider audience some of the most inspiring conversations and debates that I was having with numerous professionals during operational risk training classes, workshops, conferences and consulting assignments.
The regular column came after several articles addressing hot topics at the time, such as rogue trading (chapters 21 & 22) and key risks indicators (chapters 9, 10, 11), a topic still widely discussed today. I wrote with others, with whom I shared views on Conduct (chapter 18) and reservations about the SMA reform and capital modeling (part 6). I also wrote for other publications, such as Risk Universe and Risk Professional, and wrote blogs for PRMIA.
This book is a collection of these articles, spanning more than five years. From the fundamentals of operational risk to regulation and capital, they are presented in six thematic parts. They are reflections, opinions and reactions to what I observed and experienced over the years. The first one, “The Rogue’s Path”, published in 2012, is based on the experience I gained as a trainer in Société Générale’s remediation programme, “Fighting Back”, following the loss of €4.9 billion by Jerome Kerviel. The most recent chapters co-authored with Evan Sekeris, are a heartfelt defense of operational risk management and capital, in response to some contemptuous comments from Jamie Dimon regarding operational risk.
Operational risk management and measurement disciplines are still in their infancy in financial services, and that is what makes them fascinating. New methods and ideas arise every day and practices are constantly evolving. I hope that this publication will give readers a variety of subjects, images, analogies or suggestions to stimulate awareness and understanding of operational risk in the financial industry and beyond.
I was lucky enough to start my journey in operational risk with ING, in 2001, when the bank decided, alongside the 12 ORX founding members, to decided to adopt an Advanced Measurement Approach (AMA) for reputational reasons, regardless of the effects on its capital. ING, from the beginning, wanted to be at the forefront of operational risk management developments in the financial sector. I benefited early on from some of the best methodology in the sector. Sixteen years later, there is still much work ahead to understand the characteristics, causes and consequences of operational risks, particularly in our constantly changing world.
I encourage you to absorb and reflect on the topics in this publication. Your experiences and comments will inform a fascinating and much-needed debate on what operational risk management and measurement should look like in the financial industry.
Monthly op risk losses: MetLife suffers $510m pension reversal
By Risk staff | Opinion | 13 March 2018
Also: UK credit card mis-selling; Italy crypto loss; LendingClub class action settlement. Data by ORX News
The biggest operational risk loss in February was incurred by US insurer MetLife, which reinstated $510 million of pension reserves it had previously released in the 1990s. The firm discovered it had failed to make appropriate efforts to locate almost 13,500 people before declaring them “unresponsive and missing” and releasing their funds from its reserves. As a result, annuitants failed to receive due payments over a 25-year period.
Its internal controls at the time meant MetLife would release funds after only two unsuccessful attempts to contact pension holders, once at age 65 and once at age 70. Following the admission, MetLife made changes to its procedures. For example, it has now pledged to make multiple attempts to contact pensioners and to conduct additional data checks before releasing funds.
The deficiencies and remedial action came to light in an 8-K filing the insurer lodged with the US Securities and Exchange Commission.
In the second biggest loss, Vanquis, the credit card subsidiary of UK’s Provident Financial, reached a £170.8 million ($239 million) settlement with the UK Financial Conduct Authority over its failure to disclose the full price of an add-on product it sold. According to the regulator, Vanquis informed customers how the product worked and what the monthly charge was, but did not tell customers that the product attracted interest if there was an unpaid balance on a customer’s card at the end of the month. The regulator imposed a penalty of £2 million on Vanquis, and the firm will refund £168.8 million to affected customers.
In third place, another cryptocurrency exchange suffered a large-scale hack. Following the theft of NEM coins from Japanese exchange Coincheck in January, last month 17 million Nano coins worth up to $270 million were stolen from BitGrail, an exchange based in Italy. BitGrail’s owner claimed the theft was due to a coding flaw in Nano coins, which left the cryptocurrency vulnerable to a hack. The coin’s developers refuted these allegations, leaving the attack’s causes unclear.
In the fourth biggest loss, US peer-to-peer lender LendingClub agreed to pay $125 million to settle class action claims it misled investors about its compliance practices. Shareholders alleged that LendingClub misrepresented its internal controls as adequate to ensure loans met investors’ criteria. The claimants are seeking to recover losses incurred when LendingClub’s share price tumbled in 2016. The fall occurred when the firm admitted governance failings relating to ousted chief Renaud Laplanche and the improper sale of $22 million of low-quality loans to an institutional investor earlier that year.
Finally, Indian Overseas Bank has suffered alleged losses of 7.71 billion rupees ($119.4 million) on fraudulent loans made to a Kanpur-based stationery company, dating back to 2008. The losses came to light on February 18, when India’s Central Bureau of Investigation published a report on the fraud, accusing Vikram Kothari, director of Rotomac Pens, of “misappropriation of funds, criminal breach of trust and violation of the Foreign Exchange Management Act”. Six other Indian banks are implicated in the fraud for a total of around $550 million: Allahabad Bank, Bank of Baroda, Bank of India, Bank of Maharashtra, Oriental Bank of Commerce, and Union Bank of India. According to local press reports, Kothari represented that the money would be used for the procurement of goods, but it was actually diverted to a sham company and funnelled back to Rotomac.
Story spotlight: Punjab National Bank hit by data breach
Along with uncovering a $2 billion fraud in February, Punjab National Bank faced an additional challenge after it discovered that the credit and debit card information of over 10,000 of its customers was for sale on the dark web at $4.50 per card.
According to Cloudsek, the security firm that uncovered the breach, the information had been available to purchase for at least three months. The security firm’s initial attempt to contact PNB was frustrated when the email it sent to the cyber-crime contact address listed on the bank’s website bounced back. PNB is still investigating how the details ended up on the dark web.
In focus: Outsourcing risk
‘Outsourcing risk,’ ‘third-party risk,’ or ‘vendor risk’. Firms may disagree over how to label the potential vulnerabilities introduced to their organisation through external suppliers, but they are unanimous on the potential damage caused by this threat.
When outsourcing, firms put their trust in third parties to perform activities on their behalf. This brings reputational, regulatory and economic risks, making the need for strong governance and oversight key.
ORX News has recorded almost 100 operational risk events related to outsourcing in the past decade. The recorded costs of fines and settlements related to these events are over $1.1 billion, but reputational costs are likely to be much higher.
Firms must put their trust in these third parties to provide these services to the same level the firm itself would have done, or better. With a firm’s reputation on the line, as well as exposure to regulatory or economic loss risk, they need to have strong governance in place. Oversight is key.
An ongoing ORX research study on outsourcing risk recently surveyed more than 50 firms worldwide on their practices. The majority of firms stated their outsourcing activity was increasing, as was their operational risk function’s oversight of the process.
With a greater number of suppliers and growing demand for second-line oversight, the mature state of vendor risk management is a target that is always moving. Data management, cyber concerns and reliability of contingency plans rank the highest in the considerations of firms over their vendor arrangements. This raises the question of whether the perceived savings or efficiency gains from outsourcing tasks to suppliers are actually being realised, if the internal resources dedicated to measuring, managing and monitoring a firm’s vendor portfolio are significant.
Market participants ranked outsourcing risk as the fifth most important operational risk affecting their firm in the year ahead, down from third place the previous year, according to Risk.net’s annual roundup of the top 10 op risks.
All information included in this report and held in ORX News comes from public sources only. It does not include any information from other services run by ORX and we have not confirmed any of the information shown with any member of ORX.
While ORX endeavours to provide accurate, complete and up-to-date information, ORX makes no representation as to the accuracy, reliability or completeness of this information.
From Lehman to rupee crashes: India’s CCP chief on market stress
By Afiq Isa | Profile | 8 March 2018
Risk chief sets about bolstering CCIL’s risk modelling, lookback periods, and portfolio compression
In August 2013, India was on the brink of a financial crisis. The rupee had plunged to a record low against the dollar and stocks were plummeting amid growing worries over the health of the country’s economy.
While a broader emerging markets sell-off caused by the US Federal Reserve’s decision to scale back the pace of its monetary stimulus programme was partly to blame, India’s financial markets were also suffering from slowing growth and a lack of financial reforms.
The crisis is still relevant for Kausick Saha, chief risk officer at Clearing Corporation of India Ltd, mainly because it feeds into CCIL’s efforts to fine-tune its risk models. The central counterparty has chosen to incorporate the sharp market moves which accompanied the crisis into its models, ensuring their calculations can’t ‘forget’ the higher requirements that were seen during this volatile period when spitting out an initial margin number. The harder part was persuading members the resultant increase in margins were for their own good, says Saha.
“Market stress periods should be incorporated for better modelling,” he says. “This may mean higher initial margins for market participants, but we do need to have procyclical risk mitigants. There has been a lot of analysis and literature on the 2013 crisis and by now the industry is more convinced that this is a necessary step.”
Saha, an alumnus of Lehman Brothers who shifted across to Nomura when the Japanese bank hoovered up the bankrupt dealer’s European and Asian business in 2009, is well versed in market disruption. He now hopes to bring that experience to bear as the clearing house looks to incorporate data from historic phases of volatility into its risk calculations.
“Lehman’s collapse in October 2008 was the most stressful period for the market in recent history. Such stress periods are important to have in risk models, so now we try to incorporate those even if they lie outside our historical observation periods,” he says.
Alongside his efforts to improve CCIL’s use of historical data, Saha’s other areas of focus include boosting the market’s use of portfolio compression. His mission is to equip CCIL, which is the venue for settlement of all secondary market transactions in government securities in India, with the tools to more accurately assess margin impacts during market gyrations, reduce the risk for counterparties and cut capital requirements for dealers.
Cleared for launch
CCIL was set up in 2001 by five Indian banks and a state-run life insurer to provide clearing and settlement services for transactions in money markets, government securities, foreign exchange and derivatives. Aside from being a trade repository for over-the-counter derivatives, CCIL also undertakes portfolio compression for non-cleared rupee interest rate swaps and cleared forex forwards.
Inclusive of debt market, money market and forex instruments, its total monthly average settlement volume exceeds $1 trillion. As of December 31, 2017, outstanding notionals on forex forwards and rupee derivatives at CCIL were approximately $650 billion.
Saha, who began his career at India’s largest non-state-run lender HDFC Bank in 2003, joined CCIL as senior vice-president in the risk team in February 2015. He replaced retiring chief risk officer Siddhartha Roy six months later. Saha’s background in model risk analytics and model validation is informing his current work on overhauling CCIL’s margin models.
Lehman’s collapse in October 2008 was the most stressful period for the market in recent history. Such stress periods are important to have in risk models
He is looking to harmonise the lookback period for CCIL’s margin models across all segments, alongside adding stress data to improve its calculations. The firm’s current lookback period is four years, but the stress data from the Lehman period or the rupee crash can be inputted into the model and used as a floor, for a more accurate assessment of the margin impact in a stress situation, Saha explains.
Extended lookback periods in nascent markets run the risk that the data is patchy. It is only since liquidity in the rupee swaps market has improved over recent years that CCIL has been able to obtain adequate data for its four-year lookback period.
Other central counterparties have taken differing approaches to setting their lookback periods for their value-at-risk models, which are used to calculate trading book capital and initial margin requirements. LCH, for example, extended its lookback period to keep the Lehman episode in its time series.
“All the initial margin models now have a procyclical floor. In benign times your initial margin may be quite low, and in times of stress it might spike up and there will be some huge liquidity requirement from the members,” Saha says. “The procyclicality floor mitigates this. On top of this we can add the stress period which comprises around 25 to 50 consecutive maximum days of stress that we have seen in history.”
The firm plans to incorporate stress data into its clearing of forex forwards as well. “In our initial margin model, apart from the historical period, which is a rolling history comprising 750 days of observations, we will also incorporate 250 days from a stress period,” says Saha.
Compressed into action
CCIL has also made inroads into portfolio compression. In September last year the firm carried out its thirteenth compression cycle in over-the-counter interest rate swaps. Portfolio compression reduces the gross notionals and the amount of capital that counterparties need to hold against such trades. The technique aims to allow participants to increase their trading activity without a corresponding increase in capital; it is a key catalyst for volume growth in India’s nascent derivatives market. The daily turnover for derivatives in India’s National Stock Exchange carried a notional value of $70 billion last year, according to data from Bloomberg.
CCIL achieved a compression rate of around 86% with 20 banks participating in its most recent cycle. For forex forwards, the clearer has undertaken six rounds of compression with a 60–65% rate out of a maximum of 10 participants, Saha says.
CCIL has also made changes to its default waterfalls across all segments during Saha’s time. His team is working on implementing recovery tools at the end of the default waterfall – a necessary risk mitigation mechanism should members suffer losses that are higher than the resources available in the default fund. However, Saha says the CCP has ruled out the use of initial margin and variation margin gains haircutting in addressing such potential shortfalls – controversial recovery techniques which other CCPs are currently grappling with. Saha does not rule out further cash calls as part of its proposal on recovery tools, though.
Saha says 70% of his time is spent on core risk management efforts while the rest is for formulating policies and communicating these to clearing members. “We always work with our members. Any changes in our approach would have to go through consultation. This is how we develop the market,” he says.
2006–2009: Lehman Brothers, assistant vice-president, model market risk analytics
2009–2015: Nomura, assistant vice-president, model validation group
2015–present: CCIL, senior vice-president, appointed chief risk officer in August 2015
Advertisement | 7 March 2018
Webinar: Nasdaq BWise
Anton Lissone, Chief technology officer, Nasdaq BWise
Shelly Martin,Vice president in enterprise risk management, State Street Corporation
Gordon Liu, Executive vice president, US Head of global risk analytics, HSBC
Moderator: Tom Osborn, Desk editor, risk management, Risk.net
Financial services organisations are under pressure to significantly transform processes related to areas of governance, risk and compliance (GRC). Managing responsivenes, the proliferation of regulations and IT systems, and exploding volumes and varieties of data is a high-stakes balancing act. In an increasingly complex landscape, addressing data and risk management head-on not only creates a GRC strategy, but helps build a better business.
Risk.net’s panel of industry experts discusses how new technologies are helping make GRC more efficient, creating value and improving outcomes, and delves deeper into evolving GRC frameworks
Discussion points include:
How to use your GRC tool across various business lines and to uniformly respond to regulators
Trends in GRC technology and key drivers for GRC in the market
Why firms should invest in next-generation GRC technology
Integrating policy management, vulnerability risk management and content libraries
Using GRC tools and techniques to inform operational risk management.