Fraud makes up bulk of UK bank op risk loss events

By Louie Woodall | Data | 14 June 2018

Fraud was the highest-volume operational risk event suffered by large UK banks in 2017, though the cost of execution, delivery and process management failures was more burdensome.

Internal and external incidences of fraud accounted for 64% of all operational risk events at Barclays, Royal Bank of Scotland, Lloyds and Santander UK on average. However, these events made up just 14% of total losses in value.

RBS recorded the highest volume of fraud events, which made up 83% of its total, and Santander UK the lowest, with 50%. The respective costs associated with these events as a percentage of the banks’ totals were 6% and 42%. 

On the other hand, execution, delivery and process management failures comprised just 20% of operational risk events, but 28% of total operational risk costs. 

Santander UK reported 27% of total loss events in this category, the highest in the sample, and RBS the lowest, with 7%. The respective costs associated with these events as a percentage of the banks’ totals were 9% and 33%. 

Business disruption and system failures – recently targeted by the Bank of England for regulatory scrutiny – accounted for just 1% of operational risk events on average, and 2% of loss values. 

Standard Chartered disclosed data regarding loss values only. Execution, delivery and process management accounted for 36% of losses by value and fraud 22%. When included in the sample with the above-named banks, the average loss values attributable to execution, delivery, and process management as a percentage of the total increase to 29%.

HSBC did not disclose a breakdown of operational risk events by category or losses by value for 2017.

What is it?

Many banks disclose operational risk losses and event volumes broken down by categories set down in the Basel II framework. Basel standard-setters defined seven categories of operational loss event types: internal fraud; external fraud; employment practices and workplace safety; clients, products and business practices; damage to physical asset; business disruption and system failures; and execution, delivery and process management.

The business disruption and system failures category encompasses hardware, software, telecommunications and utility outages and disruptions. Execution, delivery and process management events include “losses from failed transaction processing or process management, from relations with trade counterparties and vendors”.

The calculations for RBS, Santander UK, Barclays and Lloyds are based on the volume and value of events where the associated loss is more than or equal to £10,000. Standard Chartered’s report did not disclose whether this threshold applied. Legal and conduct risks are excluded from the banks’ operational risk event templates. 

Why it matters

Heightened scrutiny of non-financial risks is a theme bank executives have expounded on loudly for some time. The UK watchdog has also signalled its intention to scrutinise banks’ operational resilience, including firms’ ability to bounce back from technological disruptions, like those that afflicted TSB and RBS in recent years.

The top 10 operational risks 2018 survey conducted by ranked IT disruptions as the number one risk, and fraud as number four. However, the above data suggests that disruption events still make up a tiny part of big banks’ overall operational risk profiles. This is not to say they can’t trigger huge losses, as the 2017 data also shows there is scant correlation between the volume of incidences and their related costs. 

It’s also possible, of course, that some banks allocate losses related to business disruptions, such as technology failures or cyber attacks, to other categories. Perhaps a more granular operational risk taxonomy would help provide a clearer view of the modern bank’s vulnerabilities. 

Get in touch

What explains the Bank of England's ramp up of activity surrounding business disruption and operational resilience, if the data shows that banks aren't experiencing massive losses due to these events? Let us know your thoughts by emailing or tweeting @LouieWoodall or @RiskQuantum.

Tell me more

BoE to set tolerance levels for operational disruptions

Regulators zeroing in on non-financial risk, say banks

Monthly op risk losses: banks count the cost of IT failures

Top 10 operational risks for 2018

Problems remain with op risk standardised approach, say banks

By Alexander Campbell | News | 14 June 2018
Ruben Cohen

US bill HR4296 could scupper US implementation of SMA, say op risk bankers

The final version of the Basel Committee on Banking Supervision’s operational risk capital rules is an improvement, but will still leave banks and regulators with multiple headaches as they go about implementing it, op risk experts say. 

Speaking at OpRisk Europe on June 12, Ruben Cohen, an independent risk consultant and previously director in op risk analytics with Citi, told delegates: “Many people in this room – including me – don’t believe this is a valid model.”

Attempts to make the SMA more consistent in its approach to different types of revenue have not been entirely successful, argued Michael Grimwade, international head of operational risk at MUFG Securities, speaking on the same panel.

Under the standardised approach – known as the standardised measurement approach (SMA) during its development – the primary driver of a bank’s op risk capital under the approach will be a measure of its size by revenue, dubbed the business indicator (BI). But the formula used to generate the BI treats interest income more punitively than fee income – meaning banks that rely more on lending to generate their income than fees from investment banking activities will have a higher BI number.

The approach also still differs in its treatment of loss types, the panellists noted: some op risk events such as fat-finger errors, rogue trading losses and algo malfunctions could be deducted from trading revenues instead, while the cost of replacing damaged assets could be deducted from service revenue.

The new SMA may also miss some loss information due to the time lag between an operational risk event occurring and the loss being registered, Grimwade pointed out; most types of risk register quickly, but some in the Basel ‘clients, products and business practices’ category – for example historic mis-selling cases – may take several years to come to light, especially if they are linked to cyclical causes such as a downturn causing mis-sold investment products to fail.

“About half of all major losses are linked to cyclical causes… and the SMA was not designed to capture this distinction,” Grimwade warned.

Andrew Sheen, head of operational risk regulatory advisory at Credit Suisse, added that the new rules also changed the effect of insurance on operational risk capital. While the formula recognises a recovery after a loss, it does not allow banks to hold less capital simply because they have insurance cover in place.

About half of all major losses are linked to cyclical causes… and the SMA was not designed to capture this distinction

Michael Grimwade, MUFG Securities

And he highlighted that the SMA’s implementation could be extremely inconsistent across jurisdictions: not only do national regulators have discretion on whether to apply the internal loss multiplier, but a US bill, HR 4296, lays down that any op risk capital requirement must be based on forward-looking assessment.

The bill has passed the US House of Representatives; if it were to pass the Senate too and ultimately pass into law, op risk experts have said it could severely complicate US regulators’ implementation of the SMA.

Cohen pointed out, meanwhile, that amendments to the 2017 SMA versus the 2016 version – such as reducing the multiplier applied to the largest banks – reduced overall capital requirements, but also made the final figure less sensitive to both losses and the size of the business.

“When you incorporate the loss component into the standardised approach, you lose sensitivity there,” he said. “The standardised approach is stable, it’s simple, it’s transparent – unless you go into the areas of national discretion – and it’s comparable. The loss component is risk-sensitive, but the problem is the way it is incorporated,” argued Cohen, who pointed out the natural log of the loss component as part of the calculation means variations in losses produce only a small effect on capital.

“It looks like they didn’t focus on resolving the issues [with the 2016 SMA] – it was purely a focus on reducing capital and making everyone happy,” Cohen concluded.

Editing by Tom Osborn

This article was amended on June 14 to correct Ruben Cohen’s former job title.

Regulators need to adopt AI for monitoring, experts say

By Dan DeFrancesco | News | 14 June 2018

Growing bank usage of artificial intelligence means authorities must hasten adoption themselves

The growing use of artificial intelligence in financial firms will eventually lead regulators to adopt the technology to monitor and manage firms’ automated models, an operational risk executive at a large European bank has said.

“In five to 10 years regulators will increasingly deploy AI to manage the risks that we are talking about today,” said Giles Spungin, global head of operational risk and regulatory compliance analytics at HSBC.

Some supervisors have begun this process already, Spungin said, pointing to the work the Bank of England is doing in its “fintech accelerator” initiative, which invites academics and technology firms to put forward test projects for industry use. The BoE is testing an AI tool developed by tech vendor MindBridge to analyse the quality of transaction data that goes towards constructing its Sonia overnight rate.

Spungin, who spoke on a panel at OpRisk Europe in London on June 13, said regulators need to harness AI not just to monitor banks’ own models, but to aid the broader supervisory effort. The speed at which the markets now move means regulators have no choice but to turn to the cutting-edge technology, he added.

“I think the only way for regulators to effectively manage the risk – not just AI risk, but financial industry oversight – is to utilise those automated decisions,” Spungin said. “The decision making is made by machines and it is much harder to monitor those decisions by utilising legacy, human-based processes.”

Previously, authorities have shown scepticism towards the use of opaque algorithmic techniques within financial institutions and other firms. Last year, the US Federal Reserve warned against using machine learning to assess contagion risks in model networks, saying these methods lack transparency and might obscure the true nature of banks’ vulnerabilities.

A lack of transparency is one of the criticisms levelled against so-called “black boxes”, computerised applications whose inner workings are isolated from human intervention. Regulators in Europe have acted to preserve a degree of human input in some companies’ automated decision-making processes through their recent General Data Protection Regulation. Under GDPR, an individual may challenge any decision that is automatically determined by a black box and request human intervention.

“It is pretty clear in GDPR that the regulator is not on board with 100% reliance [on black boxes] today,” said Tomas Hazleton, chief risk and compliance officer for CivilisedBank, who also sat on the panel. 

I think the only way for regulators to effectively manage the risk – not just AI risk, but financial industry oversight – is to utilise those automated decisions

Giles Spungin, HSBC

Risk managers have stressed the importance of tightening up their governance and controls in response to advances in AI models. The use of AI solely for modelling and analysis purposes doesn’t require any different risk management from how models typically are handled, said Nathan Jones, head of operational risk for international bank at State Street, and fellow panellist. The issue, Jones added, is when automated decisions are made on the basis of the results of the AI model.

Jason Forrester, head of enterprise and op risk management at Credit Suisse, also raised concerns over AI model validation.

“Who is validating the AI models to make sure we know they are doing what they are supposed to be doing?” Forrester asked during a panel debate the previous day. He highlighted the risk that poor validation could trigger “the next financial crisis in five years when we realise the AI models are making the wrong decisions”.

Both Jones and Spungin said the debate over AI models is similar to the one that took place years ago regarding high-frequency trading platforms.

Jones said the risk management paradigm for AI models is similar to that of HFT platforms where a comprehensive understanding of the inner workings of the model is not always essential. The priority is a strong governance programme.

A layered structure that includes data governance, a change programme for new system updates, safety valves with thresholds and real-time human oversight are all common among HFT platforms and would work well with AI models, he said.

“As long as somebody has the ability to pull the big handle and switch the thing off, and the machine is also trying to switch itself off if it doesn’t like what is happening, I don’t see a huge difference,” Jones said.

Editing by Alex Krohn

Stricter vendor regulation not ‘magic pill’, banks say

By Dan DeFrancesco | News | 13 June 2018

Third parties would still require oversight from banks, even if formally regulated

Outsourcing experts at banks say implementation of stricter regulatory standards on vendors would not affect how they manage their third-party relationships.

As largely unregulated startups enter the financial industry in growing numbers – in response to increased demand from banks for their services – regulators have begun to question whether vendors should fall under similar standards as the institutional clients they serve.

But greater regulation of third-party providers won’t reduce the burden of oversight for banks, experts say. Fiona O’Brien, head of group supplier assurance at Bank of Ireland, said more rigorous regulation of vendors wouldn’t change how banks undertake due diligence.

“What is the driver for wanting [vendors] to be regulated? If it is because you are hoping that it’s the magic pill and will reduce [banks’] monitoring, I don’t think it would hugely,” said O’Brien, who was speaking on a panel at OpRisk Europe in London on June 12. “I do agree that it means companies don’t have the same standards. But you still have to get the assurances that the standards are as you expect them to be; that the environment is operating effectively.”

Vendors have long benefited from having to deal with little to no regulation, which has allowed them to remain agile and innovate more easily. As banks have relied more heavily on outsourced solutions that play critical roles in their day-to-day operations, regulators have put the onus on them to ensure their third-party – and sometimes fourth-party – relationships are resilient and secure.

In the US, the Federal Reserve Board and Office of the Comptroller of the Currency both issued guidance notes on third-party risks for banks in 2013. Some firms have complained of overregulation, with one bank saying the guidance from the OCC “has gone too far”. Another bank told it had scaled back its vendor roster by 25% since 2014 as part of its compliance with the OCC recommendations.

In Europe, authorities have issued guidance on the use of cloud computing providers, as banks look to these services to ramp up computational capacity and cut hardware costs.

Abhishek Khare, divisional director of outsourced services operational risk at Societe Generale, said banks would need to undertake the same level of due diligence, whether or not a vendor was subject to regulation.

“Having vendors regulated is a cherry on the cake,” said Khare, who also sat on the panel. “The risk from all kinds of areas is something you can’t get away from just because [vendors] are regulated. It is important to manage even those vendors who are regulated with an equal amount of due diligence.”

The risk from all kinds of areas is something that you can’t get away from just because [vendors] are regulated

Abhishek Khare, Societe Generale

That’s not to say regulation of vendors can’t be beneficial for the industry overall. O’Brien said effects of the European Union’s General Data Protection Regulation, introduced in May, could already be seen. GDPR lays out new rules for how all firms – not just financial institutions – manage personal data. The new regime is a particular concern for companies due to the size of penalties for non-compliance: up to €20 million ($23.6 million) or 4% of global annual revenue, whichever is larger.

Fiona O’Brien, Bank of Ireland
Photo: Lucy Stewart

“I can see a big difference in the last year, as [vendors] assessed their readiness for GDPR,” O’Brien said. “They knew they had to because they are now liable for fines. I can see where there is a balance, but you still need to get your assurances.”

Aegon UK subjects some of its larger third-party relationships to the same rules and standards that apply to Aegon itself, explained fellow panellist Kurt Neilson, head of third-party relationship management at the insurer. This means the vendors are, in effect, regulated.

In some previous instances, Aegon had outsourced contracts without that type of oversight, he said.

“What I have seen – certainly at the executive level at Aegon – is a greater understanding today and more accountability in outsourcing,” Neilson said. “If we choose to go down that route, we are putting in place the structure, the processes and the risk mitigation to make sure it is a success.”

Outsourcing has consistently featured in’s annual Top 10 Op Risks review, ranking in fifth place in the 2018 survey.

BoE to set tolerance levels for operational disruptions

By Dan DeFrancesco | News | 13 June 2018

Regulator cites recent TSB and RBS outages as it plans incident response framework

The Bank of England is planning to set tolerance levels for banks to maintain a minimum level of service provision during a “severe but plausible operational disruption”, according to a senior executive at the regulator.

Speaking today (June 13) at OpRisk Europe, Lyndon Nelson, deputy chief executive officer and executive director for regulatory operations and supervisory risk specialists at the BoE’s Prudential Regulation Authority, said banks’ increased reliance on technology had led to an uptick in operational incidents, whether due to internal failures or external attack. He highlighted RBS’s 2012 outage of its Irish operations and retail bank TSB’s recent loss of services as examples of resiliency issues firms are having to face.

“It has become, therefore, more important than ever for regulators to set out clear expectations of firms in respect of their operational resilience,” Nelson said. “The Financial Policy Committee, for example, has been considering its tolerance for disruption to the key economic functions that the finance sector performs. As part of this work, it is likely that the FPC will set a minimum level of service provision it expects for the delivery of key economic functions in the event of a severe but plausible operational disruption.”

Nelson suggested tolerances for service provision could be set according to time or volume, taking into account a firm’s market share and measures of interconnectedness. Tolerance levels could serve as a common framework for other regulators to build upon, Nelson said. The BoE has also been developing a suite of supervisory tools that can be used to assess firms’ resilience against its own expectations, he added.

The BoE will also file a joint discussion paper with the Financial Conduct Authority that will look to get feedback from market participants and fellow regulators, Nelson said.

“I would perhaps not steal the thunder of the discussion paper where we will set out these proposals, but they will be incredibly clear about what we would regard as acceptable and within tolerance and not,” he said. “Resilience is essentially making sure that the service remains, regardless of the original circumstances. It includes the ability to withstand but also the ability to absorb shocks and recover from them. We will essentially define that in terms of these impacts, tolerances, either by times of outages or perhaps by volume, or something like that.”

Personally, Nelson said he would like firms to adopt a WAR footing approach to outages: withstand; absorb; recover.

The first point, Nelson said, was about banks setting their own tolerances for key business services using clear metrics that indicate when a disruption would represent a threat to either the firm, its consumers or its financial stability. Banks will also be expected to test and demonstrate those tolerances to regulators, and include their boards in the development of their operational and cyber resilience strategies, he added. That includes clearly defining and testing incident management procedures.

The review has concluded that there is a need for greater co-ordination and more rapid information sharing during a cyber incident

Lyndon Nelson, BoE

As for recovery, Nelson pointed to the importance of information sharing among banks regarding operational incidents. He cited the UK government’s Cross Market Operational Resilience Group as a critical tool for banks.

While CMORG – set up by the UK Treasury to bolster cross-regulatory co-ordination on operational resilience – has been successful, he argued, Nelson did say there was room for improvement. The group recently commissioned a review of its incident management in the financial sector.

Lyndon Nelson
Photo: Bank of England/Flickr

“The review has concluded that there is a need for greater co-ordination and more rapid information sharing during a cyber incident,” Nelson said. “Recommendations included: creating a standing cyber response capability for the financial sector, both during and outside standard working hours; creating a common incident taxonomy and maintaining the industry’s guidance on how to respond to a cyber attack; and bringing together risk assessment capabilities from across the financial sector and the NCSC [National Cyber Security Centre], with a view to regularly reporting shared analysis and creating a joint risk register.”

The most critical piece for any bank’s resiliency program, Nelson added, was for it to be baked in to the core of any changes implemented at a firm from the start as opposed to trying to add it in later on.  

“Often when we find problems, it is because at the beginning of the project the resilience aspects were not considered. Actually, it is often much more expensive and much more cumbersome for the firm to add those resilience aspects in later,” Nelson said. “So I sort of hope there will be a proper second-life function around the table when the decisions are taken and able to influence those decisions when they take place.”

Editing by Tom Osborn

Regulators zeroing in on non-financial risk, say banks

By Dan DeFrancesco | News | 12 June 2018
L to R: Brenda Boultwood, Jason Forrester, Philip Umande and Sean Miles

Several big financial firms said to be considering appointing heads of non-financial risk

Regulators are stepping up their focus on non-financial risks and banks should consider restructuring their risk management functions accordingly, operational risk managers said at a conference earlier today (June 12).

Jason Forrester, head of enterprise and operational risk management at Credit Suisse, said non-financial risk – in its broadest conception, all risks not covered by a bank’s market and credit risk frameworks – came in for special attention at a Basel Committee on Banking Supervision meeting held in New York about two months ago for chief risk officers (CROs) from global systemically important financial institutions (G-Sifis).

“I actually think there is going to be a lot more focus on non-financial risk from the regulators,” Forrester said. “The feedback from all the G-Sifi CROs was non-financial risk is their biggest concern, in particular tech risk and third-party risk management. A number of them were thinking of combining their non-financial risk functions together and appointing a single head of non-financial risk.”

Some banks have already moved in that direction: Balbir Bakhshi currently serves as Deutsche Bank’s group head of non-financial risk management, while UBS has merged its operational risk and compliance functions under one global head, James Oates.

Forrester predicted that, for big banks, non-financial risk will soon surpass market and credit risk in terms of importance. He explained that banks are generally extremely well-capitalised for and provisioned against credit risk exposure, while the use of clearing and collateralisation has dramatically reduced counterparty credit risk exposure. Market risk, meanwhile, accounts for a relatively low portion of banks’ overall risk-weighted assets and poses a lower threat now that firms have decreased their derivatives exposures compared with crisis-era levels, he added.

Credit Suisse has around 1,000 people currently focused specifically on non-financial risk management, Forrester said. He believes a major bank will appoint a group CRO with a non-financial risk background within the next decade.

“The next big financial crisis is going to be some combination of non-financial risk. Liquidity, operational, reputational [risks] together are going to cause some large event that dislocates the market,” he said. “Having a group of people that look at that…may better protect us going forward.”

It makes a lot more sense to have all non-financial risk in one space. It is going become probably the biggest risk – maybe is the biggest risk for some banks already

Philip Umande, Lloyds Banking Group

Philip Umande, head of operational risk capital and analytics at Lloyds Banking Group, agreed with Forrester, adding that the skill sets of CROs need to evolve accordingly: previously, CROs have tended to have a background in credit risk but, going forward, they will need to have expertise in major non-financial risks, such as cyber crime.

Umande also echoed Forrester’s point about managing all non-financial risks together as their profile rises within firms.

“There is a huge drive towards simplification and agility. It makes a lot more sense to have all non-financial risk in one space,” he said. “It is going become probably the biggest risk – maybe is the biggest risk for some banks already. There is going to be more focus on it at the highest level.”

Sean Miles, a senior op risk manager at Santander, agreed that going forward data, technology and third-party risks will be among the leading concerns within firms’ risk departments. Although his bank has not yet discussed appointing a head of non-financial risk, he said “it does feel like the market is going that way”.

Forrester argued non-financial risks were too interconnected to keep siloed. For example, operational or reputational risk events can have knock-on effects that impact business continuity, or other areas. 

“I think particularly now, given the interconnectivity of banks with market utilities, an operational event potentially at a market utility could cause a much broader contagion effect across the industry,” he said.

The comments at the conference chime with a recent prediction by Mark Yallop, a senior adviser to the Bank of England’s Prudential Regulation Committee, that supervisors will shift their attention from banks’ financial resilience to so-called operational resilience.

“While I’ve been on the PRC, we’ve still been in the business of de-risking banks – raising capital standards, increasing liquidity, improving governance standards. The next three years are going to be much more about how prudential regulators manage the challenges of new technology, and the revolution that has been brought about in the banking industry,” Yallop told

Editing by Olesya Dmitracova

US bank RWA density edges higher

By Louie Woodall | Data | 11 June 2018

US systemically important firms increased their risk-weighted asset (RWA) density in the year to March 31, with Morgan Stanley building up risk the most over this period.

Median RWA density, calculated as standardised RWAs divided by total assets, across the eight global systemically important banks (G-Sibs), grew from 57.61% at end-March 2017 to 58.85% a year later.

Morgan Stanley’s RWA density increased 401 basis points, from 41.46% to 45.47% over the period. Last March, the dealer had the lowest RWA density of the eight G-Sibs. As of end-March 2018, it had the third-lowest, hurdling BNY Mellon and State Street in the rankings.  

State Street’s and Goldman Sachs’ RWA densities grew 255bps and 144bps, respectively.

On the flipside, BNY Mellon cut its RWA density by 159bps and Citigroup by 50bps. 

Wells Fargo had the highest RWA density, 66.73%, at end-March 2018, keeping the top spot it held a year earlier, when its density was 66.65%.

BNY Mellon had the lowest RWA density of the group, at 41.88%.

What is it?

RWA density is one means of measuring the riskiness of a bank. Banks’ RWAs fall as their balance sheets shrink, but by expressing RWAs as a ratio of total assets it’s possible to track which banks are de-risking and which are loading up on risk while controlling for size. 

The data for the above charts were extracted from quarterly FR Y-9C reports, which bank holding companies are required to submit to the Federal Reserve on a quarterly basis. The data is used to assess and monitor the financial condition of US banks and is used as the Fed’s primary means of monitoring the health of big dealers in between on-site inspections.

The RWA densities above were calculated using standardised RWAs, rather than advanced RWAs, for all banks.

Why it matters

RWA density tells us as much about regulators’ assessments of banks’ risks as it does the unique risk profile of each bank. Assets held for trading generally attract lower RWA densities than held-to-maturity assets, meaning those dealers with a higher proportion of loans and other credit assets to trading inventory on the books are likely to have higher RWA densities. 

When seen this way, it’s small wonder that Wells Fargo, Citigroup, Bank of America and JP Morgan – each prominent ‘main street’ lenders as well as investment banks – have the highest RWA densities. 

Goldman Sachs and Morgan Stanley, as prominent broker-dealers, have lighter ‘traditional’ loan books and hence lower RWA densities. BNY Mellon and State Street, as custody banks, are weighed down predominantly with high-grade collateral, such as US Treasuries, which score low on risk-based measures. 

Operational RWAs somewhat distort density readings, as these do not fluctuate in line with a bank’s size per se, but mechanically according to regulatory formula. A large operational risk failing or regulatory fine one quarter, for instance, could hike a bank’s RWA density even when it had de-risked both its banking and trading books. 

Get in touch

Did any of the data points in the above charts catch your eye? Are there better ways to guage riskiness than RWA density? Let us know by emailing or tweeting @LouieWoodall or @RiskQuantum.

Tell me more

Five US banks below Collins floor

View all bank stories

Operational risk measurement beyond the loss distribution approach: an exposure-based methodology

By Michael Einemann, Joerg Fritscher, Michael Kalkbrener | Technical paper | 11 June 2018

Distortion risk measures for nonnegative multivariate risks

By Jaume Belles-Sampera, Montserrat Guillen, Josu00e9 Maru00eda Sarabia, Faustino Prieto | Technical paper | 11 June 2018

Op risk data: Mexico bank hack fuels global payment network fears

By Risk staff | Opinion | 8 June 2018

Also: roundup of monthly loss data sees Wells Fargo in top slot for second month running. Data by ORX News

Following on from its $1 billion loss in April, US bank Wells Fargo topped the losses again in May. It has agreed to pay $97.3 million to settle a class action lawsuit filed by employees who accused the bank of denying them rest breaks.

The class, which consisted of 4,481 mortgage executives, claimed that Wells Fargo regularly required them to carry out work duties at lunch times, such as networking events or coaching duties. Additionally, they claimed that Wells Fargo did not provide them with the mandated 10 minute rest break for shifts over four hours or 30 minute meal break during shifts over five hours. Employees claimed this totalled 1.9 million shifts without breaks between 2013 and 2017.

The lawsuit was filed in California, whose state laws on labour protection are thought to be among the toughest in the US. Of the 142 “employee relations” losses recorded from North America in the ORX News database, one-third are from California.

In second place is now-defunct Russian lender, El Bank, which lost 2.2 billion rubles ($35.8 million) after bank executives fraudulently issued loans to shell companies, state officials claim. El’s banking licence was revoked in May 2016 and it was declared bankrupt shortly afterwards in August of that year. Russia’s Deposit Insurance Agency filed a lawsuit against the executives in April 2018.

In third place, Societe Generale was ordered to pay €22.9 million ($26.9 million) in compensation to Landesbank Hessen-Thüringen (Helaba) for acting as custodian bank in a share trading scheme known as “cum-ex” trades.

The trades, now illegal, exploited a loophole in German tax law. The scheme involved buying shares just before a dividend payment was due and then reselling them after the dividend payment date, but with the dividend as yet unpaid. This allowed both the buyer and the seller to claim a capital gains tax refund on the dividend. German authorities closed this loophole in 2012 and have aggressively pursued missed tax payments for the preceding years. ORX News has recorded €746 million in operational risk losses paid or provisioned by banks since 2015 in relation to the trades, although this is only a fraction of total lost taxes to German authorities, estimated at between €7 billion and €12 billion.

Helaba was ordered to repay the tax rebate on the cum-ex trades for which SocGen acted as the custodian bank, plus interest. In June 2016, Helaba sued SocGen in a German court, claiming that it expected the French bank to pay the capital gains tax. The court found in Helaba’s favour and ordered SocGen to cough up the funds.

Fourth, US broker-dealer LPL Financial has agreed to pay $25.9 million to US authorities over the sales of non-exempt, unregistered equity or fixed-income securities since October 1, 2016 which did not fully comply with state securities registration requirements.

Finally, employees of the National Bank of Pakistan’s foreign exchange branch were arrested on May 16 in connection with a fraud of up to 3 billion rupees ($25.8 million) involving faked letters of credit granted to a number of companies, according to local media.

Spotlight: Canada data loss

The data of around 90,000 customers of Bank of Montreal and CIBC’s Simplii Financial has been stolen by hackers, according to reports. The stolen data is being held for a ransom of $1m, which is to be paid in the cryptocurrency Ripple. The hackers claimed they used an algorithm to gain partial access to customer accounts, allowing them to pose as users who had forgotten their passwords and from there change security questions to gain full access. The data includes names, account numbers, passwords, social security numbers and account balances.

In focus: Interbank payment weaknesses

In May, the Mexican central bank revealed that at least three banks under its remit had been affected by a cyber attack on its interbank payment network. Around 400 million pesos ($20 million) was stolen in the attack, which also caused transfer delays and service interruptions.

This is not the first time that banks have become the victims of cybercrime through their interbank payments network. The most high-profile case was the attempted theft of almost $1 billion from Bangladesh Bank in 2016. A group of hackers, aided by Bangladesh Bank employees, created malware that allowed them to infiltrate the bank’s Swift network and obtain its credentials. They used these credentials to send fraudulent transfer requests to the Federal Reserve Bank of New York. The Fed rejected most of the requests, but $81 million of transfers were successful – most of which has still not been recovered.

Swift is a global financial messaging service that enables banks to exchange coded payment instructions. The network transmits 31 million messages per day, on average, making it the main form of payment communication between financial firms.

A number of hacks using the same technique have followed in the past few years. Although direct cyber thefts from banks remain relatively rare (the only notable example in the past few years is the 2016 attack on Tesco Bank), it seems that interbank payment networks offer a more accessible attack vector.

The Bank for International Settlements is aware of this problem, which it says is growing in sophistication and will continue evolving. Following the Bangladesh Bank attack, therefore, the BIS put together a task force to examine wholesale payments fraud.

One major problem the BIS identified is that interbank system operators such as Swift do not have control over all points of entry into their systems. The onus is therefore on network participants to ensure they have appropriate controls in place to prevent fraud. This is particularly pressing due to the interconnectedness of these networks: once a criminal has gained entry to the ecosystem, the damage can affect all network participants.

Preventing fraud is not always an easy task. There is one striking feature in the eight Swift attack losses shown above: they all occurred in developing countries. Institutions in these countries may not have the same fraud prevention budgets as their counterparts in more developed countries, and a wider culture of corruption may also have an impact.

This doesn’t mean that global banks are immune to the consequences of these attacks, as they are often participants in the transfers and suffer both financial losses and reputational impact. In February 2018, Wells Fargo reached a settlement for an undisclosed amount with Ecuador’s Banco del Austro for authorising $12.2 million in fraudulent Swift transfers after hackers breached Banco del Austro’s local security systems. Following the Bangladesh Bank attack, the New York Fed also faced criticism for not detecting more of the fraudulent transactions.

In addition to financial loss and reputational risk, these kinds of attacks carry another risk. If participants lose trust in their interbank payment networks, they could apply additional controls to transfers, which would slow down payments and could cause gridlock in the global financial system. Worse, if these attacks are not stopped, the BIS warns they could undermine confidence in the integrity of the entire system. In May it therefore published its strategy to combat wholesale payments fraud, proposing a global, co-ordinated effort to stop this type of fraud.

Seven elements of the BIS’s strategy are: identify and understand the range of risks; establish endpoint security requirements; promote adherence; provide and use information and tools to improve prevention and detection; respond in a timely way to potential fraud; support ongoing education, awareness and information sharing; and learn, evolve and co-ordinate.

Editing by Alex Krohn

All information included in this report and held in ORX News comes from public sources only. It does not include any information from other services run by ORX and we have not confirmed any of the information shown with any member of ORX.

While ORX endeavours to provide accurate, complete and up-to-date information, ORX makes no representation as to the accuracy, reliability or completeness of this information.