Basel III: final op risk framework leaves banks guessing

By Steve Marlin, Louie Woodall | News | 11 December 2017

Analysis suggests big capital savings on average, but uncertainty persists over uneven implementation

Final rules from global policymakers on operational risk capital appear to show a big cut for the largest banks – but the industry has been left guessing as to the ultimate impact by the unprecedented freedom given to national regulators over controversial elements of the framework.

A quantitative impact study (QIS) from the Basel Committee on the effect of its new operational risk framework, known as the standardised measurement approach (SMA) – published last week as part of the final package of revisions to the Basel III bank capital framework – suggests the 2015 cohort of global systemically important banks (G-Sibs) could realise weighted average op risk capital savings of 30%. 

But because the SMA will allow national regulatory authorities the freedom to pick and choose the historical loss data included in the calculations, as previously reported by Risk.net – something that will have a huge bearing on final numbers – banks are treating these numbers with scepticism. 

“It is not at all clear how they managed to do the analysis, so I would take the impact study on that piece with a large salt mine,” says a capital manager at a global bank. “There does seem to be quite a lot of injections of national supervisor choices, which, if you are pressing for global uniformity, looks a bit odd,” he adds. 

The Basel analysis also suggests the impact will diverge wildly between firms: under the 2015 numbers, one G-Sib would see its minimum op risk capital requirement spike by 222%, while another would see it drop by 66.1%. Crucially, however, the Basel QIS does not take into account current Pillar II capital add-ons from national regulators – meaning projected capital decreases could be understated and increases overstated.

Banks will have time to adjust to the new framework, which is slated to be implemented from 2022. 

Huge fines for crisis-era misconduct now mean that operational risk capital accounts for a significant proportion of many banks’ total capital requirements – as much as 30% in some cases. Unlike most other parts of the bank capital framework, banks that opted for an own-models approach for op risk under Basel II have generally seen their capital requirements rise faster than banks that stuck with non-model approaches.

The SMA dispenses with bank-modelled efforts to calculate op risk capital in favour of a straightforward Basel-defined formula. A simple accounting measurement of bank total income – dubbed the business indicator – is used to divide firms into three size buckets. A separate business indicator (BI) multiplier is then applied to each bucket to produce the business indicator component (BIC). From the smallest-sized bucket to the largest, these are set at 12%, 15% and 18%, respectively. 

The March 2016 SMA proposal featured five BI buckets, with coefficients set at 11%, 15%, 19%, 23% and 29%. The switch was achieved by merging the second, third and fourth buckets from the March 2016 proposal into one. Bucket one will contain banks with a BI range of €0–1 billion ($1.18 billion); bucket two will include those with €1 billion–30 billion; and bucket three those with €30 billion and above.

Banks’ historical op risk losses are factored in through the internal loss multiplier (ILM). If average annual losses incurred over the previous decade, multiplied by 15, equal the BIC, the ILM is set equal to one. Where this loss number is greater, the ILM is greater than one; and if lower, less than one.  

Maybe we will see a reduction in SMA op risk capital compared to our current AMA op risk charge… [but US authorities] could potentially ratchet it up

A regulatory expert at a US G-Sib

However, the final Basel III framework published on December 7 permits national watchdogs to allow banks under their supervision to ignore historical losses, and simply set the ILM to one. Even where a regulator decides to include historical losses, bank are given the freedom to lobby to have certain losses excluded. Dealers say this could lead to wildly divergent implementations across jurisdictions and reduces firms’ incentives to monitor and model operational losses. 

“All specificities which were relative to losses will be subject to a supervisory decision. If a regulator sets the internal loss multiplier to 1, how do you consider the historical losses in your allocation? Is it important? Do they want to build a framework which has incentives for people to have a good framework to monitor their operational risk? These are the questions that are all raised by this approach,” says a capital manager at a European bank. 

Brad Carr, a director in regulatory affairs at the Institute of International Finance, says the methodology could see winners and losers within each jurisdiction, “across those banks with greater and smaller loss histories than their peers”.

US banks are predicted to see a measure of relief if the SMA is adopted as-is by the Federal Reserve. The regulator implemented a punitive version of the current advanced measurement approach (AMA), which burdened dealers under its jurisdiction with higher capital requirements than their international peers. It is likely that a switch to the SMA would result in similar, or even lower, capital levels than is currently the case. 

However, the Fed could dash these hopes if it believes lower capital buffers are unwarranted, warns a regulatory expert at a US G-Sib.

“Maybe we will see a reduction in SMA op risk capital compared to our current AMA op risk charge. However, if the US agencies take the interpretation that the new framework needs to remain capital neutral, they will once again use the standardised approach as a floor and then potentially ratchet it up, the way they have done with other Basel capital rules. [We’re] cautiously optimistic that it might lower the op risk capital charge, but it’s not certain because it’s all really up to national implementation,” he says.

Regulators will also impose discretionary Pillar II capital add-ons to make up for perceived shortfalls in capital requirements produced by the SMA. The Basel Committee notes that if these add-ons were included in the analysis as part of the baseline minimum capital requirement, the effect of the switch to the SMA would be to cut this requirement by a further 7.2% for those banks with more than €3 billion in Tier 1 capital. 

Transatlantic divide

European banks are more likely to see an increase in total op risk capital from the SMA, as they will no longer be permitted to use scenario generation to calculate their requirements as they did under the AMA. While US banks are required to use fairly conservative loss distribution models, European firms generally took full advantage of the four broad categories of inputs allowed under the AMA: internal loss data, external loss data, scenarios, and business environment and internal control factors.

“You will see a transatlantic divide, where US banks will be happy, while European banks might be less satisfied because the impact will be greater on their capital requirements,” says Evan Sekeris, a partner at Oliver Wyman. “This is a relief for US banks, because now they will have numbers that are very much in line with what they have now, but not have to spend time developing and refining models.”

Differences in op risk capital outputs will also be realised by the provision allowing banks to ask regulators to ignore op risk losses for divested businesses. Banks have long complained about the need to account for past operational risk losses regardless of any actions taken to remediate the underlying issues, including divestment.  

However, banks will need to demonstrate that there is no similar or residual legal exposure and that the excluded loss experience has no relevance to other continuing activities or products. “National regulators will have to make the call as to whether a loss can be excluded or not,” says Sekeris.

Additional reporting by Philip Alexander

Basel III changes set to create big winners and losers

By Kris Devasabai, Joanna Wright, Philip Alexander | News | 8 December 2017

Capital hit for G-Sibs ranges from 28% drop to 43% jump, QIS reveals

The Basel Committee on Banking Supervision’s long-awaited additions to the Basel III package will have a muted impact on the banking industry’s aggregate capital requirements, but individual firms could see wildly different outcomes.

A quantitative impact study conducted by the Basel Committee suggests minimum required Tier 1 capital for global systemically important banks (G-Sibs) will decline by 1.4% on average as a result of the changes unveiled earlier today (December 7) by international regulators. However, the dispersion around the mean is significant, ranging from an increase of 43.4% for one bank to a drop of 27.8% for another.

“Even if you ignore the outliers and just look at the banks in the middle, the seventy-fifth percentile to the twenty-fifth is a 26% spread [from 17.3% to –9.1%] to cover half the sample. That is not as tight a distribution as I would have expected,” says a senior European banker. “What we have is a methodology that, in its aggregate output, is giving us RWAs [risk-weighted assets] that are on average about the same as they were before, but for different banks there is really quite a large variance.”

While all banks are expected to meet the minimum Common Equity Tier 1 capital requirement of 4.5% under the revised framework, the quantitative impact study finds that one G-Sib would see its total capital ratio fall below the 8% minimum if the changes were implemented today. G-Sibs will need to raise a combined €27.6 billion ($32.4 billion) of fresh CET1 capital to maintain their ratios above the target level of 7%, which includes a capital conservation buffer of 2.5%.

The total capital shortfall – including additional Tier 1 and Tier 2 capital and leverage ratio requirements – under the new rules is estimated to be around €90 billion.

Tougher in Europe

As expected, European banks bear the brunt of the changes, with the European Banking Authority anticipating a 15.2% rise in the minimum required Tier 1 capital for the region’s largest lenders once the revised capital framework is phased in.

“The bulk of the impact is in Europe,” says a regulatory capital expert at a European trade group. “In terms of the CET1 shortfall for the risk-based requirement, about 60% of the total global shortfall is due to European Union G-Sibs.”

The revised Basel III framework introduces an output floor that caps the minimum ratio of RWAs calculated using internal models at 72.5% of the standardised approach. The package also makes significant tweaks to the standardised and advanced approaches to modelling credit risk and scraps the use of internal models for operational risk. G-Sibs will also face a leverage ratio surcharge, which is set at 50% of a firm’s risk-weighted higher-loss absorbency requirements.

European banks are hardest hit by the curbs on the use of internal models, but the aggregate capital impact of the revisions is largely offset by a reduction in RWAs for firms that rely on standardised approaches. “Banks using more standardised approaches have even seen some decreases in capital requirements,” says the regulatory capital expert. “You can see when you look at the distribution of impacts there are some G-Sibs that have had significant decreases.”

This is especially true for operational risk capital requirements, which are expected to see an aggregate drop of over 30%. However, the quantitative impact study reveals that for one G-Sib, op risk capital requirements will rise by 222% under the standardised measurement approach (SMA), while another can expect a reduction of 66.1%.

The restrictions on modelling credit RWAs are expected to produce mixed results. “The tweaks to credit risk are complex,” says the senior European banker. “There are so many changes in different areas.”

I am sure [the new accord] has been fine-tuned to give the right output, as opposed to anyone thinking these numbers reflect some sort of bottom-up risk approach

Senior European banker

The loss of the advanced internal ratings-based approach for modelling exposures to corporates with revenues of over €500 million is likely to have the most punitive impact on big banks. While the threshold was raised from a proposed €200 million, a significant number of corporates will be excluded from the advanced internal ratings-based approach. “It is a big range of corporates that will be caught, and it will impact different institutions very differently, depending on the mix of their portfolios,” says a credit specialist at a consulting firm. “You will see some banks with a disproportionate impact, and some banks with very little. It is not likely to be uniform.”

On the other hand, banks are likely to see a net reduction in RWAs on mortgages. “Some of the standardised approach calibrations have been dialled down. The whole loan risk-weight for mortgages is the one people are most familiar with. Applying standardised risk weights to a mortgage book will give a far lower outcome,” says the senior European banker.

However, he questions the motives behind the last minute changes to the advanced internal ratings-based threshold and standardised risk weights. “There are these kinds of recalibrations throughout the whole thing. I am sure it has been fine-tuned to give the right output, as opposed to anyone thinking these numbers reflect some sort of bottom-up risk approach.”

The Basel Committee is counting on the 30% aggregate drop in op risk capital requirements, which largely accrues to US G-Sibs, to offset a large portion of the increases elsewhere. However, some question whether these savings will be realised. “The analysis that the Basel Committee did in terms of the change in the ratios incorporates a huge offset – a reduction in op risk RWAs – but we don’t know if that will matter or not. It all depends on how the US implements the SMA for US banks,” says a policy expert at a US financial industry association. “The SMA, if implemented as Basel is suggesting in the US, would reduce RWAs for US banks, but we don’t really know how much the US agencies will take that in.”

Market participants say further analysis of the Basel III revisions is necessary to fully understand the impact of the changes on specific institutions and markets. “All of this makes the case for further analysis to understand the impacts at the level of different products and businesses,” says the regulatory capital expert. “There needs to be consideration of how this impacts capital markets products – looking at repos, derivatives and a bank’s ability to make markets and hold assets. And looking at the corporate lending side, trade finance, project finance, commodity finance, all of those things. That is what we need to see now.”

Banks await Basel decision on legacy op risk losses

By Steve Marlin | Features | 5 December 2017

European banks could see big jump in capital if losses from legacy businesses are included in SMA

Earlier this year, Credit Suisse went to its regulator with a plea for clemency. The firm’s Strategic Resolution Unit, which houses its non-core assets and business lines, was saddled with $20 billion of operational risk-weighted assets. But much of this was linked to activities the SRU had long since exited, and the bank wanted its regulator to let bygones be bygones.

The Swiss Financial Market Supervisory Authority (Finma) is not the only regulator to have fielded such a request. Risk.net understands at least two US banks have sought capital relief from federal regulators for businesses that were sold or discontinued after suffering large operational risk losses, with mixed results.  

“Some of the major US banks have had discussions with the authorities,” says a Washington, DC-based lobbyist. “I don’t know if it’s been conclusive or if banks have necessarily been happy with how it’s played out to date, but I would imagine that if they didn’t get the answer they wanted previously, that they’re probably looking to revisit that issue.”

Finma declined to comment on its discussions with Credit Suisse.

Banks have long complained about the need to account for past operational risk losses regardless of any actions taken to remediate the underlying issues, including divestment. “Right now, when you incur an operational loss, it has got a plutonium-238 half-life,” John Gerspach, Citi’s chief financial officer, said last year in a presentation to investors. “It’s just something we all know we need to address.”

The Basel Committee’s proposed standardised measurement approach (SMA) for operational risk, which was first released in March 2016, has been criticised for its rigid use of 10 years of loss data to calculate exposures.

“One of the fundamental sticking points of the SMA is they’re saying ‘your past sins are staying with you’, even though you might have remedied the issues and the controls that led to them,” says the head of operational risk at a large international bank. “The next question is, ‘what if I’ve sold this thing off and it’s no longer mine?’ Surely that’s at least an argument not to count them.”

Right now, when you incur an operational loss, it has got a plutonium-238 half-life

John Gerspach, Citi

Regulators have indicated they are receptive to that argument. “We are open to suggestions from the industry to remove certain historical loss events – such as those loss events from divested business – from the capital calculation,” Mitsutoshi Adachi, chair of the Basel Committee’s working group on operational risk, said at a Risk conference in June 2016, shortly after the draft SMA was released for consultation.

“An obvious challenge here includes identifying relevant loss events for exclusion, because it is possible that a loss event from a divested business is still relevant for the bank’s current profile. Therefore, establishing transparent guidance for inclusion and exclusion of certain loss events will be required if we choose to move in that direction,” he added.

Bank risk managers admit that will be tricky. Operational risk losses are often a symptom of cultural failings, which cannot always be addressed by exiting a business line or disposing of problem assets.

“I would not allow firms to immediately remove losses related to disposed-of subsidiaries,” says the head of operational risk at a large European bank. “Barclays and JP Morgan have both been fined for manipulating Libor, foreign exchange and electricity in California. This is indicative of overarching cultural issues. Selling a subsidiary does not remove these cultural issues, and fixing a firm’s culture is a bit like turning an oil tanker.”

The Basel Committee is expected to address this issue when it publishes its revised bank capital framework on December 7.

“The issue pertaining to discontinued or divested businesses was a theme noted during the consultative process and has therefore been considered by the Basel Committee,” a spokesperson for the standard-setter tells Risk.net. “The operational risk framework’s details will be set out in the revised Basel III framework, which will be published once endorsed by the group of central bank governors and heads of supervision.”

Passing the buck

The worry is that the Basel Committee will simply punt the question to national regulators. The possible outlines of a revised SMA were revealed in a leaked Basel document in May, which suggested national regulators will have the freedom to let banks ignore past op risk losses when implementing the framework. Under the leaked proposal, national regulators would be expected to pick up the slack through Pillar 2 capital add-ons.

“One scenario that some have suggested is if the SMA is edited to downsize the emphasis on loss history, then individual supervisors of those that have had major losses may insist that those banks carry something in Pillar 2 to reflect that,” says Brad Carr, a director in regulatory affairs at the Institute of International Finance.

Brad Carr, IIF

That could be a recipe for confusion. Inconsistent treatment of past losses is one of the common criticisms of the advanced measurement approach (AMA) for operational risk, which the SMA is intended to replace. The AMA requires banks to base internally generated operational risk estimates on a minimum historical observation period of five years. However, national regulators have the discretion to extend this look-back period indefinitely, and to exclude certain losses from the calculation.

Over time, a transatlantic divide emerged regarding the losses reflected in AMA calculations. “The exclusion of historical losses in the AMA was not something that was treated consistently across jurisdictions,” says Luke Carrivick, head of analytics and research at ORX, which provides data on operational risk. “For example, in the US, data from divested businesses had to be included in modelling, but this was not always the case in parts of Europe.”

In the US, guidance from the Federal Reserve requires banks to incorporate all relevant data within their AMA models. While the guidance stipulates a minimum of five years of internal loss data, banks are encouraged to use a longer horizon for estimation purposes wherever feasible.

“The Fed has taken the approach that all losses have to be in as long as they’re from a year where you feel that your dataset was comprehensive and complete,” says Evan Sekeris, a partner at Oliver Wyman. “If you have 15 years of data, then you have to use 15 years. If by 2050 you have 50 years of data, you have to use 50 years.”

The Fed guidance does not directly address the issue of operational risk losses incurred by divested businesses. However, it does state that following a merger or acquisition, banks should combine the operational loss histories of the underlying firms and treat the resultant loss history as if it had occurred at a single entity. This involves linking and aggregating losses into a single loss event for modelling and risk management purposes when there is a common trigger or instigating factor, including losses occurring before the merger or acquisition.

The implication is that – barring some regulatory relief – the seller would have to retain certain losses in its model to reflect its residual exposures and cultural issues as well as any indemnities provided to the acquiring firm.

Op risk bankers, especially those from European firms, say obtaining op risk capital relief from the Fed for divested businesses can be tricky. “I have experienced that [seeking capital relief for divested businesses] in prior roles,” says an operational risk specialist at a second European bank. “There was never a firm answer.”

A painful scenario

While US banks are required to use fairly conservative loss distribution models, European firms take full advantage of the four broad categories of inputs allowed under the AMA: internal loss data, external loss data, scenarios, and business environment and internal control factors.

“You’ve got differing versions of AMA between one that’s more centred on loss history and one centred on scenarios,” says Carr. “The scenario approach focuses on forward-looking risk identification, which gives more scope to look past those historical events that are no longer relevant and to pick up sources of loss that haven’t happened yet, but could arise in the future. This could give a more accurate picture, though critics would argue that it’s a more subjective approach.”

As a result, the move to the SMA is likely to be more punitive for European banks, which have been able to rely on scenario analysis to exclude certain loss events from their calculations. “By design, the SMA was mirroring US-type capital numbers, so US banks expect under the SMA to be holding fairly similar numbers as they’re holding under AMA,” says Sekeris.

But European firms could see a 63% increase in Pillar 1 operational risk capital requirements on average under the new framework, according to an ORX analysis of the original March 2016 SMA proposal.

Evan Sekeris, Oliver Wyman

Such an outcome will be especially painful for European banks that have worked hard to offload riskier businesses – often at the urging of national regulators. In the first three quarters of 2016, Credit Suisse’s SRU offloaded its entire credit derivatives portfolio, sold or restructured 40% of its loans and financing facilities, and reduced its derivatives trade count by roughly 50%.

Those moves cut the unit’s credit and market risk-weighted assets by 35%, from $54 billion to $35 billion. But over that same period, operational risk-weighted assets at the SRU rose from $19 billion to $20 billion, and are likely to be even higher when the SMA is implemented.

The same pattern can be observed at other so-called ‘bad banks’. For instance, Citi Holdings saw its credit and market risk-weighted assets fall from $132 billion at the end of 2014 to $55 billion at the end of last year after divesting a number of legacy business lines, including a US consumer lending unit and its Japanese retail and credit card business. However, its operational risk-weighted assets barely budged, from $57 billion in 2014 to $49 billion at the end of 2016.

Relief effort

Bank executives say regulators need to think seriously about granting capital relief for operational risk if they want banks to continue de-risking via divestments and asset sales.

“There are certainly instances where getting capital relief is the right incentive for organisations to de-risk the business,” says an operational risk specialist at a third European bank. “If you have a business that’s creating outsized risk and there is no capital incentive, then people might hang on to that business just to get whatever profit they can from it.”

Regulators will need to walk a fine line, however. “If the conduct losses are pervasive across an institution, then divesting a business won’t solve the problem,” says the specialist.

The exclusion of historical losses in the AMA was not something that was treated consistently across jurisdictions

Luke Carrivick, ORX

The consensus in op risk circles seems to be that banks will have an easier time persuading regulators to provide capital relief for businesses that have been sold than for ones that have merely been wound down.

“If it’s not clear to a regulator that you have no exposure at all to a certain risk, they will not lower your capital,” says Marcelo Cruz, adjunct professor at New York University and a former bank op risk chief. “There’s a difference between selling and discontinuing a business. If you sell the business, you might have a stronger case; the liabilities usually go with the business. But if they just discontinue the business, they will still be exposed to some operational risk as they can be sued years after for any perceived malfeasance when they owned the business.”

Even if a bank has sold a business, it may not always have a clear-cut case for relief. Cruz says one bank sold off a business after it suffered a major cyber-attack five years ago.

“In 2017, the FBI got a hard disk apprehended by Russian authorities from a hacker in Moscow, and got proof that the bank was hacked in 2012,” he says. “The bank claimed that they sold the business, but the lawyers from the buyer claim the bank was responsible for security at the time. Although the dispute is still in court, there is a significant risk that the bank will have to settle for some significant sum.”

In determining whether a discontinued business is eligible for op risk capital relief, regulators will likely pay close attention to the firm’s loss history following the closure, especially if the discontinued business has suffered a large loss in the past. For example, if the discontinued business previously suffered a $100 million loss, and the bank can show losses have not exceeded $50,000 for a period after the closure, it will have a stronger case for relief.

“There is a misconception among practitioners that the loss is staying, that it will always give high capital. That is not a fact; it is a myth,” says Kabir Dutta, a senior consultant at Charles River Associates. “Once you have many small losses, the intensity and importance of large losses decreases and goes away.”

Carney: conduct risk failings could spark capital add-ons

By Tom Osborn | News | 29 November 2017
Mark Carney

Senior Managers Regime is helping BoE identify cultural weaknesses at individual firms, says governor

Tough new conduct risk rules are making it easier for UK regulators to spot evidence of cultural failings at banks, with repeat offenders likely to see their operational risk capital requirements hiked, the governor of the Bank of England, Mark Carney, has warned.

The Senior Managers Regime – which came into force for banks and prudentially regulated firms last year, and which is due to be expanded to include virtually all firms regulated by the Financial Conduct Authority next year – requires firms to clearly define responsibility for 17 key functions across an institution, such as the role of chief risk officer and head of anti-money laundering controls. It also includes basic requirements for institutions to act with integrity at all times and treat customers fairly.

“For supervisors – us and the FCA – the regime is helping identify weaknesses in governance and accountability. It’s helping us assess the fitness and propriety of senior managers and others in positions of responsibility – and [assess] whether a firm has the appropriate culture and is encouraging the necessary changes,” said Carney, who was speaking at an event held by the Ficc Market Standards Board in London on November 29.  

“If that isn’t the case, in the first instance, widespread or consistent shortcomings would have consequences for the compensation of individuals. More persistent failings could increase the capital that is set aside for operational risk – so it would have consequences for the firm itself. And in the extreme, it could influence our judgements regarding the fitness and propriety of senior managers,” he added.

Carney did not offer an example of what persistent failings could look like – nor how an ad hoc add-on to a bank’s required minimum level of op risk capital would be applied. At present, the UK’s Prudential Regulation Authority uses supervisory judgement to determine how much Pillar 2A capital banks must put aside for conduct risk.

Under a BoE proposal issued in July, banks will also be expected to reveal their total capital requirements across Pillar 1 and 2A – something op risk practitioners say could end the current disparity in disclosures between banks using different approaches to calculate their op risk capital.

The Senior Managers Regime has been the subject of controversy since it was first mooted. Banks complained the pace at which the rules were introduced left them little time to root out bad behaviour; others have reported recruitment difficulties for key functions, as well as uncertainty over what breaches should trigger a disciplinary action. A controversial clause that would have placed the onus on senior managers to demonstrate they had taken all reasonable steps to prevent a bank failing, rather than the regulator having to prove the case, was watered down at the eleventh hour before the rules were implemented, after some fierce lobbying from senior bankers.

But Carney suggested the regime was welcomed by many senior managers for its positive impact on improving risk culture: “We are already seeing encouraging signs that it is making a difference. For firms, it’s clarified [the need for] improving governance, accountability and decision-making processes. Senior managers are increasingly focused on building cultures of risk awareness, openness and ethical behaviour. In the words of one chair, ‘Responsibility for culture has now moved to the top of my agenda’. I’m not sure where it was before that.”

Regulators’ standard response to cases of misconduct that have come to light post-crisis has been to levy large fines. These now total more than $320 billion worldwide, Carney noted – “capital that could otherwise have supported more than $5 trillion of finance to households and businesses across the G20”.

The impact of fines on banks’ capital requirements also lingers on long after the initial payments are made, as they have an outsize influence on the calculation of firms’ required levels of op risk capital.  

Responding to questions, Carney suggested regulators increasingly prefer using enhanced governance and tougher conduct rules to tackle misconduct, rather than the big stick approach of fines.

“We all can sense that… an approach to misconduct which is entirely ex-post punishment of institutions and their shareholders at that time is not the best way to manage that situation,” he said.

Standardized measurement approach extension to integrate insurance deduction into operational risk capital requirement

By Fabio Piacenza, Claudia Belloni | Technical paper | 28 November 2017

How to save op risk modelling

By Risk staff | Opinion | 28 November 2017

Drop loss categories and correlations and adopt simple loss distribution, advises AMA expert

It’s been 20 months since the standardised measurement approach (SMA) for operational risk capital was proposed by the Basel Committee on Banking Supervision. But, despite much soul-searching by both banks and regulators, the method has still not been finalised.

A watered-down version of the original proposal is, at the time of writing, the likeliest successor to current approaches but, given its many flaws, the danger is it will create more problems than it solves. And a key problem the SMA aims to solve is the decade-old advanced measurement approach (AMA). It seems obvious, however, that to fix the AMA regulators need to do just that: fix it rather than replace it with an entirely new and highly controversial method.  

First, a new AMA should be applied only to non-conduct operational risks, while conduct risks will need their own, potentially radically different approach. Second, the AMA should not split non-conduct operational losses into further types. It would also benefit from adopting a single simple distribution such as generalised Pareto or lognormal and, lastly, from abandoning correlations as inputs in models.

The embattled AMA requires banks to group operational risk losses by their various event types, such as internal fraud or damage to physical assets, or lines of business – with the resulting clusters affecting the bank’s models and subsequent operational risk capital charge.

However, granular categorisation of operational losses – whether by event types, lines of business or any combination thereof – is of little use. What makes more sense is to distinguish between losses related to conduct risk and those linked to all other operational risks. This observation was made in a series of reports and papers by the European Banking Authority, Bank of England and the UK’s Prudential Regulation Authority, all of which came out after about a decade of AMA implementation.

Doing away with multiple operational risk categories would solve a well-known problem of the AMA: the big differences between banks in the use of correlations

Empirical support for a much simpler categorisation of losses comes from the AMA models banks have been using: of all the loss distributions available, most AMA banks use only a few, with the lognormal and the generalised Pareto distributions among the most common. Given the narrow range of distributions that can describe operational losses, there is little benefit in a detailed classification of the losses that feed the models.

An inherent preference for a less granular method is also apparent in the more detailed studies on the weaknesses of the SMA. All of these works approach operational risk as a single threat requiring an overall capital buffer, rather than a collection of separately capitalised risks.

Doing away with multiple operational risk categories would solve another well-known problem of the AMA: the big differences between banks in the use of correlations – be it between different groups of risks, risk scenarios or individual risks – and a lack of consensus on correlation coefficients. Giving up correlations won’t be much of a sacrifice as there isn’t, to my knowledge, a single convincing empirical study that, firstly, provides any robust correlation measures and, secondly, proves they lead to better estimates of operational risk.

In fact, loss correlations are widely thought to be either too weak or dominated by statistical noise. Therefore, their inclusion in any operational risk model would require an active imagination and massaging of data. And these are the last things anyone would want in a new and improved operational risk capital model, especially after the AMA experience.

This proposal should deliver a modelling approach that’s transparent and simple and, as such, easy to use and easy to understand for all bank risk managers and supervisors. Adopting it will take the industry much closer to being able to make fair comparisons between capital levels at different institutions. And that will pave the way for benchmarking across firms and countries, helping regulators make better-informed decisions.

The missing piece of the puzzle is, of course, conduct risk, and finding a similarly robust way to quantify it is the next challenge. Conduct risk is extremely difficult to model and so may call for a completely novel approach.

Ruben D. Cohen is an independent consultant, currently working in model risk and validation. Previously, he worked for 10 years in AMA model development.

Russian crypto-currency will threaten AML efforts

By Risk staff | Opinion | 24 November 2017

Targeting individuals and companies could be impossible with digital currency, write academics

Russia is preparing to issue a government-backed crypto-currency, CryptoRuble. Unlike decentralised crypto-currencies such as bitcoin and Ethereum, there is no mining involved in CryptoRuble – all transactions are recorded via blockchain and verified by a centralised government authority.

The implications are worrying: such a system could be a means for a state to exert control or evade controls – including sanctions. Run badly, it could also become a haven for dirty money.

Decentralised crypto-currencies have flourished since the inception of bitcoin in 2009, with total market capitalisation exceeding $170 billion as of October 17, 2017. Anonymity of transactions and decentralisation is appealing to a range of users – including those with nefarious intentions, such as money laundering and tax evasion – but, unsurprisingly, not to central banks or sovereign governments.

Blockchain is not synonymous with decentralisation, being a ledger for keeping a record of all transactions. Like traditional bank ledgers, it can help prevent or reduce fraud, errors, and so on. But this benefit is only the tip of the iceberg. With decentralised crypto-currencies, one of the key concerns of central banks and sovereign governments is ceding control. With government-issued crypto-currencies, central banks and sovereign governments will gain even more control, not less, than with the current banking system.

Blockchain maintained by a centralised government authority provides a centralised, un-fragmented ledger of all transactions and associated information, such as meta-data – a tempting prospect for any government.

Added to this, ‘undocumented’ CryptoRubles – those without proof of origin – will reportedly be subject to a 13% tax. This is, whether intentionally or not, effectively a government-sponsored money-laundering machine, and with such a low overhead should be extremely attractive to all sorts of shady players. Russia will attract not only Russian but also foreign money (including dirty money).

It seems likely CryptoRubles will act as a tax shelter for US and other foreign individuals. One embedded bonus for the Russian government is that it will own the information encoded in blockchain, including potentially shady transactions, giving it unprecedented insight and potential leverage over the transactors.

However, for Russia – or another state at odds with the West – the primary goal of issuing a government crypto-currency is to free their monetary system from the controls exerted by the Federal Reserve, European Central Bank (ECB) and their allied central banks.

CryptoRuble will disrupt the current world monetary system, the Fed, ECB, US and EU policies, their law enforcement operations and many other areas

CryptoRubles create a buffer layer that only the Russian government has control over, with pertinent information inaccessible to foreign nations. Russian elites can launder their money using CryptoRuble, become impervious to (or less affected by) economic sanctions, make their assets currently tied up by US and EU sanctions more liquid, and so on.

For instance, consider an oligarch targeted by sanctions. With the advent of CryptoRuble, this oligarch could set up a new shell company (for example via a convoluted web of trusts) to continue a prior line of business such as exports of Russian oil or some other commodity. Revenues of this shell company can be legally converted into CryptoRubles. These CryptoRubles can then be converted into funds, or property, goods/services, etc, accessible by the oligarch.

Since CryptoRuble uses a centralised blockchain as a non-distributed ledger only available to the Russian government, its conversion into the funds accessible by the oligarch remains hidden from all other observers. The Russian government may tax these funds at 13% if the oligarch does not disclose their origin, but this is a small price to pay. Thus, CryptoRuble will allow various individuals and companies to skirt sanctions. North Korea is already suspected of using bitcoin for precisely this purpose.

Disruption

CryptoRuble will disrupt the current world monetary system, the Fed, ECB, US and EU policies, their law enforcement operations and many other areas. Drug money will be easier to clean. Anti-money laundering (AML) controls and efforts will be adversely affected – with the anticipated 13% haircut, this opportunity is simply too attractive to those with nefarious intents, and will undoubtedly give an incentive to shady players. Money will flow away from the US, EU and UK and into Russia.

Other countries will probably issue their own crypto-currencies, and multiple government-issued crypto-currencies (see table A) will only exacerbate the deterioration of AML controls. Other Brics members (Brazil, China, India and South Africa) are natural candidates, and China, Dubai, Estonia and Kazakhstan are reportedly considering following Russia’s lead.

A key difference between government-issued and decentralised crypto-currencies is that the former do not require mining – the energetically and computationally costly process of verifying transactions, which is rewarded with small payments in newly minted crypto-currency. In this sense, centralised government-issued crypto-currencies are more “efficient”, and should be cheaper to maintain and more readily usable as legal tender in large economies such as Russia.

This disruptive technology – crypto-currencies – will indeed end up disrupting the status quo. However, at least in the mid-term, forward-thinking sovereign states that embrace and adapt it to their advantage will end up being the disruptors as opposed to disrupted. The US is the sovereign state with most to lose in this process, with a clear policy implication: adapt to the changing reality, issue CryptoDollar now, or risk being marginalised.

Stakeholder implications

What about other stakeholders? In the table below, we outline some pros and cons of government-issued crypto-currencies for various interested parties. For example, the populace benefits from lower transaction costs, but the price to pay is diminished privacy and a greater control by the government.

Table A: Potential pros and cons of government-issued cryptocurrencies

Stakeholder

Pros

Cons

Populace

Low transaction costs

Security and privacy concerns, government control

Issuing governments

Control over transactions, information

Cannibalisation of the existing monetary system

Other governments

Loss of control (for some)

Lower regulatory burdens

Banks

Low error and fraud rates

Lower fees

Small businesses (including fintech, but not only)

Higher growth, increased opportunities for new technological developments

Reputational risks (money laundering, fraud, etc)

Finance

Higher growth, new asset classes

Destabilisation of the current business models

Economy

Higher growth, facilitated internet commerce

Learning curve, reputational risks

Non-government organisations

Injecting technology into developing countries

Learning curve

Heads of states

Forward-looking/progressive thought leadership, control over information and monetary transactions

Resistance from opposition (to authoritarian tendencies, etc)

Marketplace participants

Simpler/more transparent monetary transactions

Increased government control and potential interference

Decentralised crypto-currencies (bitcoin, Ethereum, etc)

Creating perception of legitimacy (longer-term)

Even higher volatility than currently (short-term)

 

As another example, consider banks and other traditional financial institutions. Government-issued crypto-currencies will diminish some functions of banks since the traditional local-by-nature bank ledgers will become obsolete – as will paper money, with implications for ATMs and the bank fees associated with them.

Generally, transaction costs will be reduced, which is bad news for banks. However, this does not make banks obsolete, as they have a number of other functions, such as provision of credit.

CryptoRuble will have implications for existing decentralised crypto-currencies such as bitcoin and Ethereum. In the short term, higher volatility for decentralised crypto-currencies can be expected due to two competing perceptions: i) government-issued crypto-currencies go against the premise that decentralisation is key to crypto-currencies; and ii) on the other hand, they lend increased credibility to blockchain technology.

In the beginning, there is likely to be a lot of noise and uncertainty, political jostling, security concerns, fear-mongering in press coverage, etc. However, it is feasible that in the longer term, decentralised crypto-currencies will benefit from increased credibility, and even more users will jump on the bandwagon – both those motivated by a major world power issuing its own crypto-currency as well as those perceiving this as sovereign governments attempting to exert more control. Interesting times lie ahead.

Crypto-currencies provide a simple solution for Russia – if anything, it is surprising that it took it so long

At first glance, it might seem unexpected that Russia, of all countries, has emerged as a leader in this process. However, the reason is not technological but geopolitical. Russia has been subject to sanctions and global pressure due to its foreign policy. Dependence on the existing world monetary order is a major stumbling block for the country. Its desire to gain a greater degree of independence from the Fed/ECB is by no means a surprise.

Crypto-currencies provide a simple solution for Russia – if anything, it is surprising that it took it so long.

It appears that government-issued crypto-currencies are a foregone conclusion. Large sovereign states have the technological know-how and means to do this. What about small and/or developing countries? If they are forced to outsource issuance of their government-backed crypto-currencies to larger states, geopolitical and economic implications are evident: less sovereignty and ceding control over crucial information to more powerful countries.

Alternatively, some countries could partner with the private sector, as appears to be the case with Dubai, but this would raise its own host of issues.

A longer version of this paper is available on the Social Sciences Research Network site, https://ssrn.com/abstract=3059330.

Zura Kakushadze is chief executive and co-founder of Quantigic Solutions and a professor at the Free University of Tbilisi. Jim Kyung-Soo Liew is chief executive and founder of SoKat Consulting and an assistant professor at the Carey Business School at Johns Hopkins University

ETFs under scrutiny over liquidity risk

By Risk staff | Features | 21 November 2017

Secondary market trading in funds could freeze up in times of stress, supervisors fear

It is an unwritten rule of financial markets that any security experiencing breakneck growth will attract a proportional increase in regulatory attention. The market for exchange-traded funds, or ETFs, has proved no exception.

Of particular concern is the tracking error that can occur in ETFs covering less liquid underlying markets. Another concern, also liquidity related, centres on the creation and redemption of ETF shares, particularly in stressed market conditions.

Both areas of scrutiny stem from a specific element of the ETF structure: the primary market. Largely invisible to most investors in ETFs, the primary market helps the price of these funds stay in line with the value of their underlying holdings.

The key entities in the primary market are specialist intermediaries called ‘authorised participants’ (APs), who exchange the ETF’s constituents for ETF shares (and vice versa) in wholesale transactions with the fund issuer. APs then interact with other trading firms to help set the secondary market prices of ETFs.

If the primary market functions as intended, the secondary market price of an ETF should stay within a band that reflects the cost to the AP of creating or redeeming the fund. When buying, selling or holding an ETF, other market participants therefore place implicit trust in the ability of APs to fulfil their designated role.

“APs play an important role in ETFs, yet relatively little has been written about this aspect of the operation of ETFs,” BlackRock, the largest global issuer by assets under management, wrote in a white paper earlier this year.

The primary market has helped ETFs to address the inefficiencies inherent both in traditional mutual funds, which allow investors entry and exit at a single daily pricing point, based on a fund’s net asset value (Nav), and in closed-end funds, which trade continuously on stock exchanges, but frequently at large premiums or discounts to Nav.

“ETFs have helped solve the premium and discount problem inherent in closed-end funds like investment trusts,” says Paul Tucker, former deputy governor of the Bank of England.

As the ETF market has grown – the assets invested in ETFs and other exchange-traded products reached $4.6 trillion at the end of October 2017, up from $3 trillion in mid-2016, according to research firm ETFGI – fund issuers have expanded their product range to cover more esoteric underlying exposures.

In a discussion paper published in March 2017, the Central Bank of Ireland highlighted the increasing importance of the primary market in supporting the narrow secondary market bid/offer spreads often advertised as a key selling point of ETFs.

“As ETFs broaden their range of investment exposures, this formula of enhanced ETF liquidity (supported by the creation/redemption mechanism) seems to become more significant, particularly in cases where underlying asset classes are themselves characterised by less robust liquidity,” the Central Bank of Ireland noted.

“Of interest is that while the liquidity features of ETFs are promoted, the potential for underlying assets to be significantly less liquid remains.”

Two-tier liquidity

Although the primary market underpins the functioning of ETFs, in practice relatively few trades in ETFs result in a creation or redemption of fund units.

According to evidence given in September by fund issuer Vanguard to the International Organization of Securities Commissions (Iosco), around 94% of the trading volume in US equity ETFs in 2012–2015 took place between participants in the secondary market and therefore resulted in no primary market creations or redemptions. In US bond ETFs, secondary market trading accounted for 83% of transactions during the same period, Vanguard said (see figures 1 and 2).

According to ETF issuers, this secondary market trading layer results in a valuable additional source of liquidity for market participants, as well as representing an operational improvement over a traditional mutual fund. In a traditional fund, investor purchases or sales would result in a cash inflow or outflow, requiring the mutual fund portfolio manager to buy or sell securities to maintain the desired exposure.

“The secondary market trading of ETFs serves as an additional source of intraday liquidity for market participants while intraday market prices reflect valuable information about market conditions…meaning investors can trade a broad portfolio of securities without trading in the underlying market,” Vanguard wrote in its submission to Iosco.

However, regulators have pointed out that this extra secondary market liquidity layer may not be reliable in stressed markets.

Earlier this year, the G20 Financial Stability Board noted that “APs are not obligated to create or redeem ETF shares, and an AP engages in these transactions only when they are in the AP’s best interest given market conditions. This could have potentially negative effects on the ability to trade without accepting significant discounts to the estimated value of the underlying assets if, for example, one or more APs were to pull back from the market in turbulent conditions.”

“ETFs are materially different from a traditional mutual fund,” says Fred Sommers, ETF industry specialist and formerly of Basis Point Group. “APs have no legal obligation to create or redeem. If you buy a mutual fund, you give them money and you know they will invest in the underlying securities. Both the market-makers and APs of an ETF could leave you hanging.”

ETF issuers counter by pointing to strength in the diversity of AP relationships.

“First, if a single AP were to withdraw, other APs can step in to facilitate creations and redemptions of ETF shares,” BlackRock wrote in its white paper. “Importantly, if an economically significant premium or discount (that is in excess of transaction costs) is present, other APs will have a clear economic incentive to step in.”

There have been test cases for these arguments.

In 2013 some US-listed ETFs investing in emerging market stocks and municipal bonds fell to a temporary discount to Nav after AP Citibank said it had run into internal risk limits and stopped handling redemption requests.

“This example is often cited as an area of concern. However, it is actually a case study in how the system can be self-correcting. During this situation, other APs in these products saw this as a profitable opportunity,” wrote BlackRock.

The Central Bank of Ireland noted: “Overall, the impact of ETFs on liquidity seems to be substantial and strongly positive for liquidity in all market situations where the AP mechanism works effectively. However, should the AP mechanism fail, market liquidity could contract quite suddenly, depending on the profile of ETF secondary market investors. We have already noted the potential for correlated stress in collateral counterparties and APs. If collateral counterparties and APs had separate exposures to markets whose liquidity had been enhanced by the activity of ETFs, impactful and complex patterns of contagion could emerge from these linkages.”

Model diversity

In theory, a creation in the ETF primary market involves the AP supplying the basket of shares or bonds held by the ETF to the fund sponsor, who gives ETF shares to the AP in return. A redemption is the same process in reverse.

In practice, there’s a wide variety of primary market models, both between funds operated by the same issuer and between issuers.

For an ETF investing in liquid, large-capitalisation shares, the standard model accurately describes the primary market. For example, BlackRock, the sponsor of the $6.7 billion iShares Core FTSE 100 Ucits ETF (ISF), notifies its APs each morning via a spreadsheet called the portfolio composition file (PCF) what it requires to create shares in the ETF. On November 10 the PCF for ISF contained the 101 line items in the FTSE 100 index, plus the notional amount of each security required to create the fund.

For an ETF tracking an index representing the less liquid high-yield corporate bond market, the make-up of a typical PCF is different. For example, the November 10 PCF for the $6.5 billion iShares Euro High Yield Corp Bond Ucits ETF (IHYG) contained three separate lists of bonds: a 447-bond ‘tracking basket’, representing the holdings of the fund and intended to help APs price the fund during intraday trading; a 32-bond ‘creation basket’, showing the bonds iShares would accept to create units in the ETF; and a 29-bond ‘redemption basket’, showing the bonds iShares would hand out in the case of a redemption request by an AP. The index tracked by IHYG contains around 250 bonds.

This data was supplied to Risk.net by technology firm Ultumus. ETF issuers’ PCFs are normally only sent to APs and are not visible to the broader market.

There’s no way I’d accept a basket that doesn’t include Italian financials in an investment-grade European corporate bond ETF, for example

Vasiliki Pachatouridi, BlackRock

The development of the ‘three-basket’ PCF has gone hand-in-hand with the evolution of fixed-income ETFs, a rapidly growing part of the market. Using separate creation and redemption baskets, typically smaller than the ETF itself in terms of the number of securities held, enables fixed-income ETF portfolio managers to use primary market activity to help rebalance the fund.

This practice also keeps primary market costs under control, since for the AP to supply the full list of fund holdings to the ETF issuer could be prohibitively expensive when bonds are the underlying asset class. In turn, these lower primary market costs are likely to be reflected in tighter bid/offer spreads for bond ETFs in the secondary market.

While using creation and redemption baskets that are only a subset of the fund’s holdings carries obvious cost advantages, it also incurs risks. For example, if the portfolio manager gets the basket make-up wrong, the ETF could suffer unnecessary tracking error versus the index.

When an ETF tracks less liquid underlying assets, the ETF manager therefore faces a potential trade-off between keeping trading costs low (via smaller, customised creation and redemption baskets) and replicating the index accurately.

For example, in a recent report, fund research firm Morningstar drew a link between the underperformance of State Street’s SPDR Bloomberg Barclays High Yield Bond ETF (JNK), one of the largest US-listed high-yield bond index trackers, and the size of its primary market baskets.

“[JNK] has exhibited meaningful tracking error, owing to the illiquid nature of the junk-bond market… State Street made some adjustments at the beginning of 2016 to reduce tracking error. It increased the size of the fund’s creation and redemption baskets from around 40 securities to between 100 and 150 bonds, better aligning the fund’s composition with the index,” Morningstar analyst Phillip Yoo wrote in late October.

State Street confirmed that it had changed its portfolio management process for JNK in early 2016 to improve the ETF’s tracking. It also attributed tracking error partly to changes in the benchmark’s rules since the fund’s inception.

Interviews conducted by Risk.net with ETF issuers and APs highlighted that creations and redemptions in bond ETFs may also involve an element of negotiation. For example, an AP may suggest a list of bonds to an ETF issuer as the possible components of a creation basket, while the ETF issuer may propose an alternative list. Horse-trading may continue during the day until the cut-off time for the creation or redemption of a fund.

ETF issuers play down the potential risks associated with the use of negotiated custom baskets and maintain that the discussions involved are always conducted with the interests of fund investors in mind.

“Irrespective of the size of the creation basket, the characteristics of the bonds in the basket always closely resemble those in the benchmark, whether measured by duration, spread duration or sector breakdowns,” says Vasiliki Pachatouridi, fixed-income product strategist at BlackRock, based in London. “There’s no way I’d accept a basket that doesn’t include Italian financials in an investment-grade European corporate bond ETF, for example. But if you have a $5 million creation in an $8 billion fund you have more flexibility in what you accept into the fund than, say, if you are doing a $50 million creation in a $500 million fund.”

“Also, the broader the benchmark, the more choices the portfolio manager has. Ultimately, it’s about doing the right thing for the investors in the fund. In extremis, we reserve the right to request all the bonds in the index as the creation basket,” she adds.

ETFs have multiple APs and they would not hesitate terminating any of them who sought an unfair advantage

Peter Shea, K&L Gates

But the US securities market regulator, the Securities and Exchange Commission (SEC), has acted to limit the use of non-representative primary market baskets. In 2010, it stopped approving ETFs that wished to employ custom primary market creation and redemption baskets in an attempt to safeguard investor interests, explains Peter Shea, partner at US law firm K&L Gates. ETFs approved before 2010, however, are permitted to continue with custom baskets, creating an uneven playing field for ETF issuers.

“The SEC has a stated concern that large AP ­firms may attempt to overreach on ETF investors by trying to extract more favourable creation and redemption terms through improperly dictating what the contents of the deposit basket for their transaction will be,” Shea wrote in a 2017 paper he co-authored with colleagues Timothy Bekkers and George Attisano.

The SEC declined to comment.

In an interview with Risk.net, Shea argues that the SEC’s concerns about abusive practice by APs during the process of negotiating ETF creation and redemption baskets may be overstated.

“The ETF sponsors I’ve spoken to have all told me that ETFs have multiple APs and they would not hesitate terminating any of them who sought such an unfair advantage. Also, an ETF sponsor would quickly find themselves with an enormous reputational liability if they acceded to the AP, and any such event could not be kept secret since the event would be revealed in the ETF’s internal compliance environment and elevated to the ETF’s independent directors,” Shea says.

Overlapping rules

One of the major challenges for regulators seeking to supervise the rapidly growing ETF market is that ETFs fall between two stools, combining features of a mutual fund and a listed security. In Europe, for example, this means ETFs are subject to both the Ucits Directive, governing mutual funds, and the Markets in Financial Instruments Directive, governing publicly traded securities.

“A key question in regulatory policy is whether these overlapping regulatory frameworks allow the specific features of ETFs to be appropriately regulated,” the Central Bank of Ireland said in its recent discussion paper.

The primary market is arguably the point where these two concepts – the ETF as fund and the ETF as security – interact most acutely. It is therefore likely to remain at the centre of regulatory scrutiny.

However, the extent and timing of regulators’ responses to the questions they have raised over the ETF primary market remain uncertain. The Irish central bank is still in the middle of a debate over the issues it raises in its discussion paper; Iosco is preparing a final report on liquidity risk in mutual funds, which is likely to touch on ETFs; and the SEC is making slow progress towards a new “ETF rule” based on a new regulatory framework, rather than the current system of offering dispensations from the existing mutual fund rule book.

 

Risk of in-kind model

In the US, creation and redemption in ETFs occurs on an ‘in-kind’ basis. Here, the AP exchanges securities, rather than cash, for ETF units. This model is widely used in the US because it offers tax advantages, reducing the capital gains distributions that funds declare annually to their investors and which investors then have to settle with the US taxman.

The US in-kind model has traditionally been associated with the free-of-payment settlement of trades, particularly in ETFs offering exposure to fixed income or international equities.

Free-of-payment settlement means that, when creating an ETF, the AP does not receive the creation unit until all the securities have been delivered to the ETF. If redeeming an ETF, the AP may not receive the underlying shares until a day or more after delivering the ETF shares to the fund.

While these overnight credit exposures between APs and ETFs are typically collateralised with cash, the US Financial Industry Regulatory Authority, which monitors US broker-dealers, recently argued that some APs may not be accounting adequately for the risks associated with ETF primary market trades.

“In general, we found two issues across the firms we reviewed during 2016,” a spokesperson for Finra says. “Some firms failed to monitor for intraday and overnight exposures resulting from the delivery of collateral to ETF agent banks relative to creations and redemptions of ETFs, and failed to establish counterparty credit risk limits to manage the exposures related to this business line. Second, some firms failed to recognise the charges to net capital resulting from the delivery of collateral to agent banks prior to receiving the ETF or underlying components, relative to international ETFs.”

Finra says US APs must recognise the net capital charges arising from the overnight collateralisation of their exposures to ETFs. The regulatory authority also expects APs to follow its recommendation to establish internal counterparty credit risk limits for the exposures.

In another sign that the operational processes of the US ETF primary market are now under increased scrutiny, from 2018 the SEC will require each US-listed ETF to report publicly a range of new information relating to creations and redemptions, including whether its APs were required to post collateral to the fund or any of its service providers, as well as the cash and percentage amount of creation fees.

In Europe, the primary market has largely mitigated these risks by moving from an in-kind model to a cash model, explains Bhaven Patel, vice-president for ETF capital markets at Deutsche Asset Management. Creations are made on a “delivery-versus-payment basis”.

“And, instead of just using the ETF issuer’s dealing desk to invest the cash from a creation, we’ve moved to a so-called directed cash primary market model. Here, the AP selects a broker from a list pre-approved by the ETF issuer and the cash creation is directed to that broker, resulting in lower dealing costs,” Patel adds.

He also points out that automation has reduced operational risk, online order management systems for the submission of creation and redemption orders superseding the use of faxes for orders.

JP Morgan’s CRO on the bank’s six buckets of risk

By Duncan Wood | Profile | 17 November 2017

Risk30: From loan losses to electromagnetic pulses, JPMorgan Chase has a place for it

This is the last of 10 interviews marking Risk’s 30th anniversary. Links to the rest of the series can be found here.

There are lots of things Ashley Bacon doesn’t want to talk about.

During the course of the interview, he declines to comment – or refuses to expand on a point – on nine separate occasions. Bacon explains that he needs to be cautious – which, as chief risk officer (CRO) of JPMorgan Chase, is probably part of the job description.

When he does speak, it’s worth listening. Like his boss, Jamie Dimon, he has strong views on operational risk capital. Op risk exposures count for more than a quarter of the bank’s $1.46 trillion in risk-weighted assets, but Bacon points out that some of this comes from discontinued businesses or past missteps.

(One of the things Bacon doesn’t want to talk about is the bank’s most famous recent misstep – the 2012 ‘London Whale’ credit trading losses he played a big part in resolving, a year before his promotion to the CRO’s job).

He also has concerns about new loan-loss accounting rules – their “intuitively appealing” focus on lifetime credit risk raises awkward questions about the treatment of products such as credit cards, he says. And he believes the basic profile of market risk is changing, with liquidity increasingly likely to be bimodal – abundant in good times and absent in bad – which could magnify future stresses.

But for risk management geeks, one of the most interesting features of the interview is that when describing the bank’s approach to the segmentation of emerging exposures, Bacon barely mentions market, credit and operational risk at all. All three are wrapped up in some way, of course, but it has an unusual way of dealing with the sources of its exposure, separating them into six buckets.

We find it helpful to have a framework into which you can put emerging risks because there are just so many

Ashley Bacon

“We find it helpful to have a framework into which you can put emerging risks because there are just so many. We try to think about them in six different, non-overlapping explanatory categories, and they have very different dynamics around each of them,” he says.

Into the first goes everything caused by “adverse economic conditions” – from interest rate sensitivity to loan losses. Banks are in business to take precisely this kind of exposure, Bacon says, so the aim is to make sure JP Morgan has the right amount.

Bucket two covers deliberate wrongdoing by the bank’s employees or its suppliers. This is where you would find Libor or foreign exchange rate manipulation, rogue trading, deliberate mis-selling, sanctions-busting or a host of other horrors. As a group, these are “very topical and potentially very, very dangerous,” Bacon says.

The third category is for mistakes – process failures or errors of judgement, rather than anything malicious.

The fourth reflects JP Morgan’s exposure to regulatory, political or social forces outside its control – from a gold-plated leverage ratio to deglobalisation and trade wars. By definition, these are tough risks for a bank to manage, Bacon notes: “All you can do really is contribute to the public debate in a credible way.”

Category five catches the risk of market dysfunction, but to remain distinct from category one, it only applies to markets that have gone temporarily mad – not those that have broken down under the stress of a financial crisis. “You could think about a flash crash or unusual liquidity flows. Crucially here, it would not be driven by economics – rather markets that are moving fast in an irrational and dysfunctional manner,” he says.

Alex Towle

I don’t want to get into the mechanics of exactly how we track and signal levels of concern, but yes – we’re very cognisant of which ones are elevated

Ashley Bacon

Finally, the sixth bucket contains natural and man-made disasters, ranging from the obvious – such as tsunamis or tornadoes – to the exotic, such as electromagnetic pulses.

In essence, traditional market and credit risk all goes into the first of these categories, with the biggest sources of op risk taking up the next two. Categories four and five are widely acknowledged as a challenge for banks, but are rarely seen as a risk management responsibility.

To be clear, the bank does also look at risk in terms of manifestation – market, credit and operational – but Bacon says it’s helpful to think about the cause as well. The point, he argues, is that each of the categories has its own distinct set of drivers, and focusing on where the risk comes from makes it easier for the bank to think about its defences.

“What matters with all of this analysis is the ‘so what?’ question,” says Bacon. “It makes you focus on the various programmes of ongoing enhancement around the company, and how they map onto risks we identify as most concerning based on the impact or probability.”

Impact has to be considered for each category separately and can’t be reduced to the size of any immediate financial hit, he adds – a hefty loss that results from the actions of a rogue trader would be perceived in “a completely different way” to one that was caused by a credit downturn.

So, how does the bank turn thought into action? Bacon declines to spell it out: “I don’t want to get into the mechanics of exactly how we track and signal levels of concern, but yes – we’re very cognisant of which ones are elevated, and will pay particularly close attention to applying mitigation strategies wherever appropriate for those.”

The risk manager of the future will be someone who knows how to deploy these techniques to the maximum benefit

Ashley Bacon

However it’s done today, it may be done very differently tomorrow. Bacon sees “enormous” potential in the industry’s attempts to bring together vast datasets, on-demand computing power, and machine learning. “I think a great deal will change over the next several years in terms of the most efficient ways to run a bank and provide our services. It is a huge deal,” he says.

For the risk function, he picks out three places where these new technologies could make a difference: spotting anomalies, identifying trends and also processing natural language as part of a bank’s surveillance efforts.

The spread of these tools and techniques will place new demands on those using them.

“The risk manager of the future will be someone who knows how to deploy these techniques to the maximum benefit. That’s a mindset and skillset change that will occur – starting now and increasingly so over the next three to five years,” says Bacon.

He offers a little detail on how this shift is playing out at JP Morgan. The bank has already found some applications for new technology, Bacon says, with more creative and ambitious plans on the drawing-board. Teams of specialists now work on these efforts within the risk function.

“Within the group I’m responsible for, we absolutely have groups of data scientists performing these functions; and there’s a spectrum ranging from applications of big data techniques that are mature enough to be deployed, to some that are highly speculative and closer to R&D efforts. It is very important that all of this goes on at the same time, as we try to figure out the fastest route to responsibly deploy powerful new technologies,” he says.   

Alex Towle

Risk management scaled for local volatility is interesting and has a level of importance, but what really matters is the extreme tail event, the big gap

Ashley Bacon

So, what kind of things worry JPMorgan Chase’s chief risk officer? As it turns out, mostly the unknown – changes in the behaviour of the financial system that can only be guessed at in advance.

As one example, in the world of traded risk, Bacon raises a concern that the rise of non-bank market-makers – and the curtailed risk appetite of the banks – will produce extreme volatility during periods of stress. The theory is that non-banks – often perceived as having thin capital bases and little tolerance for losses – will retreat from the market during these periods, and that heavier capital burdens mean banks will be less able to cover the shortfall.

This theory has been largely untested in recent years, when there have been severe intraday moves in some markets, but few sustained periods of volatility.

“Perhaps we end up in more of a bimodal situation where in good times things are especially stable and less volatile, and in bad times they may be far more unpredictable and volatile than we have seen in the past. We haven’t experienced so much of the latter in recent years, but I do have a concern that when it inevitably comes, markets may be thinner and more volatile than we used to experience,” he says.

He chooses his words carefully when asked whether the current popularity of short-volatility strategies could magnify any future reversal. Trend-following and short-term self-fulfilment is nothing new, he says – but “I agree it could be a bit more pronounced now than it was in the last cycle.”

Bacon also worries about CCPs. Again, the system has changed – with more risk being underwritten by a handful of big clearing houses – and it’s hard to know exactly how it will cope with a crisis.

Here, the unknowns pile up quickly. Bacon is not primarily thinking about how CCP margin models will cope when markets get choppy, he’s worried about a sudden gap, or an operational calamity – in other words, an unpredictable event hitting an untested system.

There are concepts around how banks are reasonably required to think about their own resolution that are applicable to an exchange or CCP

Ashley Bacon

He says: “Risk management scaled for local volatility is interesting and has a level of importance, but what really matters is the extreme tail event, the big gap – or perhaps a big cyber threat to an exchange or CCP. Those things need to be considered.”

One thing that would help address these fears, he adds, are well-thought-out resolution plans and financial resources that are equal to the task of covering a disaster, without “incremental and unexpected assessments of members or taxpayers”.

In May, JP Morgan issued a white paper on CCP resilience, recovery and resolution that – among other things – called for clearing houses to be bolstered by the same kind of bail-in debt that is designed to mop up unexpected losses at banks. Bacon repeats that call here.

“Exchanges are not banks and I don’t think of them as banks, but there are concepts around how banks are reasonably required to think about their own resolution that are applicable to an exchange or CCP. So to have other reliable resources – something like bail-in capital – available to cope with an extreme tail event seems a reasonable request,” he says.

Another unknown is the impact of new accounting rules that require banks to hold reserves equivalent to the expected lifetime loss on their loan books, rather than losses that have already been incurred. In the US, the current expected credit loss standard takes effect from the start of 2020; elsewhere, International Financial Reporting Standard 9 applies as early as 2018.

Bacon agrees with the general thrust of the new rules – he likes “the idea of thinking carefully about how much money could be lost over the life of a loan – it is intuitively appealing to think about it that way.”

I do understand the attraction of proxies, but we have to discuss as an industry the flaws in simplistic proxies

Ashley Bacon

The “but” lingering after that sentence is partly about the interaction between existing capital requirements and the new loss reserves. Bacon says not all loans have a contractual maturity – credit card loans being one obvious example – which makes the concept of an expected lifetime loss more difficult to apply. If this results in conservative levels of reserving, then he suggests capital levels for those portfolios should be reviewed – and presumably cut.

And, finally, a slightly different problem: that of setting op risk capital requirements. In an April letter to shareholders, JPMorgan Chase’s Dimon called for the current approach to be overhauled or scrapped – his criticism being that the bank is having to hold capital to cover the losses of the past, not the risks of the present.

An overhaul of the framework is underway at the Basel Committee on Banking Supervision, with leaks from that process offering good and bad news for JP Morgan and other critics. The good news is that a document circulating among committee members in June suggested national regulators could be allowed to ignore a bank’s loss history when calculating capital. The bad news is that removing this component would make a blunt proposal – to replace out-of-favour internal models – even more so. Under the arrangement, capital would be tied to a rough measure of size, which is not likely to help JP Morgan.

Bacon recognises the problem regulators face – it’s not easy to know how much capital to hold against huge, rare losses – but he warns against fudging the calculation: “It is a very hard problem to capital-model – to model tail loss in something with sporadic, episodic big events. So I do understand the attraction of proxies, but we have to discuss as an industry the flaws in simplistic proxies.”

Monthly op risk losses: Aussie banks settle rate rigging claims

By Risk staff | Opinion | 8 November 2017

Breakdown of top five loss events, plus insight on $7bn US consumer protection fines. Data by ORX News

Bank Austria, part of the UniCredit group, was ordered to pay €790 million ($919 million) in the largest operational risk loss in October 2017 – a legal risk loss stemming from a retroactive change in legislation. On October 12, the Austrian Constitutional Court ruled that an increase in the fee charged to transfer employees from Bank Austria’s private pension scheme to the state pension should apply retroactively.

The bank announced it would transfer more than 3,000 employees from its own pension scheme to the Austrian state pension scheme in 2015. In March 2016, the fee charged to complete this transfer was raised from 7% of the employee’s final salary to 22.8%, and was ruled to apply retroactively. Bank Austria appealed but the Constitutional Court dismissed the petition, saying the transfer of the pensions was favourable to the bank.

The second largest loss was incurred by Intesa Sanpaolo, which announced it had allocated €100 million to compensate former customers of Banca Popolare di Vicenza and Veneto Banca who allegedly fraudulently lost their savings with the now-defunct Italian banks. Intesa announced it would take over the two banks in June 2017. After reviewing their accounts, Intesa chief executive Carlo Messina said he found the two banks had failed in their fiduciary duties towards their customers.

Both banks are under investigation for allegedly advising retail customers to invest their savings in the banks’ own shares. The customers subsequently lost their savings when the banks were rescued by the Italian state in June 2017.

In third place, Deutsche Bank was ordered to pay €48 million to Postbank investors after it delayed proposing a takeover offer when it began acquiring Postbank in 2008.

The bank paid €57.25 per share for its first share purchase, for a percentage of Postbank just under the threshold at which it would be required to make a formal takeover offer. After buying more shares which took it over this threshold, the bank’s subsequent 2010 takeover offer was for €25 per share. However, shareholders argued that Deutsche Bank had been in de facto control of the bank from the time of its initial share purchase and so should have made an offer at €57.25 per share.

Deutsche Bank denied the allegations, but a Cologne regional court ruled that some shareholders were entitled to additional payments of up to €32.25 per share, to bring the total to €57.25 per share.

Bank of America Merrill Lynch was fined £34.5 million ($45.4 million) by the UK Financial Conduct Authority for failing to report 68.5 million exchange-traded derivative transactions in the fourth biggest loss. The FCA identified a number of issues with its subsidiary Merrill Lynch International’s reporting system, set up to comply with the European Market Infrastructure Regulation, that led to the failures.

The fine also took into account two previous reporting failures by Merrill Lynch International, in August 2006 and in April 2015. The most recent of those settlements was, at the time, the UK’s largest transaction reporting-related fine, at £13.3 million.

Finally, RBS agreed to pay $44.1 million to the US Department of Justice to settle allegations it made fraudulent misrepresentations in the purchase, sale and broking of collateralised loan obligations and residential mortgage-backed securities to its customers between 2008 and 2013 to increase its profits on these products.

Spotlight: Rate-rigging settlements down under

National Australia Bank has agreed a settlement of A$50 million ($38 million) with the Australian Securities and Investments Commission over allegations it manipulated the country’s wholesale interbank rate, the bank bill swap rate, along with two other banks, ANZ and Westpac.

The NAB settlement comprises a A$10 million penalty, Asic’s legal costs of A$20 million, and a A$20 million payment to a consumer protection fund.

Asic filed civil penalty proceedings against the three banks in 2016, claiming they manipulated the rate between 2010 and 2012. In addition to NAB, ANZ also agreed to settle for an undisclosed amount, whereas Westpac is taking the allegations to trial, local media reports. The three banks, as well as Commonwealth Bank of Australia, are four of the 17 global banks accused in a class action lawsuit brought by two US funds of manipulating the bank bill swap rate.

The case echoes those ongoing from investigations into the alleged manipulation of Libor and other related benchmarks by major banks worldwide. Authorities and investors in territories such as the US, UK, EU and Switzerland have reached over $11 billion of settlements with banks for the alleged actions of their traders, which involved colluding to move the benchmarks in a direction more favourable to their trading positions.

Although most of the manipulation is said to have occurred before 2012, losses are continuing. For example, JP Morgan agreed to pay $71 million in July 2017 to settle claims by US investors that it rigged the yen Libor rate between 2006 and 2011.

In focus: CFPB fines cost financial institutions almost $7bn

The Consumer Financial Protection Bureau continues to be an active regulator in the US retail financial markets – though it faces setbacks and legal challenges to its authority. Responsible for protecting US consumers “from unfair, deceptive, or abusive practices”, the CFPB can impose unlimited relief, and civil monetary penalties of up to $1 million per day for violations.

Since its inception in 2011, the CFPB has brought enforcement actions against financial institutions totalling $6.8 billion, at an average of one settlement per month. Of this total, 92% is in restitution, compensation and relief payments; for example, debt relief to consumers making payments on financial products that firms mis-sold. This is in keeping with the regulator’s primary mandate to protect consumers. The remaining 8% is in civil money penalties.

Nearly three-quarters of the relief orders were in conjunction with other regulators and authorities, particularly the Office of the Comptroller of the Currency, and the Federal Deposit Insurance Corporation, as well state authorities and the US Department of Justice.

The largest single action, at $2.1 billion, is a relief order imposed on Ocwen in 2013 for misconduct during its mortgage servicing process. This settlement, reached with the CFPB, 49 US states and the District of Columbia, required Ocwen to make principal loan reductions of $2 billion and refund $125 million to customers.

Since its peak in 2013, the annual amount of relief has fallen, with the emphasis in recent years shifting towards monetary penalties. Notable fines include $100m imposed on Wells Fargo in 2016 for opening unauthorised accounts, which made up two-thirds of the $150m of penalties imposed by the CFPB that year.

Retail credit products were at the centre of 97% of CFPB’s fines and settlements. Mortgages topped the list at $3.0 billion, driven by the Ocwen settlement, but credit cards followed close behind at a total of $2.8 billion over 18 separate events. The majority of these fell in 2014 and 2015, including two settlements of around $770 million, one with Bank of America over allegations of unfair billing practices, and one with Citigroup for allegedly using deceptive marketing for credit card add-on products.

Although the CFPB is continuing to issue fines and reach settlements, a number of initiatives by the bureau’s Obama-appointed director, Richard Cordray, have faced criticism. A rule issued by the bureau in July to bar companies from using arbitration clauses in contracts to avoid class-action lawsuits was repealed in the Senate in October.

An ongoing lawsuit between the CFPB and mortgage service provider PHH has called into question the very existence of the agency. PHH filed the suit after Cordray increased a $6 million fine over mortgage kickbacks by $103 million, claiming the director did not have the authority to do so. In October 2016, the Court of Appeals ruled that the CFPB’s governance structure was unconstitutional, and in March this year the Department of Justice argued that the US president should be able to dismiss the director. PHH went further, arguing the CFPB should be abolished. As things stand, Cordray cannot be removed before the end of his term next July.

So far in 2017, there have been 11 enforcement actions brought by the CFPB for a total of $272.8 million. Again, the majority of these losses are related to retail credit products, although $211.1 million of these losses are due to actions involving student loans.

The largest order by the CFPB so far this year, a $192 million settlement for Aequitas, which allegedly facilitated a predatory student loan scheme, falls into this category. A smaller but still notable $4.6 million fine was imposed on JP Morgan, after the regulator found it failed to provide accurate reports on current accounts to credit agencies.

All information included in this report and held in ORX News comes from public sources only. It does not include any information from other services run by ORX and we have not confirmed any of the information shown with any member of ORX.

While ORX endeavours to provide accurate, complete and up-to-date information, ORX makes no representation as to the accuracy, reliability or completeness of this information.