Whack-A-Mole can be a satisfying game. You wait for the mole, or several moles, to raise their heads in the arcade pastime, and whack ‘em down with your padded bopper. Fast reflexes build points for bopping multiple moles. But bank enterprise risk management is far more challenging than playing games. There are moles you see, risks that are obvious and readily spotted and handled. But in banking’s version of Whack-A-Mole, the moles that don’t pop up can be just as dangerous to the bank—maybe even more.
People who track risk—and regulators, the people who track the activities of the people who track risk—recognize that part of the challenge of the financial crisis concerned the ability to spot and monitor and control all facets of a particular risk, no matter where in the bank it lived.
Very large institutions, especially those with international platforms, experienced this gap. Not surprisingly, in January international regulators, the Basel Committee on Banking Supervision, issued the latest iteration of their thinking about how very large organizations should deal with gaps that can allow risks to escape detection in the large picture. Even for a schematic kind of paper, Principles For Effective Risk Data Aggregation And Risk Reporting, gets pretty heavy on details regarding how very large organizations need to improve systems in order to “get to strong.” But the fundamental principles that are put forth, 14 in all, provides food for reflection.
“One of the most significant lessons learned from the global financial crisis that began in 2007 was that banks’ information technology and data architectures were inadequate to support the broad management of financial risks,” the paper states at the outset. “Many banks lacked the ability to aggregate risk exposures and identify concentrations quickly and accurately at the bank group level, across business lines and between legal entities. Some banks were unable to manage their risks properly because of weak data aggregation capabilities and risk reporting practices.”
Implications carry through up and down the food chain, from the boardroom and down through those whose actions aggregate into the risks that every banking organization takes.
Institutions recognize their challenges
“Get to strong” is a play on regulatory ratings that has become jargon and shorthand for a multi-prong effort to beef up risk management in order to prevent the likes of another crisis. Bankers, regulators, and consultants have adopted the nomenclature. Among the efforts underway to address elements of the broader risks are proposals that a “legal entity identifier” system be created to identify organizations that engage in financial transactions, whether they involve loans, securities issuance, or other forms of risk. The idea would be that no matter where they do business, or affect business, in a financial institution, the exposures could be gathered, aggregated, and looked at and evaluated organization-wide.
We recently interviewed Kevin Blakely, senior advisor, Deloitte & Touche LLP, following a “getting to strong” webinar held by the firm. In the briefing Blakely and colleagues spoke broadly of the need to beef up techniques, including in the areas of board and senior management governance, policies and procedures, internal controls, and, finally, measurement, monitoring, and reporting.
Is the last as dry as dust? Not to anyone who lived intimately with the results of the financial crisis and the surprise of finding out that their organization had far more exposure to subprime credits, or any of the other toxics of the period.
Like our theoretical lurking Whack-A-Moles, they bit when no one knew they were there. The aim of the international regulatory effort, and related projects, is to improve things. Deloitte & Touche’s recently released Global Risk Management Study: Setting A Higher Bar, based on an international sample of large financial companies, reported in August that less than one-quarter of institutions surveyed rated their systems as “extremely effective” or “very effective” in data management/maintenance, data process architecture/workflow logic, or data governance. So there is broad recognition that this is a work in progress.
The following highlights from the interview are of interest to bankers from organizations of a range of sizes.
ABABJ: What has made risk tracking and aggregating such a challenge?
Blakely: When you stop and think about how some companies have grown, and how much some companies have grown over the last couple of decades, much has been done through acquisition. And when you put companies together, unless you’re extremely lucky, typically their systems don’t plug right into each other. So over the last couple of decades there’s been lots of patchwork, trying to connect systems to make them talk to each other as best you can.
Another challenge is that the nomenclature that was used to build one system might be very different than the nomenclature used to build another.
Take a simple matter like “non-owner occupied real estate.” One system might be built with the definition of non-owner occupied real estate that’s different than another one. I just cite that as an example where you have to build on a consistent phraseology.
And you have to make sure that there’s not a big data leakage going on somewhere.
But before the crisis, such challenges remained in place. When someone wanted to track say, sub-prime exposure, what happened? A knee-jerk reaction was to take a look at your loan portfolio. You might not find any.
But that wouldn’t be the whole story, necessarily. A broader look might have shown that elsewhere you had subprime exposure coming out of your ears. It might have been in your investment portfolio, it could have been in your securities lending portfolio. It could have been in a money market fund that sponsored for a client. It could have been in any one of a number of collective investment trusts. Any one of a number of other areas around the firm that might have caused exposure either directly or indirectly.
So there’s been this initiative across the industry to do a much better job of bringing systems up to the level of capability to provide quick information. Reliable information, understandable information that’s consistent no matter how you pull it down, no matter where you pull it down in different parts of the company.
ABABJ: This reminds me of other discussions about credit concentrations. It seems like bankers have been discussing such challenges for years.
Blakely: Lessons were learned. A key lesson that we learned in 2008 when Lehman collapsed was in the number of firms that were surprised by the totality of their exposure, because it wasn’t just in their loan portfolio. It was everywhere within their company.
And there was this confusion about, what’s the totality of exposure across the firm? And what’s the totality of our exposure by different legal entities, with different geographies and such? The meltdown in the financial markets shined a light on the shortcomings in firms’ ability to aggregate exposure across the firm.
And we learned that that sometimes that line between direct and indirect exposures gets blurred. One example is in cases where you don’t have a legal obligation to stand behind something, but there is a moral obligation. But if the firm is going to do that, it had better know what its exposure levels are.
ABABJ: How big a challenge will it be for the industry to get to a good handle on things?
Blakely: Over the last several years there’s been a lot of progress made. And we also know that the regulators are applying indirect pressure, too, for the large players to step up their game on the aggregation and data quality. I don’t think we’re there yet, but we’re better than we were three years ago.
The Basel paper gives banks until 2016 to get things in order. During 2013 they’re supposed to be performing gap analysis, and then from that point forward fixing the gaps. U.S. regulators have chosen not to apply such strict and literal requirements on most U.S. banks but that doesn’t mean that they aren’t bringing pressure. When you see the types of information that the regulators are expecting to be provided in stress testing, and in other efforts, you can see that the industry is strengthening information data governance.
ABABJ: What impediments remain?
Blakely: In the past everybody was very conscious of budget. Companies would try to set aside enough to improve risk data systems, but it was never really enough to get them to where they needed to be. But in the last three years the pressure has been enormous. And when I say that, it’s coming from inside the firms too. Bankers said, “Holy smokes, we couldn’t even aggregate our exposure across the entity, we better get this fixed. “
So we are seeing a lot of initiatives around the industry to raise the bar in data aggregation. Most risk managers are clamoring for this kind of information. They used to have to settle for what budgets they could get for it in the past. For risk managers, this is an exciting time, they are getting their systems where they need them.