The remarkable news this week of Bank of America’s error in calculating its regulatory capital has captured the industry’s attention.
How could a bank as sophisticated as this one have erred on such a basic calculation, many ask? How could it be that no one in the bank nor among the regulators who are resident in the bank’s headquarters detected the error? And over an interval of almost five years of stress testing?
So go the questions. Something missing has been perspective, and a sense of the bigger picture.
The explanation that has emerged has several components, all of which have merit in helping us understand how this could have happened:
• The bank is big and its balance sheet is very complicated.
• The error itself, while large in absolute dollars, is relatively small in relation to total capital.
• The resulting capital account appears to be more than adequate for the bank’s current operations and as a buffer against material adversity as contemplated by the specifications of the stress test’s parameters.
So, what’s the big deal?
The biggest negative impact on the bank is probably to its reputation among shareholders and analysts. This is an embarrassment and one that will linger for a while.
But what can other bankers learn from this?
This episode highlights to me a very different issue—and one that we don’t think about very often.
What financial statements really mean
Simply put, accounting is not as precise as it seems. The columns “foot” because balance sheets are supposed to balance. But what do the numbers really tell us?
Here’s a statement that may surprise many people who are not regular users of financial information:
Everything on a business’ financial statement—other than cash and contractual liabilities—are estimates.
That’s right—all the other account totals are the results of estimates.
Generally accepted accounting principles (GAAP) are accounting rules promulgated and maintained by a quasi-public board of accounting professionals and academics. Collectively, GAAP is a set of rules on how to assemble, represent, and compare accounting information.
In an accountant’s world, books balance and in the end, the debits and credits zero out. It’s not that some firms or individuals do not benefit at the expense of others. Rather, at the end of the day, the world of business in its totality is like a zero sum game. If the books could be closed, the world of numbers would be in balance.
The process is pretty specific and pretty regimented. But for all that, it’s not exactly precise.
Presenters of financial information, such as a company’s auditors, do not attest to the accuracy of the financial statements so much as the “fairness” of their presentation of the information. “Fair” in this context means consistent in all material respects with GAAP.
The principal hallmarks of GAAP are consistency with the rules and comparability of methodology. In other words, this means the same accounts presented in the same way with consistent and appropriate valuation methodology.
Perhaps one could say that financial statements are “precisely wrong” but are also accurate in the sense of their consistency and adherence to principles. That’s a great deal when you think about it and very valuable to issuers and users of the information alike.
But precise in an absolute sense? No, they are not that.
Consistency remains fundamental
One of GAAP’s major tenets, for example, is to book fixed assets at historical cost. A million-dollar building or piece of machinery has a useful life and that must be considered in determining an appropriate depreciation schedule. Fixed assets (with the primary exception of raw land) have an estimated useful life. It makes a big difference whether this is estimated to be 15 years or 35 years. There’s broad latitude on many aspects of such issues, although the IRS has established limits in the interests of national tax policy.
As lenders we are familiar with the methodology of developing an allowance for possible loan and lease losses. This is a valuation reserve for uncollectible amounts of principal and that is almost always a subjective as well as a continuous process.
Increasingly, financial valuation issues require the issuers of financial statements to consider the issues of fair market value. This entered accounting practice in a limited way nearly a generation ago and it too is a process subject to valuation methodology, some of it subjective but almost all of it subject to ultimate uncertainty in the “sharp pencil” sense.
Putting BofA into perspective
So the discussion surrounding the dollar amount of the capital shortfall at Bank of America by it inadvertent flawed methodology (i.e., one containing a flawed result by the method in which certain information was included) is much less precise than it might otherwise appear to some.
I say this to inject a sense of reason into the discussion. We are not dealing with black or white issues nor in calculations where precision should be expected to be rounded to the nearest thousand or million dollars.
Actually, bankers are used to this sort of uncertainty.
After all, it’s at the core of what liquidity really means. I learned in the 1980s that liquidity is greatest when it’s needed the least. In other words, liquidity is abundant when no one is thinking about whether or not a counterparty has the capacity to execute the other side of a transaction quickly, simply, and with no material diminishment of value.
Bank of America is a “strong” bank by any reasonable analysis. Going forward, we should consider this episode as another event in our quest for financial consistency and reasonableness. It’s not a catastrophe, but rather an embarrassment, and it speaks louder to the current environment of almost mindless financial complexity than to anything else.