By Justin B. Bakst, managing director, Darling Consulting Group
The Federal Open Market Committee has opened the door for a fed funds rate hike, predicted to strike as early as June 2015. Many will argue the timing and magnitude of Fed policy, but it’s clear that short-term interest rates will inevitably rise at some point.
As rates rise, the banking industry will experience dissipation of non-maturity deposits, resulting in heightened levels of interest rate risk and liquidity risk. In addition to elevated risk, banks could see their growth plans and strategic initiatives adversely impacted by this new normal.
Fortunately, banks have time to adopt a game plan—now, while rates are still at historical lows and many remain flush with liquidity.
Deposits, rife with uncertainty, demand informed plan
It’s no surprise that the most successful strategies of any industry are built upon a solid foundation of data. We’ve seen industries completely revolutionized as they’ve learned to effectively utilize data and analytics to enhance performance.
Regardless of the industry—from sports or casino empires to marketing or internet companies—our technology-infused world is currently experiencing a transcendent period for accessing and employing data, allowing each of us to make decisions in ways that 20 years ago may have seemed cost-prohibitive—or even implausible.
Similarly, bank executives are realizing that leveraging data and related in-depth analytics can have a very meaningful impact on balance sheet management and deposit management strategies. Banks are experiencing firsthand how a more disciplined and extensive approach to analyzing data also provides essential clarity in otherwise ambiguous areas—such as understanding non-maturity deposit portfolios.
Deposits are blanketed in uncertainty, because they are driven primarily by customer behavior patterns. Customer behavior patterns, while at times “predictable,” remain frequently imprecise because humans don’t always act in “reasonable,” rational, or expected ways.
We’ve all certainly witnessed changes within specific customer accounts that seemed to defy economically reasonable explanations. Of course, this unpredictability creates unforeseen interest rate and liquidity risk. Understanding this uncertainty through data-driven insights empowers institutions to forecast likely indicators of impending customer behavior and plan accordingly—while staying ahead of the competition.
Analytics can help stratify your deposit base in a range of ways—the opportunities are literally endless. In light of that, this article I will seek to illuminate three specific areas:
• Projected rate sensitivity of deposit accounts.
• Large deposit balance concentrations.
• Checking account trends.
How to price interest-bearing deposits as rates rise?
Banks have many deposit accounts and a wide web of relationships—many of which will react very differently as rates rise. Some accounts will retain “stickiness,” despite opportunities for customers to earn higher yield. Other depositors will leave as soon as more competitively priced products become available.
Many refer to such rate sensitivity as “beta,” which is an analytical measure representing management’s responsiveness to market rate movements. It’s shocking how many banks fail to quantify this rate sensitivity beyond back-of-the-envelope calculations.
By carefully utilizing data analytics, these same banks have an opportunity to examine rate behavior patterns in a variety of interest rate cycles and by different customer behavior demographics and segmentations.
For example, do depositors who have multiple relationships with the bank exhibit different rate sensitivities than do one-time or single-account customers? If so, what is this difference, on average?
This information can be extremely valuable in regards to interest rate risk and liquidity planning, but it can furthermore shape the strategy and tactics in other areas of the institution, such as new product development, timing, and the magnitude of actual account rate increases.
Understanding your large-balance depositors
The Pareto Principle, more commonly known as the 80-20 Rule, states that 80% of the effects come from 20% of the causes for any given event. Pareto coined the term in 1896, having proved that approximately 80% of the land in Italy was owned by 20% of the population.
In analyzing banks’ deposit portfolios, the 80-20 rule is as relevant as ever. After studying millions of depositors and hundreds of portfolios, we have consistently noticed that 15%-20% of accounts do actually control 70%-85% of bank non-maturity deposits.
We have also seen this number increase since 2008-2009, which marked the beginning of the great recession. The relevance of the 80-20 rule to deposit portfolios probably wouldn’t surprise most seasoned bankers. However, many bankers don’t track, monitor, and analyze this type of information on a regular basis.
Tracking, of course, is just the beginning. Once trends have been identified further, segmentation is a necessity.
Below are a few questions to ponder as you analyze your large balance relationships:
1. What is the nature of the relationship, and how many other accounts do these depositors have with the bank (including loans)?
2. What is the relationship between the rate paid on the deposit and concentrations of large balances over time?
3. How will these large balances react to rising interest rates?
Large balance relationships may be more vulnerable as rates rise. Building analytics around these specific customers and addressing ways to better serve this demographic and its sub-demographics may be the difference between a funding gap and substantial growth opportunities.
Danger: free checking accounts
The argument for free checking is to provide a mechanism to capture low-cost deposits and primary banking relationships in hopes of cross-selling other products, taking in large deposits, and/or recapturing costs through fees (such as overdraft protection and interchange fees).
As the rules have changed regarding automatic enrollment in overdraft protection, the opportunities to offset costs through additional fees have dwindled. But make no mistake—while the rules have changed, the cost to service checking accounts has not.
Banks spend hundreds annually to maintain a single checking account. The implication is that banks need to offset these costs either through cross-selling opportunities, fees or by gathering large depositors. Unfortunately, many banks do not capture, track, or analyze this data.
In discussing data analytics in relation to free checking, these are the questions that arise:
1. How many accounts do you have that don’t meet the criteria to offset your costs?
2. What is your strategy to offset this incremental cost and the associated risks?
3. What portion of the bank’s resources are being utilized to service different customer segments?
Keep in mind that free checking account programs can sometimes indirectly impact risk models, with customer churn potentially driving shorter average lives and higher balance volatility.
Data, information, and strategy
Collecting data can often be the simplest part of the analysis. Transforming that data into information is what truly separates the top performers from the also-rans. The key to the successful implementation data analytics tools and processes is to first identify the question or problem you are trying to solve.
Banks have access to a wide range of data but without the proper context and interpretation, this remains “just data.” Many are converting this data into valuable and relevant information to improve strategic performance. The best-planned strategies infuse data and analytics to support decisions.
With interest rate hikes on the horizon, what actions will you take today in order to better understand your deposit base through data and analytics?
About the author
Justin Bakst provides risk management education and consultation to financial institutions, leveraging the firm’s data-driven software solutions. His focus is developing and implementing tools that analyze deposit and loan customer behavior patterns to proactively manage embedded risks and drive higher levels of earnings. He is a frequent speaker and author on analytics and asset liability management topics.