Banking Exchange logo215mar2015

Streamlining banks’ paperwork journey

An integrated approach to managing finance, risk, and reporting

  • |
  • Written by  Bart Everaert, Wolters Kluwer
  • |
  • Comments:   DISQUS_COMMENTS
Does daily life in the bank feel like this? Here's a strategy that will at least cut duplication of effort and repurpose efforts already being made somewhere in the bank when new requirements arise. Does daily life in the bank feel like this? Here's a strategy that will at least cut duplication of effort and repurpose efforts already being made somewhere in the bank when new requirements arise.

In the past, efforts by banks to meet finance, risk, and regulatory compliance obligations typically consisted of maintaining a collection of numerous—and often not integrated—technologies across many parts of the organization. Regulatory platforms were routinely hampered by inflexible technology support, with overlapping but variant functionality, and discrete, independent data models that created systems that were costly and inefficient to maintain.

While banks have often found workarounds to these challenges, they have also found that such approaches are becoming increasingly insufficient. Regulators focus increasingly on seeing evidence that banks have developed sound data structure, along with integrated management practices that help generate meaningful data assurance, rather than static, “box-ticking” reports.

Today, regulators demand ever-greater integration across business processes, with demonstration of vigilant management and reporting efforts. Accordingly, banks are finding “silo” approaches increasingly anachronistic, with traditional methods of finance, risk, and regulatory reporting activities requiring more integrated regulatory compliance and reporting approaches.

Especially in the past few years, the evolving regulatory landscape has forced banks to consider adopting a more integrated and forward-thinking approach that requires different departments to work more closely than before. Increasingly they must take future regulations into account as part of the finance, risk, and reporting picture, for basically any banking regulation or guideline released in the last several years.

Rather than build on internal systems and processes to fit each new regulation in an ad-hoc manner, banks would be well served to adapt their existing processes in ways that address each regulatory change holistically and cohesively. Let’s examine the theory and practice behind this approach.

Don’t reinvent the wheel

Historically, a common practice for many banks was to build out new processes alongside old ones for each new regulation that came about.

As a result, a chain of point solutions was built to address each individual regulation. Applying a more holistic approach, however, would allow banks to leverage existing data points, mappings, and processes to meet regulatory requirements. Otherwise, banks can end up with over-engineered and inefficient processes that lack operational efficiency, strain existing resources, and increase overhead costs.

The expense of implementing and maintaining multiple point solutions are obvious. But there are also other, less-visible indirect costs in taking this approach.

First, the potential for inconsistency across data sets could increase under the traditional approach.

Want more banking news and analysis?

Get banking news, insights and solutions delivered to your inbox each week.

Second, by building new solutions from scratch, banks risk introducing a time lag for compliance that could lead to punitive measures from regulators, which in turn could cause both financial and reputational damage.

Ultimately, it can prove very challenging to implement a rigorous audit and data governance process in order to document consistency and provide transparency throughout these processes.

Many banks often lean toward installing systems that focus on individual risk components or business segments because these systems might be considered “best of breed” for the particular requirement or niche they are trying to meet.

However, these systems rarely lend themselves to the more holistic approach that regulators have been encouraging banks to adopt.

Using disparate systems and processes will generate compartmentalized results that are inconsistent. They will not provide an accurate picture of, for example, a bank’s liquidity risk.

Make processes work smarter—not harder

The challenge for many banks lies in understanding not only one’s current regulatory framework, but also in analyzing the impact of new requirements once they are released. Every new regulation could trigger yet another need for complex change management.

Rather than build ad-hoc internal systems and processes, banks need to learn how their existing processes can be adapted to meet each new regulatory change in an integrated, cohesive manner.

Indeed, there is no need to build a new process every time a regulator introduces a new regulation or twist to an existing regulation. A bank’s regulatory processes should be extendible to easily adapt to new requirements, while keeping in mind the overarching principles of the Basel Committee on Banking Supervision regulation 239 (BCBS 239)1. That rule urges institutions to build a transparent framework where risk data can be aggregated in a consistent and efficient way, and thus be effectively positioned to manage the next regulatory requirement. A properly designed process will take into account the Basel principles right from the start.

While regulators are also encouraging financial institutions to break down existing siloes between departments, such operational isolation goes against almost every principle put forth in BCBS 239. Rather than act independently, the various business lines and functions throughout an enterprise should work cohesively to build a centralized universe of data, rather than operate independently from one another.

Data should be used in a consistent and effective manner to optimize different processes, triggered by different regulatory requirements. This makes it easier to understand that the data used in one process is the same exact data used in a second process, making it much simpler for a regulator to learn a single, cohesive system as opposed to familiarizing themselves with several point solutions.

To illustrate further using a real-life example, our firm recently worked with a Tier 1 financial institution to implement its Comprehensive Capital Analysis and Review (CCAR) and FR Y-142 regulatory reporting requirements. The implementation effectively re-used and centralized existing data sources and made sure that the entire process was easily extendible for future regulatory requirements. This approach also allowed the institution to provide a level of transparency over its data and data enrichment processes that it didn’t have before.

The end result? The institution made much better use of the same data through management reporting, which then helped it gain an even better understanding of its business. A regulator subsequently gave positive feedback for this approach and noted the methodology allowed for a quick understanding of processes compared to peers, whereas the regulators needed to sort through several solutions to gain insight into peers’ operations.

Applying this example to the liquidity monitoring space and, more specifically, to the upcoming Net Stable Funding Ratio3 (NSFR) measure, is something that institutions should be looking at to learn best practices for reusing every single step that the Liquidity Coverage Ratio4 (LCR) process requires. This same thinking can also be applied to the FR 2052a5 requirements in the U.S. or the NCCF regulation in Canada6. Given the NSFR’s looming Jan. 1, 2018 deadline, banks need to realize how advantageous an extendible LCR process will be when implementing the NSFR regulation.

Manage regulatory change as a delta process

Implementing new or updated regulations should not always be viewed as a massive undertaking, but instead should become a part of business as usual. True, data may need to be aggregated and classified in a different way every time, or may require new calculations. However, the process should be fairly straightforward if data are well-managed from the start.

Let’s use a standard bank loan as an example.

Contract, customer information, valuations, and other relevant details should be stored in the bank’s centralized data repository, along with all other information relevant to its business operations. Liquidity analysis will require contractual, cash-flow generation based on the schedule information of this particular loan contract. These cash flows, together with other contractual information, will be used in the LCR classification process.

After the data has been enriched with the proper classifications and the appropriate run-off factors have been applied, the correct LCR ratio is calculated and ready to be reported. In the case of the NSFR process, all steps would be identical, except for a number of additional data points that will need to be fed into the classification engine in order to correctly quantify the appropriate available and required stable funding measures.

This same logic can be extrapolated to the liquidity stress-testing exercise as defined in the Enhanced Prudential Standards (EPS) guidance7, or any regulatory compliance topic, for that matter. Traditional regulatory reporting, Basel capital requirements, CCAR, and even the pending Current Expected Credit Loss (CECL) standard8 all require the same contractual information, which needs to be enriched with classifications, based on a multitude of additional data points (i.e. contract and customer information) in order to comply.

Thriving in new regulatory landscape

Banks that thrive in the new regulatory landscape will have built a process for finance, risk, and reporting process that is capable of being upgraded and of handling rapid regulatory changes, without constant re-invention of existing, working processes.

An integrated, holistic approach allows banks to focus on the big picture and long-term stability, rather than always having to adapt to what’s happening in the short term. It supports banks’ ongoing efforts to more easily comply with all interconnected regulatory changes at an enterprise-wide level, and to meet growing regulatory demands for transparent, repeatable, and adaptable processes, as prescribed in the BCBS 239 guidelines.

Banks that can accomplish this will experience a much-needed simplification of their regulatory risk and reporting processes within their organizations.

By adopting this integrated approach, managing changes in regulation will virtually become business as usual.

About the author

Bart Everaert is a market manager in Wolters Kluwer’s Finance, Risk & Reporting business


1. BCBS 239 is the Basel Committee on Banking Supervision's regulation number 239, “Principles for Effective Risk Data Aggregation and Risk Reporting.” The regulation’s overall objective is to strengthen banks’ risk data aggregation capabilities and internal risk reporting practices. This, in turn, helps enhance risk management and decision-making processes at banks.

2. CCAR requires a financial institution’s treasury function to work with its risk and regulatory reporting departments to help define the stress-testing scenarios for the balance sheet forecasting exercises. These exercises enable Treasury to produce the annual capital planning and regulatory reporting for the FR Y-14 capital assessments and stress testing reports.

3. The Net Stable Funding Ratio (NSFR) is a measure introduced by the Federal Reserve Bank to reduce liquidity risk in the banking system by requiring institutions to hold sufficient levels of stable funding relative to the liquidity of their assets, derivatives, and commitments over a one-year period.

4. LCR (Liquidity Coverage Ratio) requires large banking organizations to hold a minimum amount of high-quality liquid assets that can be easily and quickly converted into cash to meet net cash outflows over a 30-day stress period.

5. The FR 2052a report—Complex Institution Liquidity Monitoring Report—collects quantitative cash flow information on selected assets, liabilities, funding activities, and contingent liabilities on a consolidated basis and by material entity subsidiary.

6. Canada’s OSFI (Office of the Superintendent of Financial Institutions) issued guidelines based on Basel III liquidity standards and monitoring tools and developed its own tool, the Net Cumulative Cash Flow (NCCF) standard. NCCF measures an institution’s detailed cash flows in order to capture the risk posed by funding mismatches between assets and liabilities after the application of assumptions around the functioning of assets and modified liabilities (i.e. where rollover of certain liabilities is permitted).

7. The EPS (Enhanced Prudential Standards) final rule was approved by the Board of Governors of the Federal Reserve System in February 2014 and implements certain provisions of Section 165 of the Dodd-Frank Act. EPS establishes a number of Enhanced Prudential Standards for large U.S. bank holding companies and foreign banking organizations to help increase the resiliency of their operations. These standards include liquidity, risk management, and capital.

8. The Financial Accounting Standards Board’s new Current Expected Credit Loss impairment standard requires “life of loan” estimates of losses to be recorded for unimpaired loans at origination or purchase. Issued in June 2016, CECL represents a major change to bank accounting practices and poses significant compliance and operational challenges for banks. It is set to take effect in 2020 for SEC registrants, and 2021 for all other banks.

back to top


About Us

Connect With Us