Banking Exchange Magazine Logo

The Imperative of Data Management for Banks

Data is the raw material of banking. Data management is the scaffolding required to elevate the value of that data.

  • |
  • Written by  Colin Kerr, CTP, corporate banking analyst at Celent
The Imperative of Data Management for Banks

Data is the raw material of banking. Quality data is essential for new product development, innovative services, and for the implementation of plans that rely on data science, analytics, and artificial intelligence and machine learning (AI/ML). Chief data officers (CDOs) and product management teams must consider the future-state data architecture and platforms, data management policies and standards, deployment initiatives, and the tools required to enable growth.

Data management provides the scaffolding within which banks can safely elevate the value of data assets. More than just a data architecture strategy, data management also requires implementation of standards and governance — a mix of people, processes, and technology — with engagement from business leaders. Banks must now identify vendors that help manage data from end to end, paying special attention to critical data pipelines — yet ensuring that their selected solutions facilitate development of their data strategy. Evolving product requirements, data platforms, and public cloud strategy are all important considerations for enterprise data management (EDM) tools and governance.

Prioritizing Data Management & Governance

Effective data management includes technical (e.g., architecture, infrastructure, engineering, controls) and nontechnical (e.g., policy and standards for data capture, movement, and use of data) characteristics. Three fundamental categories support data quality improvement, consistency, and transparency; each is required for regulatory reporting, advanced analytics, operations, and product development:

  1. Cataloging data assets. Banks must assign data ownership and stewardship, define and catalog data and associated metadata, and prioritize business-critical data.
  2. Implementing data capture, transport/movement, and use standards.Banks must improve data quality (at capture and throughout the data supply chains), document data’s lineage as it moves through systems and transformations, apply data movement controls to prevent data loss, and institute standards that ensure data is accessed and used in a “fit-for-purpose” manner.
  3. Identifying and repairing data issues. Data management and governance requires remediation of persistent data quality issues, including gaps in reporting, analytics warehousing, and supply chains. This requires standards for identifying, quantifying, and prioritizing data quality issues; measuring and tracking quality issues and ensuring accountability for remediation; and providing transparency and auditability into data management operations.

Committing to Data Quality at the Source

For banks to meet the demand for analytics and complex data-led solutions, quality data is essential. Improving data quality and control is about more than meeting regulations; it makes good business sense.

When evaluating business solutions, banks should consider solutions that support improved data capture and that support EDM goals. Use cases include data capture from financial statements, loan application documents, and know your customer (KYC) documentation. Consistent standards across the bank are crucial, though different business areas will face distinct challenges.

The highly automated payments and liquidity management processes of transaction banking and payments business units are typically defined by operational rules for payment networks and more consistently structured data. When data quality issues occur, they usually create processing errors that must be urgently repaired in order for payments to be settled; only rarely do data quality issues appear after the fact. Industry initiatives, such as ISO 20022, which “describes a common platform for the development of financial messages” will help improve data consistency and richness.

Consumer and commercial lending businesses are more prone to data challenges. Freeform text, paper forms, manual data entry, and long-running workflows all contribute to data quality issues. Modernized loan origination systems, paired with natural language processing (NLP) technology, help extract data from paper and digital forms to minimize data entry while improving data quality.

Emerging Data Architectures

New data architectures are emerging that help support data management objectives:

Cloud-based applications are increasingly embraced for modern banking applications, productivity tools, development of new products and services, and increasingly for data management and governance. This impacts data management and data strategy. To use cloud-based data management solutions, whether the data remains on-premises or in the cloud, banks must be willing to use a cloud environment to manage key data assets. Banks that can’t or won’t trust their data to the cloud (often due to perceived security concerns) must consider the varied risks of not adopting cloud solutions, such as the inability to use many contemporary solutions, inefficient data integration processes, and increasingly limited execution capabilities.

Data fabric and data virtualization, in which data in physical databases are virtualized in-memory for analytics, represent a shift in data architectures not possible in the traditional data warehouses and datamarts common just a decade ago. This helps minimize data movement and provides centrally managed access controls. Data mesh, which applies product management principles to data management, is the newest generation of architecture. Data ownership and uses are decentralized (but with access federated), allowing product owners with expertise in the specific business unit to develop localized analytics and insights.

Meeting Changing Data Management Needs

A data architecture strategy alone is not sufficient in order for a bank to build a robust data foundation. A data management practice is required in order to help protect and elevate the value of data assets. Additionally, a scalable data governance framework provides stakeholders across the organization — developers, data scientists, business analysts, and stewards — a transparent audit trail for the review, validation, and uses of data assets. Business leaders have a key role in shaping data management strategy as they seek to monetize data assets to support business growth. This will play an increasingly significant role as data volumes and complexity increase to support new analytics use cases, innovative products and services, and of course, AI.

A broad range of technology tools are available to optimize data collection, data quality, lineage, and to implement controls on the movement data. Banks must consider business solutions (client-facing and internal operations), strategic infrastructure and data platforms, and product strategy as they evaluate the many data management software solutions from technology vendors.

Author: Colin Kerr, CTP, corporate banking analyst at Celent

back to top


About Us

Connect With Us



How to get the most out of Data and AI
with Ravi Loganathan from Sardine
and President of Sonar

Wednesday, July 24, 2024 at 11 AM ET / 8 AM PT

In this webinar we will cover:


This webinar is brought to you by:

SardineBanking Exchange