Menu
Banking Exchange Magazine Logo
Menu

Winning with Data Management

Managing data for financial services

  • |
  • Written by  Ihor Shkarupa, Associate Big Data Architect & Navjot Singh, Marketing Manager, Financial Services
 
 
Winning with Data Management

INTRODUCTION

Financial service companies must provide more personalized customer experiences by tapping into the value of their customer data. However, data silos keep businesses from realizing data’s true potential while increasing storage costs.

Extracting true value and profit from the data requires a data management platform and a vigilant data management strategy.

This article provides insights into how financial services companies can gain better accessibility to data with a data management platform.

KEY CHALLENGES

  • Financial services companies are facing a rapidly evolving set of business situations such as an increase in regulatory and risk management, a more digitally demanding customer base, and leveraging increasing amounts of information to enable strategic decision-making

  • Financial services companies face significant challenges when capturing and managing data across organizational, geographic, and product information silos

Financial services companies are increasingly dependent on data to make more informed business decisions, optimize operations, and improve profitability. As data becomes a more integral part of their business strategy, financial services executives must ensure data is managed properly throughout the complete data lifecycle—how it is handled, who sees it, and how it is used. To do this, companies must have timely access to accurate and relevant data that describes what is happening within and outside of the organization.

Financial services companies can access, collect, and ingest existing data into a centralized system to drive decision-making; train a model to accurately interpret masses of incomplete and inaccurate data, and produce an outcome to help gain an accurate and deep understanding of the data.

According to a survey by Gartner, the average costs related to inaccurate, incomplete, or inconsistent data for financial services were $15 million per organization in 2017.

The consequences of poor data management include hampered decision-making, customer dissatisfaction, and regulatory non-compliance. For instance, instead of rewarding users with cashback from selected retailers and financial prizes for using selected products, a bank charges a substantial commission fee for every transaction made due to faulty software application. This poor data management can lead to many issues for the bank, including poor brand credibility and customer reviews. This is an example of how a company’s value can be measured by its data performance; however, data quality leads to financial and productivity costs, as well as reputational damage.

Therefore, building an efficient data management platform that helps to manage, regulate, and increase data accessibility and usability is fundamental to financial services operations and business decisions.

WHY IS DATA MANAGEMENT SO IMPORTANT?

Data management is not a new concept in the financial services industry. Historically, companies manually collected and entered information by directly communicating with customers, with minimal data management required. However, the ever-increasing number of data sources, combined with advancements in data science, have created the need for financial services companies to see how they can capitalize on data’s potential.

With this evolution of digital technology, financial services companies have developed sophisticated analytics and data management tools. Not only is data entry standardized, but the information gathered is measurable and helps these companies make important decisions. By better understanding data, financial services companies can help solve problems, monitor performance, improve processes, and gain a deeper understanding of what customers really want and how they behave. However, poor data quality and management impact a business negatively.

NEED FOR VALUE-ADDED SERVICES

Financial services customers are demanding more personalized experiences, leading companies to create and provide value-added customer experiences at virtually every stage of his or her lifecycle. With this in mind, financial services companies are constantly challenged to synthesize a comprehensive understanding of each customer. Achieving a complete view of customers requires having the correct technologies for collecting, validating and analyzing an abundance of information about customer interactions, including buying behavior and purchase outcomes.

However, because the financial services industry is fragmented, legacy infrastructure and systems lock valuable data and insights away in repositories that are inaccessible for those who need it most (i.e. the customers and employees). Gaining access to this information is not easy because business complexity is driven by product lines, customer segments, and many transaction volumes. This creates a progressively more complicated enterprise data management landscape for financial services companies, encompassing multiple data siloed environments. These data silos occur when only limited groups of people in an organization have access to a set or source of data. But because data is not eloquently linked across these silos, it becomes inaccessible.

To solve the challenge of multiple and complex silos, financial services companies must build central data storage—data lakes or data warehouses—of integrated data from one or more disparate sources. Some financial services companies, including asset management and investment firms, have already built in-house data lakes or data warehouses. However, these companies still experience numerous issues because the data is not correctly captured, ingested, and stored in one centralized location. This is why companies must invest in a data management platform that allows for democratized access to data via a unified view of data across the organization.

USE CASE EXAMPLE: ASSET MANAGEMENT FIRMS

Asset management firms are examples of financial services companies experiencing challenges when managing and understanding customer data because of the way the industry works in silos and deals with a myriad of market changes, including new regulatory frameworks.

Recent research from IHS Markit has highlighted major gaps in the way asset management firms capture, store, and use data, with over two-thirds of respondents saying business teams have limited or no visibility of where data came from, who touched it, how it was altered, or how the data was used. However, gradual transformation is on the horizon, with nearly 50 percent of asset management firms in Europe and North America set to use a data lake by 2022 to manage new reporting requirements, such as data analytics, data-scrapping, and wider data solutions. By allowing the data to remain in its native format, asset management firms have access to a far greater stream of data for analysis, and application of a variety of data management tools to gain vital customer insights into what the data means.

While a data lake may allow asset management firms to store, analyze, and ingest important data, asset management firms will still require data experts to perform a series of interviews with key stakeholders to understand end goals. A data management platform won’t replace humans but will enable them to make better decisions quickly and consistently.

Having a data management platform is not just about consolidating the structures and data from legacy systems or heterogeneous sources, but also about making sure companies have the correct data management strategy. With global regulatory bodies constantly introducing new policies, the pressure for financial services companies to become fully compliant has considerably increased. Some regulations, such as MiFID II, necessitate financial services companies to continue developing and evolving data management platforms and require a vigilant data management strategy.

BEST PRACTICE: TACKLING DATA MANAGEMENT CHALLENGES

As data usage grows more complex, new regulatory policies make compliance an important part of data management. Including compliance principles in data management policies within a data management platform is imperative to address regulatory obligations throughout the complete data lifecycle. There are two critical components that apply to almost every data management strategy when implementing a data management platform: metadata catalog and data quality.

APPLY METADATA

A metadata labeling system is the foundation of data management, providing the structure for identifying, defining, and correctly documenting data. Metadata’s benefits are twofold:

Consistency—metadata makes information more consistent and clear. For example, metadata distinguishes between "van" and "minivan," or "money" and "cash”. This creates transparency and supports a better structure and understanding of tags.

Clarity—metadata resolves uncertainty when creating systems and structures throughout a data environment. For example, if a data analyst wants to understand a data management structure to analyze productivity, he or she would first look at each data value, and organize the information accordingly. Metadata follows the same logic, dissecting available information into a relational structure.

As a result, metadata ensures data is observed correctly across the business by engaged stakeholders. Implementing a solid metadata tagging system creates a data structure. Without metadata, data cannot be properly organized or optimized. Metadata can’t exist in silos, as its context should span the business. All aspects of the data must be reviewed to define the full set of interactions with each data point. Creating and allocating metadata tags whenever data is created, acquired, or altered in any way is one solution.

Metadata, data management, and search tools work together to form a data catalog that allows financial services companies to improve the speed and quality of data analysis.

By increasing the visibility of data assets of financial services companies to more efficiently manage their environment, SoftServe effectively manages data at scale with repeatable processes to keep data and metadata up-to-date.

MONITOR DATA QUALITY

Data quality refers to the accuracy, timeliness, and consistency of data within financial services. High-quality data reflect relevant information, directly applicable to the business and collected through ethical and credible means.

Data quality suffers when inconsistent reports are produced between departments working in silos, resulting in differing and inaccurate facts and figures on both sides. Quality data, by contrast, delivers identical reports across the business when departments work with full collaboration and transparency. By working in an agile way, large database initiatives can be broken into smaller projects with a few manageable, deliverable goals rather than working on too many data projects at once. Small, highly-focused teams can work on executing shorter initiatives with a high-quality output. Rather than focusing on long, static project structure and putting efforts into smaller iterations allows data teams to become more responsive to user feedback and manage any additional changes to the database structure efficiently.

Ensuring data quality starts with configuring a management strategy framework to automatically assess incoming data. Since specific data collection and data requirements alter per organization, creative frameworks must be modified to meet business specifications. Financial services executives must consider assessing, collecting, and measuring the capabilities of existing solutions, to ensure they satisfy organizational needs.

With our approach to data management, we are empowering cutting-edge technology to develop and maintain a scalable, governed, and self-sufficient data environment in an ever-changing data landscape for financial services.

CONCLUSION

Financial services companies are powered by data; however, the true value and profit from the data are not always realized to its full potential. Building modern big data integration solutions is also challenging due to legacy data integration models and technical skill gaps.

A well-constructed data management platform enabling data processing allows data scientists to build smart solutions and is a good base for business intelligence and ad-hoc reporting.


By Ihor Shkarupa, Associate Big Data Architect & Navjot Singh, Marketing Manager, Financial Services

back to top

Sections

About Us

Connect With Us

Resources

On-Demand:

Banking Exchange Interview with
Rachel Lewis of Stock Yards Bank

As part of the Banking Exchange Interview Series we and SkyStem are proud to present our interview with Rachel Lewis, Assistant Controller at Stock Yards Bank & Trust.

In this interview, Banking Exchange's Publisher Erik Vander Kolk, speaks with Rachel Lewis at length. We get a brief overview of her professional journey in the banking industry and get insights into what role technology plays in helping her do her work.

VIEW INTERVIEW NOW!

This Executive Interview is brought to you by:
SkyStem logo