A survey of financial services IT leaders describes an industry that is largely familiar with, and increasingly adopting and seeing the value of real-time computing technologies.
A majority—58% of respondents—reported that their company uses in-memory technologies, which provide the level of processing power often indispensable for real-time analytic applications, and 28% reported that they are used in a “mission-critical capacity.”
GridGain Systems, which provides open source and commercial in-memory data fabric solutions, polled close to 200 IT decision-makers about their companies’ attitudes, practices, and challenges around data technology. The focus was on the state of the industry’s adoption of real time analytics technologies. Survey respondents included project managers, network managers, software and business analysts, and other technology professionals working in financial services.
“The number of IT professionals reporting that in-memory is mission-critical isn’t too surprising,” says Max Herrmann, executive vice-president of marketing at GridGain. “Financial services is on the leading edge of adopting technologies that enable hyper-scale processing around functions like risk analysis and high-volume transactions, which involve processing and analyzing increasingly massive and diverse datasets in real time.”
Forty-two percent of the respondents reported risk analysis as the area to which real-time analytics technologies offer the most value, with cyber-theft and fraud prevention applications reported second (31%), offering a view of a landscape that GridGain believes may yield forthcoming changes.
“Currently, people in financial services are approaching real-time technologies mostly from an analytics-focused perspective,” says Herrmann. “But areas like cyber theft and fraud detection will require expanding the view of real-time’s role to include a more integrated picture of analytical and operational processing, allowing companies to dramatically shorten the time between when actionable data is discovered and when relevant action can be taken.”
While only 22% of respondents reported “accessing the necessary data streams” as the biggest obstacle to real-time decision-making, processing speed (40%) and “integrating diverse data streams to form a single picture” (38%) were reported as more significant challenges.
“The data is there, and there’s little problem accessing it,” says Herrmann. “However, the decision-making pipeline is slowed by reliance on traditional disk-based processing as well as the variety of data structures and data sources organizations are trying to make sense of in their decision-making process.
Hermann predicted that this problem is only likely to worsen until the traditional approach of processing different data within their own silos is addressed, as organizations will only be taking in more and more data.”
Thirty-nine percent of respondents identified scalability and speed as the areas they would most like to see the next data technology upgrade, outstripping security (19%), cloud-based analytic tools (22%), and improved uptime (18%).
“Despite the industry’s prudent conservatism, we see confirmation that financial services continues to be at the forefront of adopting many of these real-time technologies; the nature of the business demands it,” says Herrmann. “In these findings, we see an industry facing some growing pains, but we also see that it is clearly poised to continue taking the lead in adopting data processing innovations.”
- Goldman Sachs, J.P. Morgan and Citigroup Fintech Investments Growing Like Never Before
- U.S. Banks Leaders in Technology Innovation According to New Survey
- The Future of Asset Management, Part I: Where We’ve Been Explains Why We’re Here
- Beyond the Efficiency Ratio: Leveraging Automation to Improve Profitability and Experience
- The Real Reasons Bank Customers Move to Direct Banks