You’d think by now that everybody knows what the “cloud” is in context with data storage, but a survey by Citrix last year found that more than half of the 1,000 or so American adults polled believe that stormy weather would interfere with their cloud computing.
Also, more than half claimed never to use the cloud, when in actuality 95% do, including the 65% who bank online.
Banks increasingly are making use of the cloud in many ways, including storing the terabytes of data they collect about their customers. It’s important to remember that every one of those bytes resides in an electronic box somewhere, called a server, which is hooked up in sophisticated ways to the “cloud”—which old timers have always known as the internet. Having all that data stored securely and in an accessible fashion is what makes possible other new technological trends, namely big data and, more important, data analytics.
These latter are exploding in importance because of their potential to provide banks invaluable insights into their customers’ behaviors and needs, quickly, individually, and personally.
However, just as there are physical limits to how many actual white, fluffy clouds can drift through the earth’s atmosphere—even though that’s a lot—there are limits to how many bytes can be stored on servers—even though that’s also a lot, and even though they keep manufacturing new servers.
An associated point to this is, having those bytes stored is not free. As the cloud has grown in popularity, its limits are starting to be felt.
NTP Software conducted a recent storage assessment survey and found that 61.6% of file data stored on primary storage systems had not been accessed in more than six months, leading to inefficient storage use, ongoing costs, and potential compliance issues. Meanwhile, DataCore Software surveyed 477 IT professionals from a range of industries, of which 44% said storage-related costs are a serious obstacle preventing them from storing data in the cloud—or as they put it, “virtualizing more of their workloads.”
“It’s staggering to see just how many stale, duplicate, and empty files are being kept on expensive primary data by organizations of every size across a variety of industries,” says Bruce Backa, CEO, NTP Software. “Files that have not been accessed in six months or more are prime archiving candidates.”
Similarly, says George Teixeira, CEO, DataCore Software, “Storage is the big problem IT pros must solve today. The value and need to virtualize critical applications is now well recognized, but soaring storage costs and unpredictable performance workloads associated with virtualization and consolidation projects continue to impede our progress.”
Still, the data keeps pouring in. Twinstrata did its own survey and found that storage capacity requirements are quickly outstripping the ability of IT to support them, with 60% of respondents agreeing that “It seems like we are always running out of storage.”
Which gets back to the reason why all that data is being collected and stored in the first place: to milk it of valuable insights to further the bank’s strategic and tactical goals. IBM, which knows something about data, has this to say in a blog by Kenneth Muckenhaupt, executive IT specialist: “Most [banks] are missing the boat when it comes to the hidden component of successful risk management—effective information management.”
He goes on to say: “Well-managed means adopting and adhering to an information management strategy that incorporates structured and unstructured data. It means that banks must recognize that warehousing data is only the first step to effective risk management. The second, and probably most critical step, is establishing mechanisms to access and make sense of information for different users. This is where analytics comes into play.”
In other words, you’re just wasting your money if you shove all that data into the cloud and have no way to make use of it in a timely fashion. So how do you do that?
Booz Allen Hamilton says it takes a team made up not only of computer scientists, mathematicians, and statisticians, but also people who actually know the business, all working hand in hand. The idea is for the subject-matter people to ask insightful questions, which the data crunchers can then figure out how to ferret out.
“For instance, for a financial services firm seeking to deter fraud, this may involve examining connections between payment records and exchange rates,” says Booz Allen.
Aite says much the same thing in a report it did on the need for data maturity and governance for institutional and retail asset managers: “Firms have realized the merits of keeping a handle on data-quality metrics, engaging downstream business users in data stewardship, and moving overall data management responsibility and strategy to C-level functions.”
Back to the Citrix cloudy weather forecast study. It found that, even if they don’t know what the cloud is, most people know it’s a good thing. Two thirds, after being exposed to the meaning of the cloud, recognize its economic benefits. Chief among these are: its ability to lower costs (35%), improve consumer engagement for businesses (35%), and be a catalyst for small business growth (32%).
As Booz Allen Hamilton puts it, though, you have to push past the hype of big data and get to big analytics. Otherwise, all those terabytes are just taking up expensive space.
Sources used for this story include:
- How COVID-19 and Tokenization Can Transform the Financial Sector
- 5 Examples of Cutting-edge Tools to Reinvent Your Mortgage Tech Stack
- More than regulation — how PSD2 will be a key driving force for an Open Banking future
- Bank of America Adopting Digital Financial Planning Tool Usually Developed by Fintechs
- The “New Normal” in Banking Customer Expectations