Data is everywhere and its impact is tangible. So fundamental is data to our society that by 2245, half the world’s mass will consist of data and information. The quantity of data we’re dealing with now and the speed at which this quantity is growing are astonishing. Per day, each person generates 1.7 megabytes of data per second, or 146,8 gigabytes per day. 

Central role for data

A good example of this kind of data is email. Daily, we send 306,4 billion emails, which is an average of 40 emails per day per person. And while we send all these emails, Google is processing 3,5 billion search requests every day. All this data has a profound impact on the way we live. More importantly, data is playing an increasingly important role in the corporate domain. 

Take Netflix for example. Not only do they know how to provide users with relevant suggestions on what to watch next, but they also save around $1 billion per year by using data for client retention. This seems like a challenge, but it could hardly be easier. In practice, all they do is analyze the data clients generate themselves. Put differently, the financial gains to be made with proper data use versus effort needed are enormous. 

While a lot can be achieved using data, huge losses can also be sustained as a result of poor data management. A recent study showed that the American economy is losing approximately $3,1 trillion annually to poor data management. The numbers are hard to ignore and it is estimated that all combined data will be worth $229,4 billion by 2025. 

Fundamental changes

More important than the financial value of this data is how this data is transforming our lives at a fundamental level. From speech recognition to self-driving cars, our dependence on data will continue to grow. Every time data helps us choose the best route to work or what movie to watch we leave a trail of user data. This data is incredibly valuable for corporate decisions – if the data is used properly. Data at this scale brings huge benefits but equally huge risks. 

Data is growing rapidly, daily. The quantity of data can easily become overwhelming. Businesses can therefore no longer approach data passively. Businesses that don’t actively monitor and control incoming data can quickly lose oversight. It is therefore crucial for these businesses to gather and store this data in a coherent manner. However, even though we live in a data-driven world, this remains a large challenge for many businesses. There are perhaps no industries where this challenge is more tangible than the financial services industry.  

Whether it be getting a mortgage or acquiring capital to start a business, we’ve all had to deal with the cycle of getting loans approved. Financial institutions gather a lot of sensitive data on us in this process, including IDs, salary statements and information on collateral. Next to this data, financial institutions also gather less sensitive data about users continuously. This data is generated when online purchases are made, cash is withdrawn, card payments are made or someone sends a smartphone payment request after a night out.  

Effective data management

The problems that come with managing this data are immediately apparent. Most financial institutions are working with an organically grown IT-landscape consisting of legacy systems that don’t integrate well with one another. This silo-shaped structure leads to an incoherent form of data collection and storage which, in turn, leads to what’s known as “outlier data”. Outlier data is data that is incomplete, incorrect or stored in the wrong place. Once this data separates from the rest the risks become clear. For example, imagine that client A asks to see the data a financial institution has gathered on him or her, which is a common occurrence these days because of the GDPR. If data from client B is erroneously stored in this set of data, the institution is then dealing with what’s known as a data leak. Such leaks lead not only to catastrophic reputation damage but can also lead to astronomical fines and increased scrutiny from regulatory authorities.  

Averting risks

With these risks in mind it should come as no surprise that financial institutions often have entire departments dedicated to mitigating these risks. Additionally, the role of CPO, or Chief Privacy Officer, is becoming increasingly important in the Netherlands. Aegon and ING are two examples of organizations where data protection is one of the highest priorities. This makes sense when one considers the “uphill battle” such financial institutions are dealing with. In 2019, financial institutions were fined $36 billion for facilitating an estimated $2,1 trillion circulating in financial crime. On average, financial institutions spend $48 million on combating financial crime yearly but the contrast with what’s available to criminals is stark. For every euro invested in combating financial crime, two euros remains available for financial crime itself.  

It has become clear that reactive approaches from financial institutions are not sufficient. Many organizations only conduct proper customer due diligence once the damage is done, they’re being audited or a fine has been issued. This reactive approach has given way to an entirely new market segment known as customer due diligence remediation. Essentially, this form of remediation entails solving an information backlog that should have been dealt with earlier.  

Insight into historical customer data

A recent Hyarchis report shows that only 15% of financial institutions have complete insight into their historic customer data. This is a shocking statistic when one considers the sensitive nature of such data. As mentioned, this data doesn’t just encompass customer identity but also things like income, medical data and marital status. In 85% of cases, there is a lack of coherent insight into this customer data, while these files are supplemented daily with a large amount of other sensitive customer data. 

This problem often occurs because there is more data coming in than can be efficiently processed. In other words, the problem gets bigger with every day that passes. This phenomenon is causing a paradigm shift in which CPOs (Chief Privacy Officers) become crucial to organizations. Compliance as a “must” thanks to legislation and regulations is giving way to a more proactive approach. In this approach, the emphasis is not so much on averting risks as it is on creating added value from static customer data. Initiatives that were previously used reactively with the aim of knowing your customer increasingly lead to the insight made possible by holistic customer views, such as better customer service or a more targeted form of upselling. 

Artificial intelligence

From the recent Hyarchis research report we see that half of financial institutions are using artificial intelligence to get a proper grip on their customer data. This rapidly evolving technology is helping to reform the traditional silo-shaped IT-landscape and to structure chaotic data into coherent insight into customers. In the financial services industry, this is leading to a bona fide AI race. At present 35% of financial institutions use artificial intelligence at a functional level – this is predicted to grow to 95% in the next three years.  

Artificial intelligence has thus gone from a technology that was “nice to keep in mind” to a “mission critical competency”. Artificial intelligence is increasingly becoming the answer to the overwhelming quantity of data currently being generated. Not only is artificial intelligence helping ensure that customer data is managed properly but it is also helping organizations gain proactive control of historical customer data. This helps eliminate the need for auditor visits and increasingly stringent supervision but also unlocks the full value of customer data. This approach allows financial institutions to mitigate risks while actively discovering financial advantages.  

Back to feed