The role of data in business has evolved rapidly as vast changes in technology have increased the volume, velocity and variability of data gathered by businesses. Yet at the same time, the potential value of this data is often restricted by outdated legacy systems architectures and the absence of a proper data management framework. To realise value from data, banks must get a better understanding of their data ecosystem and put in place a robust data governance framework that sets clear roles and responsibilities.
In many cases, banks have large stores of data which go unused. These are essentially wasted resources which could be analysed for insights to drive cost and time reduction, new product development, optimised offerings and smarter decision making. And whilst the most common use of data management is for increasing the understanding of customer behaviour, data is also vital for effective risk management and maintaining regulatory compliance.
Regulations such as BCBS 239 are putting pressure on banks to add traceability and quality assurance to their data infrastructure. For example, the BCBS 239 principle of adaptability requires banks to have a data management framework that can meet ad hoc data requests for assessing emerging risks and conducting better risk management for forecasting, stress testing and scenarios analyses (including in times of crises where speed is critical).
There are three steps for realising value from data. The first is to do with how the data is sourced. For banks, the biggest issue is not how much data they have, but finding where it is. Often, it is distributed across multiple systems in different formats. It can take days, weeks or even months just to find and validate the data. Questions to ask include how do you know that the data in system X is up to date, where has the data come from and is the source a valid one?
The ideal solution is to have an ‘authoritative source’ for each data domain which centralises the data with unified quality controls. Banks can then implement strict quality assurance on the data at the point where it is gathered and help ensure that this is maintained wherever the data is used.
The second step is to process data into a format that is ready to be analysed. Again, the current issue with decentralised data is that it is often stored in different formats that must be adjusted to be used more widely. Furthermore, it is necessary to align data producers and data consumers in understanding the key data that is relevant to the bank’s objective. An authoritative source of data coupled with a consumer-driven processing and formatting strategy which takes into account key data points would dramatically reduce processing time.
This brings us to the last and most important step - analysis. Banks should be spending most of time and effort on extracting insights from data. By reducing the time spent on sourcing and processing, they can create a larger window to focus on smart analysis and thus extract value, ultimately transforming data into knowledge that improves decision making.
According to Forrester, “the number one challenge for business decision-makers is the lack of business competency to deal with data that is messy, diverse, or large.”1
To date, data management in most banks has been focussed on meeting regulatory demands with the minimal possible effort, capturing data lineage and data dictionaries executed as a project, creating an effective tax on the organisation that must be paid again due to poor maintenance.
Banks should seize the opportunity to turn a regulatory requirement into a value add proposition. Implementing an evolved data management framework can add business value by shortening the time it takes to source and format the data thereby allowing a greater proportion of time and resources to be spent on analysing the data using newer and more complex models to support better business decisions. Banks need to change their approach and start embedding data management into the culture of their organisations.
1 The Forrester Wave™: Master Data Management, Q1 2016, The 12 MDM Providers That Matter Most And How They Stack Up by Michele Goetz- March 16, 2016
Charles Evans is a Senior Consultant with Capco’s Data Management & Governance Practice, based in London. He is currently leading a data management project for BCBS 239 at a Tier 1 bank.
Chris Probert is a Managing Principal at Capco London and Head of Capco’s Data Management & Governance Practice. Chris has run several data management programmes at Tier 1 banks and has a proven track record of establishing Data Management and Chief Data offices.
The content and opinions posted on this blog and any corresponding comments are the personal opinions of the original authors, not those of Capco.