Part 5: Replacing Anecdata with Real Insights
The Irish mathematician, physicist, and engineer Lord Kelvin left us with numerous scientific inventions and these striking words of wisdom: “What is not defined cannot be measured. What is not measured, cannot be improved. What is not improved, is always degraded.”
In the previous four installments, we made a case for successful transformation to be viewed not as a linear, one-time change but as cyclical endeavors that deliver incremental and measurable value and are agile enough to course-correct for changing conditions. In the final installment, we look at how a structured and intentional approach to data, reporting, and empirical decision making can be used to align organizational realities with strategic imperatives and drive the transformation agenda.
Many financial institutions have formalized strategic planning and goal-setting infrastructure, budget, investment planning processes, and agile delivery frameworks. But they may still suffer from inadequacies in these processes, and lack a common pillar that brings them together.
This pillar measures the health of the organization using hard data with as little a lag time as possible. Despite widespread understanding of the importance of data to an organization’s strategy, there are two ways in which information for decision making is typically gathered:
- Anecdata. Organizations are often driven by pressures generated by clients or internal stakeholders. While client service is an admirable goal, a disorganized or fragmented approach on who to service first can often lead to disruption. These organizations end up prioritizing the loudest voices in the room instead of the neediest. Initiatives are undertaken with ill-defined goals and poorly understood ROIs. Once complete, victory is claimed based on successful execution of milestones or project-management toll gates, as opposed to an objective assessment of business outcomes and performance data.
- Ad-hoc data. It’s common in financial services for managers to be asked to quickly throw together presentations discussing the latest issue or topic de jour. But there is potential trouble ahead. By relying on “point-in-time” data gathered hastily, these presentations fail to recognize the adverse impacts that incomplete or out-of-context data can have on decision making and strategic planning. This type of data typically comes in one of two forms:
- Production data extracts provided by application teams to show the current state of a specific system, product, or user journey. This type of data comes with its own set of risks and gaps, including lack of business context in which the data should be considered, the size and sampling characteristics of the data set in question, source data obfuscation, and latency. These lead to significant confusion and distraction while the correct dataset is identified and gathered.
- Incident or problem data sourced from production support teams representing a historical snapshot of events meeting certain operational criteria. This information is often plagued by a lack of completeness, as well as the risk for embellishment through survivorship and confirmation biases. The records point to where time and resources have been invested to solve production challenges, but often obscure the root cause.
Both these approaches lead to inefficient use of resources to short circuit a more robust monitoring and measuring approach. More concerningly, the level of human intervention required lends itself to distortion of the data, either due to a difference in definition of key data points or discomfort with the core message the data provides.
In both cases, the amount of work needed to derive meaningful information from the data and the risks associated with misinterpreting it makes it a proposition devoid of much value for financial institutions looking to be innovation leaders. Inherently reward-facing, this approach forces the organization to steer the car by looking only in the rearview mirror.
A common misconception about solving this lack of structured data problem is putting too much reliance on specific tools like Tableau or Microsoft Power BI. In reality, the issues cut much deeper than simply a lack of analytics or visualization tooling; they extend from the very early stages of the strategic planning process, through delivery and into business as usual activity.
In our experience, successful organizations develop high levels of proficiency in the following areas to build reliable monitoring and measuring capabilities:
- Measuring what matters. Prevailing market conditions, customer expectations, emerging technologies, competitive disruption, and regulatory change create a continuously shifting operating landscape for financial institutions. It is critical to understand the forward-looking objectives and key performance indicators to help validate decision making and enable more adaptive business planning.
This means requiring more than a simple five-year revenue or cost-cutting forecast before approving a new initiative. It means creating top-to-bottom connectivity between the organization’s strategic objectives and the work of delivery and operational teams. This framework establishes the very core of a financial institution’s monitoring and measuring capability and cannot be circumvented.
- Data engineering and analytics. Before building dashboards, the groundwork must be laid to ensure all sources of data are identified and that the datapoints to derive relevant business metrics are catalogued. It is also extremely important for all stakeholders to understand what the data will be used for and how it helps drive the metrics they need. For example: is confirmation time the amount of time it takes to confirm a trade from the time of booking, or from the time it enters the confirmation stack? This identification helps prevent confusion and reduced rework. This process builds incrementally from the framework established above and represents the physical data models and infrastructure required to monitor and substantiate the organization’s strategic objective.
- Data governance. All data sets must conform to organizational data policies. While these vary widely depending on the business model, clientele and product sets, the key tenets of effective data governance are consistent and they always start with the business need at the forefront. Questions to consider include:
• Data availability. At what granularity and frequency is data required to support the business’ measuring and monitoring objectives? While dashboards work best on high-level data due to performance requirements, aggregated data does not lend to root cause analysis because individual transactions cannot be identified. This means an architecture that best suits the needs of each organization must be selected and designed intentionally. Care must be used when defining how often data should be refreshed. KRIs are typically real time or updated daily, while KPIs can be refreshed at a slower cadence. Faster frequency often is not necessarily better when balanced against infrastructure costs and performance considerations.
• Data integrity. Who owns a specific data source and where will that data live within the organization’s data infrastructure? Strategic decision making is eroded when an organization cannot assure consumers that they are accessing the right data coming from the right sources. Anti-patterns can form when an organization organically forms unique data and analytics capabilities across lines of business, each with unique methods for sourcing and storing data. Clear ownership and accountability for data combined with centrally defined roles and responsibilities are critical success factors.
• Data security. What can an organization do to ensure that data privacy and security rules are in place and broadly adhered to? Creating a data governance model that ensures that sensitive business information is accessible only by people with the operational need to know can sometimes be counterproductive, erecting unneeded barriers. Successful transformation organizations recognize this challenge and centralize many functions of data collection, obfuscation and visualization. This is key, especially when dealing with transaction-level data that provide insights into client financial activity and personally identifiable information.
- Business intelligence culture. This is the user-facing element of data science and typically garners the most attention. Promoting a culture where users actively utilize previously inaccessible information opens a world of possibilities to analyze and enhance organizational performance. Unfortunately, most such tools are not used as intended, but rather after the fact, to analyze issues. It is imperative for organizations to push usage of analytics tools as proactive performance management tools that can be used to anticipate trends in advance.
The key is to identify different use cases and build multiple layers of analytics for different user bases. Typically, middle level managers need more detail across a smaller breadth of functions while senior management needs higher level metrics across the business. Aligning the data, KPIs, visualization, and organizational design is what creates a culture of data-driven decision making and agility.
In conclusion, once these capabilities are available across the organization, they pay off in multiple ways. Leadership teams can pinpoint areas in their business best suited for or most in need of transformation. Transformation teams can track the outputs of their efforts in near real time. And the two ends of the spectrum can be seamlessly linked by a well thought out OKR framework.
Ultimately, a progressive approach to monitoring and measuring – enabling a nimble, data-driven business model – is what sets many of the most successful transformation organizations apart. They use their data and a culture of agility to make the best decisions for what lies ahead in today’s ultra-competitive and quickly shifting business environment.