Financial institutions have been using techniques such as robotic process automation to optimize their processes, with the aim of improving efficiency and ensuring the near-instant responses that customers and markets now demand. However, optimization is itself a process, and getting it right begins with building an accurate picture of the institution’s process flows so that projects and pain points can be prioritized. It’s easy for institutions to stumble at this first step.
When we speak of mapping process flows, the image that comes to mind is an old-school flow chart. This is still the usual, and easiest, means for communicating process flow, as we only need pen and paper. To chart process flows more systematically, and for practical purposes such as capturing the data flow between process steps, we can leverage standards such as Business Process Modelling Notation (BPMN 2.0) and other tools. But poor practices and the lack of a more modern data-led approach can still lead to a range of shortcomings in process discovery.
Shortcomings of current approaches
If you don’t work in Operations, you may be surprised to find out that process designs are not always centrally stored and that the original designs might only exist in the memory of long-term employees. For some COOs, having a standardized tool to store all the process flows is already a big step forward.
However even when the designs are well managed using standards like BPMN, this only records the design at a point of time in the past. Over time, there will be variations in the process flow, usually caused by work-arounds that the operations staff use to ease bottlenecks and make a process, well, work.
The current approach for identifying potential improvements is to obtain the process design in whatever format and to talk to the subject-matter experts to gain more insight, before looking at the process performance data that is stored on the system. However, this approach is dependent on the expertise of the people involved. Their observations may not tell the whole story and can be erroneous.
Process mining as the first step towards process optimization
A more objective and efficient way to conduct process discovery is to use process mining. According to research firm Gartner, process mining is, “a technique designed to discover, monitor, and improve real processes (i.e., not assumed processes) by extracting readily available knowledge from the event logs of information systems.”
The first time the process mining is conducted, most process mining platforms require the user to stitch together system logs and the converted digital footprints of any manual work not captured in the system. However, the whole process of information gathering is very similar to the ETL (Extract, Transform and Load) task in data pipelines: once a job is setup, subsequent updates can be scheduled to run automatically with the updates made available to management in a nicely designed dashboard so that process performance can be easily observed.
This approach has several benefits:
1) Objective performance data for a process
All the process performance data is captured through this approach, with any variations shown in the dashboard. It is not uncommon to find that the path taken most often does not, in fact, match the original design for the process flow. Various performance metrics can be calculated to reveal bottlenecks, time spent on a node, or how performance figures compare to an average.
2) Compare to best practice
When the same process is run across different countries/departments, it should be possible to compare these and to identify the optimal path – which may indicate a better approach than the corresponding process design.
3) Working without the original design
Because the process mining platform shows the variations of a process and how these compare in terms of performance, it is less important to have access to the original process design. This helps when there has been no systematic approach to retaining original designs.
Despite their obvious benefits, process mining platforms are less commonly adopted than similar data mining tools such as those for Business Intelligence. One reason is that many companies have not until recently put holistic optimization processes in place, while another is that until automation technologies matured, there were fewer opportunities to improve processes and make cost savings. It is difficult for businesses to invest in something that might not lead to a meaningful return.
However, with automation technologies becoming more commonplace, process mining provides a very efficient way to identify bottlenecks and potential areas for improvement and to create a continuously updated candidate list for investigation.
You can read the third article in the series using the links below. Alternatively, go back to the start of the series on fulfilling the promise of operations optimization.