News by sections
ESG

News by region
Issue archives
Archive section
Multimedia
Videos
Search site
Features
Interviews
Country profiles
Generic business image for editors pick article feature Image: Broadridge

05 Oct 2020

Share this article





William Cooley
Broadridge

William Cooley of Broadridge says the pandemic has only underlined the importance of operational resiliency in reconciliations

What are the market drivers forcing firms to re-evaluate their reconciliation strategy?

Reconciliations are imperative to the risk and control functions of an organisation. Accurate and timely reconciliations ensure that companies always have a grasp on operational problems that could be affecting their bottom line.

The pandemic has only underlined the importance of operational resiliency in reconciliation. processes, which have been profoundly affected by increases in the volume, variety and velocity of data, as a result of increased trading volumes during this time. Additionally, firms are under increased cost pressure given complicated economic conditions, so there is a very clear appetite for gains in efficiency.

What issues around reconciliations has the global pandemic brought to light?

Regulation and control have been hot topics over the past few years but COVID-19 has placed additional emphasis on control points because it’s critical that firms are certain about their financial exposure during the pandemic. This is not necessarily in terms of new control points, but rather ensuring that your existing control points function fully in a business continuity scenario with reduced staff or at least reduced staff interactivity. Some counterparties might not have been as successful in rolling out business continuity strategies, which makes the timely resolution of problems more important than ever because the uncertain economic conditions could lead to failures. Firms need to ensure that their bottom line is covered.

What has also become clear is that you need a system that is flexible enough to handle control points quickly and can also deal with spikes in trading volumes as seen when lockdowns began earlier in the year. At the same time, a system must still be economic, maintaining a minimal cost basis in such circumstances.

The pandemic has been a catalyst for the adoption of artificial intelligence (AI), with many firms accelerating their digital strategy. Our research has shown that 34 percent of firms now plan to increase the scope of this in light of current conditions. Those firms already progressing with technologies like AI will be much better prepared for future business disruptions.

How has reconciliation technology evolved in recent years to handle these new challenges?

Originally reconciliations were performed pen-and-paper or for more forward-looking firms that might have a complicated set of Microsoft Excel macros that would automate processes such as data acquisition or manual matching similarly to some reconciliation systems today. The first enterprise systems were built upon these workflows by industrialising the reconciliations process in a productised application; however, the tools had a very limited set of functions and processing power simply due to a lack of advanced technology. A simple task that we would routinely perform today, such as matching 50 to 100,000 records with a relatively basic set of matching rules, might not even be able to be completed given the prior technology limitations. Creating these control points was time consuming and expensive as the reconciliations usually had to be modified to fit the tool instead of customising the tool to fit the data.

Second generation systems are where the reconciliation software industry really started to hit its stride. These systems were more of a product-based toolkit than the first-generation systems, allowing for pretty much any type of reconciliation to be addressed. These systems were much more powerful and customisable which led to some very impressive workflows being built across the industry, but the problem was still that they were essentially bespoke software builds and not easily or cost-effectively maintainable. The solutions finally had the power to process large datasets, but the underlying hardware and licensed third-party software was relatively expensive.

Market solutions often lacked portability between clients, even when using the same feeder systems, and it was difficult to build any sort of economies of scale as the installation base fragmented.

These second-generation systems power most reconciliations within financial services today but there is significant interest in where the industry goes next.

Due to advancements in big data and cloud technologies, computing costs and availability have reached a point where these innovations are more widely accessible, which has led to the third generation of reconciliation solutions based on machine learning. Machine learning is not a new concept; however, it is only now that the cost/power ratio has equalised to the point where these technologies are feasible options for the reconciliations industry.

These next-generation solutions are built on an AI/machine learning foundation and offer significant benefits over the previous generation of products when it comes to the power and features required to deal with the changing volumes, variation and velocity of data that we are seeing today.

How can AI and machine learning help in transforming the reconciliation process?

There are plenty of applications for AI in the reconciliations world – three areas receiving a lot of attention right now are accelerated onboarding of reconciliations; the improvement of already high match rates; and the automated allocation of exceptions to achieve productivity gains and reduce risk.

When onboarding a new reconciliation, a machine learning-based system can automatically analyse files and find relationships in the data to build a matching scheme. As the matching scheme is based on a statistical model, it can automatically update itself as it receives more data to boost matching rates. Previously this is a task that would have taken cycles of business analysis and testing to complete.

As the system can do this work automatically, clear efficiency gains can be realised, and managers can be comfortable in the knowledge that they are always getting the best results from their system by keeping up to date.

Cutting onboarding time and keeping matching schemes up to date are very important features that will have a real impact on a business, but I believe that the real value in adding machine learning to a solution has to do with the efficiency gains realised through the day-to-day management of exceptions.

Systems based on machine learning have the ability to predict the department to which a problem belongs, the reason for the problem and even the steps required to resolve the problem by examining the history of similar issues and reaching out to other systems for supplemental information. This might seem out of reach, but given the size of the datasets that are examined and the power of machine learning algorithms like XGBoost, this is easily within the realm of possibility.

The user is then free to focus on the real issues instead of just moving an issue to the next step in a manual workflow. We have seen overall efficiency gains of 30 to 40 percent by implementing this sort of technology.

Advertisement
Get in touch
News
More sections
Black Knight Media