Regulators see the bigger picture
July 2023
Nick Moss of MarketAxess discusses the advantage regulators have when assessing data quality and how the industry can adopt a similar approach
Image: MarketAxess
ESMA's recent paper on data quality highlights the increasingly data driven approach being taken by regulators and the extent to which they are using real-time metrics and sharing data across jurisdictions to help identify data quality issues.
The regulatory data sets required under MiFIR, SFTR, EMIR and other G20 regulations are complex.
In some cases there can be more than 150 fields in a single regulation, combined with the pressures of hundreds of millions of records reported daily.
To fulfil these requirements, data needs to be drawn from multiple different upstream systems within a firm and transformed to meet specific regulatory needs.
To further compound the problem, change is constant within organisations and a small alteration to any of the upstream systems can create big problems once the data has been fed down into the regulatory report.
Advanced techniques are required to effectively analyse and monitor data, and need to be combined with a full reporting data set and smart technologies. Unfortunately, individual firms don’t often have these requisite capabilities and this leaves them vulnerable to fines and reputational damage.
The traditional approaches to ensuring regulatory data accuracy are to perform end-to-end reconciliations (a requirement under RTS 22) or periodic sample-based control reviews. On the surface, this control framework appears satisfactory, but when you dig a little deeper there are three key inherent weaknesses.
Context
Without access to the same industry-wide data set as regulators, these controls become too internally focused and omit key market context. Institutions may ask: ‘How is everyone else reporting this type of trade?” Or: “Are all my counterparties reporting the same trade timestamp as I am?”
Coverage
By definition, these controls don’t cover the whole population of reported trades, meaning, at best, it can take between three and six months to identify an error. At worst, it could be missed altogether.
This can mean a relatively small issue escalates into one that requires a significant amount of back reporting to resolve.
Cost
Traditional data quality reviews are often resource-intensive, requiring significant manual effort or the budget for third-party experts to run. Cost is also often the enemy of coverage. It’s important to ask how you can improve your data and proactively highlight potential errors, considering the backdrop of increased scrutiny from regulators, the techniques they are employing to monitor the market and deficiencies in traditional models. In essence: how can you start employing the same techniques your regulator uses?
Having listened carefully to how regulators are using their data sets to monitor accuracy, we realised that we could use our unique industry-wide dataset, powered by our network of more than 950 clients, to apply innovative data techniques to help firms undertake similar checks and address the limitations in existing models.
MarketAxess’ solution SensAI allows you to assess every field of every transaction you report, providing feedback on anomalies before they become issues. Machines are only as good as the data that feeds them, and that is where the real value of SensAI lies — its ability to provide context. Our regulatory reporting data set gives us a view similar to that of your regulator. This means we can monitor market trends and behaviours to spot outliers in your data that wouldn’t otherwise be possible. If the whole industry is reporting a certain transaction type in a particular way, but you are doing it differently, SensAI will spot that.
SensAI’s highly automated approach to analysis means cost is no longer a prohibitive factor. The solution is designed to streamline your internal regulatory controls and checks, making them more efficient, more accurate and more consistent. SensAI is here to help you safely navigate the deep and murky waters of regulatory reporting.
The regulatory data sets required under MiFIR, SFTR, EMIR and other G20 regulations are complex.
In some cases there can be more than 150 fields in a single regulation, combined with the pressures of hundreds of millions of records reported daily.
To fulfil these requirements, data needs to be drawn from multiple different upstream systems within a firm and transformed to meet specific regulatory needs.
To further compound the problem, change is constant within organisations and a small alteration to any of the upstream systems can create big problems once the data has been fed down into the regulatory report.
Advanced techniques are required to effectively analyse and monitor data, and need to be combined with a full reporting data set and smart technologies. Unfortunately, individual firms don’t often have these requisite capabilities and this leaves them vulnerable to fines and reputational damage.
The traditional approaches to ensuring regulatory data accuracy are to perform end-to-end reconciliations (a requirement under RTS 22) or periodic sample-based control reviews. On the surface, this control framework appears satisfactory, but when you dig a little deeper there are three key inherent weaknesses.
Context
Without access to the same industry-wide data set as regulators, these controls become too internally focused and omit key market context. Institutions may ask: ‘How is everyone else reporting this type of trade?” Or: “Are all my counterparties reporting the same trade timestamp as I am?”
Coverage
By definition, these controls don’t cover the whole population of reported trades, meaning, at best, it can take between three and six months to identify an error. At worst, it could be missed altogether.
This can mean a relatively small issue escalates into one that requires a significant amount of back reporting to resolve.
Cost
Traditional data quality reviews are often resource-intensive, requiring significant manual effort or the budget for third-party experts to run. Cost is also often the enemy of coverage. It’s important to ask how you can improve your data and proactively highlight potential errors, considering the backdrop of increased scrutiny from regulators, the techniques they are employing to monitor the market and deficiencies in traditional models. In essence: how can you start employing the same techniques your regulator uses?
Having listened carefully to how regulators are using their data sets to monitor accuracy, we realised that we could use our unique industry-wide dataset, powered by our network of more than 950 clients, to apply innovative data techniques to help firms undertake similar checks and address the limitations in existing models.
MarketAxess’ solution SensAI allows you to assess every field of every transaction you report, providing feedback on anomalies before they become issues. Machines are only as good as the data that feeds them, and that is where the real value of SensAI lies — its ability to provide context. Our regulatory reporting data set gives us a view similar to that of your regulator. This means we can monitor market trends and behaviours to spot outliers in your data that wouldn’t otherwise be possible. If the whole industry is reporting a certain transaction type in a particular way, but you are doing it differently, SensAI will spot that.
SensAI’s highly automated approach to analysis means cost is no longer a prohibitive factor. The solution is designed to streamline your internal regulatory controls and checks, making them more efficient, more accurate and more consistent. SensAI is here to help you safely navigate the deep and murky waters of regulatory reporting.
NO FEE, NO RISK
100% ON RETURNS If you invest in only one asset servicing news source this year, make sure it is your free subscription to Asset Servicing Times
100% ON RETURNS If you invest in only one asset servicing news source this year, make sure it is your free subscription to Asset Servicing Times