Data’s gonna date
07 September 2015
Information is more important than ever, but Mark Davis of Avox says that if firms take their eye off the ball, they could end up relying on data that is already out of date
Image: Shutterstock
What have you been focusing on since DTCC acquired Avox?
Initially, DTCC made the acquisition because of the increased importance of company information—legal entity reference data and legal entity identifiers. However, we have found that there are many more points related to regulatory reporting and risk management that also require significant levels of data, and so Avox is becoming an important part of many of DTCC’s services and utilities.
We have a team of 450 analysts who help to support those utilities, and we can therefore contribute a lot of knowledge, expertise and experience to DTCC’s new services. We may be a small part of the family, but we are an increasingly important one.
Our team uses a combination of client and external data. Clients identify a list of firms that they’re interested in, issuers, guarantors and custodians, for example, and Avox takes on the management of information about them, letting clients know as and when any of the information changes.
Avox keeps an eye on all data feeds, including data changes prompted by mergers, changes of company names, bankruptcy, or if any new codes are issued. We remove the operational overheads of investment managers and banks that would normally have to maintain that information themselves.
Data management is a business that requires a considerable amount of time and effort, and most financial organisations, especially those operating in different countries, prefer outsourcing data management functions as it can become quite a burden to maintain data in a timely and efficient manner.
Asset managers might have an operational team based in San Francisco that specialises in researching US or UK companies, but they may struggle with maintaining information pertinent to, for example, Japan.
It is often these international variations that trip firms up, and that is when they have to choose whether to build an internal team of people with different language skills or to work with a specialised provider.
Once you’ve compiled the data, what can clients do with it?
Essentially, firms can use that information however they choose to, as it can be helpful for various purposes. Clients historically used this kind of content for their trading and settlement processes, and also for risk management. Now, however, regulation and risk increasingly become the main points driving this type of service.
The data we manage generally covers names, addresses, corporate hierarchy, and industry sectors. It is the core information that is used for risk management, compliance and market data. If you were to analyse internal processes in an organisation, you would likely find the same sets of data being used over and over again for multiple processes—some of the large global organisations push out these same data points to over 1,000 internal systems.
If a firm has one set of information in different formats across 500 different internal systems, it becomes challenging to know which data can be trusted and which might be outdated or simply wrong.
Using incorrect data for reporting may have damaging repercussions to not only the firm itself, but also to the financial system. Rechecking data to ensure effective risk management, however, needs a lot of time.
Focusing on risk management in particular, the timeliness with which firms need to manage their risk is becoming more and more important. If there is an incident in a particular geographical location, firms have to be able to react quickly and understand what that means for their existing risk exposure. They have to know how many trades they have in that region, and how many bonds they are holding that are issued by the regional firms.
Accessing that kind of information should be straightforward, but because of the high volume of information passing through asset managers and banks it is often difficult to identify it quickly.
Firms wishing to effectively manage their own risk must have access to such information in real time.
How can you make sure clients trust the data you’re providing?
The quality is really the most important thing. Accessing information in a timely manner just to find out that that information is wrong is no help, and may create a bigger problem.
Of all entity data, 40 percent changes every year as companies go through corporate change. Some of our clients are dealing with 100,000 legal entities.
If 40 percent of that information changes it causes a high volume of data updates that need to be processed in order to stay active and current. We pride ourselves on getting that quality right.
As the same ownership rules apply to UK entities and, for example, entities in the Cayman Islands, firms tend to demand a consistent approach to the way their data is managed.
We try to build our processes and package the information in a consistent and repeatable way, so when clients make a request they can know exactly what they are going to get.
Due to our experience, our processes are well established and are continuously evolving. We keep on top of any new developments in the market, such as the launch of new a stock exchange or a market regulator outside the financial services sector, and assess any pertinent data to establish its trustworthiness and value.
Do you see firms getting complacent about their data management, as it’s not their core line of business?
That has certainly been the case in the past. The sell side of the industry is more involved in data management practices, and these firms probably have to be simply because of the volume of systems and information they are dealing with.
On the buy side, any complacency would be potentially caused by the fact that these firms are directly linked to their counterparts, which can generally provide access to any documentation needed.
Now, however, the challenges lie elsewhere. New regulations such as the Alternative Investment Fund Managers Directive, Solvency II, the European Markets and Infrastructure Regulation and the Foreign Account Tax Compliance Act are all very relevant to the buy side. These are causing huge change as, just a couple of years ago, regulation did not mandate keeping so much data.
On the other hand, every buy-side firm wants to manage its own risk. They naturally have to comply with regulation, but the most important thing for them is making sure that they stay competitive. Beyond that, we are seeing some of the firms treating regulatory compliance as a tick-box exercise, while at the same time others use it as an opportunity to improve their internal processes and to become more efficient.
Either way, the need to be up to date from a risk-management perspective is a key reason driving firms to address issues pertinent to their data management.
We are seeing a change to the entire discipline of maintaining information to ensure high standards, with firms finally moving away from spreadsheet solutions. We also see appointments of chief data officers, people whose primary responsibility is the quality of the information that flows through an organisation.
Does that go hand-in-hand with the evolution of technology?
That is difficult to say. It is important to identify these two as separate challenges that firms have to address and it is important for firms to have good systems in place for the correct analysis, trade settlement and regulatory reporting.
However, having good technology in place populated with bad information will lead to inaccurate trade reporting and an inability to manage risk.
Initially, DTCC made the acquisition because of the increased importance of company information—legal entity reference data and legal entity identifiers. However, we have found that there are many more points related to regulatory reporting and risk management that also require significant levels of data, and so Avox is becoming an important part of many of DTCC’s services and utilities.
We have a team of 450 analysts who help to support those utilities, and we can therefore contribute a lot of knowledge, expertise and experience to DTCC’s new services. We may be a small part of the family, but we are an increasingly important one.
Our team uses a combination of client and external data. Clients identify a list of firms that they’re interested in, issuers, guarantors and custodians, for example, and Avox takes on the management of information about them, letting clients know as and when any of the information changes.
Avox keeps an eye on all data feeds, including data changes prompted by mergers, changes of company names, bankruptcy, or if any new codes are issued. We remove the operational overheads of investment managers and banks that would normally have to maintain that information themselves.
Data management is a business that requires a considerable amount of time and effort, and most financial organisations, especially those operating in different countries, prefer outsourcing data management functions as it can become quite a burden to maintain data in a timely and efficient manner.
Asset managers might have an operational team based in San Francisco that specialises in researching US or UK companies, but they may struggle with maintaining information pertinent to, for example, Japan.
It is often these international variations that trip firms up, and that is when they have to choose whether to build an internal team of people with different language skills or to work with a specialised provider.
Once you’ve compiled the data, what can clients do with it?
Essentially, firms can use that information however they choose to, as it can be helpful for various purposes. Clients historically used this kind of content for their trading and settlement processes, and also for risk management. Now, however, regulation and risk increasingly become the main points driving this type of service.
The data we manage generally covers names, addresses, corporate hierarchy, and industry sectors. It is the core information that is used for risk management, compliance and market data. If you were to analyse internal processes in an organisation, you would likely find the same sets of data being used over and over again for multiple processes—some of the large global organisations push out these same data points to over 1,000 internal systems.
If a firm has one set of information in different formats across 500 different internal systems, it becomes challenging to know which data can be trusted and which might be outdated or simply wrong.
Using incorrect data for reporting may have damaging repercussions to not only the firm itself, but also to the financial system. Rechecking data to ensure effective risk management, however, needs a lot of time.
Focusing on risk management in particular, the timeliness with which firms need to manage their risk is becoming more and more important. If there is an incident in a particular geographical location, firms have to be able to react quickly and understand what that means for their existing risk exposure. They have to know how many trades they have in that region, and how many bonds they are holding that are issued by the regional firms.
Accessing that kind of information should be straightforward, but because of the high volume of information passing through asset managers and banks it is often difficult to identify it quickly.
Firms wishing to effectively manage their own risk must have access to such information in real time.
How can you make sure clients trust the data you’re providing?
The quality is really the most important thing. Accessing information in a timely manner just to find out that that information is wrong is no help, and may create a bigger problem.
Of all entity data, 40 percent changes every year as companies go through corporate change. Some of our clients are dealing with 100,000 legal entities.
If 40 percent of that information changes it causes a high volume of data updates that need to be processed in order to stay active and current. We pride ourselves on getting that quality right.
As the same ownership rules apply to UK entities and, for example, entities in the Cayman Islands, firms tend to demand a consistent approach to the way their data is managed.
We try to build our processes and package the information in a consistent and repeatable way, so when clients make a request they can know exactly what they are going to get.
Due to our experience, our processes are well established and are continuously evolving. We keep on top of any new developments in the market, such as the launch of new a stock exchange or a market regulator outside the financial services sector, and assess any pertinent data to establish its trustworthiness and value.
Do you see firms getting complacent about their data management, as it’s not their core line of business?
That has certainly been the case in the past. The sell side of the industry is more involved in data management practices, and these firms probably have to be simply because of the volume of systems and information they are dealing with.
On the buy side, any complacency would be potentially caused by the fact that these firms are directly linked to their counterparts, which can generally provide access to any documentation needed.
Now, however, the challenges lie elsewhere. New regulations such as the Alternative Investment Fund Managers Directive, Solvency II, the European Markets and Infrastructure Regulation and the Foreign Account Tax Compliance Act are all very relevant to the buy side. These are causing huge change as, just a couple of years ago, regulation did not mandate keeping so much data.
On the other hand, every buy-side firm wants to manage its own risk. They naturally have to comply with regulation, but the most important thing for them is making sure that they stay competitive. Beyond that, we are seeing some of the firms treating regulatory compliance as a tick-box exercise, while at the same time others use it as an opportunity to improve their internal processes and to become more efficient.
Either way, the need to be up to date from a risk-management perspective is a key reason driving firms to address issues pertinent to their data management.
We are seeing a change to the entire discipline of maintaining information to ensure high standards, with firms finally moving away from spreadsheet solutions. We also see appointments of chief data officers, people whose primary responsibility is the quality of the information that flows through an organisation.
Does that go hand-in-hand with the evolution of technology?
That is difficult to say. It is important to identify these two as separate challenges that firms have to address and it is important for firms to have good systems in place for the correct analysis, trade settlement and regulatory reporting.
However, having good technology in place populated with bad information will lead to inaccurate trade reporting and an inability to manage risk.
NO FEE, NO RISK
100% ON RETURNS If you invest in only one asset servicing news source this year, make sure it is your free subscription to Asset Servicing Times
100% ON RETURNS If you invest in only one asset servicing news source this year, make sure it is your free subscription to Asset Servicing Times