Masters of the data-verse
03 August 2016
Data volumes are perpetually on the increase, and asset managers will need to learn to master and control them if they want to remain competitive
Image: Shutterstock
It’s existed for as long as asset managers themselves, but, whether as a result of the digital revolution, or as a side effect of the regulatory environment, data is suddenly on the agenda at every conference and industry event. It has also become more and more cited as a factor for asset management success.
The buy-side space is ever more focused on regulatory compliance, efficiency and, ultimately, cost savings. According to Bill Blythe, global business development director at Gresham, new regulations mean there is now more responsibility for buy-side firms to provide data to regulators on their underlying firms. With the advent of the Markets in Financial Infrastructures Directive (MiFID) II, the number of fields for transactions reporting has increased from 23 (under MiFID I) to 65.
He suggests that the rise of the ‘chief data officer’ proves these data reporting issues are no longer merely in the realm of the IT teams. “The knowledge is now much more closely tied to the people that understand it and work with it, rather than just process it,” he says.
However, with the swathes of data coming from exchanges, public news feeds, vendors, counterparties and clients, asset managers have to first figure out how to make it useful, and then how to make it work in a competitive capacity.
Mohit Dwivedi, recently appointed principle data architect at
technology consultancy GFT, suggests that asset managers tend to focus on collecting market data, macro-economic data, and other data points that help them to identify trends and “enable them to make investment decisions”.
Similarly, Yousaf Hafeez, head of business development for BT’s financial services segment Radianz Services, agrees that this data should indeed be used for making portfolio and investment decisions, as well as for reconciliations reporting and post-trade analysis. Proper use of data, Hafeez says, can be used to “find the elusive source of alpha”.
Data, and more importantly, a thorough understanding of it, can help firms to improve transparency and make portfolio processes more easily auditable, bringing an advantage in regulatory compliance, but also reducing trade risk. Equally, improved data analysis can lead to customer insight and, therefore, better performance.
Data lakes
Despite the many advantages to be gleaned, for asset managers, actually gaining any meaningful information and insight from the data available to them is a different issue altogether.
According to Peter Hill, managing director for the UK, Ireland and the Middle East at SimCorp, trading systems feature price discovery, risk monitoring, position keeping, trading, settlement, accounting and recordkeeping, while data requirements span the whole lifecycle of a trade, so the “core data streams are extensive”.
He adds that some firms will store huge amounts of data—everything they ever create of acquire—in ‘data lakes’. “But the cost of managing that data, measured against the benefits, tends to not add up. Asset managers need to look closely at their data and be more strategic and specific in what they store, and consider what benefits they can gain from it.”
SimCorp’s head of technology and product management Anders Kirkeby builds on this, suggesting that such firms often struggle to identify exactly what data they are in possession of, let alone how they are using it, and how they intend to.
To manage this, and to differentiate themselves from the competition, Kirkeby suggests that asset managers should vary the ways in which data is stored, created and used. While some firms use unique models to gain competitive advantage, “others seem more like leftovers from previous systems and processes”, he says.
“Variation is a requirement in a competitive space but it is not always obvious that the cost of the extra variation is reflected in the performance of the fund.”
“Given the meagre fee performance ratios we often see, I believe a number of asset managers could do well to rationalise their data handling and usage. More data may bring more insight, but if you cannot turn the added insight into profit you are simply wasting resources.”
Kirkeby adds, however, that large asset managers are likely to have a larger wealth of data, which, when properly managed, could help them stand out from a data perspective.
“A data lake with sufficiently rich metadata or consolidation of systems will make it easier to leverage existing data assets.”
Model behaviour
Of course, it’s not just the amount of data within an institution that’s the problem. According to Hafeez, when increasing numbers of asset managers are using market data feeds, the bandwidth required to access these has grown as well—this is particularly noticeable in the US, he says, where volumes are highest.
Hafeez says: “This means that firms are experiencing increased connectivity costs to access the data, rising costs in processing data feeds and growing costs of handling changes to data feeds.”
He suggests that the use of a shared infrastructure, or managed service model could help firms to reduce the cost of data ownership, “as they do not have to build, manage and maintain their own networks, leaving them to focus on their core business”.
A shared infrastructure model could remove the competitive nature of data collection. Hafeez himself notes that asset managers tend to focus on the data feed that help them generate the best returns for both themselves and their clients.
He says: “Asset managers are looking to market data as a source of competitive differentiation that allows them to demonstrate they have met or exceeded the customer objectives. In addition, they are conscious of meeting their compliance, risk and regulatory objectives.”
Consequently, they’re increasingly looking to new ‘raw’ data feeds, or other non-traditional data collaction methods. He also notes: “What is useless info to one person may be critical to another person.”
Blythe also points out that spreadsheets and other manual processes, often still used in the back office, are no longer coping with the increasing volumes and volatility of data.
He suggests, however, that asset managers are being forced to review their own processes and their approach to data integrity, both in order to comply with regulation and to encourage further product innovation in-house.
Blythe says: “As asset managers onboard more business, and trades become less ‘vanilla’, and more structured and sophisticated, many firms are realising that the existing legacy platforms are no longer sufficient.”
“These more traditional systems were brought in to manage cash and equity data, and were never intended to deal with the myriad new asset classes and products we see today, requiring the processing of much wider data sets that do not fit well into fixed schema systems, are often batch-based, taking forever to modify and implement new data controls.”
Bridging the gaps
The point remains, however, that none of this matters unless the data is meaningful in the first place.
Blythe explains: “It is often overlooked or taken for granted that meaningful information can only be obtained from data if it is clean, accurate, and can be properly validated and checked against other sources of information.”
Without this integrity, data becomes just a string of meaningless figures, which “provides minimal insight, or zero potential to run advanced analytics”.
No matter how many automated processes, systems and technologies are in place, there is no remedy, at the moment, for data that is bad in the first place, and this is the area in which firms are starting to struggle.
Dwivedi suggests that because of bad data and gaps in data streams, “bad decisions are made”—not necessarily because of poor decision making, rather because the information the decision is based on is poor.
He says: “Data management is now becoming a burden on companies. Lack of standards or governance for managing data is creating business challenges.”
“Volume and variety of data is also a problem and firms are facing challenges on how to process huge volumes of data from different sources,” Dwivedi adds.
On one hand, it can be difficult to manage these errors, because when even a small portion of data is handled manually, it is inevitable that errors will creep in.
However, Hafeez notes that, while raw data feeds have to be cleaned and reviewed in order to remove erroneous data and close data gaps, manual intervention remains, in fact, somewhat of a requirement.
Hafeez says: “Much of this can be automated today but a layer of manual checking may be still required.”
Firms appear to be mistrusting of both manual efforts and machines, so, while bad data remains a problem, the key will be to put a data strategy in place in order to gain a clearer view of what is available, and what is trustworthy.
Hill suggests that institutions should take a more strategic and data-centric approach to their asset management operations as a whole, noting that creating an investment book of record is a good start.
He says: “By removing point-to-point integration and providing a single hub of timely, reliable and accurate data, firms will remove the need for departments to source and amend critical data and instead invest their time in more business-critical activities.”
Front-office applications must be able to “speak to each other in the same language”, as well as ‘speaking’ to other supporting systems, Hill says.
“More importantly, these front-office applications need to contain a ‘golden copy’ of every piece of information that passes through the back, middle and front office without lengthy system reconciliations that can delay the investment process.”
“Inconsistencies and discrepancies that arise in an order management system or risk analytics package can have multiple—all negative—consequences.”
The buy-side space is ever more focused on regulatory compliance, efficiency and, ultimately, cost savings. According to Bill Blythe, global business development director at Gresham, new regulations mean there is now more responsibility for buy-side firms to provide data to regulators on their underlying firms. With the advent of the Markets in Financial Infrastructures Directive (MiFID) II, the number of fields for transactions reporting has increased from 23 (under MiFID I) to 65.
He suggests that the rise of the ‘chief data officer’ proves these data reporting issues are no longer merely in the realm of the IT teams. “The knowledge is now much more closely tied to the people that understand it and work with it, rather than just process it,” he says.
However, with the swathes of data coming from exchanges, public news feeds, vendors, counterparties and clients, asset managers have to first figure out how to make it useful, and then how to make it work in a competitive capacity.
Mohit Dwivedi, recently appointed principle data architect at
technology consultancy GFT, suggests that asset managers tend to focus on collecting market data, macro-economic data, and other data points that help them to identify trends and “enable them to make investment decisions”.
Similarly, Yousaf Hafeez, head of business development for BT’s financial services segment Radianz Services, agrees that this data should indeed be used for making portfolio and investment decisions, as well as for reconciliations reporting and post-trade analysis. Proper use of data, Hafeez says, can be used to “find the elusive source of alpha”.
Data, and more importantly, a thorough understanding of it, can help firms to improve transparency and make portfolio processes more easily auditable, bringing an advantage in regulatory compliance, but also reducing trade risk. Equally, improved data analysis can lead to customer insight and, therefore, better performance.
Data lakes
Despite the many advantages to be gleaned, for asset managers, actually gaining any meaningful information and insight from the data available to them is a different issue altogether.
According to Peter Hill, managing director for the UK, Ireland and the Middle East at SimCorp, trading systems feature price discovery, risk monitoring, position keeping, trading, settlement, accounting and recordkeeping, while data requirements span the whole lifecycle of a trade, so the “core data streams are extensive”.
He adds that some firms will store huge amounts of data—everything they ever create of acquire—in ‘data lakes’. “But the cost of managing that data, measured against the benefits, tends to not add up. Asset managers need to look closely at their data and be more strategic and specific in what they store, and consider what benefits they can gain from it.”
SimCorp’s head of technology and product management Anders Kirkeby builds on this, suggesting that such firms often struggle to identify exactly what data they are in possession of, let alone how they are using it, and how they intend to.
To manage this, and to differentiate themselves from the competition, Kirkeby suggests that asset managers should vary the ways in which data is stored, created and used. While some firms use unique models to gain competitive advantage, “others seem more like leftovers from previous systems and processes”, he says.
“Variation is a requirement in a competitive space but it is not always obvious that the cost of the extra variation is reflected in the performance of the fund.”
“Given the meagre fee performance ratios we often see, I believe a number of asset managers could do well to rationalise their data handling and usage. More data may bring more insight, but if you cannot turn the added insight into profit you are simply wasting resources.”
Kirkeby adds, however, that large asset managers are likely to have a larger wealth of data, which, when properly managed, could help them stand out from a data perspective.
“A data lake with sufficiently rich metadata or consolidation of systems will make it easier to leverage existing data assets.”
Model behaviour
Of course, it’s not just the amount of data within an institution that’s the problem. According to Hafeez, when increasing numbers of asset managers are using market data feeds, the bandwidth required to access these has grown as well—this is particularly noticeable in the US, he says, where volumes are highest.
Hafeez says: “This means that firms are experiencing increased connectivity costs to access the data, rising costs in processing data feeds and growing costs of handling changes to data feeds.”
He suggests that the use of a shared infrastructure, or managed service model could help firms to reduce the cost of data ownership, “as they do not have to build, manage and maintain their own networks, leaving them to focus on their core business”.
A shared infrastructure model could remove the competitive nature of data collection. Hafeez himself notes that asset managers tend to focus on the data feed that help them generate the best returns for both themselves and their clients.
He says: “Asset managers are looking to market data as a source of competitive differentiation that allows them to demonstrate they have met or exceeded the customer objectives. In addition, they are conscious of meeting their compliance, risk and regulatory objectives.”
Consequently, they’re increasingly looking to new ‘raw’ data feeds, or other non-traditional data collaction methods. He also notes: “What is useless info to one person may be critical to another person.”
Blythe also points out that spreadsheets and other manual processes, often still used in the back office, are no longer coping with the increasing volumes and volatility of data.
He suggests, however, that asset managers are being forced to review their own processes and their approach to data integrity, both in order to comply with regulation and to encourage further product innovation in-house.
Blythe says: “As asset managers onboard more business, and trades become less ‘vanilla’, and more structured and sophisticated, many firms are realising that the existing legacy platforms are no longer sufficient.”
“These more traditional systems were brought in to manage cash and equity data, and were never intended to deal with the myriad new asset classes and products we see today, requiring the processing of much wider data sets that do not fit well into fixed schema systems, are often batch-based, taking forever to modify and implement new data controls.”
Bridging the gaps
The point remains, however, that none of this matters unless the data is meaningful in the first place.
Blythe explains: “It is often overlooked or taken for granted that meaningful information can only be obtained from data if it is clean, accurate, and can be properly validated and checked against other sources of information.”
Without this integrity, data becomes just a string of meaningless figures, which “provides minimal insight, or zero potential to run advanced analytics”.
No matter how many automated processes, systems and technologies are in place, there is no remedy, at the moment, for data that is bad in the first place, and this is the area in which firms are starting to struggle.
Dwivedi suggests that because of bad data and gaps in data streams, “bad decisions are made”—not necessarily because of poor decision making, rather because the information the decision is based on is poor.
He says: “Data management is now becoming a burden on companies. Lack of standards or governance for managing data is creating business challenges.”
“Volume and variety of data is also a problem and firms are facing challenges on how to process huge volumes of data from different sources,” Dwivedi adds.
On one hand, it can be difficult to manage these errors, because when even a small portion of data is handled manually, it is inevitable that errors will creep in.
However, Hafeez notes that, while raw data feeds have to be cleaned and reviewed in order to remove erroneous data and close data gaps, manual intervention remains, in fact, somewhat of a requirement.
Hafeez says: “Much of this can be automated today but a layer of manual checking may be still required.”
Firms appear to be mistrusting of both manual efforts and machines, so, while bad data remains a problem, the key will be to put a data strategy in place in order to gain a clearer view of what is available, and what is trustworthy.
Hill suggests that institutions should take a more strategic and data-centric approach to their asset management operations as a whole, noting that creating an investment book of record is a good start.
He says: “By removing point-to-point integration and providing a single hub of timely, reliable and accurate data, firms will remove the need for departments to source and amend critical data and instead invest their time in more business-critical activities.”
Front-office applications must be able to “speak to each other in the same language”, as well as ‘speaking’ to other supporting systems, Hill says.
“More importantly, these front-office applications need to contain a ‘golden copy’ of every piece of information that passes through the back, middle and front office without lengthy system reconciliations that can delay the investment process.”
“Inconsistencies and discrepancies that arise in an order management system or risk analytics package can have multiple—all negative—consequences.”
NO FEE, NO RISK
100% ON RETURNS If you invest in only one asset servicing news source this year, make sure it is your free subscription to Asset Servicing Times
100% ON RETURNS If you invest in only one asset servicing news source this year, make sure it is your free subscription to Asset Servicing Times