Time to talk about data
16 October 2013
A historical disproportionate focus on knitting data together will be replaced by a clear governing policy, says Adam Cottingham of SmartStream
Image: Shutterstock
Improvements in information infrastructure have born many IT initiatives that promise to solve all things data in financial institutions. As the market takes stock and looks back on the consequences of these initiatives, it is becoming increasingly apparent that many of these projects were misconceived, placing a disproportionate focus on knitting data together and ignoring what, in retrospect, was an obvious requirement to implement not just technology but a clear and universal policy for governing all elements of the data supply chain—from its sourcing to timeliness, quality, control policies, integration, storage and distribution.
As well as failing to solve the problem they were implemented to solve, these failures in governance have created a labyrinth of disparate databases, multiple master files and bewildering licences which are costing the industry millions of pounds a year in data fees, software licenses, maintenance contracts and fixed staff costs—and this is before one takes into account the even greater knock-on costs of regulatory compliance and repairing data-related downstream issues. As we, as an industry, attempt to move (tentatively) forward into the post-crunch brave new world, this is perhaps a good time to talk about data and its effective governance.
Many firms now have at least a nominal governance policy of sorts in place, but all too frequently these policies fall down in their application. Successfully aligning data sourcing and flow constraints with not only instrument lifecycles but also internal infrastructures across an entire organisation is a very complex undertaking. This complexity is exacerbated by notions—born of a less cost-conscious historical environment—that the way to solve data issues is to throw more sources of information together to create a ‘better’ golden copy and that all data can become a profit centre, in the form of proprietary rules and pricing routines.
The reality is rather simpler. Data, for the most part, is absolute and is a cost centre for firms. Implementing solutions that add to this cost only make cost reduction strategies—strategies that are meant to deliver economies of 40 percent or more for most firms—seem even further out of reach. Instead of launching more IT projects, firms are looking to apply meaningful metrics across elements of their data that they can affect, ie, coverage, timeliness, quality and cost. Imposing a governance policy that not only measures these metrics but enforces targets for their successful compliance leads to a tangible and results-driven framework, and because these elements are all linked, the aim is that an improvement in one delivers an improvement in the others. Implementing this governance policy means firms can take a step back from localised issues by accurately assessing their data resources, before implementing an elegant and efficient enterprise-wide policy that treats all data as part of a single ecosystem. This approach recognises the interdependencies between assets that have previously been viewed as microcosms—from legal entities to securities, corporate actions, derivatives, and prices.
To an extent the focus on firm-wide governance is being brought about by external forces as much as internal imperatives. Increased close supervision and more timely reporting requirements imposed by regulators in order to monitor and mitigate systemic risks are supported by a global drive towards greater transparency and accuracy. Initiatives such as: the US Dodd-Frank Act, Basel III, the Markets in Financial Instruments Directive (MiFID) review and The Foreign Account Tax Compliance Act (FATCA) are all putting greater reporting requirements onto firms, as well as helping to instigate industry change in their own right.
To take one example currently very much in the spotlight, the Legal Entity Identifier (LEI) project is being effectively mandated by legislation on both sides of the Atlantic and is a rare instance of a regulatory burden that is intended to not only reduce the market’s overall risk, but will also allow individual participants to free up much needed resources and capital—achieved by applying a freely available and relatively straightforward standard to an opaque and expensive problem.
The LEI is not, of course, the answer to all of the industry’s problems, nor is it going to be cost effective or easy to implement, but it is a step in the direction that it must take in order to not only survive but to prosper. This direction is towards increased standardisation and proactive collaboration between market participants, and to grease the wheels of this movement, a radical new solution is attracting interest. This solution is the centralised data utility.
Collaboration is key to the success of the central data utility (CDU); as with any technology offering, to provide efficiencies and therefore cost savings to the firms who use it. It is itself a collaboration between SmartStream and Euroclear Bank, blending Euroclear’s operational staff and global reach with SmartStream’s purpose built technology, research analysts and operations to import, cleanse, package and deliver data from any direct and indirect sources to its client base. In addition to offering strict guarantees to customers in the form of customised service level agreements (SLAs), it is specifically designed to address the situation by which all firms were receiving and managing broadly the same data sets without ever really getting the end results that they required. By centralising the operation the CDU does the heavy lifting just once and delivers the results to all customers at a fraction of the cost and at a vastly higher quality level. To achieve this it depends on the input and feedback from its entire client base, allowing all firms to benefit from this collaboration and pushing the quality of the data inexorably higher. Furthermore, the breadth and depth of the data covered—from legal entities to securities, corporate actions, prices, calendars and comprehensive identifier cross referencing— means not only a greater ability to apply rules thanks to the technology and staff’s ability to take a holistic view of all data coming into the platform, but also much greater flexibility in data sourcing with quicker and cheaper new feed implementation for customers.
So yes, it’s time to talk about data. There has never been a better opportunity for firms to stand together to replace costly and ineffective patches to a constantly growing problem with a single, effective cure. The CDU could be that cure, if the market is willing to work together, work with regulators and work with the platform to deliver effective results to strengthen the collective and benefit of all individuals.
As well as failing to solve the problem they were implemented to solve, these failures in governance have created a labyrinth of disparate databases, multiple master files and bewildering licences which are costing the industry millions of pounds a year in data fees, software licenses, maintenance contracts and fixed staff costs—and this is before one takes into account the even greater knock-on costs of regulatory compliance and repairing data-related downstream issues. As we, as an industry, attempt to move (tentatively) forward into the post-crunch brave new world, this is perhaps a good time to talk about data and its effective governance.
Many firms now have at least a nominal governance policy of sorts in place, but all too frequently these policies fall down in their application. Successfully aligning data sourcing and flow constraints with not only instrument lifecycles but also internal infrastructures across an entire organisation is a very complex undertaking. This complexity is exacerbated by notions—born of a less cost-conscious historical environment—that the way to solve data issues is to throw more sources of information together to create a ‘better’ golden copy and that all data can become a profit centre, in the form of proprietary rules and pricing routines.
The reality is rather simpler. Data, for the most part, is absolute and is a cost centre for firms. Implementing solutions that add to this cost only make cost reduction strategies—strategies that are meant to deliver economies of 40 percent or more for most firms—seem even further out of reach. Instead of launching more IT projects, firms are looking to apply meaningful metrics across elements of their data that they can affect, ie, coverage, timeliness, quality and cost. Imposing a governance policy that not only measures these metrics but enforces targets for their successful compliance leads to a tangible and results-driven framework, and because these elements are all linked, the aim is that an improvement in one delivers an improvement in the others. Implementing this governance policy means firms can take a step back from localised issues by accurately assessing their data resources, before implementing an elegant and efficient enterprise-wide policy that treats all data as part of a single ecosystem. This approach recognises the interdependencies between assets that have previously been viewed as microcosms—from legal entities to securities, corporate actions, derivatives, and prices.
To an extent the focus on firm-wide governance is being brought about by external forces as much as internal imperatives. Increased close supervision and more timely reporting requirements imposed by regulators in order to monitor and mitigate systemic risks are supported by a global drive towards greater transparency and accuracy. Initiatives such as: the US Dodd-Frank Act, Basel III, the Markets in Financial Instruments Directive (MiFID) review and The Foreign Account Tax Compliance Act (FATCA) are all putting greater reporting requirements onto firms, as well as helping to instigate industry change in their own right.
To take one example currently very much in the spotlight, the Legal Entity Identifier (LEI) project is being effectively mandated by legislation on both sides of the Atlantic and is a rare instance of a regulatory burden that is intended to not only reduce the market’s overall risk, but will also allow individual participants to free up much needed resources and capital—achieved by applying a freely available and relatively straightforward standard to an opaque and expensive problem.
The LEI is not, of course, the answer to all of the industry’s problems, nor is it going to be cost effective or easy to implement, but it is a step in the direction that it must take in order to not only survive but to prosper. This direction is towards increased standardisation and proactive collaboration between market participants, and to grease the wheels of this movement, a radical new solution is attracting interest. This solution is the centralised data utility.
Collaboration is key to the success of the central data utility (CDU); as with any technology offering, to provide efficiencies and therefore cost savings to the firms who use it. It is itself a collaboration between SmartStream and Euroclear Bank, blending Euroclear’s operational staff and global reach with SmartStream’s purpose built technology, research analysts and operations to import, cleanse, package and deliver data from any direct and indirect sources to its client base. In addition to offering strict guarantees to customers in the form of customised service level agreements (SLAs), it is specifically designed to address the situation by which all firms were receiving and managing broadly the same data sets without ever really getting the end results that they required. By centralising the operation the CDU does the heavy lifting just once and delivers the results to all customers at a fraction of the cost and at a vastly higher quality level. To achieve this it depends on the input and feedback from its entire client base, allowing all firms to benefit from this collaboration and pushing the quality of the data inexorably higher. Furthermore, the breadth and depth of the data covered—from legal entities to securities, corporate actions, prices, calendars and comprehensive identifier cross referencing— means not only a greater ability to apply rules thanks to the technology and staff’s ability to take a holistic view of all data coming into the platform, but also much greater flexibility in data sourcing with quicker and cheaper new feed implementation for customers.
So yes, it’s time to talk about data. There has never been a better opportunity for firms to stand together to replace costly and ineffective patches to a constantly growing problem with a single, effective cure. The CDU could be that cure, if the market is willing to work together, work with regulators and work with the platform to deliver effective results to strengthen the collective and benefit of all individuals.
NO FEE, NO RISK
100% ON RETURNS If you invest in only one asset servicing news source this year, make sure it is your free subscription to Asset Servicing Times
100% ON RETURNS If you invest in only one asset servicing news source this year, make sure it is your free subscription to Asset Servicing Times