Home   News   Features   Interviews   Magazine Archive   Industry Awards  
Subscribe
Securites Lending Times logo
Leading the Way

Global Asset Servicing News and Commentary
≔ Menu
Securites Lending Times logo
Leading the Way

Global Asset Servicing News and Commentary
News by section
Subscribe
⨂ Close
  1. Home
  2. Features
  3. A gold mine of data
Feature

A gold mine of data


05 August 2020

With the asset servicing industry consisting of an abundance of information, reference data is becoming increasingly important and is being made available to other beneficial users

Image: whitemocca/shutterstock.com
Data has a wide variety of use cases; it allows organisations to more effectively determine the cause of problems and it has the ability to allow organisations to visualise relationships between what is happening in different locations, departments, and systems.

Reference data is an important part of the asset servicing industry, however, challenges can occur when processes are manual. Technological intervention and automated processes are paving the way to create more efficiencies in this space.

Riccardo Lamanna, head of State Street Global Exchange, Europe, Middle East and Africa, explains that reference data is usually referred to as static information related to securities, counterparts, issuers, or a combination of those. Asset servicing industry participants are users of reference data in their usual processes, from fund administration to risk management to performance services. However, Lamanna suggests that the industry is “progressively becoming a very important vehicle through which reference data is being made available to other beneficial users, such as its clients, as well as becoming a provider of reference data itself”.

Whenever a trader is executing a trade, counterparty and security identifiers are required. Neill Vanlint, managing director of global sales and client operations at GoldenSource, adds that this counterparty and security information is what the industry means by reference data.

Vanlint says that for asset services, consistent data utilised across systems is “of paramount importance”. He also highlights that custody services would be “impossible to manage” without reference data.

Affirming its importance, Yann Bloch, vice president of product management at NeoXam, stresses that reference data is the foundation which the rest of the data management processes rely on.

Reference data encompasses generic data domains, such as currencies, exchanges, countries and business day calendars. In the context of asset servicing, it extends to securities terms and conditions, entities (or parties), portfolios, accounts, clients, products, etc.

Bloch notes that even though data does not change often, “it is growing in scope, volume, complexity and importance”.

The scope of data will continue to grow as financial institutions tend to diversify their assets. For example, moving into alternative assets comes with its share of new types of reference data to manage.

Bloch says: “With consolidation in the banking and asset management sectors, the volume of data has mechanically grown, in terms of the number of clients and products for example. Regulation, such as the Securities Financing Transactions Regulation (SFTR), the second Markets in Financial Instruments Directive (MiFID II) or the Fundamental Review of the Trading Book (FRTB), also mandates more complex data. Hence, dependable reference data is becoming more and more critical for regulatory compliance and business decision-making.”

A ripple effect

As industry experts have suggested, correct reference data is crucially important to businesses and it is critical to making sure data is as accurate as possible, but this is difficult to do without automated processes.Working in any large financial institution, there are multiple lines of business, such as consumer banking, corporate banking, asset management, asset servicing, and asset lending. Across these business units, you have to report on your balance sheet your assets, liabilities, counterparties, deposits, liquidity and capital positions.

Harry Chopra, chief client officer at Axiom SL, says: “You must have clean sets of data that reconcile across the multiple schedules. For instance, counterparty and securities exposures should line up on the balance sheet and individual reporting schedules. Also, all of these organisations are trying to price risk based on insight from this data – and if the data is wrong, there are huge consequences.”

Some of the main challenges arise when reference data is passed along through a manual process, either from the custodian to an investment manager or from an investment manager to a broker dealer, according to DTCC’s executive director, institutional trade processing product management, Bill Meenaghan.

Each step carries manual processing risk which can be eliminated by using an automated process, according to Meenaghan. He says: “Manual processes also don’t usually have any standards which may lead to a counterparty interpreting the data incorrectly, thus increasing the risk of trade failure.”

However, reference data management needs to be seen as an organisation-wide effort. Bloch highlights: “It cannot be the initiative of just one department, since reference data involves multiple IT, compliance and risk teams, as it is being sourced, transformed, cleansed, stored and consumed. A successful approach of reference data management works when people, processes and technology are aligned in delivering a well-identified objective.”

Protocols in place

When it comes to the type of protocols in place to enable secure data transfer between counterparties, AxiomSL’s Chopra says: “We took it upon ourselves to create a cloud-based service for our product, which took us a couple of years to figure out. We were granted ISO 20001 certification, which is difficult to get for providing a service, as well as ISO 27017. We found that you must have key management in the process.”

According to Chopra, this process works with a set of encrypted keys that allow you to make sure that when data is moving from one organisation to another, and accessing secured cloud facilities, it is encrypted and arrives at the correct end points. He notes: “That worked well, and we now have multiple financial institutions of asset managers and banks that use the service.”

The industry has been relying a lot on established protocols such as SWIFT or FIX; essentially, any types of files exchanged over Secure FTP servers are commonplace. But in the last few years, Bloch says NeoXam has observed an increased adoption of blockchain-based protocols, although he notes “it would be a stretch to say that we’ve reached the mainstream stage yet”.

Simplifying the process

While it is very difficult to find and agree on industry-wide solutions that would make reference data easier to manage for industry participants, Lamanna argues that data management solutions are an important factor.

Industry participants can outsource a significant part of their data management, and when linked to reference data, service providers can deliver solid security master file management, saving significant resources and effort in areas such as reconciliations, manual interventions to overwrite wrong information and classification, according to Lamanna.

“Clients can then focus on their core business processes: asset allocation, investment activity and distribution of products,” he adds.

In terms of how to simplify the process to make the data easier to manage, Lamanna notes that data management outsourcing is certainly a way to help at the level of each market participant.

However, Lamanna explains that this must be based on a solid platform that can easily operate with multiple providers which inject data, and use applications that consume and in turn enrich that data.

Weighing in on this, Meenaghan says: “Where possible, reference data should be electronically managed by the source in a centralised database. That database should enrich transaction systems directly.”

“Transaction systems should also agree with what place of settlement (PSET) should be used to ensure the correct standing settlement instructions is selected.”

Meanwhile, Vanlint comments: “The biggest favour that financial institutions can do themselves, to navigate the current compliance minefield, is to ensure that all their reference data is managed in one c entralised repository.”

The average firm has multiple different sources of data and this kind of disparity is impossible to sustain as “markets become increasingly complex and as merger and acquisition results in the need to fold one firm’s data architecture into another’s”, according to Vanlint.

Vanlint adds: “Firms that do not want to get caught out have to be confident that they can answer any challenges from the regulator or events in the market, so scrambling to compile piecemeal data is simply not an option. You need to know where all the necessary information is, which is why a central source is so important.”

Standardising data

Creating standardisation when it comes to reference data is something that the industry is moving towards, as it will help simplify the process. Chopra exclaims: “This idea of entities, industries, and sectors marks that we are making huge progress. Granted, there are three or four different standards but at least it’s not 50.”

However, there is still a little way to go on issuer relationships on the fixed income side of things, but Chopra notes that this is “not a massive problem”.

He says: “Trading is a little difficult because there are so many trading systems and so many exchanges, but it is a manageable problem.”

Also agreeing that a standardised approach could make life easier for market participants, State Street’s Lamanna cites: “Any effort aiming to standardise reference data structures will make it easier for market participants and data management providers to deliver consistent and significant data which eliminates duplication, inconsistencies and reconciliation activities.”

However, Lamanna cautions that data users will always face challenges from their internal applications and processes when integrating reference data until they decouple data from those processes in favour of a solid data management infrastructure.

Chopra concludes: “The asset servicing industry sits upon a wealth of data, including the legal entity structure, investor managers, and trading positions. There is a unique opportunity to help these organisations with their global shareholding disclosure ownership.”

“That would be a very good way to think about where the asset servicing industry could go next by helping their clients to manage this particular regulatory reporting expos
Next fearture →

A thriving market
NO FEE, NO RISK
100% ON RETURNS If you invest in only one asset servicing news source this year, make sure it is your free subscription to Asset Servicing Times
Advertisement
Subscribe today