Home   News   Features   Interviews   Magazine Archive   Industry Awards  
Subscribe
Securites Lending Times logo
Leading the Way

Global Asset Servicing News and Commentary
≔ Menu
Securites Lending Times logo
Leading the Way

Global Asset Servicing News and Commentary
News by section
Subscribe
⨂ Close
  1. Home
  2. Features
  3. Balancing the books
Feature

Balancing the books


25 May 2022

Brian Bollen looks at the current state of reference data for asset servicers, the problems presented by legacy systems, and the opportunities offered by the growing popularity of cloud services

Image: anttoniart/stock.adobe.com
“It’s not like it was in my day.” Take a glance at the international financial markets of mid-2022, and we might as well be inhabiting an alternative universe — at least, that is the way it seems to this child of the 1960s.

In 2022, the authorisation and execution of a single clean payment, or using Uber to get home from an office function, will likely involve exposure to more data exploitation and data gathering in a micro-second or two than was routinely experienced in any number of earlier lifetimes combined.

However, for asset servicing there is a melee of concepts to grasp — everything from exchange data and prices data, to securities description data and yield curve data to complex risk calculations. Not to mention data to assist with the pricing of complex derivatives and settled trades.

These are just a few examples cited by Jeremy Katzeff, head of buy-side solutions at GoldenSource, who joined the company in January 2021, charged with accelerating business growth on the buy-side as investment managers extend their purchases of data.

He previously worked at BlackRock, JP Morgan’s Asset Management bank and Broadridge, giving him an extensive background in the demand for reference data and the uses to which it can be put.

“The scope of reference data in financial services is expanding well beyond the core, into alternative data, such as cell phone location data and ESG data, much of the latter goes beyond the original scope, to include carbon footprint and gender diversity data.”

“The amount of data consumed on the buy-side has increased exponentially in recent years,” he adds. “That poses its own questions: How do you make sure the data you are planning to use is clean? What data can you trust? And, what data can you even understand?”

Which data providers are good, and not so good, he might have said, but diplomatically did not. Katzeff did, though, avail himself of the opportunity to remind readers that the answers come at a cost, but with the rise of cloud computing data infrastructure, costs have fallen dramatically, enabling market participants to better manage their data within their budgets.

Modern job descriptions for those in the front office at financial institutions specify an understanding of computer sciences unthought of until even relatively recently. Some aspiring market traders might find the new heightened demands overwhelming. Some practising portfolio managers might find themselves falling short of that level were they applying for a position today.

Reference data is an area that is still ripe for improvement, states Pardeep Cassells, head of securities and claims at AccessFintech. “Storage, access, maintenance and transparency are all factors that could be enhanced,” she adds.

“The last few years have seen various changes that have simultaneously attempted to introduce best practice static data; an example is the enforcement to use legal entity identifiers. The biggest challenge, though, surely continues to be standing settlement instructions (SSIs), which — errant, incorrect, outdated — are still a leading cause of failed securities trades across all markets.”

To treat this problem, a key future need is the implementation of an aggregation layer, enabling all organisations to store and manage SSIs through their chosen platform whilst ensuring a level of normalisation and translation, Cassells outlines.

She goes on to say: “More generally, data best practice needs to be defined and enforced. Data dictionaries, normalised values, and ease of review of data fields across counterparties may also be welcome steps towards enhancing static data quality and smoothing out some of the obstacles that will trip up organisations as the market steamrolls towards T+1 settlement.”

ESG data

What will be the continuing important elements in the industry’s short-, medium- and long-term future? GoldenSource’s Katzeff highlights data warehousing, industry consolidation and regulatory change are such elements.

A recent addition to GoldenSource’s capabilities and data sources is its ESG Impact solution, promising greater access to raw ESG data, more nuanced portfolio screening and tuning, advanced analytics features including greenhouse gas scenarios, and further support for the European ESG Template (EET).

The official entry describes how the new capabilities offer enhanced analytics on top of the ESG data mastering, aggregation, standardisation and storage features of GoldenSource ESG Impact.

This includes the addition of ‘what if’ analytics, meaning that users will be able to analyse specific outcomes of potential investment shifts, such as swapping out or adding investments and the implications this would have for the overall portfolio ESG score.

Topic tagging and template tagging simplify the generation of the EET for a firm’s products and the selection of products that fit a specific EET preference.

Based on tagging, GoldenSource also mimics the Sustainable Financial Disclosure Regulation Principal Adverse Impacts template, accelerating compliance with disclosure requirements.

Alireza Dorfard, head of market data and services at Deutsche Börse, highlights the supercriticality of data to the industry, with a special emphasis on transparency, a key consideration in the development of a new product launched in early May.

Bond Liquidity Data, based on Clearstream’s range of post-trade services, will improve the ability of traders, managers and others to value fixed-income portfolios, measure risk, assess liquidity and to decide whether their operating models are optimal, says the German bank.

“The service is based on aggregating anonymised Clearstream data to provide a view that would not otherwise be available,” says Dorfard, while conceding that Euroclear already offers a similar product. “We might have arrived at the party a bit later, but our new presence completes the picture,” he adds.

Deutsche Börse and its post-trade service provider Clearstream will offer transparency and insights into the highly fragmented fixed-income marketplace.

Leveraging aggregated data of settlement instructions for international securities from Clearstream’s international central securities depository, Bond Liquidity Data enables investors to fairly evaluate fixed-income assets and portfolios, measure liquidity and systemic risk of bond issuers and estimate execution prices for secondary trades.

“European trading in fixed-income instruments is highly fragmented and non-transparent, as most transactions are over-the-counter and different data is stored with different market participants.”

“Our customers have, therefore, a strong demand for high-quality data on fixed-income instrument prices and volumes. With this new offering, we further support them in making an informed assessment and decision,” outlines Dorfard.

Technology issues

In the early 1960s in central Scotland, data was the weekly family grocery order, written down in the tatty, lined notebook supplied by the local Co-operative store.

The data was supplemented by remembrance of the family’s Co-op number, essential for the claiming and payment of the mythical dividend. The number, for those interested, is deeply ingrained — 31114.

This paper-based memory sits light-years away from the complex technology issues currently facing the international financial market surrounding their data.

Gary West, general manager, Europe, Middle East and Africa at data warehouse specialist Yellowbrick, dissects the problems presented by legacy systems and the opportunities offered by the cloud in an online blog — of which Yellowbrick gave permission for Asset Servicing Times to cite.

“Every financial institution I talk to is reconsidering some aspect of their data infrastructure. Renewal is of course an ever-present feature of the financial technology landscape with firms racing to compete,” West indicates.

“For many, the journey to becoming a data-driven enterprise is more a marathon than a sprint. As they explore new opportunities for leveraging their substantial data inventories, firms are finding hurdles and bottlenecks in the form of data quality issues and integration challenges.

He highlights: “Often, the fragmented nature of the institution — the legacy of relentless mergers and acquisitions activity in financial services — means different operational functions have different requirements from the same datasets.”

“This is hindering firms’ ability to derive true insight from their data sets, in the form of advanced analytics that can drive new innovations, new products and new areas of business.”

As a hint of what is yet to come, this bears a striking resemblance to prognostications being dispensed a decade or two ago, suggesting that little has changed, other than the scale and pace of underlying developments.

For some readers, this will surely reinforce the notion that human behaviour and the consequences thereof are immutable.

At a time where the data becomes more difficult to decipher than the simplistic five-digits of 31114, the technology might change, but not the carbon-based life forms operating it.
← Previous fearture

Tools, not technology
Next fearture →

The global impact of T+1
NO FEE, NO RISK
100% ON RETURNS If you invest in only one asset servicing news source this year, make sure it is your free subscription to Asset Servicing Times
Advertisement
Subscribe today