Home   News   Features   Interviews   Magazine Archive   Industry Awards  
Subscribe
Securites Lending Times logo
Leading the Way

Global Asset Servicing News and Commentary
≔ Menu
Securites Lending Times logo
Leading the Way

Global Asset Servicing News and Commentary
News by section
Subscribe
⨂ Close
  1. Home
  2. Features
  3. The lifeblood of financial markets
Feature

The lifeblood of financial markets


23 Jun 2021

Reference data is the lifeblood of financial markets, and industry participants are looking to standardise, enhance and improve its quality for now and in the future

Image: digilife/stock.adobe.com
In the asset servicing industry, reference data refers to the data used to support a transaction. This information provides details about the instrument being traded, the entities involved and the transaction detail particulars.

Data can be either static or dynamic. Experts say that static data is any data that does not change over the course of the transaction. This can include the information above, together with details about the financial product and terms (for bonds).

Meanwhile, dynamic data is any data that can change throughout the lifecycle of the transaction such as pricing, exchange rates, interest rates and credit ratings.

Sarah Carver, head of digital at Delta Capita, explains: “Reference data covers a wide variety of different information that is relevant to all financial transactions and includes the specific data for each individual type of asset class, this means that reference data is relevant for all asset classes.”

For example, equity transactions require details of market makers and pricing, whereas bond transactions require details of the coupon rate and term.

Martijn Groot, vice president of strategy at Alveo, stipulates: “Reference data is a significant spend for financial services firms as they buy it from various data providers, employ staff to manage it and check its quality and often store it multiple times.”

With this in mind, reference data plays a vital role in the industry and participants are looking to enhance, standardise and improve the quality of reference data.

The lifeblood of the financial markets

Reference data can benefit the market in a number of ways and it is growing in scope, volume, complexity and importance.

“Reference data is the lifeblood of financial markets and describes the environment in which transactions take place,” affirms Groot.

As the scope of data grows, experts suggest financial institutions need to diversify their assets. For example, moving into alternative assets comes with its share of new types of reference data to manage.

“With consolidation in the banking and asset management sectors, the volume of data has mechanically grown, in terms of the number of clients and products for example,” comments Bloch.

Regulations such as the Securities Financing Transactions Regulation (SFTR), the second Markets in Financial Instruments Directive (MiFID II), and the Fundamental Review of the Trading Book (FRTB) require complex data.

This is why dependable reference data is becoming critical for regulatory compliance and business decision-making.

According to Carver, as reference data contains information about the instrument and connected parties in the transaction, the correctness and completeness of this data assists in the smooth flowing execution of trades through the industry.

However, when this data is incorrect or missing, delays in settlement or a transaction failing can occur as a result.

Indeed, one of the key challenges of reference data is the number and variety of different sources of information, which can lead to errors.

Due to the importance of reference data and the benefits, it can have on the markets, many participants are looking to mitigate some of these challenges by simplifying the reference data management process.

Reference data management is complex because of the number of terms products can have (from a few to many thousands), the variety of execution venues and tracking of what products can be sold to whom.

Additionally, there are different identifications schemes used for products which mean firms often need to cross-reference.

Groot explains that the problem has been exacerbated because many firms have historically managed reference data in silos, i.e. they have kept multiple copies in standalone databases or in applications.

This has meant they often buy data multiple times and there can be discrepancies from store to store leading to uncertainties. More importantly, Groot highlights that storing data multiple times means the cost of change to cope with new requirements is high. Regulators increasingly require more info on transactions and put demands on the data sourcing and preparation processes too.

“Streamlining the acquisition of data to source it once, cross-referencing and putting it into a common format before distributing it to end users and business applications will reduce existing operational cost but also prepare the business to better handle future requirements,” Groot says.

“Having in place, clear and robust data management processes ensures that you have confidence in the reference data that you are relying on to complete transactions,” says Carver.

Carver highlights that using technology to assist in the management of this data is important, but so is having the right individuals in place, usually a data analyst and/or data engineer who can take ownership of this reference data.

Similarly, NeoXam’s Bloch suggests using proven tools, such as enterprise data management platforms, with robust data models and the right blend between built-in best practices and flexibility to adapt, can streamline the overall process.

Speaking the same language

As well as simplifying the process, In order to more effectively reap the benefits of reference data, industry participants are looking to improve the quality of it and standardise it too.

Reference data is meant to be exchanged between all the parties in a buy/sell trade, between a financial institution and its clients, or between a firm and its regulators.

Bloch notes that when exchanging information it is important to speak the same language, and that is why the international securities identification number (ISIN), market identifier code (MIC) or legal entity identifier (LEI) has been introduced.

“Regulation will typically mandate the use of these standards, and contribute to their industry-wide adoption, as is the case for LEI, which has been introduced by the G20 after the 2008 crisis, and further enforced through MiFID II and now SFTR,” Bloch states.

Reinforcing the idea that regulators are taking an interest in the reference data used by companies to complete financial transactions, Carver notes that there is a clear emphasis from regulators that financial transactions be properly monitored, counterparties and entities are correctly identified, and information is clear and robust throughout the lifecycle of the transaction.

Having incomplete or false data can lead to a loss in capital and adverse effects for clients, requiring restitution to be made and both time and cost to the company. When it comes to improving the quality of the data, experts say this can be done by minimising the number of data acquisition channels and databases.

According to Groot, having an operations group overlooking this and using data management technology to compare data sources, signal discrepancies, monitor quality levels and track data usage will help a firm make the most of the data it buys.

Bloch explains that proper data management and governance is what ensures that banks, as well as other financial institutions, can trust their reference data.

Through well-defined, auditable processes, specialized software platforms for Enterprise Data Management, firms can build their Single Point Of Truth, for reference data and market data, which will be used enterprise-wide.

“Failing to have these data management and governance in place may lead to reporting errors, uninformed decisions. It often also leads to more workload to avoid these bad outcomes, when teams research and fix each and every data point on a report, simply because they didn’t trust the reference data sources it came from,” states Bloch.

Meanwhile, new technologies, fintechs and partnerships are making a significant impact on managing the world of reference data.

Just over the past few months, a number of partnerships have been made in this space. For example, in April, Alveo, a market data integration and analytics solutions provider for financial services, partnered with Upskills, a Murex consultant for the financial markets, to address market data aggregation, quality management and analytics challenges.

The partnership will include improving the data quality of reference data, valuation data and risk factor data fed into Murex and other trading and risk systems.

This same month NeoXam signed a strategic agreement with Market Data Professionals (MDP), experts in market data management. Speaking at the time of the announcement, NeoXam said its DataHub platform enables financial institutions to better understand and tackle a wider breadth of market and reference data challenges such as data cost, multi-vendor strategies, reconciling various sources and service level agreement management.

Carver identifies that new technology and fintechs provide on-demand or bulk data requests in a single source. This is because they are amalgamating the data from many sources including alternative datasets to increase the richness of information.

“They then provide the data validation checking using SixSigma techniques, which track the accuracy, timeliness, and completeness of the data,” Carver concludes.
← Previous fearture

Data is everything
Next fearture →

Going green
NO FEE, NO RISK
100% ON RETURNS If you invest in only one asset servicing news source this year, make sure it is your free subscription to Asset Servicing Times
Advertisement
Subscribe today