News by sections
ESG

News by region
Issue archives
Archive section
Multimedia
Videos
Search site
Features
Interviews
Country profiles
Generic business image for editors pick article feature Image: Shutterstock

04 April 2018

Share this article





The heart of the operation

Data is a strategic asset for financial services companies and banks. With the introduction of big regulations to the financial services industry this year, reference data is now even more important as firms must be able to quickly process customer requests, identify holdings and positions, assess and adjust risk levels, maximise operational efficiency, and control and optimise capital—all while remaining compliant with regulatory requirements. Because of this the data management industry is looking to standardise, add reliability and gain efficiencies in the way it manages data.
Financial institutions rely on reference data to drive enterprise applications from trading, settlement, accounting and reporting.

Tim Lind, managing director of DTCC Data Services, explains: “Reference data standards are critical to enable disparate enterprise applications to commonly identify transactions and entities through the lifecycle of events that occur in the investment process.”

Reference data can include commercially available data, such as security static data, prices, corporate actions, as well as proprietary data, such as structure.

According to Yann Bloch, vice president of product management at NeoXam, reference data is important in any industry, but in asset servicing, reference data is really at the heart of operations.
Bloch says: “We are not talking about a ‘big data’ approach here. What matters is precision, audit and traceability, because asset servicers are accountable for the data that they use.” He explains that putting in place correct data governance is a “crucial step” with data ownership and stewardship roles, as well as quality management. For firms to be able to implement such a programme, it needs to have the right tools in place. Bloch suggests an organisation should have an enterprise data management system, where such policies can be implemented.

The implementation of the second Markets in Financial Instruments Directive (MiFID II) in January turned the spotlight onto reference data. As part of the reporting process under the directive, local regulators demand an increased level of transparency. Firms, who operate trading venues or systematic internalisers, are now required to report reference data for all instruments traded or admitted to trading on these venues.

Rob Kirby, senior product manager at SIX, explains that with regulations such as MiFID II and PRIIPs now in play, more information needs to be exchanged between market players in a completely new way.

According to Kirby, manufacturers of financial products need to think in detail about how they will distribute data and documents, and how they can receive sales information outside their target market.

He states: “For many, the temptation is to scramble for minimum viable compliance and do whatever it takes to keep the regulator happy. While this seems like a sensible option now, it’s unlikely to service future requirements and actually goes against the spirit of the regulations.”
In order to achieve sustainable long-term compliance, Kirby suggests that firms cannot afford to keep adding to the vast array of information, which is already housed across multiple systems, every time a new rule is enforced.

He adds: “After all, regardless of the rule in question, most of them require similar sets of data. Instead, industry participants need to clean up the siloed information scattered across the business and consolidate their approach.”

The financial industry has to deal with data from numerous sources, including vendors, clients, partners as well as internally produced, so when it comes to simplifying the reference data management process, it depends on the maturity of each company’s reference data management practice.

Bloch suggests that the process of bringing them all together (normalisation) into a common representation is one of the main challenges of reference data management.
He explains: “When choosing a reference data management tool, the capabilities surrounding normalisation into a common data model, ability to reconcile various sources and build a ‘golden copy’ should be thoroughly analysed.”

According to Lind, some firms are more mature than others in their attempt to streamline and centralise the management of reference data across the enterprise. However, Lind suggests that industry standards play a “significant role”.

He explains: “Much of the inefficiency of reference data management is the ongoing calibrations needed to “re-interpret” and/or “map” data across disparate systems both internally across an enterprise and externally between enterprises. Having standards that help industry participants bridge legacy taxonomies to standard taxonomies that only change at the margin will serve to simplify the process.”

Also weighing in, Tom Stock, senior vice president of product management at GoldenSource, says: “The biggest favour that financial institutions can do themselves, to navigate the current compliance minefield, is to ensure that all their reference data is managed in one centralised repository.”

According to Stock, the average firm has multiple different sources of data and this kind of disparity is “impossible to sustain post-MiFID II”.

He adds: “Firms that don’t want to get caught out have to be confident that they can answer any challenges from the regulator, so scrambling for data is simply not an option. You need to know where all the necessary information is, which is why a central source is so important.”

Now MiFID II regulation has come into play, the financial services industry is going to be more accountable than ever to the regulator. Stock suggests that it is “vital” for firms to make sure they can show they have achieved best execution.

He says: “The biggest barrier to this is the disparate (and often multiple) systems, which can operate in isolation throughout an organisation and use inconsistent identifiers and classifications, causing confusion and breaking the smooth flow of trade and transparency reporting.”
According to Stock, the centralisation of data ensures that all the necessary information is brought together and linked, so that “if the regulator comes knocking, you won’t be left searching for answers”.

The data industry has tried over the years to come up with various standards and interoperability initiatives to help the reference data management process by providing automated way to reconcile data.

However, with an expanding regulatory environment, asset servicers need to work with new types of data, which have not yet been commoditised in the same way as plain security reference data.
Bloch explains that for these new data domains, each company needs to come up with internal standards, and have the tools and methods in place to map these diverse source data to its standards.

He says: “Failing to do so will result in an exponential increase in manual processes and costs associated, not to speak about operational risk.”

The implementation of standards create community between banks to converge on identifying better ways for systems to communicate, according to Lind.

He suggests: “It helps identify gaps in the data as new and more exotic financial instruments enter the market. In a world where trading margins are being put under pressure, reference data standards allow enterprise applications to integrate data more effectively for transaction processing and improve vital functions, such as the ability to aggregate risk and exposure.”

Kirby also agrees that embracing a standardised, more scalable data service to enable firms to extract the reference and pricing information needed for each regulation is “an obvious next step”.
The crossover between MiFID II and the recently implemented PRIIPs is a prime case in point, Kirby suggests. He says: “A lot of the data market participants are currently distributing for MiFID II is already reflected under PRIIPs.”

According to Kirby, the natural solution is to look at the role for mutualised approaches to the challenge, through which key industry players can bring the required information together. He concluded: “This is why firms grappling with this regulatory onslaught should be challenging their data partners to provide exactly what they need right across the business, in the form they need it. After all, the age of ready-to-consume data has very much arrived, so there should be nothing stopping financial institutions from using it.”

Advertisement
Get in touch
News
More sections
Black Knight Media