Leaps of evolution
13 Nov 2024
Clelia Frondaroli explores the challenges and opportunities that lie ahead for financial institutions on the path to tokenisation
Image: dorin/stock.adobe.com
To tokenise or not to tokenise?
That is the question facing financial institutions today. As the industry enters the next phase of digitisation, adopting tokenisation is looking increasingly like a feasible direction to expand service offerings.
The idea is supported by the numbers as well. HSBC, in a study titled ‘The 10x Potential of Tokenisation’ conducted in 2020, predicted that tokenised markets could be worth over US$24 trillion by 2027.
Despite this, widespread adoption of tokenisation has been a slow and lengthy process, with many financial institutions and jurisdictions remaining hesitant to fully embrace tokenisation.
Why then, does the industry still grapple with uncertainty surrounding the implementation of tokenisation and what opportunities, or challenges, could it bring?
A widely known conundrum
Deborah Algeo, head of global enterprise at Zodia Custody, frames the current landscape of tokenisation as ‘exciting yet complex’. The complexity, she explains, lies in the regulatory challenges that currently place constraints on the running of pilot programmes for tokenised assets — tests that banks are hesitant to conduct for fear of breaching compliance requirements.
She continues: “Cross-border compliance further complicates adoption, as differing regulations across jurisdictions create barriers to global tokenisation efforts.” Alignment of regulatory framework, she emphasises, is therefore “paramount to give institutions the clarity needed to confidently engage in this space”.
As for the excitement?
Algeo eagerly highlights: “Tokenisation can significantly enhance the efficiency of financial transactions, particularly with instant cross-border settlements, and enables broader access to markets previously excluded from traditional finance.”
Listing a number of other benefits, such as the use of on-chain traceability to improve transparency and accountability — particularly against fraud and financial crimes — Algeo is clear about one thing: “[Tokenisation] will redefine how value is created, transferred, and recorded globally.”
Her enthusiasm is shared by Michele Curtoni, head of strategy at SIX Digital Exchange. He characterises tokenisation as a “transformative opportunity for capital markets,” where faster settlement processes, as well as reduced risks, are only some of the potential benefits he says tokenisation will be able to offer.
On the subject of widespread adoption, Curtoni suggests: “The key is to promote new models of asset creation and transaction into institutional-grade systems that are proven to work in financial systems today.”
However, “innovation,” he warns, “should not come at the expense of financial stability,” explaining that safeguarding mechanisms, such as securities depositories and regulated exchanges, offer a secure way to gradually adopt new forms of technology like tokenisation without the risk of financial imbalance or industry harm.
Curtoni also offers a different perspective towards the challenges facing the adoption of tokenisation. He understands it to be a “widely known conundrum of open innovation and incentives,” where in financial markets, “risk management is the bread and butter of its participants [and] first mover advantage is not always an advantage.”
He stresses that banks have undergone a multitude of discussions, experiments, and proof of concepts prior to even considering adoption, “so, first, one needs to be comfortable with the technology and, subsequently, the benefits [will] be drawn out.”
Therefore, the creation of “platforms that can enable players to innovate and find new use cases within their acceptable risk boundaries,” along with coordination and a rigorous set of standards, are key, Curtoni maintains, to combating current risk aversion and can play a pivotal role in encouraging financial institutions towards adoption.
Building bridges
So what does the industry need to prioritise — better regulatory guidelines or interoperability — in order to drive the expansion of tokenisation?
In the eyes of Curtoni, neither approach should be prioritised. Rather, he suggests: “Both interoperability and better regulatory guidelines are essential to the growth of tokenisation”.
He highlights that, if and when regulators and central banks adopt a unified approach to managing tokenisation projects, market participants will be more inclined to explore the benefits available.
The same applies for interoperability. Curtoni explains: “As markets move towards the blockchain, it will undoubtedly [require] a mixture of technologies that will constitute the best route forward. Creating bridges between traditional and digital exchanges enables this ongoing migration.”
Algeo proposes otherwise. Explaining her reasoning, she states: “Enhanced regulatory guidelines should be the priority,” where standardised rules lay the groundwork to provide institutions with the assurance they need to launch pilot programmes. Only once these frameworks have been established, Algeo argues, “can the focus shift to scaling and ensuring systems are interoperable”.
“Interoperability is key to creating a global, liquid market for tokenised assets, but it must build upon a strong regulatory foundation,” she advises firmly.
The path to democracy?
One incentive towards tokenisation has been the potentially democratising nature of tokens in regards to high-value assets. Fractionalisation, an integral part of tokenisation, is the process of dividing ownership of an asset into smaller ‘fragments’. In theory, this means that tokenisation can allow multiple people to own parts of a single high-value asset.
Algeo explains: “Fractionalisation is transforming investment by opening traditionally exclusive asset classes — like art, real estate, and luxury goods — to a wider range of investors.
Platforms such as RealT and Fraction allow investors to buy fractional shares in properties, lowering entry barriers and democratising access.”
She underscores her earlier point of tokenisation enabling broader access to markets, and calls to attention the very real possibility of tokenisation pulling down barriers for smaller investors.
Not all asset managers, however, believe this to be the case. Curtoni suggests: “A technology solution alone doesn’t automatically address regulatory constraints, investors’ eligibility requirements, or suitability standards.”
Despite this, he acknowledges the potential tokenisation has on equalising access to ‘luxury’ investments, stating: “Fractional tokenisation, when integrated with platforms (such as a central securities depository) that offer straight-through processing, can streamline the distribution of fractional holdings [and allow] wealth managers to efficiently allocate portions of high-value assets to a broader client base.”
Evolution or revolution
Both Algeo and Curtoni share the perspective of tokenisation as an evolutionary process. They highlight that a phased approach to integrating digital assets will allow institutions to adjust processes and infrastructure gradually, ensuring interoperability as the industry continues to bridge the divide between traditional and digital systems.
However, despite the gradual evolution of tokenisation, Algeo stresses: “The cumulative impact of tokenisation could be revolutionary.”
Understanding how tokenisation can set new standards for transparency, as well as enabling instant settlement and enhanced liquidity, she reaffirms that “[it] has the potential to reshape global financial markets.”
Curtoni agrees. “Success,” he concludes, “will be when tokenisation will not be a term anymore, but the norm in asset creation.”
That is the question facing financial institutions today. As the industry enters the next phase of digitisation, adopting tokenisation is looking increasingly like a feasible direction to expand service offerings.
The idea is supported by the numbers as well. HSBC, in a study titled ‘The 10x Potential of Tokenisation’ conducted in 2020, predicted that tokenised markets could be worth over US$24 trillion by 2027.
Despite this, widespread adoption of tokenisation has been a slow and lengthy process, with many financial institutions and jurisdictions remaining hesitant to fully embrace tokenisation.
Why then, does the industry still grapple with uncertainty surrounding the implementation of tokenisation and what opportunities, or challenges, could it bring?
A widely known conundrum
Deborah Algeo, head of global enterprise at Zodia Custody, frames the current landscape of tokenisation as ‘exciting yet complex’. The complexity, she explains, lies in the regulatory challenges that currently place constraints on the running of pilot programmes for tokenised assets — tests that banks are hesitant to conduct for fear of breaching compliance requirements.
She continues: “Cross-border compliance further complicates adoption, as differing regulations across jurisdictions create barriers to global tokenisation efforts.” Alignment of regulatory framework, she emphasises, is therefore “paramount to give institutions the clarity needed to confidently engage in this space”.
As for the excitement?
Algeo eagerly highlights: “Tokenisation can significantly enhance the efficiency of financial transactions, particularly with instant cross-border settlements, and enables broader access to markets previously excluded from traditional finance.”
Listing a number of other benefits, such as the use of on-chain traceability to improve transparency and accountability — particularly against fraud and financial crimes — Algeo is clear about one thing: “[Tokenisation] will redefine how value is created, transferred, and recorded globally.”
Her enthusiasm is shared by Michele Curtoni, head of strategy at SIX Digital Exchange. He characterises tokenisation as a “transformative opportunity for capital markets,” where faster settlement processes, as well as reduced risks, are only some of the potential benefits he says tokenisation will be able to offer.
On the subject of widespread adoption, Curtoni suggests: “The key is to promote new models of asset creation and transaction into institutional-grade systems that are proven to work in financial systems today.”
However, “innovation,” he warns, “should not come at the expense of financial stability,” explaining that safeguarding mechanisms, such as securities depositories and regulated exchanges, offer a secure way to gradually adopt new forms of technology like tokenisation without the risk of financial imbalance or industry harm.
Curtoni also offers a different perspective towards the challenges facing the adoption of tokenisation. He understands it to be a “widely known conundrum of open innovation and incentives,” where in financial markets, “risk management is the bread and butter of its participants [and] first mover advantage is not always an advantage.”
He stresses that banks have undergone a multitude of discussions, experiments, and proof of concepts prior to even considering adoption, “so, first, one needs to be comfortable with the technology and, subsequently, the benefits [will] be drawn out.”
Therefore, the creation of “platforms that can enable players to innovate and find new use cases within their acceptable risk boundaries,” along with coordination and a rigorous set of standards, are key, Curtoni maintains, to combating current risk aversion and can play a pivotal role in encouraging financial institutions towards adoption.
Building bridges
So what does the industry need to prioritise — better regulatory guidelines or interoperability — in order to drive the expansion of tokenisation?
In the eyes of Curtoni, neither approach should be prioritised. Rather, he suggests: “Both interoperability and better regulatory guidelines are essential to the growth of tokenisation”.
He highlights that, if and when regulators and central banks adopt a unified approach to managing tokenisation projects, market participants will be more inclined to explore the benefits available.
The same applies for interoperability. Curtoni explains: “As markets move towards the blockchain, it will undoubtedly [require] a mixture of technologies that will constitute the best route forward. Creating bridges between traditional and digital exchanges enables this ongoing migration.”
Algeo proposes otherwise. Explaining her reasoning, she states: “Enhanced regulatory guidelines should be the priority,” where standardised rules lay the groundwork to provide institutions with the assurance they need to launch pilot programmes. Only once these frameworks have been established, Algeo argues, “can the focus shift to scaling and ensuring systems are interoperable”.
“Interoperability is key to creating a global, liquid market for tokenised assets, but it must build upon a strong regulatory foundation,” she advises firmly.
The path to democracy?
One incentive towards tokenisation has been the potentially democratising nature of tokens in regards to high-value assets. Fractionalisation, an integral part of tokenisation, is the process of dividing ownership of an asset into smaller ‘fragments’. In theory, this means that tokenisation can allow multiple people to own parts of a single high-value asset.
Algeo explains: “Fractionalisation is transforming investment by opening traditionally exclusive asset classes — like art, real estate, and luxury goods — to a wider range of investors.
Platforms such as RealT and Fraction allow investors to buy fractional shares in properties, lowering entry barriers and democratising access.”
She underscores her earlier point of tokenisation enabling broader access to markets, and calls to attention the very real possibility of tokenisation pulling down barriers for smaller investors.
Not all asset managers, however, believe this to be the case. Curtoni suggests: “A technology solution alone doesn’t automatically address regulatory constraints, investors’ eligibility requirements, or suitability standards.”
Despite this, he acknowledges the potential tokenisation has on equalising access to ‘luxury’ investments, stating: “Fractional tokenisation, when integrated with platforms (such as a central securities depository) that offer straight-through processing, can streamline the distribution of fractional holdings [and allow] wealth managers to efficiently allocate portions of high-value assets to a broader client base.”
Evolution or revolution
Both Algeo and Curtoni share the perspective of tokenisation as an evolutionary process. They highlight that a phased approach to integrating digital assets will allow institutions to adjust processes and infrastructure gradually, ensuring interoperability as the industry continues to bridge the divide between traditional and digital systems.
However, despite the gradual evolution of tokenisation, Algeo stresses: “The cumulative impact of tokenisation could be revolutionary.”
Understanding how tokenisation can set new standards for transparency, as well as enabling instant settlement and enhanced liquidity, she reaffirms that “[it] has the potential to reshape global financial markets.”
Curtoni agrees. “Success,” he concludes, “will be when tokenisation will not be a term anymore, but the norm in asset creation.”
NO FEE, NO RISK
100% ON RETURNS If you invest in only one asset servicing news source this year, make sure it is your free subscription to Asset Servicing Times
100% ON RETURNS If you invest in only one asset servicing news source this year, make sure it is your free subscription to Asset Servicing Times