Return to site

Asset tokenization: what, why and when? A primer on the technological disruption of capital markets

Written by Elliot Hentov, Head of Macro Policy Research at State Street Global Advisors

August 14, 2024

The term ‘asset tokenization’ is in great fashion but used liberally and across contexts. In this regard, it would be helpful consider an overview of asset tokenization across the main pillars of global capital markets. The transition from existing market structures to a world where the representation of financial instruments is digitally native (in token form) is a major step, but not equally impactful across all asset classes. Hence, it is worthwhile to highlight which asset classes are likely to undergo transition earlier or later; where the transition will meaningfully transform the market structure by changing supply/demand dynamics; and where future challenges or limits lie for asset tokenization.

Asset tokenization across time and asset classes

Fig. 1 illustrates what that transition from the status quo may look like. We conclude that mass adoption will lead in bonds (incl. money market funds), commodities and private equity funds, with the most transformative impact probably on private equity funds and the bond industry in the nearer term. The prospect of tokenization and fractional ownership of illiquid individual private markets would be revolutionary, but outside the fund structure is likely to be very far in the future.

Fig. 1: Relative assessment of time to mass adoption and transformative impact on market

Tokenization is a big deal, but rollout and adoption will be slow

The appeal of asset tokenization is tantalising and the advantages over existing market operations can appear obvious. The most compelling benefits are improvements to the operation of markets that should make them faster, cheaper, more transparent and more accessible. These include faster settlement, in some cases ‘atomic’, i.e. the transfer of security and payment happen simultaneously. This post-trade automation would do away with multi-day settlement times and enable 24/7 trading which, in some markets, could improve liquidity by bringing in additional supply and demand. Similarly, the shift to technological solutions and smart contracting implies fewer intermediaries and less fees spent on their services. The unique feature of blockchain technology to permanently record transactions could also greatly improve market transparency. And finally, for certain financial instruments, smart contracts allow for automation and reduction in operational complexity by embedding certain corporate actions, such as scheduled coupon or dividend payments, into the token itself. All of the above is attainable but requires time to materialise. This is because most of the efficiencies only come to fruition once the entire ecosystem is built out. It is ultimately a network innovation which requires all parts of the network to modernise, including issuance, trading, custody, etc. For this to occur, investors and providers need regulatory clarity and standardised norms. There are also technological challenges, such as cybersecurity, network protection, scalability and interoperability among different blockchains. These may all seem daunting but are eminently solvable - again, it just will take time. And of course, investors will need to gain comfort with the purchase, trading and usage of tokenized assets, which requires operational and governance changes in their organisations too. Disrupted intermediaries are likely to exaggerate all these challenges, which will delay broad based adoption, but will not ultimately prevent it given the compelling benefits.

Systemic implications

Of course, there are caveats to the benefits of tokenization, especially that they do not apply equally to all asset classes and markets. In fact, they are greatest where current market inefficiencies are large. In some asset classes, such as equities, the current digital representations of shareholder ownership work relatively well and, while tokenization offers an efficiency gain, it is much smaller than for bond or private markets.

· lower barriers to entry

Asset tokenization is therefore likely to lower barriers to entry both for issuers as well as investors, and it will do so most meaningfully in markets where operational inefficiencies have the greatest impact on market participation. The macro consequences could be greater efficiency in capital allocation across an economy, with access to capital drawing in or amplifying smaller economic actors. In those instances, notably in bond and private markets, the technological and market structure shifts associated with tokenization could also open up new economic models. Wide swathes of assets could be bundled in new forms of securitisation, leading to the expansion of capital market products. For example, tokenized bonds will transform the repo market, allowing deal maturities in hours or even minutes, whereas overnight is the shortest time period today. Similarly, the programmability of securities assets would allow for new links and less reliance on intermediaries, thus creating new opportunities.

· fungibility of assets

Tokenization overall will also enable a greater fungibility across asset classes and markets, facilitating new economic linkages and businesses. In many ways, tokenization is just another technological leap toward more integrated capital markets, further simplifying cross-asset allocation and trading. This has important implications for how capital markets function and the scope of systemic financial linkages. In this context, the tokenization of private markets - initially just as tokenized private equity, real estate or private credit funds - will further correlate them with public markets, accelerating a trend that has been occurring in recent decades. Tokenization should make access to each asset class easier, lower the barriers to entry and simplify trading. These features should make global asset allocation nimbler, with reduced friction and costs when shifting assets between markets, asset classes and geographies.

· integration with AI

Finally, the arrival of tokenization - converting financial assets into programmable code - aligns with the rise of AI and its integration into finance. As the industry builds AI into its various processes, a rising share of portfolios will be constituted by digital assets with built-in automative programming. In plain English, AI-guided portfolio management will have machine readable digital assets as a counterpart, allowing computers and software to drive the less strategic parts of portfolio management and trading. While this clearly poses new forms of systemic risks, the potential productivity gains appear enormous, especially as they could reduce the costliest, least productive work within the financial industry. There are a number of questions for us to ponder. How will financial markets transition to tokenized assets, and how will this shift affect supply and demand dynamics? What impact will increased asset fungibility have on global capital allocation and trading? What are the potential risks and productivity gains from AI-guided portfolio management with tokenized assets? Traditional financial institutions will need to adapt their operations and governance to embrace tokenization, but how will they support investors and so make them comfortable with tokenized assets to ensure adoption?

When considering the long-term vision, one might ponder what the financial ecosystem will look like once all parts of the network are modernized to support tokenization and how it will reshape global capital markets in terms of efficiency, transparency, and accessibility. These questions aim to provoke thought and discussion on the multifaceted aspects of asset tokenization, its potential benefits, challenges and long-term implications for the financial industry. But it is difficult to see how tokenization will be stopped it seems a matter of when not if financial markets will embrace asset tokenization. State Street will be publishing a paper later this year which will be examining some of the points raised in this article in greater detail.

 

This article first appeared in Digital Bytes (13th of August, 2024), a weekly newsletter by Jonny Fry of Team Blockchain.