Original title: (So it is! What is tokenization and how to tokenize according to the Vice President of the Bank for International Settlements (BIS)?)

Author: Zhang Feng

At the Singapore FinTech Festival in November 2025, Andréa M. Maechler, Vice President of the Bank for International Settlements (BIS) and acting head of the BIS Innovation Hub, delivered a thought-provoking speech focusing on the key trend of 'tokenization' that is reshaping the global financial system.

In her speech, she not only clarified the definition and mechanism of tokenization but also revealed how it drives innovation in payment systems and fosters new business models through programmable platforms. We believe her speech also provides an important framework for regulators to consider.

 

 

I. What is tokenization? From static records to dynamically programmable assets

 

Tokenization, in short, is the process of transforming the static ownership records of financial assets—such as deposits, bonds, and bills—into verifiable, transferable, and programmable digital tokens. This process is not simply digitization, but rather, through blockchain or distributed ledger technology, it endows assets with a digital identity in the form of "tokens," giving them divisible, traceable, and programmable attributes.

 

In the traditional financial system, asset records are often centralized, isolated, and outdated. For example, a cross-border payment may require multiple banks and clearing systems, going through several stages such as message transmission, reconciliation, and settlement, taking several days and incurring high costs. Tokenization, on the other hand, creates digital tokens that correspond one-to-one with the underlying assets and are cryptographically secured, enabling assets to flow and interact in real time on an open, programmable platform.

 

Programmability is one of the core characteristics of tokenization. This means that tokens can not only represent value but also embed smart contracts—automatically executed code logic. For example, tokens can be programmed to automatically pay interest at specific times or automatically transfer ownership when certain conditions are met. This programmability brings unprecedented automation and precision to financial transactions.

 

II. How to Tokenize? Mechanisms, Platforms, and Typical Cases

 

The implementation of tokenization is not a simple process; it relies on several key elements: asset on-chaining, token minting, platform integration, and a compliance framework. In his speech, Mehler vividly illustrated the practical path of tokenization in cross-border payments using the "Agorá Project" from the BIS Innovation Center as an example.

 

(a) Technical Implementation Path of Tokenization

 

Asset identification and anchoring. First, it's necessary to identify the tokenized asset and establish a reliable, auditable correspondence between it and the digital token. This is typically ensured through a combination of legal agreements and technical credentials.

 

Token minting and issuance. Issuing tokens representing the asset on a qualified programmable platform, such as a permissioned blockchain. These tokens must comply with applicable financial and securities regulations.

 

Platform integration and interaction. Tokens must run on a platform that supports smart contracts to achieve interoperability with other tokens, payment systems, traditional ledgers, etc.

 

Automated clearing and settlement. By using smart contracts to link transaction instructions, asset transfers, and fund settlements, "atomic settlement" is achieved—meaning all steps either succeed or fail simultaneously, eliminating counterparty risk.

 

(II) Project Agorá: How Tokenization is Reshaping Cross-Border Payments

 

Project Agorá is a flagship initiative spearheaded by the BIS in conjunction with seven central banks and over 40 financial institutions. This project integrates tokenized deposits (digital representations of commercial bank deposits) and tokenized reserves (digital representations of central bank currency) onto a single programmable platform. In cross-border payment scenarios, the payer's tokenized deposits and the payee's tokenized deposits can be instantly exchanged via smart contracts, with final settlement completed in real-time using central bank currency—the entire process compressed into a single atomic operation.

 

This approach significantly reduces delays, costs, and risks associated with cross-border payments, while enhancing transparency and traceability. Mehler points out that such experiments provide a crucial technological and governance model for future large-scale applications.

 

III. The Value of Tokenization: Efficiency Improvement and the Emergence of New Business Models

 

The reason why tokenization is highly valued by international institutions such as the BIS is precisely because it can create value in multiple dimensions:

 

(a) Efficiency improvement and cost reduction

 

In traditional financial transactions, reconciliation, clearing, and settlement involve a significant amount of manual and intermediary work. Tokenization, through automation and atomic settlement, drastically reduces processing time and lowers operational risks and compliance costs. Mehler emphasizes that, especially in a cross-border context, this efficiency improvement can bring "enormous systemic benefits."

 

(II) New Business Models and Financial Products

 

Tokenization is opening up a range of unprecedented application scenarios:

 

Tokenization of the bond market. The global government bond market is worth approximately $80 trillion. Tokenization can automate the entire process of issuance, trading, interest payment, and redemption, improving liquidity and lowering barriers to entry.

 

Artificial intelligence and IoT payments. Programmable tokens can be combined with AI agents to enable real-time, high-frequency, small-amount payments between machines (such as automatic charging payments for electric vehicles) or to automatically execute invoice settlements in trade finance.

 

Digitalization of traditional tools. For example, the "Commitment Project" in partnership with the BIS and the World Bank aims to tokenize paper promissory notes used by governments to contribute capital to multilateral development banks, thereby improving the efficiency and transparency of fund disbursement.

 

(III) Financial Inclusivity and Market Integrity

 

Tokenization can also reach areas underserved by traditional finance. By reducing transaction costs and increasing credibility, it can make it easier for SMEs and individual investors to participate in global financial markets, while enhancing anti-money laundering and anti-corruption capabilities through traceability.

 

IV. The Profound Significance of the BIS Vice President's Statement: Logic, Operation, Authenticity Verification, and Regulation

 

Mehler's speech was not only a description of technological trends, but also implied a complete cognitive framework for tokenization:

 

She reveals the underlying logic of tokenization. She clearly points out that the essence of tokenization is to restructure financial processes through programmability and composability. This involves not only technological upgrades but also a systematic reflection on the role of financial intermediaries, the form of currency, and contract execution methods.

 

This section clarifies the operational path for asset tokenization. As seen in projects like Agorá, successful tokenization currently requires central bank currency as the ultimate settlement asset to ensure a foundation of trust; it must be implemented on a regulated, interoperable platform; and it must emphasize integration with traditional systems to avoid fragmentation.

 

This paper proposes a mechanism for verifying the authenticity and quality of tokenized assets. Tokenization does not automatically solve the trust problem. Mehler suggests that the authenticity and quality of tokens must be ensured through: sound legal statements and asset endorsement; transparent issuance and redemption mechanisms; independent auditing and on-chain verification tools; and regulatory oversight of issuers and platforms.

 

This provides a reference for regulating tokenized assets. She noted that recent regulatory progress in the field of tokenized currencies (such as stablecoins) in various countries has provided a legal basis for broader asset tokenization. Regulation should focus on: clarifying the legal attributes of tokens and investor protection; preventing fragmentation and systemic risks; and encouraging cross-jurisdictional cooperation and standardization.

 

V. Challenges and Future Prospects

 

Despite the strong momentum of tokenization, Mehler also soberly points out that the transformation is still in its early stages. The actual deployment of tokenized deposits is still limited, and large-scale promotion faces multiple challenges, including technical interoperability, legal certainty, and regulatory coordination. Furthermore, how to balance innovation with financial stability, and how to design a platform architecture that balances openness and security, remain unresolved issues.

 

However, the direction is clear. Tokenization represents a more efficient, transparent, and inclusive financial future. As Mehler emphasized, this is not merely a technological evolution, but a paradigm shift in financial infrastructure. Central banks, commercial banks, technology companies, and regulators must explore and proceed cautiously together to ensure this transformation truly serves the stability and development of the global economy.

 

The Bank for International Settlements (BIS), through the forward-thinking pronouncements and innovative experiments of its executives, has provided an authoritative and clear roadmap for understanding tokenization. Tokenization is not a distant science fiction scenario, but a financial reality in progress. It redefines how assets flow, restructures trust mechanisms, and reshapes the boundaries of financial services. For policymakers, financial institutions, and market participants, understanding what tokenization is and how to implement it has become an essential course for embracing the next financial era.