Tokens have been around for 1000s of years, but only recently have we seen the rise of digital tokens. Now, cryptographic tokens offer us an opportunity to redesign value streams and hence existing ecosystems. A well-designed token ecosystem unlocks value by bringing parties together in new ways and stimulates the target behaviour by having cryptographic tokens as built-in incentives. Tokens matter and offer a chance to redesign existing and new ecosystems.
On January 14, 2020, we had the second round table session organised by the 2Tokens initiative. The 2Tokens project aims to clarify the path to realising value from tokenization. During the first round table session, we discussed why we need tokenization, what is required to achieve value from tokenization, and how we should move ahead with it.
The objective of the second round table discussion was to understand the challenges faced when designing new, token-driven, ecosystems; what is needed to enable purpose-driven tokenization, which comes down to token engineering - the practice of using tokens as the foundation for designing value flows and ultimately economic systems?
The event took place at YES!Delft and with over 70 thought leaders, innovation drivers and representatives from enterprises, law firms, the regulator and the Dutch government, it was a great success. The attendees had interesting discussions on seven different topics, related to purpose-driven tokenization:
- Shared understanding
- – how to align interests and motivations to collaborate and make progress in ecosystems?
- Innovative funding
- – how to fund ecosystems beyond traditional VC financing or (bank) loans?
- Change management
- – how to understand and facilitate the change implied with new ways of interacting?
- Messaging and engagement
– should the narrative around tokenization change to enable innovative ecosystems?
- Knowledge and skills
– what skills do organisations need to transition to tokenized ecosystems?
- Problem-solution fit
– how to ensure addressing real problems where tokenization can help realise in new solutions?
- Tokenization and the law
– what are the legal requirements around purpose-driven tokenization?
The objective of these seven tables was to understand how to enable tokenized ecosystems from the perspective of purpose driven tokenization. Below are summaries of the discussions that took place at each table:
Shared Understanding
Tokenization, tokenomics and token engineering are new concepts and, for many, difficult to understand. Especially since often, these terms are used differently in different contexts. What do these concepts mean, and how can an ecosystem benefit from it? Without a shared understanding among all stakeholders involved in an ecosystem, it becomes difficult to allow people from multiple disciplines to collaborate effectively and build a tokenized ecosystem.
A shared understanding is not only relevant for building tokenized ecosystems, but it will also enable regulators and policymakers to address regulatory concerns in the right way. As such, it will foster a healthy public debate around tokenization.
A shared understanding consists of precise terminology and taxonomy, international standards, useful metaphors to share with wider audiences and clear legal and regulatory frameworks. Since the field of tokenization is very much alive, on-going coordination, education and engagement among all stakeholders is essential. A Token Coalition can manage this, or organisation such as the 2Tokens project, to ensure all stakeholders' requirements are met.
We define the terminology as follows:
- Tokens
- : the digital representation of value (e.g. asset) on a blockchain.
- Tokenization
- : the process of changing value (e.g. asset) into its digital representative
- Tokenomics
: the study of the emerging field of the design of crypto tokens and related digital assets using economic incentives, game theory, cryptography and computer science.
- Token engineering
: the practice of using tokens as the foundation for designing value flows and ultimately economic systems
- Purpose-driven tokenization
: leveraging the exchange of value to drive behaviours of an ecosystem towards a particular goal
Innovative Funding
For any ecosystem, funding is important. Tokenization, however, changes how this funding can be achieved, going beyond traditional financing from venture capitalists or financial institutions. When an ecosystem plans to use tokens for funding, it can benefit from easy access to capital, anywhere in the world. Compared to traditional financing such as an IPO, the costs of financing are lower, and it is easier to scale as more people have access to the investment opportunity. After all, tokens do not know borders.
Tokens offer multiple, technical, advantages over traditional funding. First of all, they are programmable. This means that governance and rules can be embedded within the token. For example, the longer you hold a token, the more dividend you will receive. This allows you to drive the behaviour of your investors while raising funds. In addition, tokens are transparent, secure and traceable, giving regulators more control to ensure correct behaviour.
With tokenized funding becoming the norm in the coming years, we can expect a shift from ownership to temporary ownership, as exchanging assets will become easy. As a result, previously illiquid assets will become liquid, thereby drastically changing economies. Anything can be tokenized and made liquid, including real estate (fractional ownership), CO2 rights, mobility, futures, art or even entire clubs and sport contracts to increase fan engagement.
Key to tokenized funding is the right infrastructure. This includes secondary markets to easily exchange security tokens, clear regulations so companies know what they have to comply with, and an intuitive user interface to facilitate ease of investment. When the right tools are available, tokens will revolutionise funding opportunities.
Change Management
Developing an ecosystem is one thing; getting people to use it is a different challenge. Although tokens can drive behaviour, people will need to change their behaviour to participate in tokenized ecosystems. It can be expected that there is a resistance to change because people don’t understand the new ways of interacting (why is the new ecosystem better?), don’t see the urgency (why do we need to change now?) or don’t see what is different compared to traditional ecosystems (is the status quo not good enough, or even better?). In addition, there might be a fear (that large parties will determine the rules or reap all rewards) to change among the ecosystem participants.
To tackle this resistance, it is crucial for ecosystem owners to create awareness and understanding the change implied when using tokenization; what is it, why is it important, how will it change the ecosystem and what are the benefits? In addition, it is important to take into account to define and communicate the scale of change and eliminate certain (wrong) assumptions. Starting with a minimum viable ecosystem to build engagement and showing why a tokenized approach is important can help in market adoption.
When talking about tokenization of an ecosystem, or multiple ecosystems for that matter, we should also take into account the following subjects:
- Responsibility: define responsibility, who is responsible, what creates what and who might be responsible for possible negative consequences of the tokenization of the ecosystem?
- Weaker parties: Weaker parties might need a helping hand to participate in the tokenization of ecosystems.
- Ethics: is the tokenized ecosystem designed with ethics in mind?
- Skin in the game: who are the parties at risk when tokenising an ecosystem? Who can quickly adapt and who might need more help? It is important to try to keep everyone on board. Diversity within and between ecosystems could add a lot of value.
With the above components in place, it becomes easier to design and grow a tokenized ecosystem.
Messaging and Engagement
Apart from designing a tokenized ecosystem, communication around your ecosystem is also vital to drive change. For many, the concept of tokenization is not clear, let alone the benefits of a purpose-driven tokenization. Therefore, a clear message explaining why a token is used and what the benefits are for the ecosystem as a whole will be important to ensure adoption. This does not only include the right marketing material but also to see the ecosystem in production so the user can experience the benefits of a tokenized ecosystem. A demo or proof of concept will not be sufficient, and only real applications will get tokens out of the taboo sphere.
However, any tokenized ecosystem explaining tokens and the benefits of tokenization is not sufficient. There must be clear messaging and engagement on a higher level, which could benefit all tokenized ecosystems. This links back to the need for a shared understanding, standards and clear definitions. It would allow us to explain what tokenization can do for society, which would benefit all tokenized ecosystems and speed up adoption. Leveraging success stories as showcases would definitely help. However, showcasing failures can also contribute as we can learn from our mistakes, i.e. we need to educate people on tokenization.
Tokenization can have broad benefits for society if done right – how to get the implications and opportunity widely understood?
Knowledge and Skills
Education of consumers, companies, regulators and policymakers is vital for tokenized ecosystems to succeed. A lot of applications, such as decentralised autonomous applications (DAOs) are too complicated (technical) to understand for most, which limits effective governance. In addition, crypto has a reputation problem, among others due to the many scams the world has seen in the past years. Therefore, for tokenized ecosystems to succeed, we need to educate and increase the community's knowledge of tokenization to create trust. This can only be achieved by when different industry players, such as regulators, policymakers, startups and investors, actively collaborate when designing the ecosystem and share their insights with the broader community.
Unfortunately, the problem is that regulatory clarity will take too long, which could mean that organisations would need to take ownership and move ahead regardless. This can only work if those organisations moving forward ensure trust in their product by adhering to ethical standards — education through action.
Problem-Solution Fit
When designing and developing tokenized ecosystems, it is important that there is a real problem that can be fixed with tokenization. As with any startup, validation is, therefore, vital for success. Are you designing and developing the right ecosystem in the right way?
What can tokens bring to the table that cannot be achieved without tokens? To drive trust in the wider community, it needs to be clear for (potential) users/clients/regulators why a token is necessary and how the token will be used. What economic value will the token bring to the ecosystem? Important questions that need to be answered – and shared with relevant stakeholders - prior to building a tokenized ecosystem. Since tokenization is so new, this can only be achieved by active collaboration with all stakeholders before developing the ecosystem.
Tokenization and the Law
When designing tokenized ecosystems, legal compliance is essential to stand apart from unethical and fraudulent counterparts. Therefore, legal design thinking is relevant at the core of every project. It is important to have a clear view of the legal aspects of new developments to obtain regulatory cooperation and closely collaborate (for example, using sandboxes) with regulators and policymakers when designing the tokenized ecosystem. While the community should welcome regulation, regulators and policymakers need to develop regulations that do not stifle innovation. This requires more cooperation and open dialogue between those innovators and the regulators.
Conclusion
Standardisation of terminology, education of the wider community and collaboration with relevant stakeholders such as regulators and policymakers are important preconditions for developing successful tokenized ecosystems. It is up to regulators and policymakers to establish clear regulations that enable innovation instead of limiting it because tokens do not adhere to borders.
The opportunity is there if we can get the right environment for Tokenization in place. If this does not happen, then startups and ecosystems looking for to leverage the benefits of tokenization will move elsewhere as the benefits are too big to ignore.
This second round table session was again a great success by showing what the preconditions are when developing tokenized ecosystems. Purpose-driven tokenization can drive and change behaviour, but all stakeholders must be involved from the start, which is precisely what we are doing with developing the 2Tokens ecosystem.
The final round table will take place on February 11, where we will take a deep dive into all aspects of Token Financing. If you would like to contribute and/or engage on this topic, then please indicate your interest to join, you can register here.