A guide through Security Token Offerings (STO) — part 1: key benefits

Pascal Egloff
6 min readJan 15, 2021

--

The following article is the first part of a series of three. It is addressing interested investors, issuers, banking & legal experts and tech-savvy individuals alike.

Intro: Tokenization of assets / securities

Tokenization is based on distributed ledger technology (DLT) of which blockchain technology is the most prominent option. In recent years, these promising technologies and its adaptions into practice have continuously evolved. The knowledge among academics as well as practitioners has grown continuously, and several use cases have been developed and tested. After a first phase of initial coin offerings (ICO) that sometimes led to disappointment among market participants, a new era of tokenization seems to be evolving. Instead of tokenizing utility rights the focus now lies on assets and securities. This results for example in equity or debt tokens, which represent traditional financial instruments but are wrapped in a new, decentralised and (more than ever) digitalised form.

A figure that shows three kind of conversions from “traditional” to “digital” assets by tokenization.
Evolution of digital assets — security tokens

To do this (and to eventually make it worthwhile) the whole traditional financing and investing processes as well as the respective infrastructure must be digitalised too. The challenge lies in combining the knowledge about the mechanics of capital markets, legal and regulatory requirements and technological possibilities into a profitable business model that benefits all participants of the underlying ecosystem.

This intro is part of a series of short articles pointing out some of the important aspects about tokenization, its possibilities, and challenges to change the current capital markets. The articles are part of a research project by the Eastern Switzerland University of Applied Sciences. The project aims to assess and combine the needs of all relevant market participants into a framework and to standardise the tokenization process. This shall eventually also define a certification process for future issuances.

The current project partners are representing various stakeholders of a tokenization: financial industry (fedafin, Hypothekarbank Lenzburg, PFLab by PostFinance), legal (Kaiser Odermatt & Partner, VISCHER, WalderWyss) and technology (Drakkensberg, KORE Technologies). The research project is co-financed by Innosuisse and open for additional project partners (details: www.digital-assets.ch).

The goal is to unite as many industry leaders as possible to build a broadly supported standard for (truly) digital assets. This standard shall also include existing guidelines, handbooks and benchmarks.

This series of articles consists of the following three parts:

Part 1: The key benefits of tokenizing assets / securities
Outlining the benefits of tokenization and some of its challenges

Part 2: The tokenization process
Explaining the different phases of the tokenization process and describing the aspects that need to be addressed by a tokenization project.

Part 3: The importance of interoperability in tokenization
Highlighting the essential role of «interoperability» in the process of building a tokenization market.

The series is based on the assessment of more than 50 interviews with legal experts, financial institutions, secondary markets professionals, tokenization technology providers, security token issuers and investors as well as on the analysis of articles and guidelines from market organisations and academia.

Part 1: The key benefits of tokenizing assets / securities

In the beginning of the evolution of digital assets the expected marketing effect was often one of the drivers for a tokenization. However, this effect has recently decreased. Fundamentally, the main benefits of tokenization are directly linked to the advantages of distributed ledger technology (DLT). This is why efficiency gains, higher liquidity and more transparency are probably among the most recited benefits of tokenization of assets and securities. While all these three factors make sense from a high level perspective, it is important to take a closer look on what exactly they mean.

1. Efficiency gains

The transfer of assets via a DLT solution is being done by smart contracts (see more in part 3). One can think of this as an automation of actions. While in traditional financial markets lengthy processes of settlement and clearing were necessary this process will be carried out automatically by the smart contract underlying the respective security tokens. Often this is also being referred to as atomic settlement (i.e. the delivery of the token versus the payment happens at the same exact time and is conditional to one another).

Many tokenization platforms and respective technology providers offer on top of the basic issuance and transfer of assets in a tokenized, digital way (therefore digital assets) also some digitally enabled functions for corporate actions or reporting requirements (e.g. digital voting). This automation and digitalisation of processes can absolutely result in efficiency gains, which can also enable a broader use of additional functionalities (e.g. issuance of employee shares). Depending on the costs to build or fees to use tokenization for oneself or your clients these efficiency gains can also result in monetary gains. This however has to be looked upon on a case-to-case basis.

2. Higher liquidity

Tokenizing assets and therefore digitalizing them shall make them more accessible to a broader range of investors and finally lead to higher liquidity of the underlying. Especially the outlook of establishing some kind of secondary markets is fuelling this assumption. In the short term these options are limited. Just by enabling historically illiquid assets to be traded in an easier way does not necessarily result in a liquid market as we know it from blue chips (for example equity tokens of SMEs will hardly ever be traded on such a regular basis as stocks of the SMI).

Nevertheless, this might not be necessary. There are plenty of private placements that could benefit from even a slightly higher liquidity. And who knows, maybe over time, when more and more assets will be tokenized, new products will evolve, new investors will enter the markets and lower transaction fees could very well allow more active trading in general. Time will show.

3. More transparency

The legal rights and duties of a token can be embedded onto the smart contract. Thus, the security token itself has at least the function to enable an immutable record of ownership. Other features (for example the right to call for an extraordinary general assembly) could be included in the code of the smart contract [on-chain] as well or it could be facilitated outside of the smart contract [off-chain].

This differentiation between information/logic that is being kept on the blockchain — on-chain — and information/logic that is being kept in a separate database — off-chain — is crucial for blockchain applications.

In the case of security tokens, normally, any kind of personal data about investors and issuers are being kept off-chain. The reason for this is the need for conformity with privacy laws such as GDPR. Information that is integrated into a smart contract and therefore part of a blockchain would be publicly available (in case it is a public blockchain — more to that in part three of this series) and as a blockchain is immutable it could not properly comply with GDPR requirements.

Still, a tokenization (on a public blockchain) would i) allow issuers to reconcile its investor instantly and on a continuous basis and ii) grant investors an immutable proof of their right of ownership. This intermediary-free transparency should affirm trust into tokenized assets, which again is in line with the basic concept of every DLT application.

The challenge going forward

To sum up, all of the above explained benefits can really come into play; some of them have already partly materialized. However, to really leverage on them, the whole ecosystem of tokenization must grow. In this process, more importantly than just the bare scale is the conformity of the various approaches and tokenization solutions. Hence, it is a question of interoperability. This is why the last part of the series will go into detail about what interoperability exactly means and how it could be achieved.

Stay tuned and let us know your thoughts on the topic at hand.

Authors:

Pascal Egloff, lecturer and project manager for Banking & Finance at the Eastern Switzerland University of Applied Sciences in St.Gallen. He is active in the fields of digital assets, corporate and project finance, financial technologies and innovation in banking.

Prof. Ernesto Turnes, head of the Competence Centre for Banking & Finance at the Eastern Switzerland University of Applied Sciences in St.Gallen, where his focus lies in the areas of digital assets, asset management, financial instruments, and innovations in banking. He is also chairman of a Swiss asset management company and a board member of a pension fund.

--

--

Pascal Egloff
Pascal Egloff

Written by Pascal Egloff

Lecturer / Project Manager for Banking & Finance

No responses yet