stub The Need for Interoperability in Security Token Protocols - Thought Leaders - Securities.io
Connect with us

Thought Leaders

The Need for Interoperability in Security Token Protocols – Thought Leaders

mm

Published

 on

A Short History of Tokens

It’s been over 10 years since Bitcoin first introduced blockchain technology to the world. In that time, the list of potential use cases for distributed ledgers has expanded rapidly, from digital currencies, to supply chains, to identity management. At their core, however, many of these uses cases take a similar structure: they enable users to hold and transfer digital assets on a peer-to-peer basis. Put simply, we can now trade and track digital assets without needing a central trusted authority to manage the process.

This evolution of the space naturally led to the invention of “tokens” – digital assets on a blockchain that are ownable and transferable between individuals. Tokens are split into two main categories: those that represent a natively digital asset, and those that represent an underlying real-world asset. Leveraging this new paradigm, hundreds of thousands of different tokens have already been created on Ethereum alone, with a combined market cap of over $15 billion at the time of writing.

One of the most promising applications of tokens is the representation of real-world securities on-chain, which allows traditionally illiquid assets like commercial real estate to be fractionalized and transferred peer-to-peer. This process, known as “tokenization”, has gained significant mindshare from both legacy institutions and new start-ups, due to its potential to alleviate many existing pain points within the capital markets.

Regulatory Compliance

While blockchain can make it easier to transfer ownership in a technical sense, security tokens are still subject to the same laws and regulations as traditional securities. Ensuring security tokens are compliant with regulation is therefore critical to any potential tokenization,  and has been a barrier to adoption to date. As seen in the chart below, regulatory uncertainty is widely considered the largest barrier to blockchain adoption.

Numerous projects have emerged in the blockchain space, each designing a protocol that attempts to simplify and standardize how security tokens are regulated, traded, and managed. Looking at Ethereum alone, projects that have published standards tackling this problem include Securitize, Harbor, Polymath, and more. However ultimately, without modifications to how these protocols are currently designed, investors and exchanges will continue to experience significant friction when buying and selling tokenized securities. Why is this? Interoperability.

Interoperability is Crucial

Interoperability is one of the most significant benefits of tokenization. It allows an entire ecosystem of capital markets applications and products to integrate with one another because they share common software standards. However to enable interoperability at the application and product level, it needs to begin at the lowest level with the tokens themselves. In the security token space, interoperability is essential for two key parties: exchanges and investors.

As an exchange, you want to be able to authorize investors for the purchase of any security token that they are eligible to buy – no matter the company that created the token. This means not having a bespoke integration with each security token, but a simple and generic integration that is uniform across all security tokens. 

As an investor, you want the onboarding process to be as simple and frictionless as possible. Currently when an investor wants to purchase shares from multiple places, they have to provide their personal information time and time again in a process called Know Your Customer “KYC”. Blockchain has the potential to transform this process by storing this information immutably on-chain, where it can then be referenced by all security tokens. This would mean not having to repetitively provide the same personal information every time you wish to purchase a new token, instead only supplementary or updated information would be required after the initial registration. However, this process will only be possible if interoperability between security tokens is designed into the standards that govern the system.

The Protocols

Three of Ethereum’s leading security token protocols were published by Securitize, Harbor, and Polymath. All three of these protocols are built upon Ethereum’s ERC-20 token standard, which they then extend to enforce compliance into the trading of the security token. This is achieved by querying a second contract on the legality of each trade at the time that it happens.

Whilst named differently in the protocols, the use of a second contract is consistent throughout all three, achieving the same result: preventing non-compliant trades. This second ‘Regulator’ contract is kept up to date with users’ KYC and accreditation information by off-chain services that are authorized to do so – for example an exchange, or the token’s issuer.

Although these three components may seem like everything you need to regulate a security token (and in the simplest form, they are), it is how the components are programmed that really determines interoperability. Sadly, the protocols lack interoperability in two key areas, which will continue to cause friction and slow adoption of this technology:

 

  1. How do authorized parties update on-chain information about users?

 

Harbor

Harbor declares in their whitepaper that they will be the only party authorized to update user information on-chain for the time being. The centralization of this role means that exchanges would not update any data referenced by the Regulator. They therefore will not be able to approve new recipients of the token, preventing tokens from being easily traded outside of the Harbor platform.

 

Securitize

Securitize have already implemented a system whereby multiple parties can be authorized, meaning investors can register their compliance information in multiple places and are not required to go through Securitize themselves. The on-chain data is then updated directly by the authorized party, and can be viewed by all of Securitize’s tokens. Furthermore, to prevent investors from having to provide information multiple times, Securitize have designed an API to allow authorized parties to access the private information about investors that is stored off-chain, enabling them to easily determine whether an individual is compliant or if more information is needed.

 

Polymath

Polymath has a native digital utility token called POLY that is required throughout their platform to perform various tasks, including to get an authorized party to update your on-chain data. In order for an individual to KYC themselves, they first must purchase POLY tokens, which does not have a liquid fiat to POLY market. Instead the individual must purchase another cryptocurrency such as Ethereum’s “ether” (ETH) using fiat, and then exchange this for POLY. The tokens can then be used on Polymath’s KYC marketplace to make a bid to a KYC provider. If the KYC provider approves the offer, they are paid in POLY tokens to perform the KYC check for the individual. This process is clearly a significant onboarding friction to the Polymath platform, and makes the process more complex than necessary.

 

  1. How this information about users is then stored and accessed on-chain?

 

Harbor

From looking at the whitepaper and smart contracts on GitHub, it is technically possible for many of Harbor’s tokens to all share one common Regulator contract, and share one common source of user data, however this is unlikely due to the differences in regulation between different tokens. The lack of live Harbor’s tokens on Ethereum has not clarified whether it is their intention for this to be the case, or whether each token will be deployed with its own Regulator.

 

Securitize

Securitize’s protocol is designed such that their Regulator contract queries a third smart contract which stores user information. This enables each token to have unique regulations encoded in their own individual Regulator, whilst still sharing a common source of user data in the third contract, meaning when a user KYCs for one Securitize token their information is stored ready for them to buy future tokens

 

Polymath

It’s not explicitly stated in their whitepaper whether Polymath has a central source of compliance data stored on-chain that each Regulator then interacts with, or if tokens have their own local source of information. However, based on Polymath’s sample contracts, it appears that each token uses a local source of information, which is not shared between different tokens. While this may have advantages, this setup risks data redundancy and inconsistencies.

Take the following example: Bob has expressed interest in two Polymath security tokens, ABC and DEF, and has been approved as an investor for each of them. This information is sent to the Regulator contract for each of the tokens. A month later, Bob tries to purchase further DEF tokens but it is found that he is no longer accredited. This information is sent to DEF’s Regulator to update Bob’s investor status to be non-accredited. Now, on-chain, there is conflicting information: ABC thinks that Bob is a verified investor, however DEF disagrees. It is easy to see that having a central source of information would prevent such discrepancies from occurring.

Interoperability of the Protocols

As discussed previously, there are two main parties involved in the issuance and exchange of security tokens to whom interoperability will matter greatly: exchanges and investors. Both of these parties desire a smooth experience when interacting with different security tokens. So, if using the protocols as-is, let’s take a look at how exchanges and users will be affected.

Exchanges

As an exchange, integrating these protocols for purposes of transfer is easy: all of the tokens utilize the ERC-20 token standard, providing a uniform interface to invoke transfers, approvals and balance checks. However further integration with the compliance aspect of every protocol becomes far more complex. You’ll remember it’s not currently possible for a trusted party to become authorized on Harbor’s protocol – they will instead have to direct users to Harbor to KYC themselves. To then integrate with Securitize’s protocol, the trusted party must be authorized by Securitize, which will then allow them to access investor KYC data through the off-chain API, and to update on-chain information stored in the on-chain data store.

To integrate with Polymath’s protocol is likely the most complex. The trusted party must register themselves as a KYC provider on Polymath’s KYC marketplace and set themselves up to receive bids in POLY tokens in return for providing KYC services. In providing KYC services to investors the trusted party must then organize a way to ensure that the duplicative on-chain data stored about a user in each security’s Regulator④ does not become inconsistent.

Not only do the protocols have different interfaces that the trusted party must integrate with, each protocol also has a different way to provide error reporting to the exchange. When building an interface it is important to be able to translate any errors that occur into something that is understandable by users. For example, if a user cannot purchase a token this could be for a wide variety of reasons: the security may have a holding period that has not yet been satisfied, or may restrict the maximum number of permissible holders. To be able to communicate these messages to users, the exchange would have to integrate with a different method of error reporting for each protocol.

Investors 

The different methods by which onboarding of investors is currently designed in the protocols means that investors will likely have to provide personal information many times to different platforms and in different ways. This is caused by the fact that Harbor have not authorized any other parties, and Polymath require investors to bid for KYC processes using POLY tokens. The friction caused by the enforcement of these compliance methods may render investors unwilling or unable to purchase securities they would otherwise purchase.

The scale of this protocol-induced friction on investors may be somewhat alleviated by the manner in which exchanges go about integrating each of the protocols. For example, if an investor chooses to KYC on an exchange to purchase a Polymath token, that exchange, if authorized, could choose to update Securitize’s data storage at the same time. This would mean the investor’s information is on-chain in case it is needed in the future. However, if no changes are made to the current protocol designs, then the process of registering and purchasing securities will remain daunting.

Solutions

The solution to this problem need not be complex. In fact, it is possible to introduce certain solutions without changing any tokens that are already live on Ethereum. An ideal solution that results in minimal friction for both exchanges and investors, and that prevents data inconsistencies caused by having many different sources of compliance data would closely resemble Securitize’s centralized on-chain data store; however, any such a set-up must then be adopted on an industry-wide scale.

By having a central source of information on-chain, the risks of data inconsistencies is removed, and investors are able to purchase different securities through just one compliance verification. This central contract would carry out the verification that the transfer was compliant for all security tokens, and the transfer would continue or revert. The off-chain API that is accessible to all authorized exchanges means that investor compliance information can be communicated to exchanges and reduces the number of times investors must be asked to provide data. These aspects together also massively reduce the amount of integration work required by exchanges.

The introduction of a new system like this clearly causes some complications, and a number of issues would still have to be ironed out. For example in the design of how each exchange becomes authorized: who makes the decision that an exchange should be trusted? Time has to be taken to design a system that allows a consensus to be reached.

Conclusion

The tokenization of securities is still an area that is early in development and adoption, which is in-part due to the complexities of regulatory compliance. While the publication of protocols simplifies the compliance with many of these regulations by enabling them to be enforced in the execution of every transfer, there is still a long way to go before this is a seamless process. Until we have an agreement between protocols on how investor information is stored and updated both on-chain and off-chain, there will remain significant friction throughout the registration and investment processes for all parties involved.

Alice Henshaw is a smart contract engineer at Fluidity. Fluidity are a New York-based company working on DeFi, and are best known for creating decentralized exchange Airswap. Previously Alice worked at ConsenSys where she designed and implemented smart contract systems responsible for over $100M USD in transaction volume. She is a graduate of Oxford University with a degree in Computer Science.