Logical Pillars of Token Utility

Crypto projects more often than not propose rather obscure and overly complex protocols and frameworks for the industries they target. Some of these mental gymnastics are done to loop the projects own ERC20 token in the proposed solution they are conducting an ICO for. There is nothing wrong with forcefully looping a token into a protocol and thereby rationalizing funding of development costs. After all, code doesn’t manifests itself overnight. It takes time, money and continuous testing and improvements to become reality.

However, if the mechanism by which the token is looped in the protocol is redundant or impractical, there is a significant risk that a tokens use will be abstracted as time and knowledge of blockchain by actors progresses. After all, the inherent benefit of cryptocurrencies lies in the fact that it allows frictionless transfer, ownership and agency that can’t be centrally controlled (i.e. you can fork open source code and improve it). Due to these characteristics open blockchain protocols as the GET Protocol aren’t able to prevent actors to rewrite the contracts to make it cheaper to use (nor should we even want to control that, if anything this is what makes open source great in the first place!).

Don’t try to control something that is inherently free bro.

With that in mind it is crucial that when designing an open protocol with a native token one not only should consider what utility this token is providing, but also how much sense the addition the token really is. I suggest we go on a short randomish tangent and introduce a mental model that can help us make this assessment.

The razor of William Occam

“Entia non sunt praeter necessitatem multiplicanda” — William of Occam

When one casually reads some Latin medieval literature on a your regular Sunday afternoon, like I sometimes do (lie), one might have come in contact to the scrolls of English Franciscan friar, scholastic philosopher, and theologian William of Occam. This scholar was the first to coin the now prevalent thought that “Entities are not to be multiplied beyond necessity.”

While commonly used as a mental framework to prefer the simpler explanation of a phenomenon over complex and interconnected explanations, I find the model helps to access the logic of a proposed solution as well. It is rather easy to overcomplicate and over-engineer solutions to a point that it loses all practical applicability, but it does covers all edge cases imaginable(guiltyyyyy).

With the razor in mind, the question one should ask is: when assuming that a blockchain is needed to solve ticket scalping and fraud, why add the abstraction layer of an ERC20 token? What does an ERC20 token add to the solution offered by the blockchain? Why not use Ether instead? At its core, what is the core of the benefit this token really adds to the overall solution of the problem the protocol in solving? Does the token create a network effect or does it in some way reduce costs (fee pooling) or mitigate collective risk (like insurance)?

A token as a means of raising funds for development for the protocol is a valid means at its core by the way. After all, a protocol can as owner/developer of the infrastructure used, force users to pay fees in the native token for contract usage. Short term, you’re in the clear utility wise. On the long term, however, it would be shortsighted and naive to assume continued effectiveness of this mechanism. Forced token usage for transaction costs is a limiting measure and it creates friction however you look at it. As such its not unthinkable this friction will be abstracted by smart developers over time (by forking the open source code and removing the friction and use Ether instead of the native token).

If you want to ensure token utility on the long term there need to be token functionalities that cannot be abstracted away. Due to the fact that using GET adds value in a way that Ether can’t. How? I’ll tell you.