From mining to utility

Why is there a function of mining, in the form of Proof of Work or Proof of Stake, in most major cryptocurrencies? What’s the purpose of it? Is it needed?

Depending on who you ask you’ll get a bunch of typical answers.

“It compensates you for ensuring the integrity of the ledger.”

“It helps spread wealth, avoiding centralization, because everyone can participate.”

For the first of these, about ensuring the integrity of the ledger, you could also argue that transaction costs could finance this. You might reply and say transaction costs would need to be prohibitively high to ensure the smooth operation of the ledger, but why would it need to be so expensive to run the hardware? Just because it’s expensive to run a Bitcoin or some other Proof of Work ledger, there’s no excuse for assuming this is how it must be.

If mining is required for the safekeeping of the ledger, you should be seriously worried about the consequence to Bitcoin when new coins are no longer available.

With regards to spreading wealth, well, that’s primarily a political argument. And it assumes there’s intrinsic value in a cryptocurrency coin, and that even with any such value tied to the coin, centralization of this wealth is somehow avoided over time. I’ve seen no proof of this happening.

The result of mining is neither a material to work upon nor a tool to work with.

Bitcoin, with its mining setup, might have been a solution to a problem that didn’t exist. But it sparked a global brainstorming session, which is still ongoing. What can we do with this technology?

Imagine there being 3 stages so far in the history of blockchain and DLTs.

Stage 1

Early 2009, the Bitcoin ledger started operating. It doesn’t do much, but it solves a few practical problems around how to maintain a distributed ledger, ensuring that said ledger operates as described in a setting without trust.

Other copies of the same technical solution emerges quickly, each which a slightly different take on how PoW should be done. Litecoin comes to mind.

These variations still don’t really differ much. They require different setups for mining new coins, and are basically just a money grab from people seeing an opportunity to jump on a trend. There is no fundamental business case development represented among the various stage 1 ledgers.

All you can really do on a stage 1 ledger is to move the so called native asset between accounts on that ledger, with little concern for efficiency.

Stage 2

Some people see endless opportunity. But the thought of having to somehow generate a completely separate ledger for each use case seems very wasteful. In an attempt to cater for every need, a full blown Turing complete system is put in place.

This allows everyone to run code on a global machine, where state is modified by immutable code stored on the ledger. The most prominent example of a stage 2 ledger is of course Ethereum, released in 2015.

Since the release of Ethereum we’ve seen many useful applications, and quite a few useless. But because it’s not always obvious what is useful, a playground is needed. And from that playground patterns start emerging. These patterns are what leads us to stage 3.

Examples of patterns that emerge:

Introduction of non-native assets, through a process of tokenization. Tokenization allows us to track who owns a share of this non-native asset, just like we can with the native cryptocurrency associated with the ledger.

Distributed trading venues, where there’s no centralized exchange. This allows us to trade the tokenized non-native assets, exchanging them for something else. This something else is either a different non-naive asset on the same ledger or the ledger native asset.

A tokenized asset can be anything really. It can be a barrel of oil or it can be purely digital. And we can also program derivatives and options as smart contracts, where the smart contract will result in some payout given some condition.

Stage 3

While being able to run code on a global distributed machine is a technically sexy idea, it does lead to much complexity. Without involving complicated sharding mechanisms, scaling such a machine becomes near impossible. And any form of sharing will just further complicate the setup.

If from stage 2 we see a set of clear use case patterns, why don’t we just natively support these instead? This is what stage 3 represents.

In short we have a system which allows for use cases like trading other non-native assets, with a sufficient level of smart contract capabilities supported out of the box. And because it’s out of the box, it becomes a lot more optimal to run these operations. Things are more efficient, faster and safer, while still remaining a distributed ledger.

Maybe the most prominent examples of stage 3 systems are Ripple and Stellar. These ledgers allow for any asset to be traded and exchanged natively within seconds, rather than minutes, at a fraction of the cost of a stage 2 system.

A stage 3 system doesn’t allow you to run code in an on chain Turing complete machine, but it does provide building blocks for smart contracts. The difference is in where the execution is run, on or off ledger. The result in both cases is recorded on ledger, with all the requirements around verification you’d wish for.

Overall trend

As the understanding of cryptocurrency, blockchain and DLTs mature it becomes more obvious what use cases fit this technology. DLTs should allow us to transfer ownership and liability as easily as we send an email. It should be near free, if not practically free, to accomplish this, and it gets easier to do as software evolves.

During the process of achieving this, inefficiencies like mining gets rooted out, because we realize this is not where the value potential belong. It also creates interesting thoughts about how we should value the native ledger assets, and are they really needed when we can put non-native assets on there anyway?

One of the more interesting areas of research within DLTs today is on the consensus protocol. Build a better and more efficient consensus protocol and the benefits will be available to any application built upon the DLT using these.