The big Bitcoin news out this week (other than the fact that it appears Segregated Witness is about to start the process of being implemented) is that ViaBTC intends on blocking the rollout of the segwit soft fork. As I told a colleague of mine, this would make segwit dead on arrival.

{Don’t miss future essays by subscribing to Crypto Brief, my weekly newsletter about bitcoin, digital currencies, and the blockchain.}

I reached out to ViaBTC to learn more about their intentions and have an upcoming story for CoinDesk on the way. But suffice it to say, it appears that we’re at a standstill. This concerns me primarily because we are very close to a release that would solve a series of problems, including transaction malleability, some scale, and make it possible for many of the more advanced smart contracts we all dream of occurring.

Nonetheless, this piece isn’t about ViaBTC blocking segwit or, trully, about the technological innovation itself.

Instead, I want to talk about scaling more holistically. Last week, I focused on fungibility and how important it is that 1 bitcoin is equal to 1 bitcoin. And I talked, in detail, about Schnorr signatures, which result in two benefits: improved scale and improved security. Naturally, we’re still some time away from implementing this, but I feel it is important to talk about what scaling bitcoin actually means.

Two Ways to Scale

I used to have a colleague who was really great at selling. And he launched a product that, in its glory days, was bringing in millions of dollars a year in revenue with immense profit. The operational process for managing that product became bloated, costs increasing quite significantly. But none of that mattered because revenue was doing wonderfully.

When I took this product over, revenue was down quite significantly and we were in jeopardy of having to close the product down. What we found was that there were a ton of costs that, if we removed them, would make the product viable again. So we made those cuts while also working to increase the revenue.

The moral of this story is that there are two ways to achieve profit. You could do what my colleague did, which was just continue increasing revenue. Or you could do what we did: spend time looking at everything in the middle of the P&L statement, cut costs, and work on increasing the revenue.

Scaling operates the same way. There are two ways in which you can achieve scale on the blockchain.

The “easy” answer is to just increase the overall block size from 1MB to whatever — that’s the approach my colleague took with revenue.

The “hard” answer is to focus on the size of the data being transmitted — focusing on everything in the middle P&L — with an understanding that an increase to the block size will have to happen one day.

The reason I put easy and hard in quotation marks is because it’s not so cut-and-dry that increasing the blocksize is easy or that segregated witness and other ideas are hard. Some might argue that it’s actually the opposite; that initiating a hard fork to increase the blocksize is actually the truly difficult feat, especially after what we saw with the first (of many) Ethereum hard forks.

What I believe is that we should focus on the size of the transaction and the speed in which that transaction transfers rather than just making the block larger. Using the P&L metaphor, if we can reduce the costs to run the business — shrink transaction sizes — that can have as much of an impact as increasing the revenue — increasing the block size.

Talking About Big Blocks

To many, the right way to scale the blockchain is the Lightning Network, which has had its first complete transactions on the Bitcoin testnet. While this is in a very controlled environment, it’s exciting to see these sorts of transactions take place.

Naturally, this makes the main Bitcoin blockchain more of a settlement layer — settling all of those Lightning Network transactions — rather than a true transaction layer. Even Hal Finney, the very first person that had faith in Satoshi’s vision, believed that this was reality.

In a post going all the way back to December 30, 2010, he said:

Bitcoin itself cannot scale to have every single financial transaction in the world be broadcast to everyone and included in the block chain. There needs to be a secondary level of payment systems which is lighter weight and more efficient. Likewise, the time needed for Bitcoin transactions to finalize will be impractical for medium to large value purchases.

He recognized that to scale Bitcoin to act as the transaction processor for all financial exchanges would be virtually impossible. When Bitcoin XT (BIP 101) was proposed, the size of the block would scale to 8MB and then, every two years, double until it reached just above 8GB in 2036.

I was quite fond of this approach because it would have allowed everything to be done on the blockchain — that was the point after all, right?

But I put more thought into it and grew wary. We already have problems with block propagation, which decreases miner profitability due to increased orphan rates. If the blocks grew significantly larger, that could increase (though there are wonderful efforts underway to decrease the problem) orphan rates, thus reducing profitability for miners further. This would be a centralizing effect.

There is also debate that it would become increasingly more expensive to run a full node, which is what helps to relay transactions across the network. Right now, the blockchain is a little over 86 GB. If we increased the block size, that would make the blockchain far larger. Would there be a drop off in the number of people running nodes if that occurred?

Remember … Part of the reason Bitcoin is so useful is because of its trustless nature. As nodes and miners drop off and the network becomes more centralized, we have to trust parties more. We already have to do this with mining becoming centralized; why make it worse?

One argument that I have heard — and even believed — is that, if we were to increase the size of the blocks, more people would use it and that, in turn, would mean that businesses would have to support the blockchain through running full nodes and mining hardware. That would counter any drop off in node operators.

It’s a fair argument, but I don’t believe that we are anywhere near reaching mass adoption for bitcoin; therefore, we might see an instance where node operators drop off due to the growth in the blockchain before businesses are inclined to support the network. The inability to predict this transfer of resources makes scaling the block size risky.

Another argument I have heard from “big blockers” is that, because of full blocks, people are not using bitcoin. Either their transactions are taking far too long to complete or the fees are becoming too costly.

According to research by Rusty Russell back in June, the average cost per byte (to be the second cheapest transaction in a block) was only 5c. I sent 6 BTC today and my fee was a little over 700 bits, or $0.45. I don’t know about you, but the ability to send over $3,000 for $0.45 is absolutely incredible.

So really, are the costs that prohibitive for people to use Bitcoin?

What this ultimately means is that we do have time. Bitcoin is not failing. While it is true that blocks are becoming more full, I have seen no empirical evidence that people are not using bitcoin (or are, instead, using an altcoin) because bitcoin can’t support them. The only real reason altcoins even exist today is because of how easy ICOs are to rollout.

But time is not endless. And while segwit is close and Schnorr signatures could be here in another year or two, the secondary layer applications need to get here and they need to come with an advanced UX so that the average person can use bitcoin. Some of the companies I am most interested in, such as OpenBazaar, need to know that transactions will complete in an inexpensive and quick fashion. And if we don’t find a solution, these businesses could suffer.

We Need to Be Smart

Here’s the thing … Even though I am quite bullish on segregated witness, Schnorr signatures, the lightning network, and so many other wonderful ideas, none of it will matter over the longterm because 1MB blocks are too small.

According to the Lightning Network White Paper:

If we presume that a decentralized payment network exists and one person will make 3 blockchain transactions per year on average, Bitcoin will be able to support over 35 million users with 1MB blocks in ideal circumstances (assuming 2000 transactions per MB). This is quite limited, and an increase of the block size may be necessary to support everyone in the world using Bitcoin.

That just doesn’t cut it. 35 million users only making 3 transactions a year is quite abysmal. With a projected population of 8.5 billion in 2030, being able to support 0.4% of the world’s population is quite sad.

Even with all of the other scaling options, the reality is obvious: we will most likely need to scale the block size at some point. But that’s not today, which goes back to my point above about businesses adopting bitcoin. Perhaps in five years, as these other scaling options roll out, more businesses will begin supporting bitcoin. If that happens, decentralization could still be achieved while scaling the block size.

But here’s why we need to be smart …

When the time comes that we need to actually increase the block size, which will necessitate a hard fork, we need to do it the right way. We are talking about people’s money. If we are wrong with the roll out, we might be stuck with two, disparate chains the same way that Ethereum is stuck with two. ETH and ETC are worth less than ETH was before the hard fork.

That means that we might need to give ample time — multiple months rather than weeks — to ensure that the rollout is seamless. Fortunately, if a hard fork does happen and someone doesn’t update, their bitcoin isn’t lost; they are just not on the network anymore until they update.

But that hard fork will have to come. And if the current development team doesn’t support that — and the market truly wants it — then another development team will release a hard fork that the markets support. But we’re not at the point where we need that. And, the reality is simple: the market does not support bigger blocks right now.

This is the reason that I don’t support a hard fork to 2MB right now. It’s not because we won’t need it at some point; it’s because it won’t actually be enough. If we go to 2MB, when will we need to do another hard fork to 4MB? Or 8MB? Or 100MB? How many hard forks will we need to roll out?

If we’re going to do a hard fork, I want it to be the right hard fork. And I want it to be a hard fork that analyzes where we are after many of the other scalability methods (segwit, lightning, Schnorr, etc.) are achieved. This will give us better data to work with and help us determine where we need to be from a block size perspective.

Bitcoin’s Future is Still Bright

I had a conversation with one of the Core Developers about the roll out of segregated witness. And one of the things he said to me really stuck with me: “Changing consensus rules is hard by design.”

Bitcoin is not just one thing for everyone. For me, it’s a store of value. For other people, it’s their primary transaction mechanism. Making changes has to be hard to ensure that the majority or minority doesn’t terrorize the other.

But this doesn’t mean that Bitcoin doesn’t have a bright future. More people are learning about it. Scaling options are getting closer. Fungibility is being taken seriously. While all this politicking sucks, we’re talking about a revolutionary technology that is valued at over $10 billion. What’d you expect?

I am confident that we will get beyond these problems. And when the inevitable hard fork does happen, I believe that the network will remain decentralized.

{This is a new series of essays that I will be writing each week about topics that are of significant importance regarding bitcoin, digital currencies, and the blockchain. Subscribe to my weekly newsletter, Crypto Brief, so you don’t miss any future essays!}