There is a brewing argument and potential divide in the Bitcoin Cash community, but I think a lot of it is occurring because of some over-simplifications on how the Bitcoin system functions from a holistic point of view.

Why is a 1MB BSL (block size limit) bad?

The answer to this may seem obvious to many in the BCH community, but it's worth just covering it again to give a reference point for the other arguments I will put forward in the rest of this article.

To put it simply, a 1MB BSL (block size limit) is bad because the very significant majority of the hardware that is connected to the Bitcoin network can handle much more than 1MB blocks without an issue. This is 'free' scaling. I.e. you can just increase the capacity of the network some amount above 1MB and pretty much everything will stay the same. The same nodes will be on the network etc.

1MB seemed to have been chosen as a round value that wasn't too big to allow a bloat attack on the network (malicious miner rapidly bloating the blockchain with zero cost thanks to zero fees in the early days), but also big enough that the capacity could scale for a long time (6 years in fact).

In 2015 this capacity limit was reached and the scaling wars started, with one side never wanting to increase the block size limit, and the other side wanting to increase it. The scaling wars ended in 2017 with Bitcoin being forked into two versions, Bitcoin Cash and Bitcoin Core.

The argument for keeping the block size limit at 1MB (or ~2MB assuming 100% use of Segwit) was that zero nodes should be kicked off the network, and any increase in the block size would be an increase in computing resource. This increase would kick the weakest nodes off the network.

Recently, a new scaling debate started when nChain announced that they would be creating a new client called BitcoinSV and that it would be forking the network to a 128MB BSL.

At first glance this debate seems to have many arguments that echo the previous scaling wars, but the situation is different this time...

Why is a 128MB block size limit bad?

A key argument I hear brought up a lot is ' Shouldn't we increase the block size limit as large as possible?' . The answer to that, in my opinion, is 'Yes and no'. As far as I can tell, everyone in the Bitcoin Cash community supports massive on-chain scaling as soon as possible, that includes all development teams. The question is, 'how we achieve that?'. I can already hear some of you saying ' But isn't this exactly what Core also says?' . Yes, they said similar things in 2015, and their statements to this effect when taken at face value were correct. Many of the things they said were technically correct when you look at them as individual statements as opposed to within the context of their overall plan. Ryan Charles, alluded to this in one of his excellent vlogs. This is part of the reason they managed to convince people to stick with their plan.

When understanding Bitcoin you have to view it from a very wide angle. Bitcoin is greater than the sum of its parts, and each part is fundamental. This is the reason why an individual argument may be technically correct without any extra context, but completely false when understood within the context of the system as a whole.

So, what's wrong with a 128MB BSL (or any scaling achieved by simply increasing the BSL)? As mentioned earlier, all else being equal, as blocks increase in size, the amount of resources required to process them also increase. Every node on the network has a specific point at which it will no longer be able to keep up with the resource requirements of the network. The first nodes to fall will be the weakest and most irrelevant nodes. This might be some Raspberry Pis that cost $10. Who cares about them right? Then it might be old desktop computers that cost $100. Well, they are only hobbiests who don't add anything to the network so who cares. Then it is would be high-end desktop computers that cost $1000. Well these guys should really just upgrade if they want to stick with the network. Then it would be $10,000 fairly specialised machines that would be kicked off. Ok, these are only small businesses being kicked off now, and they should really upgrade to compete. Then it would be $100,000. Whoops, I think we kicked everyone off.

(When I say 'node' here I mean, any mining or non-mining node.)

OK, so what the hell is the correct BSL then?!

This is a tough question, and there is unlikely to be any simple and direct answer like ' it should never be increased ', or 'it should be completely unlimited '. There is no 'correct' BSL. It's a trade-off, and one that the miners and wider ecosystem will need to grapple with. At every increase in resource requirements the network will kick off some nodes, and maybe those nodes are pointless (e.g. a raspberry pi just used to check a watch-only address), but some will be valuable to the network.

I have watched over the past year as Bitcoin Cash community managed to not only reorganise itself from the edge of obliteration, but also thrive even during a bear market. This also included a vast number of exciting projects that built on top of Bitcoin Cash. For example Gabriel Cardona's excellent Bitbox service (now integrated into Bitcoin.com), or txhighway.com and txstreet.com, or memo.cash. Would any of these great projects have materialised had a BCH node cost $10,000? Could a competitor to Bitpay be established if a node cost $50,000 and another $50,000 to run per year? Would smaller exchanges even bother listing Bitcoin Cash pairs if the cost was that high? So what happens is the barrier to entry is raised. The ladder is pulled up.

Someone may say ' But the BSL is not the same as the block size. ' and this would be correct but blocks of any size below the hard limit can start appearing at any time without any notice. This could mean you build a service that then gets kicked off the network without any warning. The only protection against this is to have hardware that is capable of running blocks at the full block size limit. So even if the hardware actually required to run the current average block size might only cost $50, you as a business may be forced to purchase $10,000 hardware upfront just in case. The reality is that many businesses, entrepreneurs and developers would simply opt-out of using Bitcoin Cash if costs become too high.

Another potentially even bigger issue, is the risk of constant forks occurring between the miners. In the same way that businesses could have their services kicked off the network, miners could also be kicked off the network if they are not able to keep up. This presents an even bigger issue as Bitcoin relies on miners for its security, and any exchanges or other businesses might also fork off as well. Not only would this reduce security, but it would also increase centralisation. Decentralisation is only a tool to be wielded in pursuit of digital cash for every person on the planet, but it is a necessary tool. So, having a BSL that allows this to happen actually creates instability and unpredictability in the system for the participants. For this reason the BSL does need to be chosen carefully to balance these trade-offs.

So you're saying Bitcoin can't scale on-chain?

No, I'm definitely not saying that Bitcoin can't scale on-chain. Bitcoin can scale massively on-chain.

One way we get more 'free' scaling is through Moore's Law and time. Technology improves exponentially with time, so as time goes on the cost of running a node comes down exponentially, and we get this completely for free.

The other way we can scale is through the hard work of developers who can find efficiency improvements, of which there are many. An example of efficiency improvements are Xthin and Graphene by Bitcoin Unlimited. Efficiency improvements allow us to increase the capacity of the network without increasing the barriers to entry.

The final way we can scale is the easiest, but has the worst trade-offs in my opinion. We can simply increase the BSL and increase the resource requirements, thereby kicking some potentially economically beneficial nodes off the network and making it more difficult/expensive for possibly valuable developers and businesses to join the ecosystem.

It seems to me that we should exhaust all 'free' scaling options before we the increase barriers to entry of the system, and if we must increase the capacity further, research and analysis should be done to show what the effects of doing this are (e.g. what will be the realistic cost of building and running a node at 128MB blocks).

I hope the community finds this article insightful, and if there is anything anyone feels should be added or corrected, please comment below.