The Bitcoin Cash community is very attached to free markets. And for good reasons, free markets are incredibly economically efficient and have been shown to outperform central planing very consistently to to the economic calculation problem .

While this is correct, Bitcoin Cash supporter often make the mistake to think various variables controlling the protocol, such as the block size, op_return size, script length, dust limit or sigops count, needs to be left to the market, and the market only. This comes from a fundamental misunderstanding of how the market find its optimum. Markets do not spawn some magic fairy to tell you what the right value is. They work by ruthlessly destroying solutions that pick the wrong values and directing resources toward the ones that do.

The market always finds a solution. There is no guarantee that the solution the market picks includes Bitcoin Cash.

The blockchain's content is immutable

It bears to be repeated, what's in the blockchain cannot be changed. Anything introduced in it is there forever and needs to be supported forever. If insecure parameters or rules are picked, then the market cannot course correct the chain, which means it'll guts it to provide resources to another that did not do the same mistake.

A very noticeable example of such event is the ethereum blockchain. It's philosophy is to go fast and break things, and boy do they break things . Wile this approach also has it advantages, it is clear that the teams working on scaling ethereum are facing a serious challenges as the chain state has grown to a size current hardware and software has trouble managing. This places the chain in a do or die position, where they need to improve the software to support larger state properly faster than the state grows or face certain death. The people working on it are very talented, so maybe they'll make it happen. What is sure is that they cannot go back to clear the state or manage create solution that do not require as much state.

The 0-conf problem

We established that picking values for various protocol variables is suboptimal. We also established that it isn't a given that the market will be able to pick a proper value within the Bitcoin Cash ecosystem, but instead can decide to destroy it and move elsewhere. Which leave us to the following logical conclusion: we need to create solutions for the market to discover these values, and if no such thing exists, we'll need to accept being inefficient and agree on some value, as being inefficient still beat the hell out of being dead.

Creating such market has proven to be challenging for a very simple reason: Bitcoin Cash values 0-conf very much. However, if the policies of various actors differs on one of this variable, say the size of op_return for instance, then it come at a cost of destroying the confidence one can have in 0-conf. This is due to the fact that a transaction with an op_return of given size will be accepted by some and rejected by others, leaving the opportunity for double spend to include a different op_return. This was demonstrated by Vin Armani on Dash.

The primary goal of Bitcoin Cash is to be digital cash for the world. This imply that it's most important characteristics are high confidence 0-conf, scaling capabilities and fungibility, each of which are being actively worked on. Considering it is unavoidable for the market to find an equilibrium that various actors play with different values as a policy, and that doing so come at the cost of destroying 0-conf, we fond ourselves where we have to go with the suboptimal route of picking values manually.

Pre-consensus to the rescue

Fortunately I think there is. Bitcoin Cash already allows some variable to be determined by the market by picking limits significantly above what the market demand is. This is for instance the case for the block size. This in effect give the freedom to miner to produce whatever block size they deem appropriate. While not perfect, it seems to have worked really well in practice.

But we can do better. Even before Bitcoin Cash was a thing, I was promoting the idea of pre-consensus. This refers to a set of technologies allowing network participants to agree as much as possible on what the next block is going to look like. If done well, this provides significantly stronger 0-conf guarantee that we currently have, while also allowing to reach greater scale by moving work out of the critical path (if a node know what the next block is going to look like, a lot of the validation work can be done ahead of time). As it turns out, pre-consensus has the added bonus effect that it allows to delegate the responsibility of picking various values which are currently centrally planned to the market. Actors who use different policies will be able to reconciliate their differences in a time scale that is compatible with 0-conf.

Going forward