The Bitcoin developers are experienced programmers and cryptographers, and they’re most certainly not ignorant of economics. They have spent years studying every line of the source code, developing new features, and understanding the economic ramifications of those changes. As much as you think you know about Bitcoin, these guys know and have considered more. You can have an hours-long conversation with someone like Peter Todd or Eric Lombrozo on things like UTXOs, Merkle Trees, Block Headers, or other dense topics.



Telling this select group of developers on why the blocksize should be raised is like lecturing Tom Brady on how to throw a football.​

Saw these two similar small-blocker articles that came out recently:and(Spoiler alert: the reason we can't increase the block size limit is ... "decentralization.") BTW, I think this is my favorite part:There are no words.Anyways, I've already written at length about why I think this basic argument is garbage, but I had a few additional thoughts I wanted to share. The small-blockists seem to assume that an increase in the block size limittranslates into a decrease in "decentralization" (however the hell that's defined). After thinking about it some more, I think there's an easy way to demonstrate pretty convincingly that this is false. More specifically, I think you can show that,The first thing to note is that the "smallness" of a particular block size limit should be viewed as beingto the amount of transactional demand that exists. (Thus, the 1-MB limit wasn't particularly "small" at the time it was put in place, but it is becoming increasingly "small" as transactional demand continues to grow.)With that in mind, imagine a block size limit that would allow for only a single (non-Coinbase) transaction in each block. I don't know exactly how small that would be while still allowing for the Coinbase transaction and the basic block overhead (maybe 1.5 kb?), but it doesn't really matter for the point I want to make. The point is, we're talking aboutblocks and a Bitcoin main chain that would allow for a maximum of about 50,000 transactions per year. Now imagine that transactional demand is simultaneously huge -- all 7 billion people on this planet are attempting to use the Bitcoin blockchain as the backbone for the global financial system. It should be pretty apparent that in this scenario, the Bitcoin ecosystem (if we assume for the moment that it could somehow survive under these absurd conditions) would be"centralized." The LN would obviously be a complete non-starter. If the world's population formed a line to make a single on-chain transaction to open a LN payment channel, it would take us aroundto work our way through that line. That seems unworkable, no? I suppose you could imagine a traditional banking model built on top of the main chain, but clearly the only real use for the actual blockchain would be as a ridiculously-expensive interbank settlement network. Maybe the billionaires of the world could hold some of their wealth on-chain, but everyone else would never touch anything other than Bitcoin IOUs.The goal of "decentralization" is supposedly "censorship resistance." The point behind my extreme example is that,In other words,("I know. If I cut off my arms and legs, I'll make for a smaller target.")I'm sure the small-blockists would argue "well, but you're example is so extremeat 1-MB we're above the 'threshold' you're referring to, such that further increases in the block size limit would result in decreased 'decentralization.'" Sorry, but no. First of all, of course my example is extreme. It's intended to be, because that can be a useful way of illustrating a principle. And no, it's not at all clear to me that we're "at a point on the curve" where increasing the limit would result in decreased "decentralization." (In fact, my strong intuition is that the opposite is true.) So... prove it.I think my extreme example is also useful for highlighting the fact that the cost of running a full node, looked at in isolation, is obviously NOT a "reasonable metric for 'decentralization.'" It's ridiculously simplistic and one-dimensional. Just as an aside, the small blockists probably like this metric because it seems like the one that shouldfavor the conclusion that "smaller blocks = moar decentralization," but I think that's onlytrue if you take a static view of things. To the extent that larger blocks and a non-crippled Bitcoin fuel much higher levels of adoption and make many more peopleto run full nodes, that demand should incentivize more businesses to innovate to offer solutions that bring down the cost of doing so. (I admit that argument might not be entirely convincing to the extent that all or most of the costs associated with running a full node involve "off-the-shelf"-type components. In other words, there are presumablyhuge incentives for entrepreneurs to bring down the costs of storage, bandwidth, etc. But in any case, I think it's still kind of interesting theoretically as a reminder of the importance not to view things from a static perspective.)So what's the real takeaway from all this? I mean, who the hell knows where we're at "on the curve" and what the optimal block size limit is, i.e., the one that most perfectly balances all of the supposed tradeoffs? And who knows what it will be tomorrow since we have to keep in mind that we're dealing with a constantly-shifting target? I think the likely answer is:. Not even Gregory Maxwell, the Tom Brady of cryptocurrency. "When it is realized that the problem is impossible, the solution becomes simple." (Ok, I just made that quote up, but it sounds good.) What I mean is that the "solution" here is to stop treating the block size limit as a "hard-coded consensus parameter" and instead allow the limit to be determined via a flexible, emergent (and decentralized) manner by adopting a Bitcoin Unlimited-type approach.