A little late on this, but I was thinking more about orphaning risk in the context of block size. It seems to me that there are, in a sense, two kinds of “orphaning risk” associated with larger blocks:



propagation-delay-based (“natural”) orphaning risk – This is the idea that a larger block will propagate to the rest of the network more slowly, increasing the likelihood that it will be orphaned by another block found in the interim.



consensus-based (“artificial”) orphaning risk – This is exemplified by the current 1-MB “block size limit.” Why aren’t miners creating blocks larger than 1 MB right now? Because they’re confident that, if they did, those blocks would simply be orphaned. So, for now, the 99.9999%-plus estimated probability that a larger than 1-MB block will be orphaned greatly outweighs the modest marginal fee revenue they could theoretically collect by stuffing in a few more transactions.



I don’t think the small-blockists necessarily dispute that the first type of orphaning risk can act as a restraint on block size. They just doubt that it’s a sufficient restraint. In other words, they think that attempting to rely solely on “natural” orphaning risk would lead to dangerous levels of centralization. And therefore we need to create some “artificial” orphaning risk (or essentially, orphaning certainty) with a block size limit consensus rule. And I have to say, that argument doesn’t strike me as crazy. My intuition is that it’s a question of where Bitcoin is operating at a particular time in terms of (a) the demand for block space versus (b) the network’s “technological capacity.” Right now, the former seems to be relatively low compared to the latter. In other words, if the 1-MB limit were removed entirely today, miners could essentially empty the order books for block space by creating 1.5 MB blocks, no? And blocks of that size can be propagated and validated very quickly today such that the significance of the “self-propagation” / reduced-orphan-risk advantage of large miners (and thus its centralizing impact) is likely relatively modest? But as demand increases (and as the overall importance of transaction fees to miners’ bottom lines increases thanks to a reduced block reward), large miners’ self-propagation / reduced-orphan-risk advantage could become more significant (perhaps dangerously so) assuming that increases in the Bitcoin network’s technological capacity don’t keep pace. So if that situation were to materialize, it may make sense for the network to impose some degree of artificial orphaning risk in response.



But even if that’s so, that doesn’t support the Blockstream / Core position. For starters, it seems pretty obvious that 1-MB is too low right now and is not the magic number that gets the supposed tradeoffs just right. But even more fundamentally, the questions of whether or not we need a block size limit and if so what that limit should be are less important than the question of how that limit should be determined (assuming one is needed). And to me, it’s obvious that an approach like that of Bitcoin Unlimited is far superior to the approach of simply following the top-down diktat of a handful of interest-conflicted developers.