A panel on Bitcoin governance was featured at Bitcoin Pacifica 2015 last month, and it featured two Bitcoin Core contributors who do not see eye-to-eye on many Bitcoin-related issues. These contributors were Bitcoin Foundation Chief Scientist Gavin Andresen and Perpetual Bitcoin Researcher Peter Todd. Although there were certainly areas of disagreement explored during the panel discussion, there was a point at which some of the common ground found at the recent Scaling Bitcoin workshop in Montreal was discussed. It was near the end of the discussion on Bitcoin governance that Gavin Andresen asked Chaincode Labs Founder Suhas Daftuar, who moderated the panel, if he could describe the “rough consensus” found at the Montreal workshop.

Increasing the Block Size Over a Short Timeframe

When describing the compromises that were found at Scaling Bitcoin Montreal, Daftuar first noted that there does seem to be rough consensus on an increase in the block size limit over a short timeframe. He stated:

“It seemed like there may be consensus among kind of the core developers to — I don’t want to put words in people’s mouths, so I’m just going to try to describe it as I recall it too — to maybe support a plan that would allow for an increase in the block size over a relatively short timeframe. People are talking in the four-year timeframe-ish.”

It’s possible that the short timeframe is preferred by many Bitcoin Core contributors due to an unwillingness to predict future hardware developments that could alter the Bitcoin network’s ability to handle a larger or smaller block size limit. This is a concern that has been articulated by multiple Bitcoin Core contributors in the past.

New Metrics for the Block Size Limit

According to Daftuar, the idea of altering the way in which a block size is measured was also discussed at Scaling Bitcoin Montreal. Instead of simply looking at the block size limit as a solitary metric, Bitcoin Core developers and contributors seem interested in taking a closer look at how certain transactions in a block can affect the entire network. Daftuar explained:

“To do so in a way that wasn’t just to look at the block size directly as a single parameter of just making blocks bigger, but also maybe restate or take into account into the way you decide how big a block can be — kind of the effect of the transactions in a block on the overall network’s health. The idea being that there can be costs — like unexpected costs from unexpected transaction patterns that you want to minimize.”

Daftuar then provided an example of a metric that could be used to better measure the impact of specific blocks of transactions:

“For example, the growth of the number of unspent transaction outputs — some goal to keep that from growing just crazily large if somebody were to try to attack the network in that way. So, the idea of using a different metric to decide how to measure the block size was something that was discussed.”

A BIP and Testing Before Scaling Bitcoin Hong Kong

An audience member was quick to ask about a specific timeframe for an implementation of a higher block size limit in Bitcoin Core after Daftuar finished his account of the Montreal discussions, and Gavin Andresen stated the following:

“The plan is to have a BIP [Bitcoin Improvement Proposal], an implementation, and some testing done before [Scaling Bitcoin] Hong Kong.”

While the Montreal workshop was mainly a platform to discuss the problems with scalability in Bitcoin, Scaling Bitcoin Hong Kong will feature reviews and test results of specific technical proposals. Judging from other accounts of the discussions that took place in Montreal, it would seem that an increase in the block size limit to 2 or 4MB could be in the works. Whether or not this change is eventually implemented in Bitcoin Core will likely depend on the testing that takes place before the Hong Kong workshop in December.

Image courtesy of twitter.com/panteracapital