The discussion has been ongoing since late 2014. What is the ultimate solution to the problem of the block sizes? Hind sight is 20/20, but foresight is golden, or so it seems now that Gavin Andresen has made a commit on his personal fork of the Bitcoin Core repository that states, simply:
Allows any block with a timestamp on or after 1 March 2016 00:00:00 UTC to be up to 20,000,000 bytes big (serialized).
This won’t come as a surprise to anyone who’s been paying attention. The discussion has been widely circulated throughout the cryptocurrency space. Some users have said that a simplistic approach will not yield the best results, while others have said that the decreasing cost of storage space will compensate for the increase in block sizes.
Andresen has been talking about the problem since last October, when he wrote in a blog post:
My best guess is that we’ll run into the 1 megabyte block size limit during the next price bubble, and that is one of the reasons I’ve been spending time working on implementing floating transaction fees for Bitcoin Core. Most users would rather pay a few cents more in transaction fees rather than waiting hours or days (or never!) for their transactions to confirm because the network is running into the hard-coded blocksize limit.
While the “floating transaction fee” feature was introduced in February, it does not guarantee, as Andresen has astutely addressed, that all transactions will find their way into a block. However, larger blocks do accomplish the goal of a “simple solution.”
Not Everyone Happy With The 20MB Idea
The debate will, of course, continue on. The comments on the commit inferred as much, with one user saying:
Seriously Gavin, who do you think you are? Stop parading around the globe trying to dupe everyone into accepting this change. Do you seriously think you have the authority to take away my (and every other current holder of bitcoin’s) rightfully deserved privilege of getting a transaction in before others? What makes you think I want to or have the means to purchase more storage space and processing power to run my full node? What makes you think everyone has the bandwidth to handle this update? You’re losing any shred of credibility you once had by pursuing this nonsense.
It is hard to determine if this user was being serious, of course. Proposed changes do not necessarily make it into the final code. The nature of open source is that majority will rule. As with other changes that have been instituted by Bitcoin developers in the past, it is a matter of how many actually install the changes. Other users felt the changes are needed, and that Andresen at least fleshing out the code that would allow them is a step in the right direction:
Great work Gavin. Your earlier tests had me sold. Glad someone finally taking a stand and pushing for this needed change.
I think this is a great change. By March 2016 the cost of hardware will continue to go down and I think the barrier to entry to running a full node will go down to the $50-$80 dollar level. Couple this with near-unlimited bandwidth given by home internet providers and we’ll have a situation where we’ll be able to a) support the network growth and b) still have a healthy degree of decentralisation due to consumer nodes. For everything else we’ll still have pruning and off-chain transactions.
Certainly Bitcoin Core development has a responsibility to keep the protocol competitive with other large payment processors. If some feel the core developer, who has been working intimately with Bitcoin since the days of Satoshi Nakamoto, is being heavy-handed, still others feel that someone has to say it, and it might as well be the most authoritative voice to do so.
What do you think? Can the network live with slow transaction times, or is this a move in the right direction?