RealBitcoin



Offline



Activity: 854

Merit: 1000





JAYCE DESIGNS - http://bit.ly/1tmgIwK







Hero MemberActivity: 854Merit: 1000JAYCE DESIGNS - http://bit.ly/1tmgIwK The Blocksize Debate & Concerns June 20, 2016, 11:08:48 PM

Last edit: June 21, 2016, 05:29:15 PM by RealBitcoin #1 Hello, I've been in bitcoin for 3+ years now and I would like to share my intellectual opinion about big blocks for bitcoin (2mb ,8mb ,etc) , the Bitcoin Hardfork, and about bitcoin in general and what are the concerns we need to watch out for.



Now there have been many shills, from both sides, so to just clear that out, so I want rational arguments pro/contra big blocks and hardforking bitcoin. I am personally anti-hardfork and therefore anti-big blocks, and I will demonstrate here why it is the best choice in my opinion. So stop shilling, and let's start debating this like civilized adults. I`ll present here my arguments and then you guys can respond to it. The thread will not be moderated so that I should not be accused as a shill. But I hope troll posts or low intellect posts (usually 1 liners) will be removed by forum moderators. Aso remember these are only my opinions from the knowledge I have so if I'm wrong, feel free to correct me, I`m open to criticism!







1) The sacred Bitcoin Protocol



The bitcoin protocol is like a constitution, once you start making changes in it, you will pretty much find yourself in financial tyranny, abuse of power and corruption pretty soon. Whenever things can be changed easily, they will. If there is no resistance to change, they will always happen. Therefore if the bitcoin protocol gets changed only by a small amount, some people will abuse that opportunity, and gradually add more changes, until bitcoin goes off it's original mission, and becomes centralized. Weather that be some sort of filter/blacklist/censoring system, or some sort of other oppressive patch installed into bitcoin. And then bitcoin will fail.



Of course like any constitution, changes have to happen eventually. You can't have the barbarian constitutions of the middle ages rule modern society, even if in the middle ages that was the sacred norm. Changes have to happen eventually, it is inevitable, but it has to be natural, well thought of, and based on real evidence and many many testing.



So in my opinion any change should be at least 1 generational. You don't want to change bitcoin too fast, when it's not even adopted yet by many people. Industry and tech changes too fast nowadays, but you have to slow down so that the clients can catch up. Hard fork is definitely not necessary right now (nor I think it will be in the foreseeable future), because I fear the community is still too young and prone to manipulation, therefore using this 'ring of power' is not an option.





2) Segwit



Segwit is good and necessary to fix fungibility and malleability. As a side-effect, it grows the block capacity as well. It is a temporary solution in terms of scaling the block capacity, because that is not it's goal. It's goal was only to fix malleability, which it probably will to some degree.





3) Sidechains + Lightning Network



This is the path of scaling the Core team choose. And I think it's a very elegant, decentralized approach to scaling. It is one way of doing it, and it's the most easy, immediate and fair way of doing it. Not to mention it's the most secure.



It's not a giant blob of code like ETH+ smart contracts, it's actually isolated from the bitcoin protocol. Security is only good if 1 sector is isolated from another. Think of it as a submarine, if the compartments are not isolated, then if it's breached, the water will sink the entire sub.



From transaction perspective, it can scale bitcoin really bigtime, while adding little or no centralization to the approach. Other 'classical' way would be to add an intermediary to issue IOU bitcoins that could be traded offchain. Well we know that is not an option for decentralization, so the LN is really cutting edge innovation, with little nor no centralization costs to the community.



The transactions will be instant, decentralized or semi-decentralized if you don't host the node, and would be the optimal choice given the current circumstances.



Therefore it is good to have this elegant, secure, and innovative approach to scaling. It also enables new features to bitcoin, that can be used to expand business, economy and other functions.



It is truly a piece of software art, truly crafted by the smartest people on the planet.





4) Urgent Hardfork



Well if the hardfork is really a 100% urgency like if ECDSA gets broken, then it's an urgency, and unfortunately it will have to be implemented. People will not like it, but it will be a true emergency so they will have no choice. However the probability of this is very low.



The more worrysome would be, once the devs have tasted the power of hardfork, then what? What classifies as an urgency? Can they just roll out hardfork every week when an 'urgency' becomes available? We go back to point 1), and how if things get too crazy, it can end bitcoin.



These have to be answered, and the community should always be skeptical.





5) Quantum Computers



I hear this argument many times, and it's mostly fearmongering. What if quantum computers become reality they can break bitcoin...?



Well I personally don't believe in large scale quantum computers, I think they won't ever exist, and there is plenty of evidence that supports this. Yes some laboratory theoretical computers might exist, but there are hard limits of thermodynamics that prevent the existence of large ones.



It's the same nonsense like the free energy advocates that think there is some hidden energy source, when 7th grade physics demonstrates you otherwise.



However classic computers might become one day powerful enough to find some exploit in the crypto algorithms (because brute forcing is obviously nearly-impossible).



And for that we shall react as in point 4). But not sooner than that. Bitcoin doesn't need 'what will happen if' type of patches. Bitcoin doesn't need insurance against alien invasion, we can deal with it, when the threat becomes sizeable.





6) Corruptible miners



What if the miners become untrustworthy? Low probability again, or it will be temporary. Game theory proves us that people working for incentives won't do things that will undermine the incentives. Or in other words people don't bit the hand that feeds them. If miners become shady then either they will run out of money before they can do too much evil, or they will become replaced by the honest ones, that would gain more money from mining honestly. Not to mention if they really piss off the community, everyone will just sell their coins and the mining business, as well as bitcoin will be over. If they implement capital controls to slow down the dumping, then the price will crash to 0 because nobody can buy, just as nobody can sell. Etc.. with many more theoretical attacks, but all end in disaster for the attacker. So a miner betrayal would only be temporary at best.





7) Corruptible devs



So far we have had very honest reputable developers. What if that changes and we get replaced by corrupt ones?



Well it's like the saying goes, what you reap is what you sow. If the bitcoin community is comprised of programmers , IT experts and PC experts, then you will always find the best ones amongst them that can work on the code. If the bitcoin community will be comprised of morons, then bitcoin will end long before that.



So it's more important to keep the community healthy, because the community will supply the developers and bitcoin experts.



Also the bitcoin developers are decentralized, even if the Core team has control over the github depository, everyone can host bitcoin's source code on his server and work on the code.



The enforcing mechanism in bitcoin is the miners ,and I've explained in point 6) why miners are not prone to betrayal. So the devs incentive is to show his talent, get fame, get donations, and get careers. The miners enforce the rules, while the community decides weather they like it or not.



So if a dev becomes corrupt, he will achieve little, but will lose everything , especially his reputation.





8﴿ Corruptible community



Devs might be corrupt, miners might be, but all of them would be temporary issues. What if the community becomes corrupt?



Yes this is one of the most concerning factors, if the old guard leaves or gets bored of bitcoin, and we get replaced by 80 IQ morons , illiterate people, or just simply opportunists that dont care about bitcoin only want quick profit, then what?



Well this is one of the biggest and most realistic threats to bitcoin. If the community is dumb, it can be manipulated, and divided to be controlled. Everyone will just control the community by their own incentives and whoever can fool the most people will have control over bitcoin. It is a scary possibility.



My opinion would be a moderate adoption, with many tutorials for newbies, as teaching newbies about bitcoin and it's values. People need to feel home in bitcoin and appreciate it's qualities.



Also the old guard should never leave, if they do they are pussies, and whatever happens to bitcoin will be partially their fault. So people should educate newbies and teach them why bitcoin is the way to go.



9) Hardfork not fair for long term savers



Most bitcoin old guard, as well as new investors, want to save bitcoin for long term, usually in a secure wallet. If you roll out hard fork every year, then they will always have to uppgrade their wallets and work with sensitive information, that would expose them to unnecessary risk.



If they don't they can lose their money. What if they are not announced? What if they don't keep up with bitcoin news? You can't just push changes so fast because people will lose confidence in bitcoin's long term future.



If a person wants to put away for his kid's 18th birthday, and forget about that wallet for many years (but keep the private keys backed up and secure), he doesn't want to always be updated with all fixes and patches.



Even other softwares usually have support for 1 version of their software for many years, so why can't bitcoin be the same? It is, and that is how should it be!



So the SET & FORGET approach that most people feel comfortable with, will be violated, and it will upset many bitcoin whales.



10) Hardfork risks the entire network



Currently only 37% of the network runs the latest software, which is good, because if a bug gets discovered the other 67% will act as a backbone.



If you want to force a hardfork then everyone has to uppgrade before the grace period runs out, therefore all the network will have the same version. If a bug or malware gets discovered in the hardforked client, then the entire bitcoin network will be destroyed.



Therefore hardforks are ultra vulnerable to 0 day bugs (or undiscovered bugs). Decentralization should mean that you cannot be 100% confident of the developers, because we are all humans and we make mistakes, so even if the devs have checked the code many many times, you can still not risk the network, because 0 day bugs are very frequent, even in huge projects ( *cought* Ethereum *cought*) Win FREE Bitcoins Every Hour! - JOIN NOW

RocketSingh



Offline



Activity: 1625

Merit: 1032







LegendaryActivity: 1625Merit: 1032 Re: The Blocksize Debate & Concerns June 20, 2016, 11:17:15 PM

Last edit: June 20, 2016, 11:41:41 PM by RocketSingh #2 1) The sacred Bitcoin Protocol: I think the current proposed HF from 1mb to 2mb wont make or break anything. But, it'll set a precedent that bitcoin protocol can be changed according to the demand of a cheering crowd. In future, there may be a cheering crowd to raise 21m supply cap. Question is what we do then?



2) Segwit: Malleability is already fixed. Segwit does not fix it further. Segwit is aimed to create space within 1mb block size and creating scopes from Sidechains & Lightning Network. But, Segwit is complicated. It might become a disaster like The DAO. Hence, as Segwit is soft fork, there should always be non-Segwit nodes and applications doing non-Segwit transaction, so that in case of a disaster, a part of the network survives and hence bitcoin.



3) Sidechains + Lightning Network: Secondary layer. No problem.



4) Urgent Hardfork: Very unlikely for robust network like bitcoin.



5) Quantum Computers: FIAT money sitting on banking network is on much more threat than Bitcoin network. We are ~10 billion USD, while they are in trillion. We might sit back and enjoy the show.



7) Corruptible devs: Devs can not enforce anything immediately as a change to the network. They can hold back the existing working system and resist a change. So, dev level corruption is no big threat.



6 & 8 ) Corruptible miners & Corruptible community: Corruption is a relative word. In a way, a part will always be corrupted. But, in reality, only those matter, who have deep skin in the system itself. In reality, community is miners and holders. If there are 1000 corrupted holder holding 0.01 BTC each, then they will have less effect on the network than 10 holders holding 10 BTC each. Because, if, at any point, holders need to decide the winning chain, the coin holding majority will decide which chain will survive by selling the other chain's coin. Individual majority of holders is not important. Now, to serve their own interest, the majority holders wont go against the best health of the bitcoin network, corrupted or not corrupted.



9) Hardfork not fair for long term savers: Already discussed in above points.

RealBitcoin



Offline



Activity: 854

Merit: 1000





JAYCE DESIGNS - http://bit.ly/1tmgIwK







Hero MemberActivity: 854Merit: 1000JAYCE DESIGNS - http://bit.ly/1tmgIwK Re: The Blocksize Debate & Concerns June 20, 2016, 11:20:21 PM

Last edit: June 21, 2016, 12:04:25 AM by RealBitcoin #3 Quote from: RocketSingh on June 20, 2016, 11:17:15 PM 1) The sacred Bitcoin Protocol: I think the current proposed HF from 1mb to 2mb wont make or break anything. But, it'll set a precedent that bitcoin protocol can be changed according to the demand of a cheering crowd. In future, there may be a cheering crowd to raise 21m supply cap. Question is what we do then?



Yes combine 1) with 8 ), and if the community gets hijacked by socialists, they will demand a wealth redistribution mechanism to be implemented in the protocol.



It's sort of an extreme example, but it is plausible. So that is why change has to be very hard to make, and well tested and thought out before it happens.



What will we do then? Nothing, by then the entire community is just destroyed. We need to prevent this from happening.



2) I meant fungibility, i always mix the two. I`ll corect it.



5) Yes, as soon as it would exist, nations would start hacking eachother , and it would be chaos , but i dont think it's possible to exist



7) Yes, but delaying good info is not incentivized. They can earn money and fame by helping bitcoin, so it's against their incentive to do that.



8 ) Yes, but that is why i wish the old guard to stay, so far they have the most coins, and they are the most reputable people for bitcoin, so if they dont sell their coins, we will always have good people controlling the coins and influencing the community with pro-bitcoin spirit.



Yes combine 1) with 8 ), and if the community gets hijacked by socialists, they will demand a wealth redistribution mechanism to be implemented in the protocol.It's sort of an extreme example, but it is plausible. So that is why change has to be very hard to make, and well tested and thought out before it happens.What will we do then? Nothing, by then the entire community is just destroyed. We need to prevent this from happening.2) I meant fungibility, i always mix the two. I`ll corect it.5) Yes, as soon as it would exist, nations would start hacking eachother , and it would be chaos , but i dont think it's possible to exist7) Yes, but delaying good info is not incentivized. They can earn money and fame by helping bitcoin, so it's against their incentive to do that.8 ) Yes, but that is why i wish the old guard to stay, so far they have the most coins, and they are the most reputable people for bitcoin, so if they dont sell their coins, we will always have good people controlling the coins and influencing the community with pro-bitcoin spirit. Win FREE Bitcoins Every Hour! - JOIN NOW

franky1



Offline



Activity: 2884

Merit: 1751









LegendaryActivity: 2884Merit: 1751 Re: The Blocksize Debate & Concerns June 21, 2016, 12:28:07 AM #4 i actually think the OP's post is actually quite unbiased and reasonable



here is my insight to address three points (sacred protocol)(corruptible devs)(corruptible pools)

though things should change to adapt to the changing reality of users needs, we need to accept that even soft forks should need consensus just to be activated. and should need independent testing and reading every line of code.

this is because a softfork due to its nature of NOT needing upwards close to 100% adoption, allows the risk of bad code being added to change things 'on a whim'.

im glad mining pools are not going to run any new versions until they have done their own checks and tests... even if greenlit by core. I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.

Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at

gmaxwell

Legendary



Offline



Activity: 3178

Merit: 4301









StaffLegendaryActivity: 3178Merit: 4301 Re: The Blocksize Debate & Concerns June 21, 2016, 07:05:29 AM #5 Quote from: RocketSingh on June 20, 2016, 11:17:15 PM Malleability is already fixed. Segwit does not fix it further.

Technical point: This is very much not the case: Malleability is blocked in the relay network for a narrow class of transactions but anything clever is exposed, multisig is exposed to other-signer malleation, and all transactions are exposed to malleation by miners. Segwit fixes it in a deep and profound way for all purely segwit transactions.



Quote from: MicroGuy on June 21, 2016, 01:43:33 AM doesn't say anything about a 1MB block size in the Bitcoin whitepaper. Changing it to 2 MB (like we did) does not change the protocol

The whitepaper doesn't say anything about 21 million coins-- or a great many other very important consensus rules. If the rule was intended to _simply_ be temporary it could have had an automatic phase out, but it didn't. If it was intended to go up as it filled, it could have-- just as difficulty adjusts, the mechanism is all there.



Already Ethereum users make the incorrect argument that bitcoin was hardforked in the past to fix the value overflow bug (it wasn't) and thus its okay for them to manually tamper with the ledger to claw the funds the DAO lost and hand them over to other users. You're seeing a first hand demonstration for how quickly people cling to argument of convenience.



The rule by math element of Bitcoin is essential to the value proposition, great care should be taken to not erode it based on short term interests. Technical point: This is very much not the case: Malleability is blocked in the relay network for a narrow class of transactions but anything clever is exposed, multisig is exposed to other-signer malleation, and all transactions are exposed to malleation by miners. Segwit fixes it in a deep and profound way for all purely segwit transactions.The whitepaper doesn't say anything about 21 million coins-- or a great many other very important consensus rules. If the rule was intended to _simply_ be temporary it could have had an automatic phase out, but it didn't. If it was intended to go up as it filled, it could have-- just as difficulty adjusts, the mechanism is all there.Already Ethereum users make the incorrect argument that bitcoin was hardforked in the past to fix the value overflow bug (it wasn't) and thus its okay for them to manually tamper with the ledger to claw the funds the DAO lost and hand them over to other users. You're seeing a first hand demonstration for how quickly people cling to argument of convenience.The rule by math element of Bitcoin is essential to the value proposition, great care should be taken to not erode it based on short term interests.

MeteoImpact



Offline



Activity: 97

Merit: 10







MemberActivity: 97Merit: 10 Re: The Blocksize Debate & Concerns June 21, 2016, 08:50:36 AM #6 Quote from: gmaxwell on June 21, 2016, 07:05:29 AM Quote from: MicroGuy on June 21, 2016, 01:43:33 AM doesn't say anything about a 1MB block size in the Bitcoin whitepaper. Changing it to 2 MB (like we did) does not change the protocol

The whitepaper doesn't say anything about 21 million coins-- or a great many other very important consensus rules. If the rule was intended to _simply_ be temporary it could have had an automatic phase out, but it didn't. If it was intended to go up as it filled, it could have-- just as difficulty adjusts, the mechanism is all there.

The whitepaper doesn't say anything about 21 million coins-- or a great many other very important consensus rules. If the rule was intended to _simply_ be temporary it could have had an automatic phase out, but it didn't. If it was intended to go up as it filled, it could have-- just as difficulty adjusts, the mechanism is all there. should it be increased? Is, "infinite", capacity even something we should be targetting?



It is my understanding that Core's implementation of segwit will also include an overall, "blocksize", increase (between both the witness and transaction blocks), though with a few scalability improvements that should make the increase less demanding on the system (linear verification scaling with segwit and Compactblocks come to mind). Do you personally support this particular instance of increasing the overall, "blocksize"?



To be clear, I'm just asking as someone who'd like to hear your informed opinion. In the short-term I'm not exactly worried about transaction capacity--certainly not hardfork worried--as with segwit on the way and the potential for LN or similar mechanisms to follow, short-term capacity issues could well be on their way out (really all I want short-term is an easier way to flag and use RBF so I can fix my fees during unexpected volume spikes). What I'm more curious about at this point are your views on the long-term--the, "



Of course, you're free to keep your opinions to yourself or only answer as much as you're comfortable divulging; it's not my intent to have you say anything that would encourage people to sling even more shit at you than they already have been lately Out of curiosity, would you be willing to state your personal opinion on under what conditions it would be appropriate and/or beneficial to raise the blocksize limit? What particular concerns, technical or otherwise, do you consider most important when it comes to considering alterations to this particular aspect of the system? Of course, blocksize is only one aspect of capacity (and a horribly inefficient method of scaling capacity), but I'm curious if there are any particular conditions under which you feel raising the blocksize would be the appropriate solution. Is it only a last-resort method of increasing capacity? And when it comes to capacity, to what extentit be increased? Is, "infinite", capacity even something we should be targetting?It is my understanding that Core's implementation of segwit will also include an overall, "blocksize", increase (between both the witness and transaction blocks), though with a few scalability improvements that should make the increase less demanding on the system (linear verification scaling with segwit and Compactblocks come to mind). Do you personally support this particular instance of increasing the overall, "blocksize"?To be clear, I'm just asking as someone who'd like to hear your informed opinion. In the short-term I'm not exactly worried about transaction capacity--certainly not hardfork worried--as with segwit on the way and the potential for LN or similar mechanisms to follow, short-term capacity issues could well be on their way out (really all I want short-term is an easier way to flag and use RBF so I can fix my fees during unexpected volume spikes). What I'm more curious about at this point are your views on the long-term--the, " 133MB infinite transactions ", stage.Of course, you're free to keep your opinions to yourself or only answer as much as you're comfortable divulging; it's not my intent to have you say anything that would encourage people to sling even more shit at you than they already have been lately

gmaxwell

Legendary



Offline



Activity: 3178

Merit: 4301









StaffLegendaryActivity: 3178Merit: 4301 Re: The Blocksize Debate & Concerns June 21, 2016, 09:50:47 AM

Last edit: June 21, 2016, 10:05:47 AM by gmaxwell #7 Quote from: MeteoImpact on June 21, 2016, 08:50:36 AM Out of curiosity, would you be willing to state your personal opinion on under what conditions it would be appropriate and/or beneficial to raise the blocksize limit?

I'd be glad to, and have many times in the past (also on the



Quote What particular concerns, technical or otherwise, do you consider most important when it comes to considering alterations to this particular aspect of the system? Of course, blocksize is only one aspect of capacity (and a horribly inefficient method of scaling capacity), but I'm curious if there are any particular conditions under which you feel raising the blocksize would be the appropriate solution. Is it only a last-resort method of increasing capacity? And when it comes to capacity, to what extent should it be increased? Is, "infinite", capacity even something we should be targetting?

Since we live in a physical world, with computers made of matter and run on energy there will be limits. We should make the system as _efficient_ as possible because, in a physical world, one of the concerns is that there is some inherent trade-off between blockchain load and decentralization. (Note that blockchain load != Bitcoin transactional load, because there are _many_ ways to transact that have reduced blockchain impact.) ... regardless of the capacity level we're at, more efficiency means a better point on that trade-off.



Not screwing up decentralization early in the system's life is a high priority for me: It is utterly integral to the system's value proposition in a way that few other properties are, and every loss of decenteralization we've suffered over the system's life has been used to argue that the next loss isn't a big deal, making progress backwards is hard: The system actually works _better_ in a conventional sense, under typical usage, as it becomes more centralized: Upgrades are faster and easier, total resource costs are lower, behavior is more regular and consistent, some kinds of attacks are less commonly attempted, properties which are socially "desirable" but not rational for participants don't get broken as much, etc. Decentralization is also central to other elements that we must improve to preserve Bitcoin's competitiveness as a worldwide, open and politically neutral money-- such as fungiblity.



But even if we imagine that we got near infinite efficiency, would we want near infinite capacity?



Bitcoin's creator proposed that security would be paid for in the future by users bidding for transaction fees. If there were no limit to the supply of capacity, than even a single miner could always make more money by undercutting the market and clearing it*. Difficulty can adapt down, so low fee income can result in security melting away. This could potentially be avoided by cartel behavior by miners, but having miners collude to censor transactions would be quite harmful... and in the physical world with non-infinite efficiency, avoiding the miners from driving the nodes off the network is already needed. Does that have to work as a static limit? No-- and I think some of the flexcap proposals have promise for the future for addressing this point.



Some have proposed that instead of fees security in the future would be provided by altruistic companies donating to miners. The specific mechanisms proposed didn't make much sense-- e.g. they'd pay attackers just as much as honest miners, but that could be fixed... but the whole model of using altruism to pay for a commons has significant limitations, especially in a global anonymous network. We haven't shown altruism of this type to be successful at funding development or helping to preserve the decentralization of mining (e.g. p2pool). I currently see little reason to believe that this could workable alternative to the idea initially laid of of using fees... of course, if you switch it around from altruism to a mandatory tax, you end up with the inflationary model of some altcoins-- and that probably does "work", but it's not the economic policy we desire (and, in particular, without a trusted external input to control the inflation rate, it's not clear that it would really work for anyone).



So in the long term part of my concern is avoiding the system drifting into a state where we're all forced to choose between inflation or failure (in which case, a bitcoin that works is better than one that doesn't...).



As far as when, I think we should execute the most extreme caution in incompatible changes in general. If it's really safe and needed we can expect to see broad support... it becomes easier to get there when efforts are made to address the risks, e.g. segwit was a much easier sell because it improved scalablity while at the same time increasing capacity. Likewise, I expect successful future increases to come with or after other risk mitigations.



(* we can ignore orphaning effects for four reasons, orphaning increases as a function of transaction load can be ~completely eliminated with relay technology improvements, and if not that by miners centralizing.. and if all a miners income is going to pay for orphaning losses there will be no excess paying for competition thus security, and-- finally-- if transaction fees are mostly uniform, the only disincentive from orphaning comes from the loss of subsidy, which will quickly become inconsequential unless bitcoin is changed to be inflationary.)



Quote It is my understanding that Core's implementation of segwit will also include an overall, "blocksize", increase (between both the witness and transaction blocks), though with a few scalability improvements that should make the increase less demanding on the system (linear verification scaling with segwit and Compactblocks come to mind). Do you personally support this particular instance of increasing the overall, "blocksize"?

I think the capacity increase is risky. The risks are compensated by improvements (both recent ones already done and immediately coming, e.g. libsecp256k1, compactblocks, tens of fold validation speed improvements, smarter relay, better pruning support) along with those in segwit.



I worry a lot that there is a widespread misunderstanding that blocks being "full" is bad-- block access is a priority queue based on feerate-- and at a feerate of ~0 there effectively infinite demand (for highly replicated perpetual storage). I believe that (absent radical new tech that we don't have yet) the system cannot survive as a usefully decentralized system if the response to "full" is to continually increase capacity (such as system would have almost no nodes, and also potentially have no way to pay for security). One of the biggest problems with hardfork proposals was that they directly fed this path-to-failure, and I worry that the segwit capacity increase may contribute to that too... e.g. that we'll temporarily not be "full" and then we'll be hit with piles of constructed "urgent! crash landing!" pressure to increase again to prevent "full" regardless of the costs. E.g. a constant cycle of short term panic about an artificial condition pushing the system away from long term survivability.



But on the balance, I think the risks can be handled and the capacity increase will be useful, and the rest of segwit is a fantastic improvement that will set the stage for more improvements to come. Taking some increase now will also allow us to experience the effects and observe the impacts which might help improve confidence (or direct remediation) needed for future increases.



Quote

To be clear, I'm just asking as someone who'd like to hear your informed opinion. In the short-term I'm not exactly worried about transaction capacity--certainly not hardfork worried--as with segwit on the way and the potential for LN or similar mechanisms to follow, short-term capacity issues could well be on their way out (really all I want short-term is an easier way to flag and use RBF so I can fix my fees during unexpected volume spikes). What I'm more curious about at this point are your views on the long-term--the, " 133MB infinite transactions ", stage.



Some of the early resource estimates from lightning have already been potentially made obsolete by new inventions. For example, the lightning paper originally needed the ability to have a high peak blocksize in order to get all the world's transactions into it (though such blocks could be 'costly' for miners to build, like flexcap systems) in order to handle the corner case where huge numbers of channels were uncooperative closed all at once and all had to get their closures in before their timeouts expired. In response to this, I proposed the concept of a sequence lock that stops when the network is beyond capacity ("timestop"); it looks like this should greatly the need for big blocks at a cost of potentially delaying closure when the network is usually loaded; though further design and analysis is needed. I think we've only started exploring the potential design space with channels.



Besides capacity, payment channels (and other tools) provide other features that will be important in bringing our currency to more places-- in particular, "instant payment".



As much as I personally dislike it, other services like credit are very common and highly desired by some markets-- and that is a service that can be offered by other layers as well.



I'm sorry to say that an easy to use fee-bump replacement feature just missed the merge window for Bitcoin Core 0.13. I'm pretty confident that it'll make it into 0.14. I believe Green Address has an feebump available already (but I haven't tried it). 0.13 will have ancestor feerate mining ("child pays for parent") so that is another mechanism that should help unwedge low fee transactions, though not as useful as replacement.



Quote

Of course, you're free to keep your opinions to yourself or only answer as much as you're comfortable divulging; it's not my intent to have you say anything that would encourage people to sling even more shit at you than they already have been lately

I'd be glad to, and have many times in the past (also on the more general subject of hardforks ), though I don't know that my particular views matter that much. I hope they do not.Since we live in a physical world, with computers made of matter and run on energy there will be limits. We should make the system as _efficient_ as possible because, in a physical world, one of the concerns is that there is some inherent trade-off between blockchain load and decentralization. (Note that blockchain load != Bitcoin transactional load, because there are _many_ ways to transact that have reduced blockchain impact.) ... regardless of the capacity level we're at, more efficiency means a better point on that trade-off.Not screwing up decentralization early in the system's life is a high priority for me: It is utterly integral to the system's value proposition in a way that few other properties are, and every loss of decenteralization we've suffered over the system's life has been used to argue that the next loss isn't a big deal, making progress backwards is hard: The system actually works _better_ in a conventional sense, under typical usage, as it becomes more centralized: Upgrades are faster and easier, total resource costs are lower, behavior is more regular and consistent, some kinds of attacks are less commonly attempted, properties which are socially "desirable" but not rational for participants don't get broken as much, etc. Decentralization is also central to other elements that we must improve to preserve Bitcoin's competitiveness as a worldwide, open and politically neutral money-- such as fungiblity.But even if we imagine that we got near infinite efficiency, would we want near infinite capacity?Bitcoin's creator proposed that security would be paid for in the future by users bidding for transaction fees. If there were no limit to the supply of capacity, than even a single miner could always make more money by undercutting the market and clearing it*. Difficulty can adapt down, so low fee income can result in security melting away. This could potentially be avoided by cartel behavior by miners, but having miners collude to censor transactions would be quite harmful... and in the physical world with non-infinite efficiency, avoiding the miners from driving the nodes off the network is already needed. Does that have to work as a static limit? No-- and I think some of the flexcap proposals have promise for the future for addressing this point.Some have proposed that instead of fees security in the future would be provided by altruistic companies donating to miners. The specific mechanisms proposed didn't make much sense-- e.g. they'd pay attackers just as much as honest miners, but that could be fixed... but the whole model of using altruism to pay for a commons has significant limitations, especially in a global anonymous network. We haven't shown altruism of this type to be successful at funding development or helping to preserve the decentralization of mining (e.g. p2pool). I currently see little reason to believe that this could workable alternative to the idea initially laid of of using fees... of course, if you switch it around from altruism to a mandatory tax, you end up with the inflationary model of some altcoins-- and that probably does "work", but it's not the economic policy we desire (and, in particular, without a trusted external input to control the inflation rate, it's not clear that it would really work for anyone).So in the long term part of my concern is avoiding the system drifting into a state where we're all forced to choose between inflation or failure (in which case, a bitcoin that works is better than one that doesn't...).As far as when, I think we should execute the most extreme caution in incompatible changes in general. If it's really safe and needed we can expect to see broad support... it becomes easier to get there when efforts are made to address the risks, e.g. segwit was a much easier sell because it improved scalablity while at the same time increasing capacity. Likewise, I expect successful future increases to come with or after other risk mitigations.(* we can ignore orphaning effects for four reasons, orphaning increases as a function of transaction load can be ~completely eliminated with relay technology improvements, and if not that by miners centralizing.. and if all a miners income is going to pay for orphaning losses there will be no excess paying for competition thus security, and-- finally-- if transaction fees are mostly uniform, the only disincentive from orphaning comes from the loss of subsidy, which will quickly become inconsequential unless bitcoin is changed to be inflationary.)I think the capacity increase is risky. The risks are compensated by improvements (both recent ones already done and immediately coming, e.g. libsecp256k1, compactblocks, tens of fold validation speed improvements, smarter relay, better pruning support) along with those in segwit.I worry a lot that there is a widespread misunderstanding that blocks being "full" is bad-- block access is a priority queue based on feerate-- and at a feerate of ~0 there effectively infinite demand (for highly replicated perpetual storage). I believe that (absent radical new tech that we don't have yet) the system cannot survive as a usefully decentralized system if the response to "full" is to continually increase capacity (such as system would have almost no nodes, and also potentially have no way to pay for security). One of the biggest problems with hardfork proposals was that they directly fed this path-to-failure, and I worry that the segwit capacity increase may contribute to that too... e.g. that we'll temporarily not be "full" and then we'll be hit with piles of constructed "urgent! crash landing!" pressure to increase again to prevent "full" regardless of the costs. E.g. a constant cycle of short term panic about an artificial condition pushing the system away from long term survivability.But on the balance, I think the risks can be handled and the capacity increase will be useful, and the rest of segwit is a fantastic improvement that will set the stage for more improvements to come. Taking some increase now will also allow us to experience the effects and observe the impacts which might help improve confidence (or direct remediation) needed for future increases.I think we don't know exactly how lightning usage patterns will play out, so the resource gains are hard to estimate with any real precision. Right now bidi channels are the only way we know of to get to really high total capacity without custodial entities (which I also think should have a place in the Bitcoin future).Some of the early resource estimates from lightning have already been potentially made obsolete by new inventions. For example, the lightning paper originally needed the ability to have a high peak blocksize in order to get all the world's transactions into it (though such blocks could be 'costly' for miners to build, like flexcap systems) in order to handle the corner case where huge numbers of channels were uncooperative closed all at once and all had to get their closures in before their timeouts expired. In response to this, I proposed the concept of a sequence lock that stops when the network is beyond capacity ("timestop"); it looks like this should greatly the need for big blocks at a cost of potentially delaying closure when the network is usually loaded; though further design and analysis is needed. I think we've only started exploring the potential design space with channels.Besides capacity, payment channels (and other tools) provide other features that will be important in bringing our currency to more places-- in particular, "instant payment".As much as I personally dislike it, other services like credit are very common and highly desired by some markets-- and that is a service that can be offered by other layers as well.I'm sorry to say that an easy to use fee-bump replacement feature just missed the merge window for Bitcoin Core 0.13. I'm pretty confident that it'll make it into 0.14. I believe Green Address has an feebump available already (but I haven't tried it). 0.13 will have ancestor feerate mining ("child pays for parent") so that is another mechanism that should help unwedge low fee transactions, though not as useful as replacement.Ha, that's unavoidable. I just do what I can, and try to remind myself that if no one at all is mad then what I'm doing probably doesn't matter.

franky1



Offline



Activity: 2884

Merit: 1751









LegendaryActivity: 2884Merit: 1751 Re: The Blocksize Debate & Concerns June 21, 2016, 12:25:26 PM #9



there is just one sticking point

Quote I worry a lot that there is a widespread misunderstanding that blocks being "full" is bad-- block access is a priority queue based on feerate-- and at a feerate of ~0 there effectively infinite demand (for highly replicated perpetual storage). I believe that (absent radical new tech that we don't have yet) the system cannot survive as a usefully decentralized system if the response to "full" is to continually increase capacity (such as system would have almost no nodes, and also potentially have no way to pay for security). One of the biggest problems with hardfork proposals was that they directly fed this path-to-failure, and I worry that the segwit capacity increase may contribute to that too... e.g. that we'll temporarily not be "full" and then we'll be hit with piles of constructed "urgent! crash landing!" pressure to increase again to prevent "full" regardless of the costs. E.g. a constant cycle of short term panic about an artificial condition pushing the system away from long term survivability.



at this current early era of bitcoin, the fee's are treated as a bonus(subsidy) and the reward is treated as the salary(income). whereby the deflationary nature of the reward causes the fiat price(speculatively) to offset the deflationary nature to keep pools comfortable long term for a couple decades(ignoring short term panic price emotional drama once each 4 years).

and as such it is not necessary at this time to be forced to wait for blocks to be full, before increasing some bufferspace/wiggle room to allow natural growth right now, purely on the bases of fee's



i say this because the paragraph i quoted is concerning and that it seems that bottlenecking blocks appears to some devs to be a good thing, because it launches the fee war.

but fees will only become essential in a couple decades.



so here is my question. imagine that segwit was in production in 2013. tested and independently vetted by 2014. would it have been included in v0.10 or would developers have dragged their feet until the fee war started this year and waited to 0.13 because of the worry about mining in 2032+(reward under 1btc)

isnt it the case that by pushing a fee war early is simply causing a taxation of bitcoin which will hurt bitcoins utility and desirability which can literally tank the speculative price to force fee's to become more essential sooner, to offset their income and thus raise the tax higher in and endless spiral.



however within those couple decades other things can be put inplace to help achieve the ultimate goal in a few decades of switching salary<-->bonus (income<-->subsidy)

whereby the 'full blocks are good, fee war is good' mindset should not be front of mind for devs, because a slow progression should be the mindset. rather than a manic push to switch sooner



i hope to see a honest and respectful opinion

i truly hope your reply is that segwit would have been implemented when independently vetted, tested, rather than dragging feet in favour of a early fee war.

to Gmaxwell, i can honestly say that your last post was a genuine open minded opinion taking everything into account. it was a good read.there is just one sticking pointat this current early era of bitcoin, the fee's are treated as a bonus(subsidy) and the reward is treated as the salary(income). whereby the deflationary nature of the reward causes the fiat price(speculatively) to offset the deflationary nature to keep pools comfortable long term for a couple decades(ignoring short term panic price emotional drama once each 4 years).and as such it is not necessary at this time to be forced to wait for blocks to be full, before increasing some bufferspace/wiggle room to allow natural growth right now, purely on the bases of fee'si say this because the paragraph i quoted is concerning and that it seems that bottlenecking blocks appears to some devs to be a good thing, because it launches the fee war.but fees will only becomein a couple decades.so here is my question. imagine that segwit was in production in 2013. tested and independently vetted by 2014. would it have been included in v0.10 or would developers have dragged their feet until the fee war started this year and waited to 0.13 because of the worry about mining in 2032+(reward under 1btc)isnt it the case that by pushing a fee war early is simply causing a taxation of bitcoin which will hurt bitcoins utility and desirability which can literally tank the speculative price to force fee's to become more essential sooner, to offset their income and thus raise the tax higher in and endless spiral.however within those couple decades other things can be put inplace to help achieve the ultimate goal in a few decades of switching salarybonus (incomesubsidy)whereby the 'full blocks are good, fee war is good' mindset should not be front of mind for devs, because a slow progression should be the mindset. rather than a manic push to switch sooneri hope to see a honest and respectful opinioni truly hope your reply is that segwit would have been implemented when independently vetted, tested, rather than dragging feet in favour of a early fee war. I DO NOT TRADE OR ACT AS ESCROW ON THIS FORUM EVER.

Please do your own research & respect what is written here as both opinion & information gleaned from experience. many people replying with insults but no on-topic content substance, automatically are 'facepalmed' and yawned at

RealBitcoin



Offline



Activity: 854

Merit: 1000





JAYCE DESIGNS - http://bit.ly/1tmgIwK







Hero MemberActivity: 854Merit: 1000JAYCE DESIGNS - http://bit.ly/1tmgIwK Re: The Blocksize Debate & Concerns June 21, 2016, 04:53:56 PM #12 Quote from: franky1 on June 21, 2016, 12:28:07 AM i actually think the OP's post is actually quite unbiased and reasonable



here is my insight to address three points (sacred protocol)(corruptible devs)(corruptible pools)

though things should change to adapt to the changing reality of users needs, we need to accept that even soft forks should need consensus just to be activated. and should need independent testing and reading every line of code.

this is because a softfork due to its nature of NOT needing upwards close to 100% adoption, allows the risk of bad code being added to change things 'on a whim'.

im glad mining pools are not going to run any new versions until they have done their own checks and tests... even if greenlit by core.



I dont support hardforks, but i can accept arguments from pro-hardforker if they can come up with good ones. It can be debated.





Yes soft forks need a lot of reviewing and testing as well, absolutely true. But this is already happening, every bitcoin client can potentially have some bug in it, that is why people should never uppgrade all at once.



Therefore you cannot force hardfork, because then everyone has to uppgrade before the grace period runs out, therefore a hardfork is 1000% more risky than a soft one.



Currently only 37% run the latest bitcoin version, so the other 63% acts as a backbone if there is a 0 day bug in the currrent version.



It cannot be said about a hardfork scenario....



Just added this point to point 10) , because it's really a very important point I dont support hardforks, but i can accept arguments from pro-hardforker if they can come up with good ones. It can be debated.Yes soft forks need a lot of reviewing and testing as well, absolutely true. But this is already happening, every bitcoin client can potentially have some bug in it, that is why people should never uppgrade all at once.Therefore you cannot force hardfork, because then everyone has to uppgrade before the grace period runs out, therefore a hardfork is 1000% more risky than a soft one.Currently only 37% run the latest bitcoin version, so the other 63% acts as a backbone if there is a 0 day bug in the currrent version.It cannot be said about a hardfork scenario.... Win FREE Bitcoins Every Hour! - JOIN NOW

MeteoImpact



Offline



Activity: 97

Merit: 10







MemberActivity: 97Merit: 10 Re: The Blocksize Debate & Concerns June 21, 2016, 07:14:50 PM #13 Quote from: gmaxwell on June 21, 2016, 09:50:47 AM snip

Damn, thanks for that lengthy and comprehensive response--big confidence builder as it (more-or-less) confirms a lot of my assumptions regarding the concerns in the system. A decentralised backbone upon which other systems can be built seems like the right choice if decentralisation is something which we value. Second-layer solutions that make tradeoffs of decentralisation/whatnot to meet the demands of higher capacity can happen (and already do--just look at all the off-chain trades on exchanges) on top of a decentralised ledger, but second-layer solutions cannot return decentralisation to Bitcoin if it is lost.



Honestly, I'm going long on Bitcoin anyhow (most of my transactions involve combining small outputs from altcoin mining), so current usability/capacity concerns aren't especially pressing for me, especially with huge improvements in these areas already in development. It will be an experience to see how this plays out over the next few years--just what will Bitcoin look like by the 2020 halving? Damn, thanks for that lengthy and comprehensive response--big confidence builder as it (more-or-less) confirms a lot of my assumptions regarding the concerns in the system. A decentralised backbone upon which other systems can be built seems like the right choice if decentralisation is something which we value. Second-layer solutions that make tradeoffs of decentralisation/whatnot to meet the demands of higher capacity can happen (and already do--just look at all the off-chain trades on exchanges) on top of a decentralised ledger, but second-layer solutions cannot return decentralisation to Bitcoin if it is lost.Honestly, I'm going long on Bitcoin anyhow (most of my transactions involve combining small outputs from altcoin mining), so current usability/capacity concerns aren't especially pressing for me, especially with huge improvements in these areas already in development. It will be an experience to see how this plays out over the next few years--just what will Bitcoin look like by the 2020 halving?

enet



Offline



Activity: 81

Merit: 10







MemberActivity: 81Merit: 10 Re: The Blocksize Debate & Concerns June 26, 2016, 09:02:44 AM #14 Quote from: RealBitcoin on June 20, 2016, 11:08:48 PM 3) Sidechains + Lightning Network

This is the path of scaling the Core team choose. And I think it's a very elegant, decentralized approach to scaling.



Both are already failures - if you're realistic about it. I don't see any connection between the bad ideas of Gregory Maxwell and a few others, and Bitcoin as a system. Sidechains are one of the dumbest ideas I've seen in Bitcoin ever. The ideas was proposed 3 years ago, whitepaper written 2 years ago, and there is no functioning system and I predict there will never be. Lightning is quite interesting, but will be a failure, too. I will be on my beliefs by selling Bitcoin for a better alternative when it arrives. There are deep economic reasons for these ideas to be bad, but you can't expect from core developers any understanding of it. They don't even see the incentive and scaling problems as they are. "Scaling Bitcoin" is a misnomer to begin with.



Quote from: RealBitcoin on June 20, 2016, 11:08:48 PM 7) Corruptible devs

Also the bitcoin developers are decentralized, even if the Core team has control over the github depository, everyone can host bitcoin's source code on his server and work on the code.



No. Bitcoin dev is centralised around a small group of people, for good and bad reasons. Nobody can add just add any patch to the system. Forking blockstream/core is basically starting a new chain and project. Both are already failures - if you're realistic about it. I don't see any connection between the bad ideas of Gregory Maxwell and a few others, and Bitcoin as a system. Sidechains are one of the dumbest ideas I've seen in Bitcoin ever. The ideas was proposed 3 years ago, whitepaper written 2 years ago, and there is no functioning system and I predict there will never be. Lightning is quite interesting, but will be a failure, too. I will be on my beliefs by selling Bitcoin for a better alternative when it arrives. There are deep economic reasons for these ideas to be bad, but you can't expect from core developers any understanding of it. They don't even see the incentive and scaling problems as they are. "Scaling Bitcoin" is a misnomer to begin with.No. Bitcoin dev is centralised around a small group of people, for good and bad reasons. Nobody can add just add any patch to the system. Forking blockstream/core is basically starting a new chain and project.

tomywomy



Offline



Activity: 57

Merit: 27







Jr. MemberActivity: 57Merit: 27 Re: The Blocksize Debate & Concerns June 26, 2016, 10:27:55 AM #15 Quote from: gmaxwell on June 21, 2016, 07:05:29 AM

Already Ethereum users make the incorrect argument that bitcoin was hardforked in the past to fix the value overflow bug (it wasn't) and thus its okay for them to manually tamper with the ledger to claw the funds the DAO lost and hand them over to other users. You're seeing a first hand demonstration for how quickly people cling to argument of convenience.





The fallacies used by Ethereum users to justify tampering with the ledger are extreme and are certainly a slippery slope to the end of reliability in the truth and security of the transactions on their chain. They are all the fallacies that support socialism, big government, bailouts, wealth redistribution, etc...



As and when bitcoin becomes more mainstream, it too will have taken on a majority of individuals who do not understand that the whole thing was created precisely to prevent majorities or powerful individuals from violating the property rights of other individuals. This is why making such interventions as hard as possible, if not downright impossible, must remain the top priority of the system. The fallacies used by Ethereum users to justify tampering with the ledger are extreme and are certainly a slippery slope to the end of reliability in the truth and security of the transactions on their chain. They are all the fallacies that support socialism, big government, bailouts, wealth redistribution, etc...As and when bitcoin becomes more mainstream, it too will have taken on a majority of individuals who do not understand that the whole thing was created precisely to prevent majorities or powerful individuals from violating the property rights of other individuals. This is why making such interventions as hard as possible, if not downright impossible, must remain the top priority of the system.

xDan



Offline



Activity: 688

Merit: 500



ヽ( ㅇㅅㅇ)ﾉ ~!!







Hero MemberActivity: 688Merit: 500ヽ( ㅇㅅㅇ)ﾉ ~!! Re: The Blocksize Debate & Concerns June 26, 2016, 12:13:39 PM #17



Quote from: https://bitcoin.org/bitcoin.pdf The incentive can also be funded with transaction fees. If the output value of a transaction is

less than its input value, the difference is a transaction fee that is added to the incentive value of

the block containing the transaction. Once a predetermined number of coins have entered

circulation, the incentive can transition entirely to transaction fees and be completely inflation

free.

Quote from: https://bitcointalk.org/index.php?topic=48.msg329#msg329 In a few decades when the reward gets too small, the transaction fee will become the main compensation for nodes. I'm sure that in 20 years there will either be very large transaction volume or no volume.

Quote from: http://satoshi.nakamotoinstitute.org/emails/cryptography/2/ At first, most users would run network nodes, but as the

network grows beyond a certain point, it would be left more and more to

specialists with server farms of specialized hardware. A server farm would

only need to have one node on the network and the rest of the LAN connects with

that one node.



The bandwidth might not be as prohibitive as you think. A typical transaction

would be about 400 bytes (ECC is nicely compact). Each transaction has to be

broadcast twice, so lets say 1KB per transaction. Visa processed 37 billion

transactions in FY2008, or an average of 100 million transactions per day.

That many transactions would take 100GB of bandwidth, or the size of 12 DVD or

2 HD quality movies, or about $18 worth of bandwidth at current prices.



If the network were to get that big, it would take several years, and by then,

sending 2 HD movies over the Internet would probably not seem like a big deal.

Quote from: https://bitcointalk.org/index.php?topic=994.msg12168#msg12168 We should always allow at least some free transactions.

Quote from: https://bitcointalk.org/index.php?topic=287.msg7687#msg7687 Forgot to add the good part about micropayments. While I don't think Bitcoin is practical for smaller micropayments right now, it will eventually be as storage and bandwidth costs continue to fall. If Bitcoin catches on on a big scale, it may already be the case by that time. Another way they can become more practical is if I implement client-only mode and the number of network nodes consolidates into a smaller number of professional server farms. Whatever size micropayments you need will eventually be practical. I think in 5 or 10 years, the bandwidth and storage will seem trivial.

Highlights added by me.



After reading the whitepaper, and Satoshi's other writings, I bought Bitcoin.



Anything other than what Satoshi specified is a deviation from what I bought into, and should be treated with great suspicion.



If you read and understand what he wrote, he intended Bitcoin to support what was possible according to network limits. He did not intend hard limits to force any specific level of "decentralisation" - only those necessary to prevent "flooding" and the network completely falling to its knees. As a hodler for 5 years, since 2011, this is what I bought into:Highlights added by me.After reading the whitepaper, and Satoshi's other writings, I bought Bitcoin.If you read and understand what he wrote, he intended Bitcoin to support what was possible according to network limits. He did not intend hard limits to force any specific level of "decentralisation" - only those necessary to prevent "flooding" and the network completely falling to its knees. HODLing for the longest time. Skippin fast right around the moon. On a rocketship straight to mars.

Up, up and away with my beautiful, my beautiful Bitcoin~

Lauda



Offline



Activity: 2660

Merit: 2643





Exchange Bitcoin quickly-https://blockchain.com.do







LegendaryActivity: 2660Merit: 2643Exchange Bitcoin quickly-https://blockchain.com.do Re: The Blocksize Debate & Concerns June 26, 2016, 12:19:39 PM #18 Quote from: enet on June 26, 2016, 09:02:44 AM Both are already failures - if you're realistic about it. I don't see any connection between the bad ideas of Gregory Maxwell and a few others, and Bitcoin as a system. Sidechains are one of the dumbest ideas I've seen in Bitcoin ever. The ideas was proposed 3 years ago, whitepaper written 2 years ago, and there is no functioning system and I predict there will never be.

No. Sidechains are a good idea and you're spewing out false information. If there "is no functioning system", then what is Liquid? What is Rootstock supposed to be?



Quote from: enet on June 26, 2016, 09:02:44 AM Lightning is quite interesting, but will be a failure, too. I will be on my beliefs by selling Bitcoin for a better alternative when it arrives. There are deep economic reasons for these ideas to be bad, but you can't expect from core developers any understanding of it. They don't even see the incentive and scaling problems as they are. "Scaling Bitcoin" is a misnomer to begin with.

It seems that everything that comes from the people which are trying to improve Bitcoin is a failure, according to the people who have no skills and/or no contributions to the ecosystem.



Quote from: enet on June 26, 2016, 09:02:44 AM No. Bitcoin dev is centralised around a small group of people, for good and bad reasons. Nobody can add just add any patch to the system. Forking blockstream/core is basically starting a new chain and project.

No, it is not centralized. If it were centralized, the main person could push any changes that they've wanted. Additionally "blockstream/core" is an very ignorant statement. Are you not capable of comprehending the difference between those two? No. Sidechains are a good idea and you're spewing out false information. If there "is no functioning system", then what is Liquid? What is Rootstock supposed to be?It seems that everything that comes from the people which are trying to improve Bitcoin is a failure, according to the people who have no skills and/or no contributions to the ecosystem.No, it is not centralized. If it were centralized, the main person could push any changes that they've wanted. Additionally "blockstream/core" is an very ignorant statement. Are you not capable of comprehending the difference between those two?

Carlton Banks



Offline



Activity: 2842

Merit: 2279









LegendaryActivity: 2842Merit: 2279 Re: The Blocksize Debate & Concerns June 26, 2016, 12:46:18 PM #19 Quote from: xDan on June 26, 2016, 12:13:39 PM Highlights added by me.



After reading the whitepaper, and Satoshi's other writings, I bought Bitcoin.



Anything other than what Satoshi specified is a deviation from what I bought into, and should be treated with great suspicion.



If you read and understand what he wrote, he intended Bitcoin to support what was possible according to network limits. He did not intend hard limits to force any specific level of "decentralisation" - only those necessary to prevent "flooding" and the network completely falling to its knees.



You're misrepresenting Satoshi if you think cherry-picking things he's said to support a bigblocks perspective is the sum total of his reasoning on the matter: it isn't.



And, Satoshi didn't get every single detail right anyway, and also, he's not (known) to be developing Bitcoin any more. When you bought Bitcoin, you had no reasonable expectations that Satoshi would either always be right, or that he would always be around. I got in right when Satoshi left, maybe that helps with my perspective a little better.









Just because Central Bankers believe they're "magical people", doesn't mean other monetary genius's should be examined with the same faulty measuring technique. You're misrepresenting Satoshi if you think cherry-picking things he's said to support a bigblocks perspective is the sum total of his reasoning on the matter: it isn't.And, Satoshi didn't get every single detail right anyway, and also, he's not (known) to be developing Bitcoin any more. When you bought Bitcoin, you had no reasonable expectations that Satoshi would either always be right, or that he would always be around. I got in right when Satoshi left, maybe that helps with my perspective a little better.Just because Central Bankers believe they're "magical people", doesn't mean other monetary genius's should be examined with the same faulty measuring technique. Vires in numeris