dEBRUYNE



Offline



Activity: 2086

Merit: 1131







LegendaryActivity: 2086Merit: 1131 Re: Elastic block cap with rollover penalties June 05, 2015, 06:15:33 PM #81 Quote from: DumbFruit on June 04, 2015, 07:52:01 PM Quote from: Meni Rosenfeld This is correct, and I hadn't given enough thought to this problem prior to posting.



Now that I've given it more thought, I think it can be significantly alleviated by making collection from the pool span a longer period, on the time scale of years. Relative hashrate is likely to change over these period, so it may not be the best plan to publish excessively large blocks hoping to reclaim the fee (and publishing typical-sized blocks does not give big miners an advantage). Also, with a different function you can change the balance between the marginal and total penalty so that the actual penalty is small, nullifying the effect (it will require the cap to be a bit harder, but still more elastic than what we have now).



I agree that this calls for more analysis. A longer time period over which the reward is given doesn't help, as the larger nodes or entities will still get a larger ratio of the rolling over fees, by definition.

Actually, making the rollover fees only extend over a couple of blocks would more likely mitigate the problem, but if you roll over the fee for about 3 blocks or so, then it might be worth it for a miner to hold blocks and release 2 at a time, depending on fee-over-time. This, in turn, might exacerbate the selfish miner exploit1. The natural monopoly condition that already exists in Bitcoin2 seems to be exacerbated either way.



Getting around this would be tricky, if it's possible.



1http://fc14.ifca.ai/papers/fc14_submission_82.pdf

2https://bitcointalk.org/index.php?topic=176684.msg9375042#msg9375042



Quote from: NewLiberty on June 04, 2015, 05:41:22 PM An examination of the prior art is warranted.

Pointing to Monero as an examination of prior art is asking a bit much. Are you expecting us to dig through the Monero source code? How do they get around the problem?



This is not very helpful;



Quote The Basics

A special type of transaction included in each block, which contains a small amount of monero sent to the miner as a reward for their mining work.

https://getmonero.org/knowledge-base/moneropedia/coinbase



A longer time period over which the reward is given doesn't help, as the larger nodes or entities will still get a larger ratio of the rolling over fees, by definition.Actually, making the rollover fees only extend over a couple of blocks would more likely mitigate the problem, but if you roll over the fee for about 3 blocks or so, then it might be worth it for a miner to hold blocks and release 2 at a time, depending on fee-over-time. This, in turn, might exacerbate the selfish miner exploit. The natural monopoly condition that already exists in Bitcoinseems to be exacerbated either way.Getting around this would be tricky, if it's possible.This is not very helpful;

Did you miss this link? -> Did you miss this link? -> https://github.com/monero-project/bitmonero/blob/c41d14b2aa3fc883d45299add1cbb8ebbe6c9ed8/src/cryptonote_core/blockchain.cpp#L2230-L2244 Privacy matters, use Monero - A true untraceable cryptocurrency

Why Monero matters? http://weuse.cash/2016/03/05/bitcoiners-hedge-your-position/

Meni Rosenfeld

Legendary





Offline



Activity: 2058

Merit: 1022









DonatorLegendaryActivity: 2058Merit: 1022 Re: Elastic block cap with rollover penalties June 07, 2015, 12:45:09 PM

Last edit: June 07, 2015, 07:42:00 PM by Meni Rosenfeld #82 Quote from: Meni Rosenfeld on June 05, 2015, 03:23:53 PM I'll try to repeat the calculations with a different demand curve, to demonstrate my point. But this will take some time and Shabbat is in soon, so that will have to wait.

Let's assume the demand curve - the number of transactions demanded as a function of the fee, per 10 minutes - is d(p) = 27/(8000p^2). It's safe to have d(p)->infinity as p->0 because supply is bounded (if there was no bound on supply, we'd need a more realistic bound on demand to have meaningful results). The behavior below is the same for other reasonable demand curves, as long as demand diminishes superlinearly with p (sublinear decay is less reasonable economically, and results in very different dynamics).



We'll assume 4000 transactions go in a MB, and that T=1MB. So the penalty, as a function of the number n of transactions, is f(n) = max(n-4000,0)^2 / (4000*(8000-n)).



We'll also assume that transactions are in no particular rush - users will pay the minimal fee that gives them a good guarantee to have the tx accepted in reasonable time (where this time is long enough to include blocks from the different miner groups). So there is a specific fee p for which the tx demand clears with the average number of txs per block (the number of txs can change between blocks). It would have been more interesting to analyze what happens when probabilistic urgency premiums enter the scene, but that's not relevant to the issue of mining centralization.



Scenario 1: 100 1% miners.



Each miner reclaims 1% of the penalty. If the optimal strategy is to have n txs per block, resulting in a fee of p, then n=d(p) and the marginal penalty (derivative of f) at n, corrected for the reclaiming, must equal p (so that adding another transaction generates no net profit). In other words, 0.99f'(d(p)) = p. Solving this gives p = 0.7496 mBTC, n = 6007.



Penalty is 0.5053 BTC, so pool size is 50.53.

Miners get 4.5027 BTC per block (6007 * 0.0007496 from txs + 0.5053 collection - 0.5053 penalty).

6007 txs are included per block.



Scenario 2: One 90% miner and 10 1% miners.



The market clears with a tx fee of p, with the 90% miner including n0 txs per block and the 1% miners including n1 txs per block.

The average #txs/block must equal the demand, so 0.9n0 + 0.1n1 = d(p).

Every miner must have 0 marginal profit per additional transaction, correcting for reclaiming. So

0.1 f'(n0) = p

0.99 f'(n1) = p



Solving all of this results in:

n0 = 7251

n1 = 5943

p = 0.6885 mBTC (lower than in scenario 1)



Penalty paid by 1% miners: f(5943) = 0.4589 BTC

Penalty paid by 90% miner: f(7251) = 3.5294 BTC

Average penalty: 0.9*3.5294 + 0.1*0.4589 = 3.2223 BTC

Pool size: 322.23 BTC



Reward per block for 1% miner: 5943 * 0.0006885 + 3.2223 - 0.4589 = 6.8552 BTC (more than in scenario 1)

Reward per block for 90% miner: 7251 * 0.0006885 + 3.2223 - 3.5294 = 4.68521 BTC (less than 1% miners in this scenario; more than the miners in scenario 1).



Average number of txs per block: 0.9 * 7251 + 0.1 * 5943 = 7120, more than in scenario 1.



Miners are happy - big or small, they gain more rewards.

Users are happy - more of their transactions are included, at a lower fee.

Nodes are not happy - they have to deal with bigger blocks.

Exactly as with the previously discussed demand curve.



Over time, difficulty will go up, nullifying the extra mining reward; and whoever is in charge of placing the checks and balances, will make the function tighter (or hold on with making it looser), to keep the block sizes at the desired level.





There is another issue at play here - the ones who benefit the most from the big miner's supersized blocks, are the small miners. The big miner could threaten to stop creating supersized blocks if the small miners don't join and create supersized blocks themselves. Forming such a cartel is advantageous over not having supersized blocks at all - however, I think the big miner's bargaining position is weak, and small miners will prefer to call the bluff and mine small blocks, avoiding the penalty and enjoying the big miner's supersized blocks. This is classic tragedy of the commons, but in a sort of reverse way - usually, TotC is discussed in this context when the mining cartel wants to exclude txs, not include them. Let's assume the demand curve - the number of transactions demanded as a function of the fee, per 10 minutes - is d(p) = 27/(8000p^2). It's safe to have d(p)->infinity as p->0 because supply is bounded (if there was no bound on supply, we'd need a more realistic bound on demand to have meaningful results). The behavior below is the same for other reasonable demand curves, as long as demand diminishes superlinearly with p (sublinear decay is less reasonable economically, and results in very different dynamics).We'll assume 4000 transactions go in a MB, and that T=1MB. So the penalty, as a function of the number n of transactions, is f(n) = max(n-4000,0)^2 / (4000*(8000-n)).We'll also assume that transactions are in no particular rush - users will pay the minimal fee that gives them a good guarantee to have the tx accepted in reasonable time (where this time is long enough to include blocks from the different miner groups). So there is a specific fee p for which the tx demand clears with the average number of txs per block (the number of txs can change between blocks). It would have been more interesting to analyze what happens when probabilistic urgency premiums enter the scene, but that's not relevant to the issue of mining centralization.Scenario 1: 100 1% miners.Each miner reclaims 1% of the penalty. If the optimal strategy is to have n txs per block, resulting in a fee of p, then n=d(p) and the marginal penalty (derivative of f) at n, corrected for the reclaiming, must equal p (so that adding another transaction generates no net profit). In other words, 0.99f'(d(p)) = p. Solving this gives p = 0.7496 mBTC, n = 6007.Penalty is 0.5053 BTC, so pool size is 50.53.Miners get 4.5027 BTC per block (6007 * 0.0007496 from txs + 0.5053 collection - 0.5053 penalty).6007 txs are included per block.Scenario 2: One 90% miner and 10 1% miners.The market clears with a tx fee of p, with the 90% miner including n0 txs per block and the 1% miners including n1 txs per block.The average #txs/block must equal the demand, so 0.9n0 + 0.1n1 = d(p).Every miner must have 0 marginal profit per additional transaction, correcting for reclaiming. So0.1 f'(n0) = p0.99 f'(n1) = pSolving all of this results in:n0 = 7251n1 = 5943p = 0.6885 mBTC (lower than in scenario 1)Penalty paid by 1% miners: f(5943) = 0.4589 BTCPenalty paid by 90% miner: f(7251) = 3.5294 BTCAverage penalty: 0.9*3.5294 + 0.1*0.4589 = 3.2223 BTCPool size: 322.23 BTCReward per block for 1% miner: 5943 * 0.0006885 + 3.2223 - 0.4589 = 6.8552 BTC (more than in scenario 1)Reward per block for 90% miner: 7251 * 0.0006885 + 3.2223 - 3.5294 = 4.68521 BTC (less than 1% miners in this scenario; more than the miners in scenario 1).Average number of txs per block: 0.9 * 7251 + 0.1 * 5943 = 7120, more than in scenario 1.Miners are happy - big or small, they gain more rewards.Users are happy - more of their transactions are included, at a lower fee.Nodes are not happy - they have to deal with bigger blocks.Exactly as with the previously discussed demand curve.Over time, difficulty will go up, nullifying the extra mining reward; and whoever is in charge of placing the checks and balances, will make the function tighter (or hold on with making it looser), to keep the block sizes at the desired level.There is another issue at play here - the ones who benefit the most from the big miner's supersized blocks, are the small miners. The big miner could threaten to stop creating supersized blocks if the small miners don't join and create supersized blocks themselves. Forming such a cartel is advantageous over not having supersized blocks at all - however, I think the big miner's bargaining position is weak, and small miners will prefer to call the bluff and mine small blocks, avoiding the penalty and enjoying the big miner's supersized blocks. This is classic tragedy of the commons, but in a sort of reverse way - usually, TotC is discussed in this context when the mining cartel wants to exclude txs, not include them. bitcoin-otc WoT

- Exchange bitcoins for ILS (thread)

Analysis of Bitcoin Pooled Mining Reward Systems (thread, 1EofoZNBhWQ3kxfKnvWkhtMns4AivZArhr | Who am I? Bitcoil - Exchange bitcoins for ILS ( thread ) | Israel Bitcoin community homepage summary ) | PureMining - Infinite-term, deterministic mining bond

molecular

Legendary



Offline



Activity: 2730

Merit: 1016









DonatorLegendaryActivity: 2730Merit: 1016 Re: Elastic block cap with rollover penalties June 07, 2015, 07:00:47 PM #84 Quote from: MayDee on June 07, 2015, 06:12:09 PM

I really like this idea! Keep up the great work Meni Rosenfeld

I like it, too.



Thinking about the next steps I re-skimmed the OP (pretending to be someone just being introduced to the idea) and I think the introduction of (and reference to) the 'rollover fee pool' is very misleading. I know it is explained right after that it's really a 'rollover size penalty', but I fear it might lead people onto the wrong track and make it harder than necessary for them to grasp the idea. Maybe it'd be less confusing and easier to understand the concept if that part was removed?



I have a feeling this idea is a hard sell, mainly because it isn't what many might expect: it's neither...



a way to dynamically set the block size cap nor

a solution for scaling nor

a rollover fee pool

It concerns itself with a different (although related) issue, namely the way the system behaves when approaching the transaction throughput limit.



I personally think this is a very important issue and my expectation of the current behavior and the ramifications thereof regarding user experience and media coverage is one of the reasons I'm for Gavins simple 20MB kicking-the-can-down-the-road proposal. With the rollover penalty in place I might be willing to wait longer and let some pressure build on developing scaling solutions. I'm not opposed to seeing how a fee market would develop, I'm also not opposed to seeing business opportunities for entities working on scaling solutions. I just don't want to hit a brick wall, as Meni so aptly put it... it would do much damage and can potentially set us back years, I fear.



So what are peoples ideas of how a roadmap could look like, what kind of funds we might need and how we could organize enough (monetary and political) support?

I like it, too.Thinking about the next steps I re-skimmed the OP (pretending to be someone just being introduced to the idea) and I think the introduction of (and reference to) the 'rollover fee pool' is very misleading. I know it is explained right after that it's really a 'rollover size penalty', but I fear it might lead people onto the wrong track and make it harder than necessary for them to grasp the idea. Maybe it'd be less confusing and easier to understand the concept if that part was removed?I have a feeling this idea is a hard sell, mainly because it isn't what many might expect: it's neither...It concerns itself with a different (although related) issue, namely the way the system behaves when approaching the transaction throughput limit.I personally think this is a very important issue and my expectation of the current behavior and the ramifications thereof regarding user experience and media coverage is one of the reasons I'm for Gavins simple 20MB kicking-the-can-down-the-road proposal. With the rollover penalty in place I might be willing to wait longer and let some pressure build on developing scaling solutions. I'm not opposed to seeing how a fee market would develop, I'm also not opposed to seeing business opportunities for entities working on scaling solutions. I just don't want to hit a brick wall, as Meni so aptly put it... it would do much damage and can potentially set us back years, I fear.So what are peoples ideas of how a roadmap could look like, what kind of funds we might need and how we could organize enough (monetary and political) support? PGP key molecular F9B70769 fingerprint 9CDD C0D3 20F8 279F 6BE0 3F39 FC49 2362 F9B7 0769

klondike_bar



Offline



Activity: 1918

Merit: 1001



ASIC Wannabe







LegendaryActivity: 1918Merit: 1001ASIC Wannabe Re: Elastic block cap with rollover penalties June 08, 2015, 12:58:54 AM #88 Quote from: Meni Rosenfeld on June 07, 2015, 08:21:15 PM Quote from: vane91 on June 07, 2015, 08:02:49 PM Quote from: ArticMine on June 02, 2015, 09:12:50 PM The key here is how is T set. If T is fixed then 2T becomes the hard limit and the problem remains. If T is set based on an some average of previously mined blocks then this may address the problem

this



actually, just use twice the average blocksize of the last two weeks.



And you don't really need any of this complicated system.

actually, just use twice the average blocksize of the last two weeks.And you don't really need any of this complicated system.



2. Even if you do place a floating block limit, it doesn't really relieve of the need for an elastic system. Whatever the block limit is, tx demand can approach it and then we have a crash landing scenario. We need a system that gracefully degrades when approaching the limit, whatever it is.

1. Floating block limits have their own set of problems, and may result in a scenario where there is no effective limit at all.2. Even if you do place a floating block limit, it doesn't really relieve of the need for an elastic system. Whatever the block limit is, tx demand can approach it and then we have a crash landing scenario. We need a system that gracefully degrades when approaching the limit, whatever it is.

2) if the cap is set at double the recent volumes, it should provide enough room for fuller blocks so long as we dont see 5x network growth within less than a 1-2 month timespan. even then, the cap would grow over time and lower prioriy transactions may just be pushed back a few blocks. fees take priority until everything balances out after a few days/weeks



(I suggest 2.5x the average of 40days, or 6000 blocks) OR ((2x the last 6000 blocks) + (0.5x the last 400 blocks)). The second allows for slightly faster growth if there's sudden demand for room.

1) i beg to differ, so long as the timespan is sufficient that only a LONG lasting spam attack or other growth could cause massive block caps. Personally, i think as long as it averages over at least 1-2 weeks, thats sufficient to prevent any sort of rampant spamming.2) if the cap is set at double the recent volumes, it should provide enough room for fuller blocks so long as we dont see 5x network growth within less than a 1-2 month timespan. even then, the cap would grow over time and lower prioriy transactions may just be pushed back a few blocks. fees take priority until everything balances out after a few days/weeks(I suggest 2.5x the average of 40days, or 6000 blocks) OR ((2x the last 6000 blocks) + (0.5x the last 400 blocks)). The second allows for slightly faster growth if there's sudden demand for room.

No longer a wannabe - now an ASIC owner! 24" PCI-E cables with 16AWG wires and stripped ends - great for server PSU mods, best prices https://bitcointalk.org/index.php?topic=563461 No longer a wannabe - now an ASIC owner!

onelineproof



Offline



Activity: 100

Merit: 14







MemberActivity: 100Merit: 14 Re: Elastic block cap with rollover penalties June 08, 2015, 11:51:49 AM #89 I think a better solution would be to require miners to do more work for larger block sizes. Instead of hashing just the header of a block, miners have to hash something more: perhaps something proportional to the block size. So if a header is 80 bytes, it takes up 80/1000000=8e-05 of the whole block. So for any block size x > 1 MB, require a miner to hash the first (8e-05)x of the block in order for it to be valid. This will make Bitcoin automatically scale to the power of computers in the future, as big blocks will only be plentiful if computers (ASICs) are fast enough that it is worth taking the extra transaction fees with bigger blocks. Any problems with this?

Email <my username>@gmail.com 0xB6AC822C451D63046A2849E97DB7011CD53B564 The uncorrupted Bitmark protocol: https://github.com/bitmark-protocol/bitmark Email @gmail.com 0xB6AC822C451D63046A2849E97DB7011CD53B564

molecular

Legendary



Offline



Activity: 2730

Merit: 1016









DonatorLegendaryActivity: 2730Merit: 1016 Re: Elastic block cap with rollover penalties June 08, 2015, 04:34:37 PM #94 Quote from: Meni Rosenfeld on June 08, 2015, 02:14:35 PM Quote from: onelineproof on June 08, 2015, 11:51:49 AM I think a better solution would be to require miners to do more work for larger block sizes. Instead of hashing just the header of a block, miners have to hash something more: perhaps something proportional to the block size. So if a header is 80 bytes, it takes up 80/1000000=8e-05 of the whole block. So for any block size x > 1 MB, require a miner to hash the first (8e-05)x of the block in order for it to be valid. This will make Bitcoin automatically scale to the power of computers in the future, as big blocks will only be plentiful if computers (ASICs) are fast enough that it is worth taking the extra transaction fees with bigger blocks. Any problems with this?

That's the basic idea behind Greg's proposal. I've yet to examine it in detail; I think it was actually what I thought about first, before eschewing it in favor of a deductive penalty.



I think there are errors in your description of how to implement this. It's not about what you hash, it's about what your target hash is. Also, you need to carefully choose the function that maps block size to mining effort.

That's the basic idea behind Greg's proposal. I've yet to examine it in detail; I think it was actually what I thought about first, before eschewing it in favor of a deductive penalty.I think there are errors in your description of how to implement this. It's not about what you hash, it's about what your target hash is. Also, you need to carefully choose the function that maps block size to mining effort.

Can you link Gregs proposal? I haven't seen it.

Can you link Gregs proposal? I haven't seen it. PGP key molecular F9B70769 fingerprint 9CDD C0D3 20F8 279F 6BE0 3F39 FC49 2362 F9B7 0769

tacotime



Offline



Activity: 1484

Merit: 1004









LegendaryActivity: 1484Merit: 1004 Re: Elastic block cap with rollover penalties June 08, 2015, 10:47:24 PM #97 Quote Quote from: NewLiberty on June 04, 2015, 06:24:02 PM Thanks for that, it was my reading also.

Thus TX fees that are not in the block but paid out of band are not subject to penalty...

It's an additive deduction, not multiplicative.



You seem to be thinking the miner's reward is:

(1 - penalty) * (minted coins + tx fees + collection from pool)

Where it is really:

minted coins + tx fees + collection from pool - penalty



Having more fees in the txs included doesn't increase the penalty. There is no difference between adding 1 mBTC to the fee and paying 1 mBTC out-of-band. It's an additive deduction, not multiplicative.You seem to be thinking the miner's reward is:(1 - penalty) * (minted coins + tx fees + collection from pool)Where it is really:minted coins + tx fees + collection from pool - penaltyHaving more fees in the txs included doesn't increase the penalty. There is no difference between adding 1 mBTC to the fee and paying 1 mBTC out-of-band.

I don't see how this differs from in Monero?



In Monero, addition of txs up to the median blocksize is free. As you surpass the median blocksize, a quadratic penalty is applied to the subsidy of the coinbase, but amounts obtained from tx fees are untouched. The subsidy of the coinbase is initially dependent of the number of coins in existence, and so takes into account the previous penalties to coinbases of any previously generated blocks (comparable to your "pool" method). Then, the miner is free to add transactions meeting some economic equilibrium that maximizes their overall income when taking into account the blocksize penalty to the coinbase subsidy.



So, it's like this:

(1 - penalty) * (minted coins) + tx fees

where penalty is dependent on the size of the block above the median size according to the formulas found in the CN whitepaper.



gmaxwell criticizes this as promoting out-of-band transactions, but the fact remains that to permanently and secure transfer money you must use the blockchain and have it included in a block somewhere. Thus, I never thought it was much of an issue. I don't see how this differs from in Monero?In Monero, addition of txs up to the median blocksize is free. As you surpass the median blocksize, a quadratic penalty is applied to the subsidy of the coinbase, but amounts obtained from tx fees are untouched. The subsidy of the coinbase is initially dependent of the number of coins in existence, and so takes into account the previous penalties to coinbases of any previously generated blocks (comparable to your "pool" method). Then, the miner is free to add transactions meeting some economic equilibrium that maximizes their overall income when taking into account the blocksize penalty to the coinbase subsidy.So, it's like this:(1 - penalty) * (minted coins) + tx feeswhere penalty is dependent on the size of the block above the median size according to the formulas found in the CN whitepaper.gmaxwell criticizes this as promoting out-of-band transactions, but the fact remains that to permanently and secure transfer money you must use the blockchain and have it included in a block somewhere. Thus, I never thought it was much of an issue. Code: XMR: 44GBHzv6ZyQdJkjqZje6KLZ3xSyN1hBSFAnLP6EAqJtCRVzMzZmeXTC2AHKDS9aEDTRKmo6a6o9r9j86pYfhCWDkKjbtcns