Return-Path: Received: from smtp1.linuxfoundation.org (smtp1.linux-foundation.org [172.17.192.35]) by mail.linuxfoundation.org (Postfix) with ESMTPS id 2F46283D for ; Tue, 4 Aug 2015 11:04:27 +0000 (UTC) X-Greylist: whitelisted by SQLgrey-1.7.6 Received: from mail-lb0-f180.google.com (mail-lb0-f180.google.com [209.85.217.180]) by smtp1.linuxfoundation.org (Postfix) with ESMTPS id 204EB153 for ; Tue, 4 Aug 2015 11:04:25 +0000 (UTC) Received: by lbbpo9 with SMTP id po9so3691658lbb.2 for ; Tue, 04 Aug 2015 04:04:23 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:from:date:message-id:subject:to :cc:content-type; bh=R/qBeYmrL11QKFYWvNsi6EtK3AhFDV4nvdu4ra8hWa8=; b=lHHPDPK3Z2AtSPtMSDUxEILsG6MEIM3B+JfbVWlH4Qd359D+LZLfjKLfb8lL7CTSh5 +mwiGAqEpDF5exjxGpLM1ckrJCdh8sguJweLPq71dHs2Y+ti0uWR3LREFDVjvdOWRbMg eCv+0cosiTG4WwRBI6KETdyFNSh8BIEmQ1Qr/j2P4xG0Sj4310p6XsBT+78pkbQehM6K FbCbNf/vbtJj5eU8ENSNh8HCkK0FLJeDZ4YOCWJA5xsuJpgQuDgBeLKrWEWdfV48epY4 kX2gj7Wzj6O2W1Mr5XbBRiwnOY/WhxPTJsuJS3WTOCyCvue5fL/YKOG87HHkcBhEm/lQ Q/1A== X-Received: by 10.112.161.40 with SMTP id xp8mr2628302lbb.71.1438686263219; Tue, 04 Aug 2015 04:04:23 -0700 (PDT) MIME-Version: 1.0 Received: by 10.25.22.25 with HTTP; Tue, 4 Aug 2015 04:04:03 -0700 (PDT) In-Reply-To: References: From: Hector Chu Date: Tue, 4 Aug 2015 12:04:03 +0100 Message-ID: To: =?UTF-8?B?Sm9yZ2UgVGltw7Nu?= Content-Type: multipart/alternative; boundary=001a11c25f52a99aa5051c7a3fab X-Spam-Status: No, score=-2.7 required=5.0 tests=BAYES_00,DKIM_SIGNED, DKIM_VALID,DKIM_VALID_AU,FREEMAIL_FROM,HTML_MESSAGE,RCVD_IN_DNSWL_LOW autolearn=ham version=3.3.1 X-Spam-Checker-Version: SpamAssassin 3.3.1 (2010-03-16) on smtp1.linux-foundation.org Cc: Bitcoin Dev Subject: Re: [bitcoin-dev] Block size following technological growth X-BeenThere: bitcoin-dev@lists.linuxfoundation.org X-Mailman-Version: 2.1.12 Precedence: list List-Id: Bitcoin Development Discussion List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-List-Received-Date: Tue, 04 Aug 2015 11:04:27 -0000 --001a11c25f52a99aa5051c7a3fab Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable Mike's position is that he wants the block size limit to eventually be removed. That is of course an extreme view. Meanwhile, your view that the block size should be artificially constrained below the organic growth curve (in a way that will penalize a majority of existing and future users) lies at the other extreme. The majority position lies somewhere in between (i.e. a one-time increase to 8MB). This is the position that ultimately matters. If the block size is increased to 8MB and things get demonstrably a whole lot worse, then you will have a solid leg to stand on. In that case we can always do another hard fork later to reduce the block size back to something smaller, and henceforth the block size will never be touched again. On 4 August 2015 at 11:35, Jorge Tim=C3=B3n < bitcoin-dev@lists.linuxfoundation.org> wrote: > On Fri, Jul 31, 2015 at 4:58 PM, Mike Hearn wrote: > >> How more users or more nodes can bring more miners, or more importantl= y, > >> improve mining decentralization? > > > > > > Because the bigger the ecosystem is the more interest there is in takin= g > > part? > > As explained by Venzen, this is a non-sequitur. > > > I mean, I guess I don't know how to answer your question. > > I don't know the answer either, that's fine. It's the opposite > question that I've been insistently repeating and you've been > (consciously or not) consistently evading. > But that's also fine because I believe you finally answer it a few lines > below. > > > When Bitcoin was > > new it had almost no users and almost no miners. Now there are millions > of > > users and factories producing ASICs just for Bitcoin. > > The emergence of a btc price enabled the emergence of professional > miners, which in turn enabled the emergence of sha256d-specialized > hardware production companies. > Nothing surprising there. > By no means it consitutes an example of how a bigger consensus sizes > can cause less mining centralization. > > > Surely the correlation is obvious? > > Correlation does not imply causation. I will better leave it at that... > > >> I'm sorry, but until there's a simulation that I can run with differen= t > >> sizes' testchains (for example using #6382) to somehow compare them, I > will > >> consider any value arbitrary. > > > > > > Gavin did run simulations. 20mb isn't arbitrary, the process behind it > was > > well documented here: > > > > > http://gavinandresen.ninja/does-more-transactions-necessarily-mean-more-c= entralized > > > > I chose 20MB as a reasonable block size to target because 170 gigabytes > per > > month comfortably fits into the typical 250-300 gigabytes per month dat= a > > cap=E2=80=93 so you can run a full node from home on a =E2=80=9Cpretty = good=E2=80=9D broadband > plan. > > > > Did you think 20mb was picked randomly? > > No, I think 20 MB was chosen very optimistically, considering 3rd > party services rates (not the same service as self-hosting) in the > so-called "first world". And then 20 MB goes to 20 GB, again with > optimistic and by no means scientific expectations. > > But where the number comes from it's not really what I'm demaning, > what I want is some criterion that can tell you that a given size > would be "too centralized" but another one isn't. > I haven't read any analysis on why 8GB is a better option than 7GB and > 9GB for a given criterion (nor one declaring 20 GB a winner over 19 GB > or 21 GB). > A simulation test passing 20 GB but not 21 GB would make it far less > arbitrary. > > >> Agreed on the first sentence, I'm just saying that the influence of > >> the blocksize in that function is monotonic: with bigger sizes, equal > >> or worse mining centralization. > > > > > > I have a hard time agreeing with this because I've seen Bitcoin go from > > blocks that were often empty to blocks that are often full, and in this > time > > the number of miners and hash power on the network has gone up a huge > amount > > too. > > I'm of course talking about consensus maximum blocksize, not about > actual blocksize. > Yes, again, when mining becomes profitable, economic actors tend to > appear and get those profits. > But don't confuse total hashrate improvements with an "increase in the > number of miners" or with mining decentralization. > > > You can argue that a miner doesn't count if they pool mine. But if a > miner > > mines on a pool that uses exactly the same software and settings as the > > miner would have done anyway, then it makes no difference. Miners can > switch > > between pools to find one that works the way they like, so whilst less > > pooling or more decentralised pools would be nice (e.g. > getblocktemplate), > > and I've written about how to push it forward before, I still say there > are > > many more miners than in the past. > > > > If I had to pick between two changes to improve mining decentralisation= : > > > > 1) Lower block size > > Finally, I think you finally answered my repetitive question here. > If I say "Mike Hearn understands that the consensus block size maximum > rule is a tool for limitting mining centralization" I'm not putting > words in your mouth, right? > I think many users advocating for an increase in the consensus limit > don't understand this, which is extremely unfortunate for the debate. > > > 2) Finishing, documenting, and making the UX really slick for a > > getblocktemplate based decentralised mining pool > > > > then I'd pick (2) in a heartbeat. I think it'd be a lot more effective. > > Great! Maybe after 2 mining centralization improves so much that we're > confortable not only not lowering it but rather increasing it. > > >> you should be consequently advocating for full removal of the limit > rather > >> than changes towards bigger arbitrary values. > > > > > > I did toy with that idea a while ago. Of course there can not really be > no > > limit at all because the code assumes blocks fit into RAM/swap, and nod= es > > would just end up ignoring blocks they couldn't download in time anyway= . > > There is obviously a physical limit somewhere. > > Did the fact that you "understand that the consensus block size > maximum rule is a tool for limitting mining centralization" influenced > your rejection of that idea at all? > > > But it is easier to find common ground with others by compromising. Is > 8mb > > better than no limit? I don't know and I don't care much: I think > Bitcoin > > adoption is a slow, hard process and we'll be lucky to increase average > > usage 8x over the next couple of years. So if 8mb+ is better for others= , > > that's OK by me. > > The only way that "not caring much whther we have a consensus limit or > not" and "understand that the consensus block size maximum rule is a > tool for limitting mining centralization" at the same time is by not > caring about mining centralization at all. > Is that your position? > > If you don't care about having a limit but you don't want to limit > transaction volume, then ++current_size will ALWAYs be your > "compromise position" and no blocksize increase will ever be enough > until the limit is completely removed. > Is that your position? > > > Re: exchange profit. You can pick some other useful service provider if > you > > like. Payment processors or cold storage providers or the TREZOR > > manufacturers or whoever. > > Yes, and I believe the same points stand. > > > My point is you can't have a tiny high-value-transactions only currency > AND > > all the useful infrastructure that the Bitcoin community is making. It'= s > a > > contradiction. And without the infrastructure bitcoin ceases to be > > interesting even to people who are willing to pay huge sums to use it. > > You keep talking about "high-value-transactions-only" like if > non-urgent transaction fees rising from zero to, say, 1 satoshi, would > automatically result in that "high-value-transactions-only" Bitcoin. > Please, stop talking as if someone was proposing a > "high-value-transactions-only" Bitcoin. That may happen but nobody > really knows. If it happens it may not be bad thing necessarily (ie > bitcoin microtransactions can still happen using trustless payment > channels and x is still cheaper than x% for any transacted value > higher than 100) but that's really not what we're talking about here > so it seems distraction that can only help further polirizing this > discussion. > > What we're talking about here is that hitting the limit would > (hopefully) make miners start caring about fees. Enough that they stop > being irrational about free transactions. If both things happen, > non-urgent transaction fees will likely rise (as said, above zero). > > You think that would be a catastrophe for adoption and I disagree. > But (as Pieter has repeatedly explained) for any size there will be > use cases that will be eventually priced out. > So when rising this consensus limit, not increasing centralization > should be the priority and the potential impact in market fees a much > more secondary concern. > Do you agree with this? > > I'm sure there are many intermediate positions between "caring more > about mining centralization than market fees when deciding about a > consensus rule that limits mining centralization" and "not caring > about mining centralization at all". > I really don't want to put words in your mouth, but I honestly don't > know what your position is. > I don't really know how else can I ask the same question: you don't > care the consensus maximum blocksize rule being here at all or not > (you just said that). > Is it because you don't think it limits mining centralization or > because you don't care about limiting mining centralization with > consensus rules at all? > _______________________________________________ > bitcoin-dev mailing list > bitcoin-dev@lists.linuxfoundation.org > https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev > --001a11c25f52a99aa5051c7a3fab Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
Mike's position is that he wants the block size limit = to=C2=A0eventually=C2=A0be=C2=A0removed. That is of course an extreme view.= Meanwhile, your view that the block size should be artificially constraine= d below the organic growth curve (in a way that will penalize a majority of= existing and future users) lies at the other extreme. The majority positio= n lies somewhere in between (i.e. a one-time increase to 8MB). This is the = position that ultimately matters.

If the block size is i= ncreased to 8MB and things get demonstrably a whole lot worse, then you wil= l have a solid leg to stand on. In that case we can always do another hard = fork later to reduce the block size back to something smaller, and hencefor= th the block size will never be touched again.

On 4 August 2015 at 11:35, Jorge T= im=C3=B3n <bitcoin-dev@lists.linuxfoundation.org&g= t; wrote:
On Fri,= Jul 31, 2015 at 4:58 PM, Mike Hearn <hearn@vinumeris.com> wrote:
>> How more users or more nodes can bring more miners, or more import= antly,
>> improve mining decentralization?
>
>
> Because the bigger the ecosystem is the more interest there is in taki= ng
> part?

As explained by Venzen, this is a non-sequitur.

> I mean, I guess I don't know how to answer your question.

I don't know the answer either, that's fine. It's the op= posite
question that I've been insistently repeating and you've been
(consciously or not) consistently evading.
But that's also fine because I believe you finally answer it a few line= s below.

> When Bitcoin was
> new it had almost no users and almost no miners. Now there are million= s of
> users and factories producing ASICs just for Bitcoin.

The emergence of a btc price enabled the emergence of professional miners, which in turn enabled the emergence of sha256d-specialized
hardware production companies.
Nothing surprising there.
By no means it consitutes an example of how a bigger consensus sizes
can cause less mining centralization.

> Surely the correlation is obvious?

Correlation does not imply causation. I will better leave it at that= ...

>> I'm sorry, but until there's a simulation that I can run w= ith different
>> sizes' testchains (for example using #6382) to somehow compare= them, I will
>> consider any value arbitrary.
>
>
> Gavin did run simulations. 20mb isn't arbitrary, the process behin= d it was
> well documented here:
>
> http://gavin= andresen.ninja/does-more-transactions-necessarily-mean-more-centralized=
>
> I chose 20MB as a reasonable block size to target because 170 gigabyte= s per
> month comfortably fits into the typical 250-300 gigabytes per month da= ta
> cap=E2=80=93 so you can run a full node from home on a =E2=80=9Cpretty= good=E2=80=9D broadband plan.
>
> Did you think 20mb was picked randomly?

No, I think 20 MB was chosen very optimistically, considering 3rd party services rates (not the same service as self-hosting) in the
so-called "first world". And then 20 MB goes to 20 GB, again with=
optimistic and by no means scientific expectations.

But where the number comes from it's not really what I'm demaning,<= br> what I want is some criterion that can tell you that a given size
would be "too centralized" but another one isn't.
I haven't read any analysis on why 8GB is a better option than 7GB and<= br> 9GB for a given criterion (nor one declaring 20 GB a winner over 19 GB
or 21 GB).
A simulation test passing 20 GB but not 21 GB would make it far less arbitr= ary.

>> Agreed on the first sentence, I'm just saying that the influen= ce of
>> the blocksize in that function is monotonic: with bigger sizes, eq= ual
>> or worse mining centralization.
>
>
> I have a hard time agreeing with this because I've seen Bitcoin go= from
> blocks that were often empty to blocks that are often full, and in thi= s time
> the number of miners and hash power on the network has gone up a huge = amount
> too.

I'm of course talking about consensus maximum blocksize, not abo= ut
actual blocksize.
Yes, again, when mining becomes profitable, economic actors tend to
appear and get those profits.
But don't confuse total hashrate improvements with an "increase in= the
number of miners" or with mining decentralization.

> You can argue that a miner doesn't count if they pool mine. But if= a miner
> mines on a pool that uses exactly the same software and settings as th= e
> miner would have done anyway, then it makes no difference. Miners can = switch
> between pools to find one that works the way they like, so whilst less=
> pooling or more decentralised pools would be nice (e.g. getblocktempla= te),
> and I've written about how to push it forward before, I still say = there are
> many more miners than in the past.
>
> If I had to pick between two changes to improve mining decentralisatio= n:
>
> 1) Lower block size

Finally, I think you finally answered my repetitive question here. If I say "Mike Hearn understands that the consensus block size maximum=
rule is a tool for limitting mining centralization" I'm not puttin= g
words in your mouth, right?
I think many users advocating for an increase in the consensus limit
don't understand this, which is extremely unfortunate for the debate.
> 2) Finishing, documenting, and making the UX really slick for a
> getblocktemplate based decentralised mining pool
>
> then I'd pick (2) in a heartbeat. I think it'd be a lot more e= ffective.

Great! Maybe after 2 mining centralization improves so much that we&= #39;re
confortable not only not lowering it but rather increasing it.

>> you should be consequently advocating for full removal of the limi= t rather
>> than changes towards bigger arbitrary values.
>
>
> I did toy with that idea a while ago. Of course there can not really b= e no
> limit at all because the code assumes blocks fit into RAM/swap, and no= des
> would just end up ignoring blocks they couldn't download in time a= nyway.
> There is obviously a physical limit somewhere.

Did the fact that you "understand that the consensus block size=
maximum rule is a tool for limitting mining centralization" influenced=
your rejection of that idea at all?

> But it is easier to find common ground with others by compromising. Is= 8mb
> better than no limit? I don't know and I don't care much:=C2= =A0 I think Bitcoin
> adoption is a slow, hard process and we'll be lucky to increase av= erage
> usage 8x over the next couple of years. So if 8mb+ is better for other= s,
> that's OK by me.

The only way that "not caring much whther we have a consensus l= imit or
not" and "understand that the consensus block size maximum rule i= s a
tool for limitting mining centralization" at the same time is by not caring about mining centralization at all.
Is that your position?

If you don't care about having a limit but you don't want to limit<= br> transaction volume, then ++current_size will ALWAYs be your
"compromise position" and no blocksize increase will ever be enou= gh
until the limit is completely removed.
Is that your position?

> Re: exchange profit. You can pick some other useful service provider i= f you
> like. Payment processors or cold storage providers or the TREZOR
> manufacturers or whoever.

Yes, and I believe the same points stand.

> My point is you can't have a tiny high-value-transactions only cur= rency AND
> all the useful infrastructure that the Bitcoin community is making. It= 's a
> contradiction. And without the infrastructure bitcoin ceases to be
> interesting even to people who are willing to pay huge sums to use it.=

You keep talking about "high-value-transactions-only" like= if
non-urgent transaction fees rising from zero to, say, 1 satoshi, would
automatically result in that "high-value-transactions-only" Bitco= in.
Please, stop talking as if someone was proposing a
"high-value-transactions-only" Bitcoin. That may happen but nobod= y
really knows. If it happens it may not be bad thing necessarily (ie
bitcoin microtransactions can still happen using trustless payment
channels and x is still cheaper than x% for any transacted value
higher than 100) but that's really not what we're talking about her= e
so it seems distraction that can only help further polirizing this
discussion.

What we're talking about here is that hitting the limit would
(hopefully) make miners start caring about fees. Enough that they stop
being irrational about free transactions. If both things happen,
non-urgent transaction fees will likely rise (as said, above zero).

You think that would be a catastrophe for adoption and I disagree.
But (as Pieter has repeatedly explained) for any size there will be
use cases that will be eventually priced out.
So when rising this consensus limit, not increasing centralization
should be the priority and the potential impact in market fees a much
more secondary concern.
Do you agree with this?

I'm sure there are many intermediate positions between "caring mor= e
about mining centralization than market fees when deciding about a
consensus rule that limits mining centralization" and "not caring=
about mining centralization at all".
I really don't want to put words in your mouth, but I honestly don'= t
know what your position is.
I don't really know how else can I ask the same question: you don't=
care the consensus maximum blocksize rule being here at all or not
(you just said that).
Is it because you don't think it limits mining centralization or
because you don't care about limiting mining centralization with
consensus rules at all?
___________________________________= ____________
bitcoin-dev mailing list
bitcoin-dev@lists.= linuxfoundation.org
https://lists.linuxfoundation.org/mail= man/listinfo/bitcoin-dev

--001a11c25f52a99aa5051c7a3fab--