Return-Path: Received: from smtp1.linuxfoundation.org (smtp1.linux-foundation.org [172.17.192.35]) by mail.linuxfoundation.org (Postfix) with ESMTPS id C6C048D9 for ; Sat, 14 Nov 2015 16:08:24 +0000 (UTC) X-Greylist: whitelisted by SQLgrey-1.7.6 Received: from mail-lb0-f170.google.com (mail-lb0-f170.google.com [209.85.217.170]) by smtp1.linuxfoundation.org (Postfix) with ESMTPS id 8815A165 for ; Sat, 14 Nov 2015 16:08:23 +0000 (UTC) Received: by lbblt2 with SMTP id lt2so68978130lbb.3 for ; Sat, 14 Nov 2015 08:08:21 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=subject:to:references:from:message-id:date:user-agent:mime-version :in-reply-to:content-type:content-transfer-encoding; bh=JavZI7QTdId53ExSpj0jXHMj0kHfaNGnBz1wgYrXFoM=; b=OO8Wls7P9uwXa7qvIu/oPQkwYBZJiPs42L0sv7KuCQgaWBFA7DXvcBJQganzusN2CU soP9wh4EDePojrbT9/zabkbWdMBaJ0sVeQBnpFZklH6VnJeHDQRCYrvtmwmnB4rM4IiL X4fZRy6YKqVONb/jjAUYJAaUTKDMlwgApDA89LN60LGVcRfqmAQKd8VB3ZxqEKCkmlxF ZbTH/CWw3sGyM10xU1A52QlaIDOkv29CRIcH9HQclQS2ZXT6meq3EI6u9a4Q6DPhHinS SHhikhMmMDs6N+6Z9jKRkPnFF502khD+UZA2foIKTcOBw3rq/157dS/XhQslEcVs2rIi ujOg== X-Received: by 10.112.11.199 with SMTP id s7mr954805lbb.71.1447517301282; Sat, 14 Nov 2015 08:08:21 -0800 (PST) Received: from Mariuszs-MacBook-Pro-2.local (ti0008a400-4579.bb.online.no. [88.88.244.243]) by smtp.googlemail.com with ESMTPSA id v4sm4025171lbo.30.2015.11.14.08.08.19 for (version=TLSv1.2 cipher=ECDHE-RSA-AES128-GCM-SHA256 bits=128/128); Sat, 14 Nov 2015 08:08:20 -0800 (PST) To: bitcoin-dev@lists.linuxfoundation.org References: From: Mariusz Nowostawski X-Enigmail-Draft-Status: N1110 Message-ID: <56475C72.10803@gmail.com> Date: Sat, 14 Nov 2015 17:08:18 +0100 User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.11; rv:38.0) Gecko/20100101 Thunderbird/38.3.0 MIME-Version: 1.0 In-Reply-To: Content-Type: text/plain; charset=windows-1252 Content-Transfer-Encoding: 8bit X-Spam-Status: No, score=-2.7 required=5.0 tests=BAYES_00,DKIM_SIGNED, DKIM_VALID, DKIM_VALID_AU, FREEMAIL_FROM, RCVD_IN_DNSWL_LOW autolearn=ham version=3.3.1 X-Spam-Checker-Version: SpamAssassin 3.3.1 (2010-03-16) on smtp1.linux-foundation.org X-Mailman-Approved-At: Sat, 14 Nov 2015 17:49:48 +0000 Subject: Re: [bitcoin-dev] How to evaluate block size increase suggestions. X-BeenThere: bitcoin-dev@lists.linuxfoundation.org X-Mailman-Version: 2.1.12 Precedence: list List-Id: Bitcoin Development Discussion List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-List-Received-Date: Sat, 14 Nov 2015 16:08:24 -0000 > Date: Fri, 13 Nov 2015 11:37:51 -0500 > From: Emin G?n Sirer > To: bitcoin-dev@lists.linuxfoundation.org > Subject: [bitcoin-dev] How to evaluate block size increase > suggestions. > Message-ID: > > Content-Type: text/plain; charset="utf-8" > > By now, we have seen quite a few proposals for the block size increase. > It's hard not to notice that there are potentially infinitely many > functions for future block size increases. One could, for instance, double > every N years for any rational number N, one could increase linearly, one > could double initially then increase linearly, one could ask the miners to > vote on the size, one could couple the block size increase to halvings, > etc. Without judging any of these proposals on the table, one can see that > there are countless alternative functions one could imagine creating. > > I'd like to ask a question that is one notch higher: Can we enunciate what > grand goals a truly perfect function would achieve? That is, if we could > look into the future and know all the improvements to come in network > access technologies, see the expansion of the Bitcoin network across the > globe, and precisely know the placement and provisioning of all future > nodes, what metrics would we care about as we craft a function to fit what > is to come? > > To be clear, I'd like to avoid discussing any specific block size increase > function. That's very much the tangible (non-meta) block size debate, and > everyone has their opinion and best good-faith attempt at what that > function should look like. I've purposefully stayed out of that issue, > because there are too many options and no metrics for evaluating proposals. > > Instead, I'm asking to see if there is some agreement on how to evaluate a > good proposal. So, the meta-question: if we were looking at the best > possible function, how would we know? If we have N BIPs to choose from, > what criteria do we look for? > > To illustrate, a possible meta goal might be: "increase the block size, > while ensuring that large miners never have an advantage over small miners > that [they did not have in the preceding 6 months, in 2012, pick your time > frame, or else specify the advantage in an absolute fashion]." Or "increase > block size as much as possible, subject to the constraint that 90% of the > nodes on the network are no more than 1 minute behind one of the tails of > the blockchain 99% of the time." Or "do not increase the blocksize until at > least date X." Or "the increase function should be monotonic." And it's > quite OK (and probably likely) to have a combination of these kinds of > metrics and constraints. > > For disclosure, I personally do not have a horse in the block size debate, > besides wanting to see Bitcoin evolve and get more widely adopted. I ask > because as an academic, I'd like to understand if we can use various > simulation and analytic techniques to examine the proposals. A second > reason is that it is very easy to have a proliferation of block size > increase proposals, and good engineering would ask that we define the > meta-criteria first and then pick. To do that, we need some criteria for > judging proposals other than gut feeling. > > Of course, even with meta-criteria in hand, there will be room for lots of > disagreement because we do not actually know the future and reasonable > people can disagree on how things will evolve. I think this is good because > it makes it easier to agree on meta-criteria than on an actual, specific > function for increasing the block size. > > It looks like some specific meta-level criteria would help more at this > point than new proposals all exploring a different variants of block size > increase schedules. > > Best, > > - egs Hi, Good point. I found it also to be important. Enumerating first exactly WHAT and WHY is being solved and then formulating HOW the proposed solutions can be evaluated, resonates with me, probably because I'm an academic. We need to be able to predict if a given proposal addresses the actual problem at hand and we need to know how to evaluate the proposals. But first, we need to re-evaluate the problem. I think the original problem of the increasing demands on the transaction throughput of the bitcoin blockchain have been derailed into a rather constrained discussion on the block size increase. This is fundamentally broken. The debate should be reverted back to the actual problem: how to deal with the increasing demands of the transaction throughput on the bitcoin blockchain, what exactly is the problem, and what possible ways exist in dealing with those problems? In other words, how to deal with the scalability of the bitcoin blockchain. Somewhat surprisingly, all the solutions proposals so far focused on a "centralized" model in which Bitcoin blockchain plays the central role in any future economy. Bitcoin has been compared to VISA and questions have been asked how to handle 1000s of transactions per second - via the Bitcoin blockchain. Should any form of electronic value transfer such as assets, currencies, etc be handled by THE SINGLE CHAIN? A single chain that does it ALL? Is that the objective for the Bitcoin blockchain? There are some contradictions: from one hand the Bitcoin blockchain offers transparent, distributed ledger that can be used for verification of transactions, which is a good thing. At the same time, it uses p2p technology, consensus, and limited transaction throughput (regardless of where the actual block size limit sits), which are bad when considering processing and validating large volumes of financial transactions such as currently processed by VISA. Planning the Bitcoin blockchain to handle everything will simply never scale, it might turn out impossible, and also it will hinder innovation, and can ultimately work against the Bitcoin blockchain (e.g. by inadvertent and irreversible centralization of verification nodes, or simply reducing the full verification nodes count). (Devil's advocate): limit stays on 1MB and there are more transactions generated every 10min that can potentially fit into the block. Then, this follows: 1. There will be a strong incentive for fees to grow. Is it a bad thing? Are growing fees unavoidable anyway? Miners need to recover the costs of mining when the crossing-point of rewards/fees happens, and that means growing fees, right? Can this be modeled? Can we predict how much the fees will actually grow to weed out all the "noise" transactions that currently get through because it is cheap to do so? 2. Some transactions will simply never made it to the chain. Is it a bad thing? Perhaps it is just a self-adaptation to weed out "invaluable" transactions from the system? Do we need to process and cater for ALL the transactions? Including all the NOISE that doesn't really contribute to anything? Can we model this? Personally, I think transaction fees for electronic transfers should be close to ZERO, but, at the same time, I think transaction costs of Bitcoin blockchain should be high enough to keep this chain special, noise free, and compact, with large number of validating nodes working in p2p fashion. So are points 1 and 2 really fundamentally BAD, or are they just a self-regulating anti-noise mechanism? I can pack-up thousands of transactions in auxiliary systems, and validate them via a SINGLE bitcoin blockchain transaction, and this would scale. 3. Companies generating large number of transactions will have to look into ways of saving costs and combining multiple individual transactions into larger ones to save on FEE costs. Is it a bad thing? No - it is a very good thing, and the sooner it happens the better ;) 4. Companies, governments and institutions would have to look into ways of improving their own performance and scalability OUTSIDE of the Bitcoin blockchain. For example through client-server, extremely efficient centralised blockchains, Open Chain-like models, and only using Bitcoin blockchain for validation/verification in a transparent and verifiable manner. How about running thousands of financial transactions per second on open chain and only recording merkel root hashes for batches of transactions every 10min in a Bitcoin blockchain for transparency and verification? So, coming back to the original question about CRITERIA: *1* The ability to setup and run low-cost fully validating node should remain the same or should improve; upon deployment of a given proposal there should be an increase in a number of fully validating nodes, or at worst, there should be no decrease. *2* The ability to include additional data into Bitcoin blockchain (e.g. via OP_RETURN) should improve or, at worst, remain the same as it is now. *3* Upon deployment of a given proposal, the noise in the ledger should be reduced, or at worst, remain the same. *4* A proposal should address long-term scalability, and offer the possibility of combining Bitcoin blockchain with well-defined auxiliary protocols to offer high-transaction throughputs. cheers Mariusz