Bitcoin Forum
May 09, 2016, 12:59:06 AM *
News: New! Latest stable version of Bitcoin Core: 0.12.1 [Torrent]
 
  Home Help Search Donate Login Register  
  Show Posts
Pages: « 1 2 3 [4] 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 ... 114 »
61  Bitcoin / Bitcoin Discussion / Re: Gavin Andresen Proposes Bitcoin Hard Fork to Address Network Scalability on: October 17, 2014, 04:15:33 PM
The real question is do you think it is feasible to do this before the "hard-fork"?

It is already being done, so yes. Optimizations to how transactions or blocks are communicated between peers don't require any sort of fork.
62  Bitcoin / Bitcoin Discussion / Re: Why blockchains might want to consider using AT "Turing complete" txs on: October 17, 2014, 04:14:11 PM
I'm confused. In the lottery example:
Quote
get timestamp for @txid and store in @timestamp
What is the timestamp for a transaction? When the node receiving the transaction receives it? The timestamp of the block in which the transaction is confirmed?
63  Bitcoin / Bitcoin Discussion / Re: Gavin Andresen Proposes Bitcoin Hard Fork to Address Network Scalability on: October 17, 2014, 04:02:17 PM
...basically the blocks just contain the txids - which can be matched with those in each nodes memory pool (assuming they are present - they may need to "request" txs if they don't already know about them).

I think you're reinventing Matt's fast block relay code.  See:
  https://bitcoinfoundation.org/2014/08/a-bitcoin-backbone/
64  Bitcoin / Development & Technical Discussion / Re: Increasing the block size is a good idea; 50%/year is probably too aggressive on: October 16, 2014, 09:50:58 PM
Designing something to work and designing to not fail are entirely different endeavors and someone qualified for one is not necessarily qualified to even evaluate the other.
Pure coincidence, but I had lunch today with a local developer who will be putting up a building in downtown Amherst. They are planning on running fiber to the building, because they want to build for the future and the people they want to sell to (like me in a few years, when we downsize after my kids are in college) want fast Internet.

If I gaze into my crystal ball...  I see nothing but more and more demand for bandwidth.

We've got streaming Netflix now, at "pretty good" quality.  We'll want enough bandwidth to stream retina-display-quality to every family member in the house simultaneously.

Then we'll want to stream HD 3D surround video to our Oculus Rift gizmos, which is probably another order of magnitude in bandwidth. To every member of the family, simultaneously. While our home security cameras stream to some security center off-site that is storing it as potential evidence in case of burglary or vandalism....

Then... who knows? Every prediction of "this will surely be enough technology" has turned out to be wrong so far.
65  Bitcoin / Development & Technical Discussion / Re: Increasing the block size is a good idea; 50%/year is probably too aggressive on: October 15, 2014, 06:34:45 PM
Of course one can say, let's put it 50% per year until the bandwidth stops growing that fast,
and then we fork again. But this only postpones the problem.  Trying to predict now  exactly when this happens, and to  program for it now, seems futile.

Okey dokey.  My latest straw-man proposal is 40% per year growth for 20 years. That seems like a reasonable compromise based on current conditions and trends.

You seem to be looking hard for reasons not to grow the block size-- for example, yes, CPU clock speed growth has stopped. But number of cores put onto a chip continues to grow, so Moore's Law continues.  (and the reference implementation already uses as many cores as you have to validate transactions)

PS: I got positive feedback from a couple of full-time, professional economists on my "block size economics" post, it should be up tomorrow or Friday.
66  Bitcoin / Development & Technical Discussion / Re: A Scalability Roadmap on: October 14, 2014, 10:12:07 PM
I propose the following rule to determine the block size limit, once the block reward is low
The block size limit would increase (or decrease), by X%, if total transaction fees in the last N blocks is Y Bitcoin or more (or less).  

......

I am aware miners could also manipulate fees by including transactions with large fees and not broadcasting these to the network.  However why would miners in this scenario want to manipulate the limit upwards?

The fear is that a cartel of big, centralized, have-huge-data-pipes miners would drive out smaller miners by forcing up the block size high enough so the smaller miners have to drop out.



67  Bitcoin / Development & Technical Discussion / Re: Increasing the block size is a good idea; 50%/year is probably too aggressive on: October 14, 2014, 10:07:23 PM
why not linear growth, like  +n MB per block halving, or quadratic like +n MB per n'th block halving?

Because network bandwidth, CPU, main memory, and disk storage (the potential bottlenecks) are all growing exponentially right now, and are projected to continue growing exponentially for the next couple decades.

Why would we choose linear growth when the trend is exponential growth?

Unless you think we should artificially limit Bitcoin itself to linear growth for some reason. Exponential growth in number of users and usage is what we want, yes?
68  Bitcoin / Development & Technical Discussion / Re: Message to devs from merchant on: October 14, 2014, 03:47:59 PM
You guys say that hard drives are cheap but you still have to scan the block chain at first. It takes a very long time to do but is the only trust less solution, isn't it?

Yes, today. But not at some point in the future. Please read about "UTXO commitments" in https://bitcoinfoundation.org/2014/10/a-scalability-roadmap/

69  Bitcoin / Development & Technical Discussion / Re: Increasing the block size is a good idea; 50%/year is probably too aggressive on: October 14, 2014, 03:11:15 PM
No comment on this?
Quote
One example of a better way would be to use a sliding window of x number of blocks 100+ deep and basing max allowed size on some percentage over the average while dropping anomalous outliers from that calculation.  Using some method that is sensitive to the reality as it may exist in the unpredictable future give some assurance that we won't just be changing this whenever circumstances change.
Do it right, do it once.

That does not address the core of people's fears, which is that big, centralized mining concerns will collaborate to push smaller competitors off the network by driving up the median block size.

There isn't a way to predict what networks will look like in the future, other than to use the data of the future to do just that.  Where we are guessing we ought acknowledge that.

Yes, that is a good point, made by other people in the other thread about this. A more conservative rule would be fine with me, e.g.

Fact: average "good" home Internet connection is 250GB/month bandwidth.
Fact: Internet bandwidth has been growing at 50% per year for the last 20 years.
  (if you can find better data than me on these, please post links).

So I propose the maximum block size be increased to 20MB as soon as we can be sure the reference implementation code can handle blocks that large (that works out to about 40% of 250GB per month).
Increase the maximum by 40% every two years (really, double every two years-- thanks to whoever pointed out 40% per year is 96% over two years)
Since nothing can grow forever, stop doubling after 20 years.

70  Bitcoin / Development & Technical Discussion / Re: Increasing the block size is a good idea; 50%/year is probably too aggressive on: October 13, 2014, 07:05:00 PM
It also may be contrary to the eventual goal of usage driven mining, where transaction fees ultimately overtake block reward in value.  This proposal may drive TX fees to zero forever.  Block chain is a somewhat scarce resource, just as total # of coins.  Adding an arbitrary 50% yearly inflation changes things detrimentally.

I'm sending a follow-up blog post to a couple of economists to review, to make sure my economic reasoning is correct, but I don't believe that even an infinite blocksize would drive fees to zero forever.

Commodity prices never drop to zero, no matter how abundant they are (assuming a reasonably free market-- government can, of course supply "free" goods, but the results are never pretty). The suppliers of the commodities have to make a profit, or they'll find something else to do.

That has very little to do with whether or not transaction fees will be enough to secure the network in the future. I think both the "DON'T RAISE BLOCKSIZE OR THE WORLD WILL END!" and "MUST RAISE THE BLOCKSIZE OR THE WORLD WILL END!" factions confuse those two issues. I don't think adjusting the block size up or down or keeping it the same will have any effect on whether or not transaction fees will be enough to secure the network as the block subsidy goes to zero (and, as I said, I'll ask professional economists what they think).

If this forks as currently proposed, I'll be selling all my BTC on Gavin's fork and mining on the other.  I suspect I will not be the only one.

Okey dokey. You can join the people still mining on we-prefer-50-BTC-per-block fork (if you can find them... I think they gave up really quickly after the 50 to 25 BTC subsidy decrease).
71  Bitcoin / Development & Technical Discussion / Re: A Scalability Roadmap on: October 09, 2014, 10:14:22 PM
Yeah, 40% of a 250 GB connection works out to about 23 MB depending on how you define month.  May I ask what would happen regarding TOR?

Thanks for checking my math!  I used 31-day months, since I assume that is how ISPs do the bandwidth cap.

RE: what happens with Tor:

Run a full node (or better, several full nodes) that is connected to the network directly-- not via Tor.

But to keep your transactions private, you broadcast them through a Tor-connected SPV (not full) node. If you are mining, broadcast new blocks the same way.

That gives you fully-validating-node security plus transaction/block privacy. You could run both the full node and the SPV-Tor-connected node on a machine at home; to the rest of the network your home IP address would look like a relay node that never generated any transactions or blocks.

If you live in a country where even just connecting to the Bitcoin network is illegal (or would draw unwelcome attention to yourself), then you'd need to pay for a server somewhere else and administer it via Tor.
72  Bitcoin / Development & Technical Discussion / Re: A Scalability Roadmap on: October 09, 2014, 07:51:30 PM
An extremely large block size would mess up the economics of mining eventually.

I'm working on a follow-up blog post that talks about economics of the block size, but want to get it reviewed by some real economists to make sure my thinking is reasonably correct. But I'm curious: why do you think an extremely large block size will mess up the economics of mining?  What do you think would happen?

RE: geometric growth cannot go on forever:  true, but Moore's law has been going steady for 40 years now. The most pessimistic prediction I could find said it would last at least another 10-20 years; the most optimistic, 600 years.

I'd be happy with "increase block size 40% per year (double every two years) for 20 years, then stop."

Because if Bitcoin is going gangbusters 15 years from now, and CPU and bandwidth growth is still going strong, then either the "X%" or the "then stop date" can be changed to continue growing.

I did some research, and the average "good" broadband Internet connection in the US is 10Mbps speed. But ISPs are putting caps on home users' total bandwidth usage per month, typically 250 or 300GB/month. If I recall correctly, 300GB per month was the limit for my ISP in Australia, too.

Do the math, and 40% of a 250GB connection works out to 21MB dedicated to Bitcoin every ten minutes. Leave a generous megabyte for overhead, that would work out to a starting point of maximum-size-20MB blocks.

(somebody check my math, I'm really good at dropping zeroes)

73  Bitcoin / Development & Technical Discussion / Re: A Scalability Roadmap on: October 08, 2014, 05:36:07 PM
Lowering the limit afterward wouldn't be a soft-forking change if the majority of mining power was creating too-large blocks, which seems possible.

When I say "soft-fork" I mean "a majority of miners upgrade and force all the rest of the miners to go along (but merchants and other fully-validating, non-mining nodes do not have to upgrade)."

Note that individual miners (or sub-majority cartels) can unilaterally create smaller blocks containing just higher-fee transactions, if they think it is in their long-term interest to put upward pressure on transaction fees.

I think that a really conservative automatic increase would be OK, but 50% yearly sounds too high to me. If this happens to exceed some residential ISP's actual bandwidth growth, then eventually that ISP's customers will be unable to be full nodes unless they pay for a much more expensive Internet connection. The idea of this sort of situation really concerns me, especially since the loss of full nodes would likely be gradual and easy to ignore until after it becomes very difficult to correct.

As I mentioned on Reddit, I'm also not 100% sure that I agree with your proposed starting point of 50% of a hobbyist-level Internet connection. This seems somewhat burdensome for individuals. It's entirely possible that Bitcoin can be secure without a lot of individuals running full nodes, but I'm not sure about this.

Would 40% initial size and growth make you support the proposal?


Determining the best/safest way to choose the max block size isn't really a technical problem; it has more to do with economics and game theory. I'd really like to see some research/opinions on this issue from economists and other people who specialize in this sort of problem.

Anybody know economists who specialize in this sort of problem? Judging by what I know about economics and economists, I suspect if we ask eleven of them we'll get seven different opinions for the best thing to do. Five of which will miss the point of Bitcoin entirely. ("...elect a Board of Blocksize Governors that decides on an Optimal Size based on market supply and demand conditions as measured by an independent Bureau of Blocksize Research....")
74  Bitcoin / Development & Technical Discussion / Re: A Scalability Roadmap on: October 06, 2014, 06:25:02 PM
Is Gavin saying this should grow at 50% per year because bandwidth has been increasing at this rate in the past?  Might it not be safer to choose a rate lower than historic bandwidth growth?  Also how do we know this high growth in bandwidth will continue?

Yes, that is what I am saying.

"Safer" : there are two competing threats here: raise the block size too slowly and you discourage transactions and increase their price. The danger is Bitcoin becomes irrelevant for anything besides huge transactions, and is used only by big corporations and is too expensive for individuals. Hurray, we just reinvented the SWIFT or ACH systems.

Raise it too quickly and it gets too expensive for ordinary people to run full nodes.

So I'm saying: the future is uncertain, but there is a clear trend. Lets follow that trend, because it is the best predictor of what will happen that we have.

If the experts are wrong, and bandwidth growth (or CPU growth or memory growth or whatever) slows or stops in ten years, then fine: change the largest-block-I'll-accept formula. Lowering the maximum is easier than raising it (lowering is a soft-forking change that would only affect stubborn miners who insisted on creating larger-than-what-the-majority-wants blocks).


RE: a quick fix like doubling the size:

Why doubling? Please don't be lazy, at least do some back-of-the-envelope calculations to justify your numbers (to save you some work: the average Bitcoin transaction is about 250 bytes big). The typical broadband home internet connection can support much larger blocks today.
75  Bitcoin / Development & Technical Discussion / Re: Adding a feature to decline dust transactions on: October 04, 2014, 05:00:07 PM
That could be a feature of the wallet:  do not to display any unconfirmed (or even confirmed) transaction less than x

That is a feature of Bitcoin-Qt. Unconfirmed dust transactions don't enter the memory pool, so they are not relayed, not included in blocks being mined, and not displayed by the wallet.

If I recall correctly, if they DO get mined into a block by somebody then they are displayed. Ignoring them and not adding them to the wallet in that case might be a nice feature, although today's dust might be tomorrow's treasure if prices rise another couple orders of magnitude.
76  Bitcoin / Bitcoin Discussion / Re: It's about time to turn off PoW mining on: September 17, 2014, 03:56:49 PM
I am not the best person to discuss the technical details here, but how do you explain PoW altcoins are easily 51% attacked to death. But then PoS altcoins all avoided this fate, and most of them (the non scammy ones), works and works well. Clearly when put in a equal competition (altcoins), the PoS system came out on top in an equal competitive environment (without early start advantage etc...).

I think we'll see non-clone coins being broken after two things happen:

1. They become valuable enough for attackers to bother, and there is some way for them to cash out.
2. The attackers have some time to do what they need to do to mount an attack-- write code, deploy botnets, hack into some big exchange(s), get their hands on some early-adopter's wallet backups, or whatever.

Once the tools and techniques are developed, then I think we'll see what we see in PoW 51% attacks: attacks against even mostly-worthless clonecoins, because if they've already got the tools then they might just attack for the lulz.

I'm surprised you count peercoin a PoS success-- they're still running with centralized checkpoints, aren't they?
77  Bitcoin / Bitcoin Discussion / Re: It's about time to turn off PoW mining on: September 17, 2014, 03:23:42 PM
Is there a rebuttal from the PoS crowd to this:
  https://download.wpsoftware.net/bitcoin/pos.pdf

... other than "sure, the original PoS ideas were flawed, but the latest MegaUberPoS system gets it right and nobody has figured out exactly how to break it!"
78  Bitcoin / Bitcoin Discussion / Re: DDoS attack on Bitcoin.org on: September 17, 2014, 03:11:26 PM
Probably just some anti-Foundation skiddie who saw this:
  https://bitcoinfoundation.org/2014/09/bitcoin-org-walk-down-memory-lane/
79  Bitcoin / Development & Technical Discussion / Re: Instructing a node to disconnect from a specific peer via RPC on: September 15, 2014, 12:29:24 PM
I needed that, so hacked together a disconnectpeer RPC call:
  https://github.com/gavinandresen/bitcoin-git/commit/499ae0b3d77e1c41d79f34329d555980676d1f3a

Needs more thorough testing-- I'm not sure if calling CloseSocketDisconnect directly from the RPC thread is the cleanest way of disconnecting a peer.
80  Bitcoin / Development & Technical Discussion / Re: Energy efficiency. on: September 10, 2014, 04:51:42 PM
Thaddeus Dryja's "proof of idle" idea hasn't been getting enough attention. See https://www.youtube.com/watch?v=QN2TPeQ9mnA

The idea is to get paid NOT to mine, because it is economically rational for everybody to keep the difficulty lower rather than higher (everybody saves money on electricity if everybody can somehow agree to keep their equipment idle). Thaddeus figured out a way of solving the coordination problem so nobody can lie about how much mining power they have or profit from cheating and running miners that they promised to keep idle.

Having lots of idle mining capacity is appealing for at least two reasons -- it is more energy efficient, and if an attack of some kind is detected it could be brought online instead of kept idle to help fight the attack.



However... I suspect that taking that idle mining power, pointing it to a big mining pool, and then performing a winning-share-withholding-attack (if you find a share that satisfies full network difficulty, don't submit it to the pool -- just throw it away and pretend it never happened) could be a way of doubling your profits, because you drive down difficulty, get paid for "idle" capacity, AND get a share of the profits from the mining pool you're attacking.
Pages: « 1 2 3 [4] 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 ... 114 »
Sponsored by , a Bitcoin-accepting VPN.
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!