BitcoinTalk
Bitcoin Blogger: Is It Better To Buy Or Generate Bitcoins?

View Satoshi only

External link

Nice calculation. certainly it is not worth building a machine for bitcoin creation. However if you already have this computing power sitting idly then you should calculate only the costs of the extra electricity that you need to consume to generate a block. Has anybody done this?
If you want to build a bitcoin generation machine you are probably better off buying a FPGA board and designing a dedicated hash calculating hardware monster.

Interesting note: If one puts 4100000 hash/sec in the calculator then you get 10 minutes average.
10 minutes is the actual target average block production rate.
This means that currently the the total computing power working on solving hashes is 4.1 Mhash/s
or about 900 quad cores. or 3-4 thousand laptops. Pretty impressive!

This means that there are some people who think it is interesting to produce bitcoin.
Who knows why, I stopped several difficulties ago.
Interesting calculations.  Currently I'm down to one computer myself (laptop), so it's not really worth generating for me.  However, I'm wondering whether this is going to be a problem for the network's continued growth and stability? I know there are many threads discussing how there is in fact incentive to continue to generate, even when all 21M are found.  My concern is that those 21M coins are going to end up being majority owned by a minority of individuals.  Will bitcoin end up with the same pareto properties as other economic and social systems, where 1% of the people end up owning just over 50% of the resource?  Is there any way for us to get an accurate gauge on this?
I can't wait for someone to write an application that allows group mining of bitcoins.  I just can't wait one month to see 50 bitcoins.
  Is there any way for us to get an accurate gauge on this?

No, but if you're still alive in 120 years to see it, count yourself lucky.

I'd be willing to bet that whoever holds the majority of bitcoins in 120 years...

1) they earned them and

2) they are not even born yet.

I can't wait for someone to write an application that allows group mining of bitcoins.  I just can't wait one month to see 50 bitcoins.

I think that you missed the point of the blog article.
I can't wait for someone to write an application that allows group mining of bitcoins.  I just can't wait one month to see 50 bitcoins.

I think that you missed the point of the blog article.

Heh.  I did write the article.  But what do I know?  Wink
Quote
Is It Better To Buy Or Generate Bitcoins?
I'd say: it depends.  Grin

if you need coins to do stuff right now, there's no guaranteed way to generate them anyway, can't be "better" that way.
if you dont need coins and just want to support the network, there's no need to buy any (or generate, to try would be enough).

can't even say which is cheaper, or less expensive, that again depends on where you live and what you have to invest (in power and/or hardware).

for me:
i just went down from ~8000khs to ~1500khs at the latest difficulty-step, so i'm actually not expecting to gain lots of coins in the near future, but still generate to help find new blocks.
running a machine (that sucks up >20W) only to generate coins is already useless to me (or needs to be bought/build/invested in first) and (watching current difficulty and market-prices) will prolly result in a loss.
i'm wondering anyway, why the market didn't change at all the last 2 diff.steps, watched and waited a bit, but this is it for me, i'm done generating coins.   Grin
sad thing is, that i have to wait about a month to get coins from MtGox too, but i'll at least get some more than 50btc at once, we'll need some other payment-options there quick.


nice article!
I can't wait for someone to write an application that allows group mining of bitcoins.  I just can't wait one month to see 50 bitcoins.

I think that you missed the point of the blog article.

Heh.  I did write the article.  But what do I know?  Wink

Bi-polar?
The article is all about the cost of the hardware, neglecting the more significant cost: electricity.

Once you're above baseline power of 11 kWh/day (as any geek is), Southern California utilities get about $0.13/kwh marginal, with taxes, distribution, etc.
The 24-core beast built in the article probably draws some serious current.  Hard to guess how much, but I'd guess about 500W?  Anyone know?

This will add 360kW/h a month to your electric bill, which will easily drive you into the next pricing tier, or maybe two tiers higher.  Now your marginal power can be $0.18 kW/hr.
Yikes.
That bitcoin miner would be about $2/day to run, or $788 a year, which means you've never matched the hardware cost of the system in two or three years.

If you have to actively cool the room with the computer, at least during daytime, double it again.

i'm wondering anyway, why the market didn't change at all the last 2 diff.steps, watched and waited a bit, but this is it for me, i'm done generating coins.   Grin
sad thing is, that i have to wait about a month to get coins from MtGox too, but i'll at least get some more than 50btc at once, we'll need some other payment-options there quick.


nice article!

I don't know if this is the case here, but markets do anticipate things, I can't say that the difficulty increases were really surprising in any way. Profitable speculators also tend to even prices out over time. If you actually figure out how to "buy low, sell high" you are providing the service of evening out prices over time, or in the case of arbitrage, evening prices out indifferent locations.
The article is all about the cost of the hardware, neglecting the more significant cost: electricity.

Once you're above baseline power of 11 kWh/day (as any geek is), Southern California utilities get about $0.13/kwh marginal, with taxes, distribution, etc.

This is a calculation that depends highly on who you are and where you live.  I live in an area that recently had a 10%+ residental electric rate hike, to about 8 cents per KWH.  This is only slightly more expensive per btu than using natural gas with a 90% efficient gas heater versus a 100% efficent electric heater.  So the price difference for me to run any computer full tilt during the heating season, which is most certainly longer than Southern California, is about half a penny per kilowatt or less.  I don't even know anyone who bothers to shut down their computers from September to May to save money.  There's also someting to be said for the soothing white noise of a (good condition) cpu fan as the beast in the corner crunching numbers keeps your bedroom a couple degrees warmer so that you can turn the house thermostat down to 69 degrees at night.  I can't prove it, but I would bet that I actually save energy doing this, because otherwise my wife would insist on turning up the heat.
The article is all about the cost of the hardware, neglecting the more significant cost: electricity.

Once you're above baseline power of 11 kWh/day (as any geek is), Southern California utilities get about $0.13/kwh marginal, with taxes, distribution, etc.
The 24-core beast built in the article probably draws some serious current.  Hard to guess how much, but I'd guess about 500W?  Anyone know?

The 24-core beast will draw a minimum of 380W (using the power supply calculator: http://educations.newegg.com/tool/psucalc/index.html)
I can't say that the difficulty increases were really surprising in any way.
difficulty increases of course weren't surprising, we can expect some more sooner or later.  Cheesy
what surprised me was the non-responding market.
if sellers have to invest more time/energy into mining, why do they still sell that cheap?
i'm ok with it though, i wanna buy anyway.  Grin
The article is all about the cost of the hardware, neglecting the more significant cost: electricity.

Once you're above baseline power of 11 kWh/day (as any geek is), Southern California utilities get about $0.13/kwh marginal, with taxes, distribution, etc.

This is a calculation that depends highly on who you are and where you live.  I live in an area that recently had a 10%+ residental electric rate hike, to about 8 cents per KWH.  This is only slightly more expensive per btu than using natural gas with a 90% efficient gas heater versus a 100% efficent electric heater.  So the price difference for me to run any computer full tilt during the heating season, which is most certainly longer than Southern California, is about half a penny per kilowatt or less.  I don't even know anyone who bothers to shut down their computers from September to May to save money.  There's also someting to be said for the soothing white noise of a (good condition) cpu fan as the beast in the corner crunching numbers keeps your bedroom a couple degrees warmer so that you can turn the house thermostat down to 69 degrees at night.  I can't prove it, but I would bet that I actually save energy doing this, because otherwise my wife would insist on turning up the heat.

That's interesting. If you have no screen on then all of the energy spent is converted to heat right? So if you are heating the place anyway it is costless?

Heh, so bitcoins will be a cold climate manufactured good, lol.
That's interesting. If you have no screen on then all of the energy spent is converted to heat right? So if you are heating the place anyway it is costless?

The power to the screen gets converted to heat too, even the light produced would get converted to heat.
[
That's interesting. If you have no screen on then all of the energy spent is converted to heat right? So if you are heating the place anyway it is costless?


To a point, yes.  The trick is knowing where that point of dimishishing returns begins, and how it varies.

Quote

Heh, so bitcoins will be a cold climate manufactured good, lol.

It already is.  I know from private conversations that a statisticly significant number of long-time forum members are Canadians.  At least one of whom has been selling off last winter's heat bill.  If there are more people who generate bitcoins in the northern hemisphere (which is almost certainly the case) one would expect to see a seasonal component to the difficulty.  At least once the difficultly reaches a mature balance in the future.  This is an effect that several of those distributed.net type people have noticed before.  Their overall computational power trends slightly up during the winter months in the northern hemisphere.
The article is all about the cost of the hardware, neglecting the more significant cost: electricity.

Once you're above baseline power of 11 kWh/day (as any geek is), Southern California utilities get about $0.13/kwh marginal, with taxes, distribution, etc.
The 24-core beast built in the article probably draws some serious current.  Hard to guess how much, but I'd guess about 500W?  Anyone know?

The 24-core beast will draw a minimum of 380W (using the power supply calculator: http://educations.newegg.com/tool/psucalc/index.html)

What you have calculated is the power requirements of system with a single Phenom II X4 processor, the article talks about 6 such processors in one system.

A single Phenom II X4 consumes at full load approximately 125W (see TDP: http://en.wikipedia.org/wiki/AMD_Phenom), 6 such cores would consume about 750W, so a full system including memory, video, motherboard, will draw about 900W, assuming a 1000W power supply unit (a somewhat pricey device which is oddly forgotten in the article) with an efficiency of around 80% you can expect drawing over 1 kW 24/7 running such a system. Assuming 1kW and you pay 0.15ct for each kWh that would be about 3.60 every day on power consumption. That's a little bit more than the value of generating one block (which gives you 50 bitcoins).

As you're expected to generate one block each day, you would actually lose money generating bitcoins, even when you got that $2000 system for free. Of course this is assuming you pay a more or less standard price for electricity, don't have any excess self-generated electricity, you don't recycle waste heat, ...

What you have calculated is the power requirements of system with a single Phenom II X4 processor, the article talks about 6 such processors in one system.

A single Phenom II X4 consumes at full load approximately 125W (see TDP: http://en.wikipedia.org/wiki/AMD_Phenom), 6 such cores would consume about 750W, so a full system including memory, video, motherboard, will draw about 900W, assuming a 1000W power supply unit (a somewhat pricey device which is oddly forgotten in the article) with an efficiency of around 80% you can expect drawing over 1 kW 24/7 running such a system. Assuming 1kW and you pay 0.15ct for each kWh that would be about 3.60 every day on power consumption. That's a little bit more than the value of generating one block (which gives you 50 bitcoins).

As you're expected to generate one block each day, you would actually lose money generating bitcoins, even when you got that $2000 system for free. Of course this is assuming you pay a more or less standard price for electricity, don't have any excess self-generated electricity, you don't recycle waste heat, ...

Most of those power calculators will show you only the peak power needed which is generally at startup time, not at full load running. This is good for selecting the power supply you need but not so good for estimating the power costs over the long run. Unless this 6 socket beast is seriously different from smaller systems in that way.
I set up a Kill-a-Watt to my computer and calculated with 2200 khash/s, I was using 140 watts (monitors off). According to the calculator, I should produce a block every 14 days, 2 hours, 3 minutes, or every 338.05 hours. It would cost me 47.327 Kwh to produce a block, at 12 cents per Kwh, thats $5.68 per block, a net loser.

What is interesting is the reports of the people producing 25,000 khash/sec with their video cards in the CUDA thread. If they are using less than 1000 watts (and should be), it would be profitable again to produce, electricity wise.

Difficulty level 1000 by year end.
I set up a Kill-a-Watt to my computer and calculated with 2200 khash/s, I was using 140 watts (monitors off). According to the calculator, I should produce a block every 14 days, 2 hours, 3 minutes, or every 338.05 hours. It would cost me 47.327 Kwh to produce a block, at 12 cents per Kwh, thats $5.68 per block, a net loser.

What is interesting is the reports of the people producing 25,000 khash/sec with their video cards in the CUDA thread. If they are using less than 1000 watts (and should be), it would be profitable again to produce, electricity wise.

Difficulty level 1000 by year end.

2000 at least imo, but my guess is 8500.
What is interesting is the reports of the people producing 25,000 khash/sec with their video cards in the CUDA thread. If they are using less than 1000 watts (and should be), it would be profitable again to produce, electricity wise.

Difficulty level 1000 by year end.
i just did a short test on the lately released cuda-client, here's the results

AMD X3 @2.8ghz
->stock client
~3800khs ~150Watt


GTX260
->puddinpop's cuda client
~33000khs ~200Watt

cpu-work isn't profitable for me anymore, i already shut down most generators at the latest diff.step,
gfx-crunching would indeed be profitable again,
i don't really wanna trust a client that fondles with my balance on it's own though, so i don't use it.
hopefully we'll see an open source client some day.


but your joking about that 1000, arent ya?
end of the year?
maybe end of the week.  Grin
using seconds from 12/18/2009 3:56:01 as base 0 (the last time difficulty was 1.0000), A y=mx+b linear regression of the log of difficulty levels:

m=2.87698e-07
b=-0.611057

use b=-363.4351 for the seconds you programmers use.

example: block 46368 was generated at 1269212064. exp(1269212064*2.87698e-07 - 363.4351) = exp(1.71409) = predicted level of 5.551602 vs actual level of 4.565162.



The correlation factor was 0.957 between the date and log of difficulty level.

This predicts future difficulty levels of:

9/30/2010 : 677.7963
10/31/2010 : 1,464.7148
11/30/2010 : 3,087.5334
12/31/2010 : 6,672.1461
1/31/2011 : 14,418.4781
2/28/2011 : 28,919.2776
3/31/2011 : 62,494.4307
4/30/2011 : 131,734.6149
5/31/2011 : 284,677.9188
6/30/2011 : 600,084.4486
12/31/2011 : 58,149,375.6523

If the trend continues, it will be quite difficult to generate coins in the near future  Tongue
I don't like the total cost of the computer. You could do without the keyboard, mouse, monitor. In addition, get the cheapest RAM stick you can find, a cheap 8GB SSD (energy saving) then install Debian 5. Access it through SSH and turn on bitcoin. You could easily do it to $300-$500 less and cut power costs by getting a high-efficiency PSU.
Someone should set up a newegg affiliate program and recommend the best bitcoin generation gear. Smiley
Someone should set up a newegg affiliate program and recommend the best bitcoin generation gear. Smiley
Once the GPU client gets perfected, I'm going to get a GTX 465, throw in a cheap dual-core CPU, 512MB RAM, and a case/cpu combo; then I am going place it in a closet and let it print Bitcoins for a year or two. Tongue It's about a $500 investment but with patience and careful energy management it might profit.
AMD X3 @2.8ghz
->stock client
~3800khs ~150Watt
Did you try -4way?

Quote
How many hashes can I expect with a 24 core machine? I have a quad-core generating 4,300 hashes-per-second, so I am estimating a 24-core machine could mine bitcoins at 25,000 hashes-per-second.
AMD Phenom (I think 4-core) CPUs are doing about 11,000khps with -4way, about 100% speedup.  24 cores should get 66,000khps.  AMD is the best choice because it has the best SSE2 implementation. (or maybe because tcatm had an AMD and optimised his code for that)

There's been so much else to do that I haven't had time to make -4way automatic.  For now you still have to do it manually.
http://bitcointalk.org/index.php?topic=820.0