Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I did the math a few years back on how long you would have to run old machines to (roughly) offset the carbon emissions instead of purchasing new hardware. This included all mining, refinement, manufacturing, shipping and electrical savings from more efficient processors.

A big part of this is the very intense amount of energy producing the silicon wafer from Quartz ingots. While they weigh only a few grams of the total machine they reside in, they have a very sizable impact on total energy.

Funnily enough, for most desktop computers it would take about 15 years of non-stop usage to manage this. That is if powered purely by Lignite/Brown Coal. Anything cleaner, so almost any other energy source, and you have to run them way longer. If purely on solar panels and their manufacturing carbon output, it moves into the centuries range.



The solar panels required energy to create, too. I don't think that it would take centuries for replacing Cray 1 with a Raspberry Pi 5 to pay for itself in carbon intensity, even if both are powered by solar panels. The Cray example is seemingly uncharitable, but the principal is the same because if the only relevant thing is solar power, then it should take centuries in that case too, right?


Any chance you could write this up and publish?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: