Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

See my other reply, when people count energy costs they fail to take into account the existing sunk cost of producing said resources, and the energy from having to build out new infrastructure to create these “more efficient” datacenters.

It’s like when people replace their fridge with a “more efficient” one and wipe out any energy savings with the cost of the new fridge. The difference in energy use will not pay for the new fridge for many years and by then you’ve already replaced the new fridge with another newer “better” one.



The only energy cost that matters is to the operator. Old hardware costs more to run so why would I run it? That there was energy used to produce the device and the replacement is literally not a factor in the calculation.


no.

you have to go for TCO to justify upgrades. energy alone most of the time doesn't justify replacing old hardware.

factor in space (=rent), age related increase in failure rate (=servicing), computer power needs (=opportunity costs) then together with the energy needs you find good points in time to justify an upgrade.

energy is the least relevant of those.


[flagged]


Hah, smooth brain! (A little harsh, tho.)


My bad, i get in a mood on HN sometimes.


It's literally an economic pressure.

Even if they were the same efficiency the older takes up way more space.

Why would you pay for 5x the data center space? Surely building they out isn't energy cheap either




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: