See my other reply, when people count energy costs they fail to take into account the existing sunk cost of producing said resources, and the energy from having to build out new infrastructure to create these “more efficient” datacenters.
It’s like when people replace their fridge with a “more efficient” one and wipe out any energy savings with the cost of the new fridge. The difference in energy use will not pay for the new fridge for many years and by then you’ve already replaced the new fridge with another newer “better” one.
The only energy cost that matters is to the operator. Old hardware costs more to run so why would I run it? That there was energy used to produce the device and the replacement is literally not a factor in the calculation.
you have to go for TCO to justify upgrades. energy alone most of the time doesn't justify replacing old hardware.
factor in space (=rent), age related increase in failure rate (=servicing), computer power needs (=opportunity costs) then together with the energy needs you find good points in time to justify an upgrade.
It’s like when people replace their fridge with a “more efficient” one and wipe out any energy savings with the cost of the new fridge. The difference in energy use will not pay for the new fridge for many years and by then you’ve already replaced the new fridge with another newer “better” one.