Technically it is inefficiency. The electricity should be doing computer things. Heat is wasted electricity. Just there's not much the data centre could do about it.
There’s a minimum level of energy consumption (and thus heat) that has to be produced by computation, just because of physics. However, modern computers generate billions of times more heat than this minimum level.
Heat is not by itself waste. It's what electricity turns into after it's done doing computer things. Efficiency is a separate question - how many computer things you got done per unit electricity turned into heat.
I think what they mean is that there is not a Carnot engine hooked up between the heat source and sink. Which theoretically something the data center could do about it.
The electricity is doing computer things, building bitcoin blocks.
No its not. It would be waste only if the there is a high temperature gradient, which is minimized in mining operation through proper cooling.
It's that computation requires electricity. And almost all of the heat in bitcoin mining comes from computation, technically changing transistor state.
They could make a second floor with eggs and newborn chicken. /s
Even if the computer does perfectly-efficient computer things with every Joule, every single one of those Joules ends up as one Joule of waste heat.
If you pull 100W of power out of an electric socket, you are heating your environment at 100W of power completely independent of what you use that electricity for.