The building and electrical infrastructure are far cheaper than the hardware. So much so that the electricity is a small cost of the data center build out, but a major cost for the grid.
Of the most valuable part is quickly depreciating and goes unused within the first few years, it won't have a chance for long term value like fiber. If data centers become, I don't know, battery grid storage, it will be very very expensive grid storage.
Which is to say that while there was an early salivation for fiber that was eventually useful, overallocation of capital to GPUs goes to pure waste.
>The building and electrical infrastructure are far cheaper than the hardware.
Maybe it's cheaper if we measure by dollars or something, but at the same time we lack the political will to actually do it without something like AI on the horizon.
For example, many data center operators are pushing for nuclear power: https://www.ehn.org/why-microsoft-s-move-to-reopen-three-mil...
That's one example among many.
So I'm hesitant to believe that "electricity is a small cost" of the whole thing, when they are pushing for something as controversial as nuclear.
Also the 2 are not mutually exclusive. Chip fabs are energy intensive. https://www.tomshardware.com/tech-industry/semiconductors/ts...