The article says, "Kirshna said that it takes about $80 billion to fill up a one-gigawatt data center."
But thanks for you insight -- I used your basic idea to estimate and for 1GW it comes to about $30b just for enough GPU power to pull 1GW. And of course that doesn't take into account any other costs.
So $80b for a GW datacenter seems high, but it's within a small constant factor.
That said, power seems like a weird metric to use. Although I don't know what sort of metric makes sense for AI (e.g., a flops counterpart for AI workloads). I'd expect efficiency to get better and GPU cost to go down over time (???).
UPDATE: Below someone posted an article breaking down the costs. In that article they note that GPUs are about 39% of the cost. Using what I independently computed to be $30b -- at 39% of total costs, my estimate is $77b per GW -- remarkably close to the CEO of IBM. I guess he may know what he's talking about. :-)
> power seems like a weird metric to use
Because this technology changes so fast, that's the only metric that you can control over several data centers. It is also directly connected to the general capacity of data center, which is limited by available energy to operate.