logoalt Hacker News

jsnellyesterday at 9:02 PM1 replyview on HN

I don't see how that number could possibly be realistic.

A H100 cost 30k when new, and uses 500W of power.

500W for a year is about 4500kWh, which at $0.10/kWh is $450/year if run at full utilization (unrealistic).

TCO of an AI data center should be entirely dominated by capex depreciation.


Replies

creddityesterday at 11:43 PM

In fairness your calculation looks at the most expensive element of the DC but ignores all of the associated parts required to utilize the H100: CPU, memory, cooling, etc. No to say that that flips the calculation (I don't have the answer), but it does leave a lot of power out.

show 1 reply