They don't have to spend $500B to compete. Their costs should be much lower.
That said, I don't think they have the courage to invest even the lower amount that it would take to compete with this. But it's not clear if it's truly necessary either, as DeepSeek is proving that you don't need a billion to get to the frontier. For all we know we might all be running AGI locally on our gaming PCs in a few years' time. I'm glad I'm not the one writing the checks here.
They’re a big company. You could tell a story that they’re less efficient than OpenAI and Nvidia and therefore need more than $500b to compete! Who knows?
This seems to be getting lost in the noise in the stampede for infrastructure funding
Deepseek v3 at $5.5M on compute and now r1 a few weeks later hitting o1 benchmark scores with a fraction of the engineers etc. involved ... and open source
We know model prep/training compute has potentially peaked for now ... with some smaller models starting to perform very well as inference improves by the week
Unless some new RL concept is going to require vastly more compute for a run at AGI soon ... it's possible the capacity being built based on an extrapolation of 2024 numbers will exceed the 2025 actuals
Also, can see many enterprises wanting to run on-prem -- at least initially