$19B -> $30B annualized revenue in a month?
Feels like the lede is buried here!
I’m surprised Anthropic wanted to partner with Broadcom when they have such a negative reputation with antics such as their VMWare acquisition.
Interesting to see Anthropic investing in compute infrastructure. The bottleneck I keep hitting is not raw compute but where that compute lives — EU customers increasingly need guarantees their data stays in-region. More sovereign compute options in Europe would unlock a lot of enterprise AI adoption.
There's no limit to the algorithms. People dont understand yet. They can learn the whole universe with a big enough compute cluster. We built a generalizable learning machine
Can someone explain why everything is being marketed in terms of power consumption?
I don’t understand Claude Code’s moat here. What can it do that opencode can’t or couldn’t fairly easily implement?
[flagged]
[dead]
I guess gigawatts is how we roughly measure computing capacity at the datacenter scale? Also saw something similar here:
> Costs and pricing are expressed per “token”, but the published data immediately seems to admit that this is a bad choice of unit because it costs a lot more to output a token than input one. It seems to me that the actual marginal quantity being produced and consumed is “processing power”, which is apparently measured in gigawatt hours these days. In any case, I think more than anything this vindicates my original decision not to get too precise. [...]
https://backofmind.substack.com/p/new-new-rules-for-the-new-...
Is it priced that way, though? I assume next-gen TPU's will be more efficient?