> The report estimates that training the latest frontier large language models, such as xAI’s Grok 4, can generate over 72,000 tons of carbon-equivalent emissions.
That seems pretty trivial, relative to 38bn per year globally?
Also nobody will ever have a moat, so the graph of investor stupidity is going through the roof.
The "China leads in robotics" seems to be unaffected by AI. The China line is basically on the same trajectory since 2012. The chart does no belong in the article.
> Training AI models can generate enormous carbon emissions
Sure, but what I'd really like to see is a graph for how much carbon is generated serving these models globally.
Stating "Software engineers are all-in on AI" because of an increase in github projects being created is hilarious. I didn't realise creating a github repo made someone a software engineer. If only I had known this I wouldn't have bothered learning all the other stuff!
I still don't understand the State of AI in 2026.
Besides the lead in robotics for China, those Grok emissions charts are the thing that most leap off the page.
> The report estimates that carbon emissions from models with the least efficient inference are over 10 times as high as those with the most efficient inference. DeepSeek’s V3 models were estimated to consume around 23 watts when responding to a “medium-length” prompt, while Claude 4 Opus was estimated to consume about 5 watts.
This makes absolutely no sense. I suppose they meant watt hours, and that's a weird way to explain carbon emissions...
Profits generated by AI: <not graphed>
The absence speaks volumes.
Worth calling out AI sentiment among young people is not nearly so rosy: https://news.gallup.com/poll/708224/gen-adoption-steady-skep...