logoalt Hacker News

tqiyesterday at 10:11 PM2 repliesview on HN

> The report estimates that training the latest frontier large language models, such as xAI’s Grok 4, can generate over 72,000 tons of carbon-equivalent emissions.

That seems pretty trivial, relative to 38bn per year globally?


Replies

azakaiyesterday at 11:51 PM

Another way to put it: if training a model cost 72,000 tons of carbon, and it then gets used by 100 million people (typical of major models), the cost per person is 0.00072 tons.

Per the article, the average human uses over 5 tons per year (Americans: 18). Adding 0.00072 to 5 is not really noticeable.

(There is also the cost of inference, of course.)

jeffbeeyesterday at 11:46 PM

Yeah it's basically nothing despite the fact that xAI seemed to intentionally crank up the carbon intensity for no reason.

Also, hilarious to select 2 major models from 2025 and they're both Grok, almost certainly the least useful, least used, and least interesting of that year.