logoalt Hacker News

0biodiversityyesterday at 6:49 AM1 replyview on HN

That it is reported that the global AI footprint is already at 8% of aviation footprint [1] is indeed rather alarming and surprising.

Research on this (is it mainly due to training? inefficient implementations? vibe coders as you say? other industrial applivations? can we verify this by the number of gpus made or money spent? etc) is truly necessary and the top companies must not be allowed to be not transparent about this.

[1] https://www.theguardian.com/technology/2025/dec/18/2025-ai-b...


Replies

NohatCoderyesterday at 5:16 PM

The nature of these AIs is generally such that you can always throw more computation at the problem. Bigger models is obvious, but as I hinted earlier a lot of the current research goes more towards making various subqueries than making the models even bigger. In any case, for now the predominant factor determining how much compute a given prompt costs is how much compute someone decided to spend. So obviously if you pay for the "good" models there will be a lot more compute behind it than if you prompt a free model.