logoalt Hacker News

arnaudsmyesterday at 10:22 PM3 repliesview on HN

Running an LLM by default when I open your site is the most energy-consuming thing a computer can do, and the thing consumers hate the most in 2025.


Replies

xzjisyesterday at 11:12 PM

I love Kagi's implementation: by default it's disabled, you either have to add a question mark to the search, or click in the interface after searching to generate the summary.

show 1 reply
observationisttoday at 12:33 AM

This is absurd. Training an AI is energy intensive but highly efficient. Running inference for a few hundred tokens, doing a search, stuff like that is a triviality.

Each generated token takes the equivalent energy of the heat from burning ~.06 µL of gasoline per token. ~2 joules per token, including datacenter and hosting overhead. If you get up to massive million token prompts, it can get up to the 8-10 joules per token of output. Training runs around 17-20J per token.

A liter of gasoline gets you 16,800,000 tokens for normal use cases. Caching and the various scaled up efficiency hacks and improvements get you into the thousands of tokens per joule for some use cases.

For contrast, your desktop PC running idle uses around 350k joules per day. Your fridge uses 3 million joules per day.

AI is such a relatively trivial use of resources that you caring about nearly any other problem, in the entire expanse of all available problems to care about, would be a better use of your time.

AI is making resources allocated to computation and data processing much more efficient, and year over year, the relative intelligence per token generated, and the absolute energy cost per token generated, is getting far more efficient and relatively valuable.

Find something meaningful to be upset at. AI is a dumb thing to be angry at.

show 2 replies
conradevtoday at 12:03 AM

Your computer might use more energy displaying the results to you than the server does generating them. Especially in Chrome :)

The server shares resources!