logoalt Hacker News

usefulcatyesterday at 3:10 PM1 replyview on HN

> it often answers in 20 seconds what it would take me an entire afternoon to figure out with traditional research.

In that case I think it would be only fair to also count the energy required for training the LLM.

LLMs are far ahead of humans in terms of the sheer amount of knowledge they can remember, but nowhere close in terms of general intelligence.


Replies

crazygringoyesterday at 4:20 PM

Training energy is amortized across the lifespan of a model. For any given query for the most popular commercial models, your share of the energy used to train it is a small fraction of the energy used for inference (e.g. 10%).