logoalt Hacker News

botusaurusyesterday at 9:30 PM1 replyview on HN

> However, LLM would also require >75% of our galaxy energy output to reach 1 human level intelligence error rates in general.

citation needed


Replies

Joel_Mckayyesterday at 9:43 PM

The activation capping effect on LLM behavior is available in this paper:

https://www.anthropic.com/research/assistant-axis

The estimated energy consumption versus error rate is likely projected from agent test and hidden-agent coverage.

You are correct, in that such a big number likely includes large errors itself given models change daily. =3

show 1 reply