logoalt Hacker News

operatingthetanyesterday at 7:11 PM3 repliesview on HN

I don't think the LLM will produce AGI. Just based on how context windows work, the prompt cycle, etc. LLMs aren't out there thinking about stuff in their spare time. The way they appear to have thoughts and a psyche is purely an illusion.


Replies

fooquxyesterday at 7:17 PM

Something I often think about is how we can barely define what AGI, consciousness, etc are. We may be pretty sure that what we have currently is an illusion, but at which point is the illusion good enough that it no longer matters? Especially with regards to my first question.

It's hard to say it's not X when we can't really define X.

show 2 replies
andsoitisyesterday at 7:25 PM

> LLMs aren't out there thinking about stuff in their spare time.

Agentic changes the calculus.

show 1 reply
booleandilemmayesterday at 7:14 PM

It doesn't have to produce AGI and it could still ruin the lives of millions of people. Our society isn't ready for that kind of shock. We can't all be instagram influencers.

show 1 reply