I don't think the LLM will produce AGI. Just based on how context windows work, the prompt cycle, etc. LLMs aren't out there thinking about stuff in their spare time. The way they appear to have thoughts and a psyche is purely an illusion.
> LLMs aren't out there thinking about stuff in their spare time.
Agentic changes the calculus.
It doesn't have to produce AGI and it could still ruin the lives of millions of people. Our society isn't ready for that kind of shock. We can't all be instagram influencers.
Something I often think about is how we can barely define what AGI, consciousness, etc are. We may be pretty sure that what we have currently is an illusion, but at which point is the illusion good enough that it no longer matters? Especially with regards to my first question.
It's hard to say it's not X when we can't really define X.