Something I often think about is how we can barely define what AGI, consciousness, etc are. We may be pretty sure that what we have currently is an illusion, but at which point is the illusion good enough that it no longer matters? Especially with regards to my first question.
It's hard to say it's not X when we can't really define X.
I'm not saying we can't build it, but what we have right now certainly is not it. Right now context is just a bunch of text. Surely the human mind's context resembles something more like a graph database. What if we could use a database for context?
I would personally argue that it's a lot easier to say something definitely isn't x, with confidence, than to say it definitely is. I definitely don't know what the surface of jupiter looks like, but I can pretty confidently say it doesn't look like Kansas. I think the better it gets, the easier it will be to spot the shortcomings, because the gap between what it can do well and what it can't will widen. Anything the technology is fundamentally incapable of ever achieving will be made obvious by the fact that it will simply continue to not achieve it. We may not be able to easily define the totality of what exactly it needs to have to count as AGI, but the further it progresses, the easier it will be to point out individual things it's definitely missing.