> As long as there is a gap between AI and human learning, we do not have AGI.
Don't read the statement as a human dunk on LLMs, or even as philosophy.
The gap is important because of its special and devastating economic consequences. When the gap becomes truly zero, all human knowledge work is replaceable. From there, with robots, its a short step to all work is replaceable.
What's worse, the condition is sufficient but not even necessary. Just as planes can fly without flapping, the economy can be destroyed without full AGI.
The gap is important because of its special and devastating economic consequences. When the gap becomes truly zero, all human knowledge work is replaceable. From there, with robots, its a short step to all work is replaceable.
I don’t know why statements like this are just taken as gospel fact. There are plenty of economic activities which do not disappear even if an AI can do them.
Here’s one: I support certain artists because I care about their particular life story and have seen them perform live. I don’t care if an AI can replicate their music because the AI didn’t experience life.
Here’s another: positions that have deep experience in certain industries and have valuable networks; or that derive power by being in certain positions. You could build a model that incorporates every single thing the US president, any president, ever said, and it still wouldn’t get you in the position of being president. Many roles are contextual, not knowledge-based.
The idea that AGI replaces all work only makes sense if you’re talking about a world with completely open, free information access. I don’t just mean in the obvious sense; I mean also “inside your head.” AI can only use data it has access to, and it’s never going to have access to everyone’s individual brain everywhere at all times.
So here’s a better prediction: markets will gradually shift to adjust to this, information will become more secretive, and attention-based entertainment economics will become a larger and larger share of the overall economy.
If you’re concerned about the economic impact, then whether a model is AGI or not doesn’t matter. It really is more of a philosophical thing.
There’s no “gap that becomes truly zero” at which point special consequences happen. By the time we achieve AGI, the lesser forms of AI will likely have replaced a lot of human knowledge labor through the exact “brute-force” methods Chollet is trying to factor out (which is why many people are saying that doing so is unproductive).
AGI is like an event horizon: It does mean something, it is a point in space, but you don’t notice yourself going through it, the curvature smoothly increases through it.