I believe his argument is that now that you've defined the limitation, it's a ceiling that will likely be cracked in the relatively near future.
Well, hallucinations have been identified as an issue since the inception of LLMs, so this doesn’t appear true.
Well, hallucinations have been identified as an issue since the inception of LLMs, so this doesn’t appear true.