What's more alarming isn't that AI is limited to existing domain data, it's that when people push it to deviate outside those known data points it confidently hallucinates nonsense.
And many of today's publicly-accessible platforms are designed to steer results away from nonsense back to... too much sense. "Hypernormal" is a good word for it. I've spent a lot of time prompt-yelling, "Please make something as weird as I'm imagining, stop veering back to what normal people want to see/read."
So much nonsense.
If I had a nickel for every AI-poisoned "researcher" I'd seen with a preprint full of nonsense buzzwords like "quantum fractal holographic resonance matrix"... well, I wouldn't be rich, but I'd probably at least have enough to buy a coffee.