Unfortunately it's easier to train an AI to be convincing than to be correct, so it can look insightful before it's true.
Like horoscopes, only they're not actually that bad so roll a D20 and on a set of numbers known only to the DM (and varying with domain and task length) you get a textbook answer and on the rest you get convincing nonsense.
> Unfortunately it's easier to train an AI to be convincing than to be correct, so it can look insightful before it's true.
This nails it. This is the fundamental problem with using AI material. You are outsourcing thinking in a way where the response is likely to look very correct without any actual logic or connection to truth.