>his whole job
Possibly akin to a roofer taking a shortcut up there, then taking a spill? You knew better but unfortunately let the fact that you could probably get away with it with zero impact decide for you.
IIRC hallucinations were essentially kicked off initially by user error, or rather… let’s say at least: a journalist using the best available technologies should have been able to reduce the chance of this big of an issue to near zero, even with language models in the loop & without human review.
(e.g. imagine Karpathy’s llm-council with extra harnessing/scripting, so even MORE expensive, but still. Or some RegEx!)
Alternatively… there was no AI error, the reporter made up the quotes, and lied when they were challenged.