logoalt Hacker News

wpietritoday at 1:38 PM0 repliesview on HN

I think it's better to say that LLMs only hallucinate. All the text they produce is entirely unverified. Humans are the ones reading the text and constructing meaning.