logoalt Hacker News

anon_anon12today at 12:55 PM2 repliesview on HN

People's trust on LLM imo stems from the lack of awareness of AI hallucinating. Hallucination benchmarks are often hidden or talked about hastily in marketing videos.


Replies

wpietritoday at 1:38 PM

I think it's better to say that LLMs only hallucinate. All the text they produce is entirely unverified. Humans are the ones reading the text and constructing meaning.

cess11today at 1:00 PM

[flagged]

show 4 replies