logoalt Hacker News

Humorist2290yesterday at 7:48 PM1 replyview on HN

> Again, we have moved past hallucinations and errors to more subtle, and often human-like, concerns.

From my experience we just get both. The constant risk of some catastrophic hallucination buried in the output, in addition to more subtle, and pervasive, concerns. I haven't tried with Gemini 3 but when I prompted Claude to write a 20 page short story it couldn't even keep basic chronology and characters straight. I wonder if the 14 page research paper would stand up to scrutiny.


Replies

actersyesterday at 8:03 PM

I feel like hallucinations have changed over time from factual errors randomly shoehorned into the middle of sentences to the LLMs confidently telling you they are right and even provide their own reasoning to back up their claims, which most of the time are references that don't exist.

show 1 reply