logoalt Hacker News

wing-_-nutstoday at 4:08 PM1 replyview on HN

I dunno about banning them, humans without LLMs make mistakes all the time, but I would definitely place them under much harder scrutiny in the future.


Replies

pessimizertoday at 4:17 PM

Hallucinations aren't mistakes, they're fabrications. The two are probably referred to by the same word in some languages.

Institutions can choose an arbitrary approach to mistakes; maybe they don't mind a lot of them because they want to take risks and be on the bleeding edge. But any flexible attitude towards fabrications is simply corruption. The connected in-crowd will get mercy and the outgroup will get the hammer. Anybody criticizing the differential treatment will be accused of supporting the outgroup fraudsters.

show 1 reply