logoalt Hacker News

empath75today at 3:58 PM3 repliesview on HN

I think a _single_ instance of an LLM hallucination should be enough to retract the whole paper and ban further submissions.


Replies

gcrtoday at 4:02 PM

Going through a retraction and blacklisting process is also a lot of work -- collecting evidence, giving authors a chance to respond and mediate discussion, etc.

Labor is the bottleneck. There aren't enough academics who volunteer to help organize conferences.

(If a reader of this comment is qualified to review papers and wants to step up to the plate and help do some work in this area, please email the program chairs of your favorite conference and let them know. They'll eagerly put you to work.)

show 1 reply
andy99today at 4:11 PM

   For example, authors may have given an LLM a partial description of a citation and asked the LLM to produce bibtex
This is equivalent to a typo. I’d like to know which “hallucinations” are completely made up, and which have a corresponding paper but contain some error in how it’s cited. The latter I don’t think matters.
show 1 reply
wing-_-nutstoday at 4:08 PM

I dunno about banning them, humans without LLMs make mistakes all the time, but I would definitely place them under much harder scrutiny in the future.

show 1 reply