logoalt Hacker News

kittikittitoday at 3:14 PM1 replyview on HN

"LlMs HAlLuCinATE"

Stop this. This is such a stupid way way of describing mistakes from AI. Please try to use the confusion matrix or any other way. If you're going to try and make arguments, it's hard to take them seriously if you keep regurgitating that LLM's hallucinate. It's not a well defined definition so if you continually make this your core argument, it becomes disingenuous.


Replies

dgxyztoday at 3:17 PM

How about "expected poor ratio of corn to shit".?