logoalt Hacker News

gertoptoday at 6:08 AM0 repliesview on HN

I've not heard many people claim that LLMs don't hallucinate, however I have seen people (that I previously believed to be smart):

1. Believe LLMs outright even knowing they are frequently wrong

2. Claim that LLMs making shit up is caused by the user not prompting it correctly. I suppose in the same way that C is memory safe and only bad programmers make it not so.