logoalt Hacker News

hhhlast Thursday at 9:34 AM0 repliesview on HN

> ChatGPT has this issue where when it's doesn't know the explanation for something, it often won't hallucinate outright, but create some long-winded confusing word salad that sounds like it could be right but you can't quite tell.

This is just hallucinating.