logoalt Hacker News

madeofpalkyesterday at 5:19 PM0 repliesview on HN

It’s apt, because the only thing LLMs is hallucinate because they have no grounding in reality. They take your input and hallucinate to do something “useful” with it.