logoalt Hacker News

simonhtoday at 2:33 PM0 repliesview on HN

Humans understand what mistakes are and can reason about what constitutes a mistake and what doesn’t. LLMs can’t do that.

It’s for the same reason that they will invent bullshit instead of saying “I don’t know”, when they don’t know. They don’t have a concept of accuracy of facts.