logoalt Hacker News

dkdbejwi38305/15/20253 repliesview on HN

How would an LLM “know” when it isn’t sure? Their baseline for truth is competent text, they don’t have a baseline for truth based on observed reality. That’s why they can be “tricked” into things like “Mr Bean is the president of the USA”


Replies

JustFinishedBSG05/15/2025

It would "know" the same way it "knows" anything else: The probability of the sequence "I don't know" would be higher than the probability of any other sequence.

show 1 reply
ben_w05/15/2025

The answer is the same as how the messy bag of chemistry that is the human brain "knows" when it isn't sure:

Badly, and with great difficulty, so while it can just about be done, even then only kinda.

show 1 reply
saberience05/15/2025

Humans can just as easily be tricked. Something like 25% of the American Electorate believed Obama was the antichrist.

So saying LLMs have no "baseline for truth" doesn't really mean much one way of the other, they are much smart and accurate than 99% of humans.