logoalt Hacker News

AndrewKemendoyesterday at 1:19 PM3 repliesview on HN

> I've seen an interesting behavior in India. If I ask someone on the street for directions, they will always give me an answer, even if they don't know. If they don't know, they'll make something up.

Isn’t this the precise failure pattern that everybody shits on LLMs for?


Replies

chrisjjyesterday at 4:43 PM

Only on surface. The difference is the LLM doesn't know it doesn't know. An LLM provides the best solition it has regardless of whether that solution is in any way fit for purpose.

show 1 reply
koliberyesterday at 2:06 PM

Yes.

show 1 reply
melvinmelihyesterday at 3:15 PM

--

show 1 reply