logoalt Hacker News

coldteatoday at 10:10 AM2 repliesview on HN

What part of "Specifically, we define a formal world where hallucination is defined as inconsistencies between a computable LLM and a computable ground truth function. By employing results from learning theory, we show that LLMs cannot learn all the computable functions and will therefore inevitably hallucinate if used as general problem solvers. " doesn't carry the title, to ask mildly?


Replies

p-e-wtoday at 11:50 AM

I don’t agree with that definition of “hallucination”, for starters.

show 1 reply
gus_massatoday at 12:59 PM

[dead]