logoalt Hacker News

p-e-wtoday at 8:32 AM1 replyview on HN

That paper’s abstract doesn’t carry its title, to put it mildly.


Replies

coldteatoday at 10:10 AM

What part of "Specifically, we define a formal world where hallucination is defined as inconsistencies between a computable LLM and a computable ground truth function. By employing results from learning theory, we show that LLMs cannot learn all the computable functions and will therefore inevitably hallucinate if used as general problem solvers. " doesn't carry the title, to ask mildly?

show 2 replies