logoalt Hacker News

alexwebb2today at 6:06 PM1 replyview on HN

> The idea that hallucinations are somehow less likely because you're asking meta-questions about LLM output is completely without basis

Not sure who you're replying to here – this is not a claim I made.


Replies

mpalmertoday at 6:28 PM

That's fair, but I'm not sure why you chose to address the one part of my comment that isn't responsive to your points.