logoalt Hacker News

baqlast Sunday at 10:23 AM1 replyview on HN

It should have been called confabulation, hallucination is not the correct analog, tech bros simply used the first word they thought of and it unfortunately stuck.


Replies

K0baltlast Sunday at 12:28 PM

Undesirable output might be more accurate, since there is absolutely no difference in the process of creating a useful output vs a “hallucination” other than the utility of the resulting data.

I had a partially formed insight along these lines, that LLMs exist in this latent space of information that has so little external grounding. A sort of deeamspace. I wonder if embodying them in robots will anchor them to some kind of ground-truth source?