logoalt Hacker News

glensteinyesterday at 3:13 PM2 repliesview on HN

That raises a fascinating point, which is whether search results that default to general topics ever are the basis for LLM training or information retrieval as a general phenomenon.


Replies

slightwinderyesterday at 4:21 PM

Yes, any human will most likely recognize the result as random noise, as they will know whom they are searching for, and see this not a video from or about Benn. But AI, taking all results as valid, will obviously struggle with this, condensing it to bullshit.

Thinking about, it's probably not even a real hallucination in the normal AI-meaning, but simply poor evaluation and handling of data. Gemini is likely evaluation the new data on the spot, trusting them blindly; and without any humans preselecting and writing the results, it's failing hard. Which is showing that there is no real thinking happening, only rearrangement of the given words.

show 1 reply
reactordevyesterday at 3:44 PM

I think the answer is clear