logoalt Hacker News

slightwinderyesterday at 3:09 PM3 repliesview on HN

Searching for "benn jordan isreal", the first result for me is a video[0] from a different creator, with the exact same title and date. There is no mentioning of "benn" in the video, but some mentioning of jordan (the country). So maybe, this was enough for Google to hallucinate some connection. Highly concerning!

[0] https://www.youtube.com/watch?v=qgUzVZiint0


Replies

trjordanyesterday at 3:53 PM

This is almost certainly what happened. Google's AI answers aren't magic -- they're just summarizing across searches. In this case, "Israel" + "Jordan" pulled back a video with opposite views than the author.

It's somewhat less obvious to debug, because it'll pull more context than Google wants to show in the UI. You can see this happening in AI mode, where it'll fire half a dozen searches and aggregate snippets of 100+ sites before writing its summary.

show 3 replies
glensteinyesterday at 3:13 PM

That raises a fascinating point, which is whether search results that default to general topics ever are the basis for LLM training or information retrieval as a general phenomenon.

show 2 replies
bdhcuidbebeyesterday at 7:45 PM

Just wait until you realize how ai translation ”works”.

Its literally bending languages into american with other words.