Searching for "benn jordan isreal", the first result for me is a video[0] from a different creator, with the exact same title and date. There is no mentioning of "benn" in the video, but some mentioning of jordan (the country). So maybe, this was enough for Google to hallucinate some connection. Highly concerning!
That raises a fascinating point, which is whether search results that default to general topics ever are the basis for LLM training or information retrieval as a general phenomenon.
Just wait until you realize how ai translation ”works”.
Its literally bending languages into american with other words.
This is almost certainly what happened. Google's AI answers aren't magic -- they're just summarizing across searches. In this case, "Israel" + "Jordan" pulled back a video with opposite views than the author.
It's somewhat less obvious to debug, because it'll pull more context than Google wants to show in the UI. You can see this happening in AI mode, where it'll fire half a dozen searches and aggregate snippets of 100+ sites before writing its summary.