Yes, any human will most likely recognize the result as random noise, as they will know whom they are searching for, and see this not a video from or about Benn. But AI, taking all results as valid, will obviously struggle with this, condensing it to bullshit.
Thinking about, it's probably not even a real hallucination in the normal AI-meaning, but simply poor evaluation and handling of data. Gemini is likely evaluation the new data on the spot, trusting them blindly; and without any humans preselecting and writing the results, it's failing hard. Which is showing that there is no real thinking happening, only rearrangement of the given words.
The fundamental problem is AI has no ability to recognize data quality. You'll get something like the best answer to the question but with no regard for the quality of that answer. Humans generally recognize they're looking at red herrings, AIs don't.