logoalt Hacker News

trjordanyesterday at 3:53 PM3 repliesview on HN

This is almost certainly what happened. Google's AI answers aren't magic -- they're just summarizing across searches. In this case, "Israel" + "Jordan" pulled back a video with opposite views than the author.

It's somewhat less obvious to debug, because it'll pull more context than Google wants to show in the UI. You can see this happening in AI mode, where it'll fire half a dozen searches and aggregate snippets of 100+ sites before writing its summary.


Replies

sigmoid10yesterday at 6:29 PM

There is actually a musician called Benn Jordan who was impersonated by someone on twitter who posted pro-Israel content [1]. That content is no longer available, but it might have snuck into the training data, i.e. Benn Jordan = pro Israel. This might also have been set in relation to the other Jordan's previous pro-Palestine comments, eventually misattributing the "I was wrong about Israel" video. It's still a clear fuckup - but I could see humans doing something similar when sloppily accruing information.

[1] https://www.webpronews.com/musician-benn-jordan-exposes-fake...

show 1 reply
ludicrousdisplayesterday at 4:44 PM

Interesting, I wonder what Google AI has to say about Stove Top Stuffing given it's association with Turkey.

underdeserveryesterday at 4:40 PM

Ironic, that Google enshittifying their search results is hurting what they hope is their next cash cow, AI.

show 1 reply