logoalt Hacker News

billypilgrimyesterday at 9:52 PM1 replyview on HN

I must say I expected an actual poisoning of the data used to train the LLM and was excited, but the examples indicate that the LLM just searched the web and reported what it found? When you create a website with fake information and search Google for that information, it will of course bring up your site, not because it’s factually correct but because it’s related to what you searched for. What am I missing?


Replies

rincebrainyesterday at 10:34 PM

The part where lots of people have historically trusted LLM responses without verification, more than trying to sort through the dross on Google or Bing search results is, I think, the point.