logoalt Hacker News

simmeruplast Monday at 3:47 PM1 replyview on HN

In my mind the Google result page is like a public space.

You wouldn't punish the person who owns the park if someone inside it breaks the law as long as they were facilitating the law to be obeyed. And Google facilitiates the law by allowing you to take down slanderous material by putting in a request, and further you can go after the original slanderer if you like.

But in this case Google itself is putting out slanderous information it has created itself. So Google in my mind is left holding the buck.


Replies

gruezlast Monday at 4:17 PM

>But in this case Google itself is putting out slanderous information it has created itself. So Google in my mind is left holding the buck.

Wouldn't this basically make any sort of AI as a service untennable? Moreover how would this apply to open weights models? If I asked llama whether someone was a pedophile, and it wrongly answered in the affirmative, can that person sue meta? What if it's run through a third party like Cerebras? Are they on the hook? If not, is all that's needed for AI companies to dodge responsibility is to launder their models through a third party?

show 1 reply