logoalt Hacker News

deepvibrationsyesterday at 2:58 PM2 repliesview on HN

The law needs to stand up and make an example here, otherwise this will just continue and at some point a real disaster will occur due to AI.


Replies

koolbayesterday at 3:04 PM

> The law needs to stand up and make an example here, otherwise this will just continue and at some point a real disaster will occur due to AI.

What does it mean to “make and example”?

I’m for cleaning up AI slop as much as the next natural born meat bag, but I also detest a litigious society. The types of legal action that stops this in the future would immediately be weaponized.

show 4 replies
GuB-42yesterday at 4:30 PM

On what grounds?

Being wrong is usually not a punishable offence. It could be considered defamation, but defamation is usually required to be intentional, and it is clearly not the case here. And I think most AIs have disclaimers saying that that may be wrong, and hallucinations are pretty common knowledge at this point.

What could be asked is for the person in question to be able to make a correction, it is actually a legal requirement in France, probably elsewhere too, but from the article, it looks like Gemini already picked up the story and corrected itself.

If hallucinations were made illegal, you might as well make LLMs illegal, which may be seen as a good thing, but it is not going to happen. Maybe legislators could mandate an official way to report wrongful information about oneself and filter these out, as I think it is already the case for search engines. I think it is technically feasible.

show 7 replies