logoalt Hacker News

GuB-42yesterday at 4:30 PM7 repliesview on HN

On what grounds?

Being wrong is usually not a punishable offence. It could be considered defamation, but defamation is usually required to be intentional, and it is clearly not the case here. And I think most AIs have disclaimers saying that that may be wrong, and hallucinations are pretty common knowledge at this point.

What could be asked is for the person in question to be able to make a correction, it is actually a legal requirement in France, probably elsewhere too, but from the article, it looks like Gemini already picked up the story and corrected itself.

If hallucinations were made illegal, you might as well make LLMs illegal, which may be seen as a good thing, but it is not going to happen. Maybe legislators could mandate an official way to report wrongful information about oneself and filter these out, as I think it is already the case for search engines. I think it is technically feasible.


Replies

delectiyesterday at 5:36 PM

Defamation does not have to be intentional, it can also be a statement made with reckless disregard for whether it's true or not. That's a pretty solid description of LLM hallucinations.

Sophirayesterday at 5:22 PM

> it looks like Gemini already picked up the story and corrected itself.

Not completely. According to later posts, the AI is now saying that he denied the fabricated story in November 2024[0], when in reality, we're seeing it as it happens.

[0] https://bsky.app/profile/bennjordan.bsky.social/post/3lxprqq...

Retr0idyesterday at 4:56 PM

Google's disclaimers clearly aren't cutting it, and "correcting" it isn't really possible if it's a dynamic response to each query.

I don't think you can make yourself immune to slander by prefixing all statements with "this might not be true, but".

show 1 reply
jedimastertyesterday at 5:39 PM

> It could be considered defamation, but defamation is usually required to be intentional

That's not true in the US, only that the statement harm the individual in question and are provably false, both of which are pretty clear here.

jedimastertyesterday at 5:41 PM

> If hallucinations were made illegal, you might as well make LLMs illegal

No, the ask here is that companies be liable for the harm that their services bring

eth0upyesterday at 5:07 PM

"if hallucinations were made illegal..."

I was just yesterday brooding over the many layers of plausible deniability, clerical error, etc that protect the company that recently flagged me as a fraud threat despite having no such precedent. The blackbox of bullshit metrics coupled undoubtedly with AI is pretty well immune. I can demand review from the analysis company, complain to the State Attorney General, FTC and CCPA equivalents maybe, but I'm unsure what else.

As for outlawing, I'll present an (admittedly suboptimal) Taser analogy: Tasers are legal weapons in many jurisdictions, or else not outlawed; however, it is illegal to use them indiscriminately against anyone attempting a transaction or job application.

AI seems pretty easily far more dangerous than a battery with projectile talons. Abusing it should be outlawed. Threatening or bullying people with it should be too. Pointing a Taser at the seat of a job application booth connected to an automated firing system should probably be discouraged. And most people would much rather take a brief jolt, piss themselves and be on with life than be indefinitely haunted by a reckless automated social credit steamroller.

pessimizeryesterday at 9:22 PM

> defamation is usually required to be intentional

Is it? Or can it be just reckless, without any regard for the truth?

Can I create a slander AI that simply makes up stories about random individuals and publicizes them, not because I'm trying to hurt people (I don't know them), but because I think it's funny and I don't care about people?

Is the only thing that determines my guilt or innocence when I hurt someone my private, unverifiable mental state? If so, doesn't that give carte blanche to selective enforcement?

I know for a fact this is true in some places, especially the UK (at least since the last time I checked), where the truth is not a defense. If you intend to hurt a quack doctor in the UK by publicizing the evidence that he is a quack doctor, you can be convicted for consciously intending to destroy his fraudulent career, and owe him compensation.

show 1 reply