logoalt Hacker News

troupotoday at 1:55 PM3 repliesview on HN

It's the "guns don't kill people" equivalent for AIs.

---

Before the pitchforks and downvotes:

- yes, it's a deliberate simplification

- yes, the issue is complex because you can also argue that you can't blame authors of encyclopedias and chemistry books for bombs and poisons, so why would we blame providers of LLMs

- and no, this bill is only introduced to cover everyone's assess when, not if, LLMs use results in large scale issues.


Replies

pjc50today at 2:23 PM

Quite an appropriate analogy: gun manufacturers were sued for their responsibility in US mass shootings. They won, so the mass shootings continue.

show 2 replies
Topfitoday at 2:06 PM

In fairness, a well designed and tested weapon at least can be expected to reliably and consistently perform the same thing each time. We also understand deeply how they work and can easily investigate if something happens whether it was user error, a defect or design issue. LLMs, not so much.

thegrimmesttoday at 2:15 PM

This dodges the moral argument behind "guns don't kill people", which is worth confronting directly. I think people can reasonably disagree about whether second/third/fourth/etc. order effects carry moral/legal responsibility.

In light of such disagreement, and given the lack of any higher authority among free, equal, people to arbitrate it, the only reasonable way to coexist peacefully is to avoid imposing your ideas on others. This is the foundation of a liberal society.

show 1 reply