logoalt Hacker News

garyfirestormyesterday at 2:37 PM7 repliesview on HN

It’s a tool. Used incorrectly will lead to errors. Just like a hammer, used incorrectly could hit the users finger.


Replies

happytoexplainyesterday at 2:42 PM

There is enormous variability in how hard a tool is to use correctly, how likely it is to go wrong, and how severe the consequences are. AI has a wide range on all those variables because its use cases vary so widely compared to a hammer.

The use case here is police facial recognition. Not hitting nails. The parent wasn't saying "AI is a liability" with no context.

show 1 reply
tgvyesterday at 3:08 PM

This tool, however, is specifically built for mass surveillance. It serves no other purpose. The tool is broken, and everybody knows it. The tool makers are at least as guilty as those who use it.

show 2 replies
jqpabc123yesterday at 3:02 PM

Used incorrectly will lead to errors.

Only one small little problem --- there is no way to tell if you are using it "correctly".

The only way to be sure is to not use it.

Using it basically boils down to, "Do you feel lucky?".

The Fargo police didn't get lucky in this case. And now the liability kicks in.

show 3 replies
MattDaEskimoyesterday at 3:20 PM

What kind of outcome results from misuse? Clearly a hammer's misuse has very little in common with a global, hivemind network used in high-stake campaigns.

Now, if I misused a hammer and it hurt everyone's thumb in my country, then maybe what you said would have some merit.

Otherwise, I'd say it's an extremely lazy argument

skeeter2020yesterday at 3:00 PM

AI feels closer to a firearm than a hammer when accessing law enforcement's ability to quickly do massive, unrecoverable harm.

suzzer99yesterday at 2:54 PM

Dynamite is a tool. But we don't hand it out to anyone who wants to play with it.

show 1 reply
hrimfaxiyesterday at 5:43 PM

Unlike hammers people preface things with "claude says", etc. I never see that kind of distancing with tools that aren't AI.