logoalt Hacker News

zinodaurtoday at 3:20 AM1 replyview on HN

Sibling comment already said it, but yes I was specifically alluding to Altman's decision to allow the US government to use their AI to choose bombing targets without a human in the loop - perhaps this is why the US government double-tapped[1] a school killing 160 girls, all younger than 12, when the school was clearly marked on google maps.

I also vigorously dislike the industry, but your stance 'I'm on the skeptic side of "AI"' is something you need to address - saying this in the friendliest way possible, you are wrong.

AI needs to be opposed, because the billionaires are going to use it to turn the world into shit, but if the best the AI opposition can muster is "AI isn't useful", we are fucked. It's extremely powerful and can do bizzaro things when you rig it up with tools - the kinds of things we need to prevent companies like Google from doing with it, no one is paying attention to.

[1] double-tapped: a phrase referring to the practice of firing a second missile after the first to kill any rescuers or surviving schoolgirls


Replies

imirictoday at 6:59 AM

Regardless, "AI" is not doing the killing in that case. Rather, humans have deployed it to control weapons that kill people. There are several layers of indirection there before you can claim "AI kills people". This is the same indirection as when a human chooses to press a button that fires a missile, or stab someone, just with more steps involved.

So you can also be outraged at weapon manufacturers, which is one step closer. Or, you can skip the indirection, and be outraged specifically at people in charge of using this technology, which is my point.

I'm disgusted by this industry as much as you are, believe me. But blaming the companies that produce "AI" for people dying is misplaced. They're certainly part of the problem, but not the root cause.

> AI needs to be opposed

AI doesn't exist. It is a marketing term used by grifters to sell their snake oil.

But even if it did, it's silly to claim that any technology needs to be opposed. This one is potentially more problematic than others because it raises some difficult existential and social questions which we might not be ready to answer, but it's still ultimately on us to control how it's used. We've somehow been able to do this for nuclear weapons which can literally obliterate civilization at the press of a button, so a probabilistic pattern generator seems trivial in comparison. It's going to be bumpy, but I think we'll manage.