logoalt Hacker News

gwbas1ctoday at 6:14 PM2 repliesview on HN

I look at this as a case of "pick your battles."

In war, the civilians can't audit every move of the military. (It's impractical, both for reacting timely, and for keeping secrets from the enemy.)

If the military doesn't work with Google, they will work with someone else who might not put the same amount of pressure on the military about the practical limits on AI. Or, even worse, our enemy might use a significantly better AI that we do.

My hope is that "war" shifts to AI vs AI, machine vs machine. Calling people who work on AI for wartime purposes immoral is fundamentally immoral when AI in war replaces the need for human casulties.


Replies

ajam1507today at 8:41 PM

It shouldn't be the role of a company to hold their nose and work with the government, it should be the government's role to inspire confidence that what they are doing with the technology is ethical.

> Calling people who work on AI for wartime purposes immoral is fundamentally immoral when AI in war replaces the need for human casulties.

This is naive. It will only reduce casualties for the side with the AI, and will very likely embolden countries to fight more wars.

mitthrowaway2today at 6:23 PM

As a private contractor, you can sign a contract to deliver pizza or bandages to US soldiers, but also put into the contract that you won't deliver lethal weapons, if that's your own ethical stance. You don't need to audit every move of the military, just the stuff you're doing at their request.

And sure, maybe that just means the military decides to take their business elsewhere. But if you have confidence that your service is the best, then you sell based on that.

show 1 reply