Around 10 years ago, in college, in Calculus class I had a very ambitious classmate, wanted to go to DARPA and work on Robotics. I asked if he was thinking it through solely from technical perspective or considering ethics side as well. Clearly, he didn't understand the question and I directly inquired - what if the code you write or autonomous machine you contribute to used for killing? His response - that's not my problem.
After spending couple of years studying in the US, I came to conclusion that executives and board members in industry doesn't care about society or humans, even universities don't push students towards critical thinking and ethics, and all has turned into a vocational training, turning humans into crafting tools.
The same time, at Harvard, I attended VR innovation week and the last panel discussion of the day was Ethics and Law, which was discussed by Law Professor, a journalist and a moderator and was attended a handful of people. I inquired why founders, CEOs or developers weren't in part of the discussion or in attendance? Moderator responded that they couldn't find them qualified enough to take part in the discussion. The discussion basically was - how product companies build affects the society? Laws aren't founders problem, that's what lawyers are for, and ethics - who cares, right?
This frenzy, this rat race towards next billion dollar company at any cost, has tore down the fabric of the society to the individual thinking level; or more like not thinking, just wanting and needing.
My pet theory is that this has been accelerated due to the cultural rejection of the humanities as worthy of study.
Orwell wrote about this: https://orwell.ru/library/articles/science/english/e_scien
> "The fact is that a mere training in one or more of the exact sciences, even combined with very high gifts, is no guarantee of a humane or sceptical outlook."
>” I inquired why founders, CEOs or developers weren't in part of the discussion or in attendance? Moderator responded that they couldn't find them qualified enough to take part in the discussion.”
This seems more like credentialist arrogance than a well-reasoned judgment.
"Once the rockets are up, who cares where they come down?
That's not my department!" says Wernher von Braun
That's just patently false. Tons of executives and board members in industry absolutely care. Some are in it just for philanthropic purposes.
Nothing has destroyed my faith in humanity more than the frantic race to the bottom of the AI insanity the last couple years. You can feel the frenzied greed in the air, masses of investors piling over each other to get a piece of the golden pie at any cost. It’s fucking disgusting.
> Moderator responded that they couldn't find them qualified enough to take part in the discussion.
With a gate keeping attitude like that, are you really surprised engineers don’t want to participate?
Which is why on a human level I have zero respect for many CEOs. The world would be a better place without them and they are actively working on making it worse. In fact I believe the rest of the tribe should punish them for this anti-social behavior to disincentivize it in the future.
> what if the code you write or autonomous machine you contribute to used for killing?
This line of thinking, that creating machines that kill is unethical, will destroy the West. If the US wasn't so good at producing killing machines in WW2, you wouldn't be here to complain about DARPA ethics.
Instead of having engineers develop the most advanced machines for killing (i.e. protecting the West) such people go into producing the most addictive content delivery systems, destroying the brains of minors.
As Tom Lehrer sang:
"Once the rockets are up, who cares where they come down? That's not my department!" says Wernher von Braun.
The one industry that people dislike that I haven’t been in is war. I hope to be in weapons one day. The ethics are pretty straightforward to me: kill as few as possible to protect your interests; and that may be many people; but it is not really that many people.
Anyway, I won’t guess at your friend’s motivation but if you gave me the ability to make America’s industry better at prosecuting war you’d better believe I’d do it with great enthusiasm.
Besides I’ve been around long enough to know that when the rubber hits the road the ethical people will find their way rapidly to the Paradox of Tolerance and suddenly find that violence is highly desirable. I find this kind of high variance behaviour is undesirable and leads to unhappiness all around.
See in your case with the military you can directly say, hey my code will be used to bomb other people possibly. But in today's times it isn't (I am sure even then) so cut and dry. I worked in AdTech industry (like 60% of the bay area techies). So the ad tech I write gets shown to millions/billions. What about ads influencing elections and then politicians waging wars? Anti-vax ads which influence people and then kill them. Scam ads. Insurance ads and then people not getting cancer meds from the same insurance. Am I responsible for those deaths? I would say Yes.
But what is the option? I feel each of us wants to draw a line based off of our morality but the circumstances don't allow us to stick to it (still gotta pay rent)
We are all on a Titanic the way I see it. It's just the DARPA guy is gonna sink first. Rest of us are just pretending to be Jack trying to be the last ones to go.