Some of the most reassuring and scariest things you can read are about the incidents that have already occurred where computers said "launch all the nukes" and the humans refused. On the one hand, good news! We have prior art that says humans don't just launch all the nukes just because the computers or procedures say to. Bad news, it's been skin-of-our-teeth multiple times already.
https://www.warhistoryonline.com/cold-war/refused-to-launch-... - This isn't even the incident I was searching for to reference! This one was news to me.
https://en.wikipedia.org/wiki/Stanislav_Petrov#Incident - This is the one I was looking for.
> We have prior art that says humans don't just launch all the nukes just because the computers or procedures say to.
This relies on processes being in place to ensure that a human will always make the final decision. What about when that gets taken away?
I briefly got into a "rabbithole" of watching videos about trying to intercept BMs and glide hypersonic weapons, pretty interesting, decoys deployed in space... the outcome seemed to be not good, can't guarantee 100% interception
We shouldn't be the least bit surprised no human has complied so far.
If they had, then we wouldn't be having this conversation. For all we know, there may be a vast multiverse of universes some with humans and we would only find ourselves having this conversation in one of the universes where no human pressed the button.
I hope humans in charge are as wise now as they were then.
> We have prior art that says humans don't just launch all the nukes just because the computers or procedures say to.
previously no-one had spent trillions of dollars trying to convince the world that those computers were "Artificial Intelligence"