> We have prior art that says humans don't just launch all the nukes just because the computers or procedures say to.
This relies on processes being in place to ensure that a human will always make the final decision. What about when that gets taken away?
I find it hard to imagine that the people in a position to kill those processes could ever be that zealously in love with AI, but recent events have given me a tiny bit of doubt.