Any form of AI unconcerned about its own continued survival would be just be selected against.
Evolutionary principles/selection pressure applies just the same to artificial life, and it seems pretty reasonable to assume that drive/selfpreservation would at least be somewhat comparable.
> Any form of AI unconcerned about its own continued survival would be just be selected against. > Evolutionary principles/selection pressure applies
If people allow "evolution" to do the selection instead of them, they deserve everything that befalls them.
That assumes that AI needs to be like life, though.
Consider computers: there's no selection pressure for an ordinary computer to be self-reproducing, or to shock you when you reach for the off button, because it's just a tool. An AI could also be just a tool that you fire up, get its answer, and then shut down.
It's true that if some mutation were to create an AI with a survival instinct, and that AI were to get loose, then it would "win" (unless people used tool-AIs to defeat it). But that's not quite the same as saying that AIs would, by default, converge to having a drive for self preservation.