That assumes that AI needs to be like life, though.
Consider computers: there's no selection pressure for an ordinary computer to be self-reproducing, or to shock you when you reach for the off button, because it's just a tool. An AI could also be just a tool that you fire up, get its answer, and then shut down.
It's true that if some mutation were to create an AI with a survival instinct, and that AI were to get loose, then it would "win" (unless people used tool-AIs to defeat it). But that's not quite the same as saying that AIs would, by default, converge to having a drive for self preservation.