I don't know what definition of AI you're using, but plenty of ML algorithms operate deterministically, let alone most other logic programmed into a computer. I don't see how your statement can be right given that these other software systems also operate in the real world.
ML run a GPU that uses matrix multiplies isn't deterministic unless you go through great pains to lock things down at the expense of performance.