I assume you're some sort of programmer and I genuinely wonder how in the world can someone in good faith downplay non-determinism and ambiguity when talking about a programming language.
High-level languages can certainly yield inefficient code when compiled, or maybe different code among different compilers, but they're always meant to allow their users to know exactly what to expect from what they put together in their programs. I've always considered this a hard fact, I simply cannot wrap my head around working in a way that forces me to abandon this basic assumption.