Binary bits are also a language. A structured language that transistor-based computers execute into some result we humans find valuable. Why wouldn't a model be able to write these binary instructions directly? Why do we need all these layers in between? We don't.
Because the learned function to generate binary code is likely more complex than that for Python.
I admit I can't say for sure until we try it. If someone were to train a model at the same scale on the same amount of raw binary code as we do these models on raw language and code, would it perform better at generating working programs. Thing is, it would now fail to understand human language prompts.
From what I know and understand though, it seems like it would be more complex to achieve.
My meta point is, you shouldn't think of it as what would a computer most likely understand, because we're not talking about a CPU/GPU. You have to think, what would a transformer architecture deep neural net better learn and infer? Python or binary code? And I think from that lens it seems more likely it's Python.