Honestly, I feel like Julia might as well beat Mojo or sommat to the punch, sooner or later. It has some facilities and supporting infrastructure for a lot of scientific and data-handling tasks surrounding ML, if not for compiling and dispatching kernels (where XLA reins supreme to anything in the CUDA ecosystem!) For example, Bayesian programming like Turing.jl is virtually unmatched in Python. It's been a while since I looked at Lux.jl for XLA integration, but I reckon it could be incredibly useful. As long as LLM's and RLVR training thereof should continue to improve, we may be able to translate loads of exiting Pytorch code eventually.
> For example, Bayesian programming like Turing.jl is virtually unmatched in Python.
What about numpyro?
Disclaimer: I contribute to numpyro occasionally.
I dunno. This sort of thing gives me pause:
https://danluu.com/julialang/
But the first thing that gave me pause about Julia? They sort of pivoted to say "we're general purpose" but the whole index-starting-at-one thing really belies that -- these days, that's pretty much the province of specialty languages.