Amazon (+ Microsoft) already released a language for ML called gluon 8 years ago: https://aws.amazon.com/blogs/aws/introducing-gluon-a-new-lib...
autogluon is popular as well: https://github.com/autogluon/autogluon
Interesting, i can see this being very similar to Nvidia's CUTE DSL. This hints that we are converging to a (local) optimal design for Python-based DSL kernel programming.
The fact that the "language" is still Python code which has to be traced in some way is a bit off-putting. It feels a bit hacky. I'd rather a separate compiler, honestly.
Not to be confused with the Gluon UI toolkit for Java : https://gluonhq.com/products/javafx/
Not to be confused with gluon the embbedable language in Rust: https://github.com/gluon-lang/gluon
Why is zog so popular these days? Seems really cool but I have yet to get the buzz / learn it.
Is there a big reason why Triton is considered a "failure"?
Is this Triton's reply to NVIDIA's tilus[1]. Tilus is suposed to be lower level (e.g. you have control over registers). NVIDIA really does not want the CUDA ecosystem to move to Triton as Triton also supports AMD and other accelerators. So with Gluon you get access to lower level features and you can stay within Triton ecosystem.
[1] https://github.com/NVIDIA/tilus