I'm excited to see `--trim` finally make it, but it only works when all code from entrypoints are statically inferrable. In any non-toy Julia program that's not going to be the case. Julia sorely needs a static mode and a static analyzer that can check for correctness. It also needs better sum type support and better error messages (static and runtime).
In 2020, I thought Julia would be _the_ language to use in 2025. Today I think that won't happen until 2030, if even then. The community is growing too slowly, core packages have extremely few maintainers, and Python and Rust are sucking the air out of the room. This talk at JuliaCon was a good summary of how developers using Rust are so much more productive in Rust than in Julia that they switched away from Julia:
https://www.youtube.com/watch?v=gspuMS1hSQo
Which is pretty telling. It takes a overcoming a certain inertia to move from any language.
Given all that, outside of depending heavily on DifferentialEquations.jl, I don't know why someone would pick Julia over Python + Rust.
Telling what? Did you actually listen to the talk that you linked to, or read the top comment there by Chris Rackauckas?
> Given all that, outside of depending heavily on DifferentialEquations.jl, I don't know why someone would pick Julia over Python + Rust.
See his last slide. And no, they didn't replace their Julia use in its entirety with Rust, despite his organization being a Rust shop. Considering Rust as a replacement for Julia makes as much sense to me as to considering C as a replacement for Mathematica; Julia and Mathematica are domain specific (scientific computation) languages, not general systems programming languages.
Neither Julia nor Mathematica is a good fit for embedded device programming.
I also find it amusing how you criticize Julia while praising Python (which was originally a "toy" scripting language succeeding ABC, but found some accidental "gaps" to fit in historically) within the narrative that you built.
> In any non-toy Julia program that's not going to be the case.
Why?
These are exactly the feelings that I left with from the community in ~2021 (along with the AD story, which never really materialized _within_ Julia - Enzyme had to come from outside Julia to “save it” - or materialized in a way (Zygote) whose compilation times were absolutely unacceptable compared to competitors like JAX)
More and more over time, I’ve begun to think that the method JIT architecture is a mistake, that subtyping is a mistake.
Subtyping makes abundant sense when paired with multiple dispatch — so perhaps my qualms are not precise there … but it also seems like several designs for static interfaces have sort of bounced off the type system. Not sure, and can’t defend my claims very well.
Julia has much right, but a few things feel wrong in ways that spiral up to the limitations in features like this one.
Anyways, excited to check back next year to see myself proven wrong.
I don't think Julia was designed for pure overhead projects in memory-constrained environments, or for squeezing out that last 2% of hardware performance to cut costs, like C++, Rust or Zig.
Julia is the language to use in 2025 if what you’re looking for is a JIT-compiled, multiple-dispatch language that lets you write high-performance technical computing code to run on a cluster or on your laptop for quick experimentation, while also being metaprogrammable and highly interactive, whether for modelling, simulation, optimisation, image processing etc.