logoalt Hacker News

kyrratoday at 2:41 PM11 repliesview on HN

First request latency also can really suck in Java before hotpathed code gets through the C2 compiler. You can warm up hotpaths by running that code during startup, but it's really annoying having to do that. Using C++, Go, or Rust gets you around that problem without having to jump through the hoops of code path warmup.

I wish Java had a proper compiler.


Replies

hrmtst93837today at 7:20 PM

Gaming the JIT just to get startup times in line is a decent sign that Java's "fast" comes with invisible asterisks all over prod graphs. At some point you're managing the runtime, not the app.

AOT options like GraalVM Native Image can help cold starts a lot, but then half your favorite frameworks breaks and you trade one set of hoops for another. Pick which pain you want.

prontoday at 3:13 PM

You mostly need a recent JDK. Leyden has already cut down warmup by a lot and is expected to continue driving it down.

https://foojay.io/today/how-is-leyden-improving-java-perform...

https://quarkus.io/blog/leyden-1/

looperhackstoday at 2:57 PM

You can create a native executable with GraalVM. Alternatively, if you want to keep the JVM: With the ongoing project Leyden, you can already "pre-train" some parts of the JVM warm-up, with full AoT code compilation coming some time in the future.

show 3 replies
titzertoday at 3:36 PM

I worked on JVMs long ago (almost twenty years now). At that time most Java usage was for long-running servers. The runtime team staunchly refused to implement AOT caching for as long as possible. This was a huge missed opportunity for Java, as client startup time has always, always, always sucked. Only in the past 3-5 years does it seem like things have started to shift, in part due to the push for Graal native image.

I long ago concluded that Java was not a client or systems programming language because of the implementation priorities of the JVM maintainers. Note that I say priorities--they are extremely bright and capable engineers that focus on different use cases, and there isn't much money to be made from a client ecosystem.

bob1029today at 3:21 PM

AOT is nice for startup time, but there are tradeoffs in the other direction for long tail performance issues in production.

There are JITs that use dynamic profile guided optimization which can adjust the emitted binary at runtime to adapt to the real world workload. You do not need to have a profile ahead of time like with ordinary PGO. Java doesn't have this yet (afaik), but .NET does and it's a huge deal for things like large scale web applications.

https://devblogs.microsoft.com/dotnet/bing-on-dotnet-8-the-i...

taerictoday at 3:19 PM

I challenge the idea that first request latency is bottle necked by language choice. I can see how that is plausible, mind. Is it a concern for the vast majority of developers?

pjmlptoday at 3:09 PM

Excelsior JET, now gone, but only because GraalVM and OpenJ9 exist now.

The folks on embedded get to play with PTC and Aicas.

Android, even if not proper Java, has dex2oat.

a-dubtoday at 2:51 PM

i'd be curious about a head to head comparison of how much the c2 actually buys over a static aot compilation with something serious like llvm.

if it is valuable, i'd be surprised you can't freeze/resume the state and use it for instantaneous workload optimized startup.

bombcartoday at 2:44 PM

Do none of the JVMs do that? GraalVM?

show 3 replies
dioniantoday at 2:59 PM

This is why I use java for long running processes, if i care about a small binary that launches fast, i just use something slower at runtime but faster at startup like python.

show 3 replies
belfthrowtoday at 3:08 PM

I really hate how completely clueless people on hn are about java. This is not, and has not been an issue for many many years in Java and even the most junior of developers know how to avoid it. But oh no, go and rust is alwaayssss the solution sure.

show 2 replies