Java is usable now, but in 2013 it was the worst debugging experience one could have. I would rather work with PHP5 than with Java (unless I started a project from scratch). Also auto-refactoring was clearly worse, because well, Java. It was around that time that I tried Scala then Clojure, and even if debugging the JVM was still an experience (to avoid as much as possible), at least limited side effects reduced the issues.
If programming peaked, it certainly wasn't in 2010.
> it was the worst debugging experience one could have.
Hard disagree. I'm not going to argue that Java debugging was the best, however:
1. You could remote debug your code as it ran on the server.
2. You could debug code which wouldn't even compile, as long as your execution path stayed within the clean code.
3. You could then fix a section of the broken code and continue, and the debugger would backtrack and execute the code you just patched in during your debugging session.†
This is what I remember as someone who spent decades (since Java 1.0) working as a contract consultant, mainly on server side Java.
Of course this will not convince anyone who is determined to remain skeptical, but I think those are compelling capabilities.
† Now I code in Rust a lot, and I really enjoy it, but the long compile times and the inability to run broken code are two things which I really miss from those Java days. And often the modern 2025 debugger for it is unable to inspect some for the variables for some reason, a bug which I never encountered with Java.
Debugging Java is a pleasure compared to debugging assembly and C, using only a terminal.