> After the user downloads 2MB of JavaScript, waits for it to parse, waits for it to execute, waits for it to hydrate, waits for it to fetch data, waits for it to render... yes, then subsequent navigations feel snappy. Congratulations.
In my experience, a lot of SPAs transfer more data than the front-end actually needs. One team I worked on was sending 4MB over the wire to render 14kb of actual HTML. (No, there wasn't some processing happening on the front-end that needed the extra data.) And that was using graphql, some dev just plunked all the fields in the graphql query instead of the ones actually needed. I've seen that pattern a lot, although in some cases it's been to my benefit, like finding more details on a tracking website than the UI presented.
even for "downloads 2MB of JavaScript", it is often simply because the site is badly written (e.g. not careful about managing dependency), not necessarily because "JAVASCRIPT BAD".
Just look at the source code of amazon.com. It's a mess. But I bet it is more of an organizational problem than a tech stack problem, for a website worked on by literally hundreds of teams (if not more) where everyone crams their little feature in the website home page