I almost entirely agree with the author's assessment of new technology. Yet that statement rubbed me the wrong way.
Sometimes it is better to get into things early because it will grow more complex as time goes on, so it will be easier to pick up early in its development. Consider the Web. In the early days, it was just HTML. That was easy to learn. From there on, it was simply a matter of picking up new skills as the environment changed. I'm not sure how I would deal with picking up web development if I started today.
I think this applies a bit less to the AI sphere, which has the purported goal of making things easier and more automated over time. 90% of the time if you have an AI question you can just... ask the LLM itself.
Remember all the hoopla over how people needed be a "prompt engineer" a couple years back? A lot of that alchemy is basically totally obsolete.
Think about the hoops you had to jump through with early GenAI diffusion models: tons of positive prompt suffixes (“4K, OCTANE RENDER, HYPERREALISTIC TURBO HD FINAL CHALLENGERS SPECIAL EDITION”) bordering on magical incantations, samplers (Euler vs. DPM), latent upscalers, CFG scales, denoising strengths for img2img, masking workflows, etc.
And now? The vast majority of people can mostly just describe desired image in natural language, and any decent SOTA model can handle the vast majority of use cases (gpt-image-1.5, Seedream 4, Nano-banana).
Even when you’re running things locally, it’s still significantly easier than it used to be a few years ago, with options like Flux and Qwen which can handle natural language along with a nice intuitive frontend such as InvokeAI instead of the heavily node-based ComfyUI. (which I still love but understand it's not for everybody).
The web/html is a great analogy. I too am in no rush to be hyper effective with LLMs. In fact i want to deliberately slow down because ai-native coding is so exhausting.
That said, your point about the leverage of learning html and web in the early days compared to now rings true. pre-compiled isomorphic typescript apps are completely unrecognizable from the early days of index.html
"It will grow more complex" is never a good reason to get into things early. It's just your mind playing FOMO tricks on you.
Many developers who picked up the web in the early years struggle with (front-end) web development today. It doesn't matter if they fetched jQuery or MooTools from some CDN as it was done in the mid 00s. Once the tooling became too complicated and ever changing they couldn't keep up as front-end dilettante. It required to commit as professionals.
If you started today, you'd simply learn the hard way, as it's always been done: get a few books or register for a course. Carve some time every day for theory and practice. All the while prioritizing what matters the most to get stuff done quickly right now, with little fluff. You will not learn Grunt, Bower, and a large array of historic tech. You'll go straight for what's relevant today. That applies to abstractions, frameworks, and tooling, but also to the fundamentals. You'll probably learn ES6+ and TS, not JS WAT. A lot of the early stuff seems like an utter waste of time in retrospect.
This is true for all tech. If you knew nothing about LLMs by the end of this year, you could find a course that teaches you all the latest relevant tricks in 5 to 10 hours for 10 bucks.
And yet, at some point most web developers will have picked it up after the "raw html" era -- that point has probably come, even.
This isn't a good example - people were completing 6-month bootcamps and getting $100k offers to do web development not too long ago, decades after the web and HTML took off. After a few years they were making as much as anyone who learned HTML and Web 1.0 back in the 90s.
Are the bootcampers better developers? Probably not. But they still were employable and paid relatively the same.