The irony is that the vast deskilling that's happening because of this means that most "software engineers" will become incapable of understanding, let alone fixing or even building new versions of the systems that they are utterly dependent on.
There should be thousands or tens of thousands people worldwide that can build the operating systems, virtual machines, libraries, containers, and applications that AI is built on. But the number will dwindle and we'll ironically be unable to build what our ancestors did, utterly dependent on the AI artifacts to do it for us.
God I hope it doesn't all crash at once.
Trust me. All those people do it for the love of doing it, so I don't think they will outsource the jobs to some automation....
I have been coding long before internet and before there were huge demand for software devs..and I would be coding even after there is no demand for the same.
How many kernel devs does the world need? A dozen or two?
It will be the same with software. AI will be writing and consuming most software. We will be utilizing experiences built on top of that, probably generated in real time for hyper personalization. Every app on your phone will be replaced by one app. (Except maybe games, at least for a short while longer).
Everyone's treating writing code as this reverent thing. No one wrote code 100 years ago. Very few today write assembly. It will become lost because the economic neccesity is gone.
It's the end of an era, but also the beginning of a new one. Building agentic systems is really hard, a hard enough problem that we need a ton of people building those systems. AI hardware devices have barely been registered, we need engineers who can build and integrate all sorts of systems.
Engineering as a discipline will be the last job to be automated, since who do you think is going to build all the worlds automation?
I feel I've upskilled in so many directions (not just "ability to prompt LLMs") since going all in on LLM coding. So many tools, techniques, systems, and new areas of research I'd never have had the time to fully learn in the past.
I have a hard time believing any tenured developer is not actually learning things when using LLMs to build. They make interesting choices that are repeatable (new CLIs I didn't even know existed, writing scripts to churn through tricky data, using specific languages for specific tasks like Go for concurrently working through large numerous tasks, etc.)
Anyone not learning things via LLM coding right now either doesn't care at all about the underlying code/systems, or they had no foundational knowledge or interest in programming to begin with (which is also a valid way to use these tools, but they don't work very well without guidance for too long [yet]).
>But the number will dwindle and we'll ironically be unable to build what our ancestors did, utterly dependent on the AI artifacts to do it for us.
That's only a brief moment in time. We learned it once, we can learn it again if we have to. People will tinker with those things as hobbies and they'll broadcast that out too. Worst case we hobble along until we get better at it. And if we have to hobble along and it's important, someone's going to be paying well for learning all of that stuff from zero, so the motivation will be there.
Why do people worry about a potential, temporary loss of skill?
If a catastrophic failure occurs we will have to return to first principles and re-derive the solutions. Not so bad, probably enlivening even to get to spin up the mind again after a break.
I mean there should be. But there's not. Despite the millions of CS grads produced many people could not reasonably be expected to produce many 'standard' parts of a software stack
There is a deadly game of chicken going on. Junior recruiting already stopped for the most part. Only way this doesn’t end in a catastrophe is if AI becomes genuinely as good as the most skilled developers before we run out of them. Which I doubt very much but don’t find completely impossible.