> The problem is that the average person doesn't know how to explain & solve a problem in sufficient detail to get a working solution.
I intuit this also is an intrinsic limit to LLM based approaches to "you don't need them expensive programmers no more"
with LLMs magically "generating the solution" you move the responsibility for concise expression of the problem up the ladder.
and then you "program" in prompts, reviewing the LLM-proposed formalization ("code").
I other words, the nature of "programming" changes to prompt engineering. alas you still have to understand formal languages (code)...
so there'll always be plenty to do for humans who can "math" :-)
Not to mention someone would need to evaluate and test the proposed solution... which with today's LLMs I would not bet heavily on its correctness.
This is some years ago, but a friend of mine, trained in a 4GL that was still a procedural programming language, went somewhere that was using a higher level, model-based generation of code based on that language. It turned out they still needed a few people who understood how things worked beneath the hood.
I am deeply skeptical that human-language level specifications will ever capture all the things that really need to be said for programming, any more than they do for mathematics. There are reasons for formalisms. English is slippery.
This is only true to an extend. We have a lot of digitally inclined workers who’re developing programs or scripts to handle a lot of things for them. It’s imperfect and often wildly insecure and inefficient, but unlike any previous no-code or “standard” solution it actually works. Often in conjunction with “standard” solutions.
On one hand you’re correct in that there will always be a need for programmers. I really doubt there will be a great need for generalist programmers though. The one area that may survive is the people who’re capable of transforming business needs and rules into code. Which requires a social and analytical skillset for cooperating with non tech people. You’ll also see a demand for skilled programmers at scale and for embedded programming, but the giant work force of generalist developers (and probably web developers once Figma and similar lets designers generate better code) is likely going to become much smaller in the coming decades.
Then is basically what the entire office workforce is facing. AI believers have been saying AI would do to the office what robots did to the assembly line for years, but now it actually seems like they’re going to be correct.
A lot of business people want to get something functional that they can sell, and hire a programmer if/when they can afford one. That niche is seeing a lot of uptake with regards to LLM based approaches.
This works for them because an MVP typically isn't a lot of code for what they need, and LLMs have a limited scope within which they can generate something that works.
In fact we've been using programming LLMs for a long time, which we call compilers.
There is a disconnect somewhere. When I read online, I hear about how GenAI/LLMs replace programmers and office workers. When I go to work, I mostly hear the question of how we can apply GenAI/LLMs, apart from discussion of the general buzz.
Maybe this is a reflection of local conditions, I'm not sure, but it doesn't seem like the truly revolutionary changes require the solution to find a problem. It was immediately clear what you could do with assembly line automation, or the motor car, or the printing press.