The pattern that gets missed in these discussions: every "no-code will replace developers" wave actually creates more developer jobs, not fewer.
COBOL was supposed to let managers write programs. VB let business users make apps. Squarespace killed the need for web developers. And now AI.
What actually happens: the tooling lowers the barrier to entry, way more people try to build things, and then those same people need actual developers when they hit the edges of what the tool can do. The total surface area of "stuff that needs building" keeps expanding.
The developers who get displaced are the ones doing purely mechanical work that was already well-specified. But the job of understanding what to build in the first place, or debugging why the automated thing isn't doing what you expected - that's still there. Usually there's more of it.
Classic Jevons Paradox - when something gets cheaper the market for it grows. The unit cost shrinks but the number of units bought grows more than this shrinkage.
>every "no-code will replace developers" wave actually creates more developer jobs, not fewer
you mean "created", past tense. You're basically arguing it's impossible for technical improvements to reduce the number of programmers in the world, ever. The idea that only humans will ever be able to debug code or interpret non-technical user needs seems questionable to me.
This suggests that the latent demand was a lot but it still doesnt prove it is unbounded.
At some point the low hanging automation fruit gets tapped out. What can be put online that isnt there already? Which business processes are obviously going to be made an order magnitude more efficient?
Moreover, we've never had more developers and we've exited an anomalous period of extraordinarily low interest rates.
The party might be over.
Machinery made farmers more efficient and now there are more farmers than ever.