People should finally understand that LLMs are a lossy database of PAST knowledge. Yes, if you throw a task at it that has been done tons of times before, it works. Which is not a surprise, because it takes minutes to Google and index multiple full implementations of "Tool that allows you to right-click on an image to convert it". Without LLM you could do the same: Just copy&paste the implementation of that from Microsoft Powertoys, for example.
What LLMs will NOT do however, is write or invent SOMETHING KNEW.
And parts of our industry still are about that: Writing Software that has NOT been written before.
If you hire junior developers to re-invent the wheels: Sure, you do not need them anymore.
But sooner or later you will run out of people who know how to invent NEW things.
So: This is one more of those posts that completely miss the point. "Oh wow, if I look up on Wikipedia how to make pancakes I suddenly can make and have pancakes!!!1". That always was possible. Yes, you now can even get an LLM to create you a pancake-machine. Great.
Most of the artists and designers I am friends with have lost their jobs by now. In a couple of years you will notice the LLMs no longer have new styles to copy from.
I am all for the "remix culture". But don't claim to be an original artist, if you are just doing a remix. And LLM source code output are remixes, not original art.
> What LLMs will NOT do however, is write or invent SOMETHING KNEW.
Counterpoint: ChatGPT came up with the new expression "The confetti has left the cannon" a few years ago.
So, your claim is not obviously true. Can you give us an example of a programming problem where the LLMs fail to solve it?