Strange article. The problem isn’t that everyone doesn’t know how everything works, it’s that AI coding could mean there is no one who knows how a system works.
Including the AI, which generated it once and forgot.
This is going to be a big problem. How do people using Claude-like code generation systems do this? What artifacts other than the generated code are left behind for reuse when modifications are needed? Comments in the code? The entire history of the inputs and outputs to the LLM? Is there any record of the design?
Just because there is someone who could understand a given system, that doesn’t mean there is anyone who actually does. I take the point to be that existing software systems are not understood by anyone most of the time.
I read it more as:
We already don't know how everything works, AI is steering us towards a destination where there is more of the everything.
I would also add it's also possible it will reduce the number people that are _capable_ of understanding the parts it is responsible for.
If the average tenure of a developer is 2.5 years, how likely is it in 5 years that any of the team that started the project is still working on it?
It's that no one knows if a system works.
This happens even today. If a knowledgeable person leaves a company and no KT (or more likely, poor KT) takes place, then there will be no one left to understand how certain systems work. This means the company will have to have a new developer go in and study the code and then deduce how it works. In our new LLM world, the developer could even have an LLM construct an overview for him/her to come up to speed more quickly.
No I think the problem is AI coding removes intentionality. And that introduces artifacts and connections and dependencies that shouldn’t be there if one had designed the system with intent. And that makes it eventually harder to reason about.
There is a difference in qualia in it happens to work and it was made for a purpose.
Business logic will strive more for it happens to work as a good enough.