This is often quoted, but I wonder whether it's actually strictly true, at least if you keep to a reasonable definition of "works". It's certainly not true in mechanical engineering.
IMHO, the key is where you add complexity. In software you have different abstraction layers. If you make a layer too fat, it becomes unwieldly. A simple system evolves well if you're adding the complexity in the right layer, avoiding making a layer responsible for task outside its scope. It still "works" if you don't, but it's increasingly difficult to maintain it.
The law is maybe a little too simplistic in its formulation, but it's fundamentally true.
You built this gear using the knowledge from your last gear. You didn't start with no knowledge, read a manual on operating a lathe, grab a hunk of metal and make a perfect gear the first time.
> It's certainly not true in mechanical engineering.
Care to exemplify?
The definition of a complex system is the qualifier for the quote. Many systems that are designed, implemented and found working are not complex systems. They may be complicated systems. To paraphrase Dr. Richard I. Cook’s ”How Complex Systems Fail” where he claims that complex systems are inherently hazardous, operate near the edge of failure and cannot be understood by analyzing individual components. These systems are not just complicated (like a machine with fixed parts) but dynamic, constantly evolving, and prone to multiple, coincidental failures.
A system of services that interact, where many of them are depending on each other in informal ways may be a complex system. Especially if humans are also involved.
Such a system is not something you design. You just happen to find yourself in it. Like the road to hell, the road to a complex system is paved with good intentions.