Everyone claims they are x% more productive with LLMs, but once a greenfield project turns into a brownfield one, the second law of thermodynamics kicks in and these disposable components start becoming a liability.
On top of that, Hyrum’s law doesn’t go away just because your software has explicit contracts. In my experience, as more people start losing their agency over the code they generate, the system accumulates implicit cruft over time and other code starts depending on it.
Also, reliability is a reasoning problem. So operational excellence becomes scrappy with this way of working. It works for some software, but I don’t know whether this push to YOLO it is actually a good thing. C-levels are putting immense pressure in many big companies for everyone to adopt these tools and magically increase productivity while decreasing headcount to please investors. Not going too well so far.
A good interface doesn’t magically make vibe-coded implementations with little oversight usable. Rewriting it over and over again in the same manner and expecting improvement is not the kind of engineering I want to do.
The contract stability bit rings true from my experience. I've built a few B2B integrations that follow this pattern naturally - the data layer and API contracts are rock solid because changing them means coordinating with external systems, but the application logic gets rewritten fairly often.
Where it gets messy is when your "disposable" layer accumulates implicit contracts. A dashboard that stakeholders rely on, an export format someone's built a process around, a webhook payload shape that downstream systems expect. These aren't in your documented interfaces but they become load-bearing walls.
The discipline required is treating your documented contracts like the actual boundary - version them properly, deprecate formally, keep them minimal. Most teams don't have that discipline and end up with giant surface areas where everything feels permanent.
I don't buy the idea of immutable contracts becoming more common with AI assisted coding. One of the areas AI shines is coding against a spec. One of the big problems in old systems is that the cost of upgrading consumers to a new API version gets too expensive. Using AI agents you can just do it for them. Refactoring getting cheaper isn't going to make API contracts immutable its going to make them more mutable.
> This created a culture of careful engineering: clean code, thoughtful architecture, and refactoring to reduce technical debt.
I wish!
I think we have enough anecdata that users don’t like a changing interface. They like keeping things the same, mostly.
So how can you keep generating disposable software on this layer?
And what you mostly want to change in software, is new features or handle more usage. If you do that, it needs in most cases changes to the data store and the “hand crafted core”.
So what part in practice will be disposable and how often will it be “generated again”?
Maybe for simple small stuff, like how fast Excel sheets are being made, changed and discarded? Maybe for embedded software?
I think that Disposable System can combine very well with Malleable Software[1]. Imagine a program, photoshop as example, with a plugin system and builtin Code Agent, if by default the program doesn’t have the tool you want, you could ask the agent to create the tool in 5 minutes. Each user will have an unique set of tools, and a program molded for him.
Too much emphasis on contracts could lead to rigid integrations, that are not allowed to evolve or be flexible. This might become a source of brittleness in the system. What if AI is allowed to discover the changes to API contracts and change the interactions accordingly, make components decouple, giving them some room to evolve, providing more reliability?
I like the perspective and phrasing. Build the foundation carefully and vibe code colors on the wall, decoration in the room, and design of wallpaper/carpets
Want a dashboard from an API with openapi docs or from SQL database with known schema, or want a quick interactive GUI that highlights something in `perf stat` data, unleash claude.
Exactly where I figured it was going. LLMs generate too much "magic" unmaintainable code, so when it breaks just hope the next model is out and start all over
> As software gets cheaper to produce (thanks to coding agents) and quality expectations shift
Shifting quality expectations are a result of the load of crappy software we experience, not a change in what we want from software. I.e. not a good thing, allowing us to ship crap, because people "expect it", it simply means "most software is crap". So not a good thing, but something we should work against, by producing less slop, not more.
In other words, focus on the interface and not on the module implementation. In the control theory domain we call this black-box modelling.
Validating the correctness of AI output seems like one of the biggest problems we are going to face. AI can generate code far faster than humans can adequately review it.
My work is in formal verification, and we’re looking at how to apply what we do to putting guard rails on AI output.
It’s a promising space, but there’s a long way to go, and in the meantime, I think we’re about to enter a new era of exploitable bugs becoming extremely common due to vibe coding.
I vibe coded an entire LSP server — in a day — for an oddball verification language I’m stuck working in. It’s fantastic to have it, and an enormous productivity boost, but it would’ve literally taken months of work to write the same thing myself.
Moreover, because it ties deeply into unstable upstream compiler implementation details, I would struggle to actually maintain it.
The AI took care of all of that — but I have almost no idea what’s in there. It would be foolish to assume the code is correct or safe.
The key flaw in the argument is that high quality interfaces do not spring to life by themselves. They are produced as an artifact of numerous iterations over both interfacing components. This does not invalidate the entire article, but interface discovery phase has to become the integral part of disposable systems engineering.