logoalt Hacker News

mehagaryesterday at 5:45 PM1 replyview on HN

So the question is, are these AI tools primarily creating inherent complexity, or is it a significant amount of accidental complexity?

And if AI tools are writing all of the code, does it even matter anymore?


Replies

orderone_aiyesterday at 6:48 PM

It absolutely does matter. LLMs still have to consumer context and process complexity. The more LoC, the more complexity, the more errors you have and the higher your LLM bills. That's even in the AI maximalist, vibe-code only use case. The reality is that AI will have an easier time working in a well-designed, human-written codebase than one generated by AI, and the problem of AI code output turning into AI coding inputs resulting in the AI choking and on itself and making more errors tends to get worse over time, with human oversight being the key tool to prevent this.