No. AI is a must for software development. It's non-negotiable. The productivity gains are too great. The era of 100% human-written code is over. People will still do it as an idle curiosity, for personal projects only they intend to use. But even those open source projects with significant user bases that forbid the use of AI (like, afaik, NetBSD) will be eclipsed by those that support it in terms of features, capability, and security. And the commercial world? Forget it. You cannot keep pace with your employer's expectations unless you learn to use these tools well. This is not up for debate. It's reality.
Plenty of accomplished devs are getting good results and accomplishing tasks with unheard-of speed using AI, so if you're still not, that's a PEBKAC. You are not using the tools correctly. Figure it out before you complain.
LLMs may be a must for programming, but not for engineering. Writing code is the easy part once you figure out what actually needs to be built in the first place.
> No. AI is a must for software development. It's non-negotiable.
???
https://smsk.dev/2026/04/26/ai-cannot-self-improve-and-math-...
>You are not using the tools correctly.
Stop being deluded, man.
When this crap collapses into itself you will be in tears back asking for the knowledge you failed to get without the fancy Clippys.
Now, stop that fancy Megahal chatbot and learn to do things by hand.
> "No. AI is a must for software development. It's non-negotiable."
Absolutist rubbish.
> "But even those open source projects with significant user bases that forbid the use of AI [...] will be eclipsed by those that support it in terms of features, capability, and security."
As is this. If a language model is relevant to a project, open source or otherwise, is of course heavily dependent on its nature (ethics, use case, deployment, working environment/culture, et cetera).