“people like me are still needed” is just a desperate form of self-persuasion.
No, no it's not. I've seen what "PM armed with an LLM" will do. Trust me, if you're a decent enough Full Stack software engineer that can take an idea and run with it to implement it, you'll have a leg up over the PM with the idea that has no idea how to "do computers".Most of what these PMs can produce nowadays turns boardroom heads, sure. But it's just that: visuals and just enough prototype functionality that it fools the people you're demoing to. Seen enough of these in the recent past.
Will there be some PMs that can become "software developers" while armed with an LLM? Sure!
But that's not the majority. On the other hand, yes there are going to be "software developers" that will be out of a job because of LLMs, because the devs that were FS and could take an idea from 0-1 with very little overhead even in the past can now do so much faster and further without handing off to the intermediates and juniors. They mentor their LLM intern rather than their intermediates and juniors. The perpetual intermediate devs with 20 years of experience are the ones that are gonna have a larger and larger problem I'd say.
The Staff engineer that was able to run circles around others all along? They'll teach their LLM intern into an intermediate rather than having to "10 times" a bunch of perpetual intermediates with 20 years of experience.
What I'd love to see is videos of nontechnical folks using language models to create software.
When I use them myself, I just see them crushing it and think, this thing is now doing my job for basically $0, I am no longer economically relevant. But I've spent a lifetime learning to program, so it's possible I only get good results because of the way I think to prompt it.
I really can't get the outside view so I can't decide whether AI is going to make me homeless or not. I think we need the videos.
Oddly, devops seems to be the "last bastion" of our trade, as they seem to be only ones pushing back against PM vibe-coded stuff. Usually while those projects look aesthetically pleasing, they start to fall apart when met with devops requirements for environment values, cybersecurity, etc
I agree with you, so far what I see is that AI amplifies an individuals output in many domains, but the value of that is 100% contingent on their judgment. It changes the economics of many tasks, but fundamentally it can't really help you if you don't actually know what you want—which is sort of a shocking number of people in the corporate world where most people are there for a paycheck, and perhaps to pursue some social marker of "success".
I'm under no illusions about the goals of AI company execs to justify their valuations (and expenses!) by capturing a huge chunk of global employment value, and the CEOs of many big companies whose financials are getting squeezed for all sorts of reasons and are all too happy to jump on the efficiency narrative of AI to justify layoffs that would have been necessary anyway. Also, AI will keep getting better and it could certainly will move up the food chain—it's already replaced a lot of what I did and I assume capabilities will continue improving for a while even after model capabilities plateau as we improve harnessing, tooling and practice.
So yeah, it can replace a lot of what we do, but I'm not running scared because every step of the way I've seen software people are the ones who actually get the most out of LLMs. Sure it can write all the code so the job changes, but even our workflows completely change, it's giving us more of an edge (if we're open to it) than it does to anyone non-technical. At this stage it still feels empowering on an individual level.
Now I do worry about the consolidation of power and wealth in a tech oligarchy, but that's an issue we need to deal with at a societal and government policy level. Essentially, I can see AI as having radically different outcome potential based on how it's governed. In one way it can be very empowering to small teams, and reduce coordination costs, and increase competition by allowing smaller groups of people to make more scalable companies. But it could also lead to unprecedented concentration of wealth and power if a small set of AI companies are allowed to capture all the economic gains. I don't think there are any easy answers, but I do feel hopeful that we can figure something out as a society—it certainly seems to be creating some unified sentiment across political lines that have been so polarized and divisive over the last decade.
I think this is exactly correct.
I agree with you overall, yet there’s one flow that works for me. Instead of speccing out a feature, I let PMs vibe code it. I then have the exact reference I need to build.
Maybe LLMs oneshotted the right way, maybe it needs fixes, maybe some fundamentals are misunderstood, in any case it’s easier for me to know what I need to build, for the PM to be aware of some limitations (LLMs do the job of pushing back and explaining) and overall for us to have to the point conversations.
It is somewhat orthogonal to what you say, when you focused on dev seniority, so that part stands true.
But I think “PMs armed with an LLM” can, when properly used, add a lot of value to the dev process.