I think as it becomes more common for executives to think we can replace software engineering with agents, I wonder if they might be basing their decisions off of unrealistic perceptions of the average software engineer. I guess I'm mulling two somewhat contradictory senses:
1. You get out of it what you put into it. A savvy CTO might be incredibly excited by everything they can do with agents, and improperly think that all the software engineers can do the same thing, when in reality your org's average software engineers might not have the creativity to even think of many cases where it could save them work. So by mandating agent usage, you might find that productivity hasn't improved while AI costs have increased.
2. When using AI, there are two gaps that become more obvious. First is the gap of: who tells the agent what to do? In many orgs, product isn't technically savvy enough to come up with a detailed spec/plan that LLM can use. And many cog-in-machine developers aren't positioned to come up with the spec, they just want to implement it. By expecting work to be implemented by agent-using developers, you might instead find a lot of idle workers waiting for work to show up. Second is the qa/review cycle. You've introduced a big change to the org but are you really saving cost or shifting it?
I'm all for introducing LLM as optional to help existing developers increase velocity and quality, but I think the "let's restructure the org" movement is really dicey, especially for mid-size or smaller employers.
Related to 2, my company is strongly pushing for developer to have product mentality and be less of just a cog in a machine.
I am biased because I have more of a product mentality than other developers, but I think these are the people better positioned to be more productive with agents: know enough tech to be able to implement things with agents, and know enough product to know what should be implemented.
I expect other companies to follow.
You're basically arguing for massive headcount reductions.
> You get out of it what you put into it.
Beyond that, it's a force multiplier and it doesn't care if the force is positive or negative. Someone with poor software engineering principals can use AI to make an absolute mess quickly.