In my experience, it's been the complete opposite. The very experienced engineers that are actually willing to use top of the line tooling are much better than they were before, including those that are over 40, and over 50.
Part of the practical degradation of traditional programmers over time has always been concentration and deep calculation, just like in chess. The old chess player knows chess much better than a 19 year old phenom, but they cannot calculate for that many hours at the same speed as before, so their experience eventually loses to the raw calculation. Maybe at 35, or at 45, but you are just not as good. Claude Code and Codex save you the computation, while every single instinct and 2 second "intuition", which is what you build with experience, is still online.
It's not just that it's a more fair competition: It's now unfair in the opposite direction. The senior that before could lead a team of 6 is now leading a team of agents, and reviewing their code just as before. Hell, it's easier to get the agent to change direction than most juniors around me, which will not be easy to correct with just plain, low-judgement feedback.
Anecdotally, it feels like something materially changed in US software hiring market at the start of this year to me. It feels like more and more businesses are taking a wait and see approach to avoid over-investing in human capital in the next few years.
It also feels like the hiring "signal", which was always weak before, is just completely gone now, when every job you do advertise receives over 500 LLM written applications and cover letters that all look and feel the same.
The pro-athlete comparison in this article is bit silly IMO- there are obvious physical body issues that occur with aging if you rely on your muscles etc to make money. If you compare to other fields of knowledge work, such as say law or medicine, there are loads of examples of very experienced, very sharp operators in their 40s and 50s.
> AI-users thus become less effective engineers over time, as their technical skills atrophy
Based on my experience, I think this will prove more true than not in the long run, unfortunately.
Professionally, I see people largely falling into two camps: those that augment their reasoning with AI, and those that replace their reasoning with AI. I’m not too worried about the former, it’s the latter for whom I’m worried.
My mom is a (US public school) high school teacher, and she vents to me about the number of students who just take “Google AI overview” as an absolute source of truth. Maybe it’s just the new “you can’t cite Wikipedia”, but she feels that since the pandemic, there’s a notable decline in the critical thinking skills of children coming through her classes.
We have a whole generation (or two) of kids that have grown up being told what to like, hate, believe, etc. by influencers and anonymous people on the internet. They’d already outsourced their reasoning before LLMs were a thing. Most of them don’t appear to be ready to constructively engage with a system that is designed to make them believe they are getting what they want with dubious quality.
> If you work in construction, you need to lift and carry a series of heavy objects in order to be effective. But lifting heavy objects puts long-term wear on your back and joints, making you less effective over time. Construction workers don’t say that being a good construction worker means not lifting heavy objects. They say “too bad, that’s the job”.
My parents were both construction workers. There is an understanding that you cannot lift heavy objects forever. You stop lifting objects and move to being a foreman, a supervisor... and if you are uncomfortable learning to get others to do work that you before have done yourself, you burn out your body entirely and the consequences are horrible.
This is factual reality, but it is also a parable that has been important for me to internalize about delegation in my own career. It is not irrelevant to AI use, but I don't think it slots onto it totally as neatly.
I keep reading about how AI will be fine because people can just retrain for different careers. However, I never read what those careers are or who is going to pay for retraining.
I certainly don't have the money or time to go back to college and start a new career at the bottom.
I really wish seemingly intelligent people would stop using the abstraction analogy (like the article does). The key word is: determinism. Every level of abstraction (inc. power tools, C, etc.) added a deterministic layer you can rely on to more effectively do whatever it is that you're doing - same result, every time. LLM's use natural language to describe programming and the result is varied at the very best (hence agents, so we can brute force the result instead). I think the real moat is becoming the person who can actually still program.
> The career of a pro athlete has a maximum lifespan of around fifteen years. You have the opportunity to make a lot of money until around your mid-thirties,
This sounds ageist - I'm around 40 and feel I am at my mental peak, compared to even my mid 20's. This isn't a good analogy at all, the brain doesn't "wear out" like a professional athletes' body does, it just changes its structure. The brain is a remarkable organ.
From Reddit:
> After being laid off, a programmer becomes a welder. One day while working, he suddenly muttered to himself, "It's been so long, I've even forgotten how to solve three sum". A coworker next to him quietly replied, "Two pointers".
If by software engineering, one means typing code character by character into a text editor, sure it's going to be difficult to find someone to pay you for it.
If you mean creating software, well we are creating more software than ever before and the definition of what software is has never been so diverse. I can see many different careers branching off from here.
Unless I'm missing something, there's an obvious logic issue here.
If we truly need to sacrifice our skill to be productive by using LLMs that atrophy us, then the only devs that have a limited lifespan are us. The next ones won't have a skillset to atrophy since they won't have built it through manual work.
Also, I hereby propose to publicly ban the "LLMs generating code are like compilers generating machine code" analogy, it's getting old to reargue the same idea time after time.
We're forgetting one thing: we (mere engineers) have control over nothing. The vast majority of us are at the mercy of executives and investors. Before AI we had some sort of grip because our skills weren't so much a commodity, and yeah, dealing with code and systems architecture and data and distributed systems wasn't that easy. Now AI is a tool not for us but for the higher-ups, they can finally commoditize software engineering and need only a small fraction of us. I see engineers around here fighting and discussing who'll be left behind (the 80%) and who'll remain because they're "more than mere coders" (the 20%)... what we don't discuss here is that we're all now at the mercy of Anthropic et al, and that's bad. The irony is that the vast majority of us use Anthropic, so we are just loading the guns for them to use them. It's sad, but we call it progress. Nuts
Was it ever a lifetime career? Haven't most people looked around and asked themselves where are all the 50+ engineers? They basically don't exist in large numbers. Ageism is real in this industry. You either save up enough money to retire early, switch into management, or get forced out of the industry eventually. AI is just accelerating the trend. I see very few junior engineers resisting AI. I see a LOT of staff+ engineers resisting it. Just look at the comments on HN. Anti-AI sentiment is real.
I'm repeating what others have essentially said, but ask yourself what's on your resume. If it says "Software Engineer" and that's all it talks about, then yea you might not find it's a lifetime career.
But if it's a diversity of things (that use or leverage software development) then you probably have a lifetime career ahead of you.
I've been writing software for over 40 years but I've never seen myself as having a software engineering career. I've been a research assistant in geophysics, a marine technician on research ships, a game developer, an advisor to the UN, and a lot more. Yes all through that I used software, but I did a lot of other things in the process of using it.
Argument A: AI means you don't learn as much, so even though you are more effective, it inhibits your growth and you shouldn't use it. However, on a pragmatic level, it's effective to hire a bajillion people, fire them at will, and get AI to do everything. You will get so many JIRA tickets closed and so many lines of code written.
Argument B: AI means you don't learn as much, and the single most useful work product of a software engineer is knowing how the code functions, so it's depriving your company of the main benefit of your work. Also, layoffs are terrible business strategy because every lost employee is years of knowledge walking out the door, every new hire is a risk, and red PRs are derisking the business.
Institutional and personal knowledge seem similar, but the implications of each are radically different.
I don't understand it. The time-limited career would work if we were born with innate ability for software engineering and would lose it over time by using AI. Most people are not born with that ability though, it needs to be developed first.
And read Programming as Theory Building already, it's not that long
Comparing software development to carrying heavy things at a construction site feels like a real stretch to me.
'If you work in construction, you need to lift and carry a series of heavy objects in order to be effective. But lifting heavy objects puts long-term wear on your back and joints, making you less effective over time. Construction workers don’t say that being a good construction worker means not lifting heavy objects. They say “too bad, that’s the job”.'
On another note, for sure software developers are saying things like "this is the part of the job that I like" or "if you aren't doing the work, you won't be good at the work." But other people are saying this, too. I just saw an episode of "Hacks" ("QuickScribbl") where the writers say pretty much this exact thing when confronted with AI tooling designed to "make their job easier". Is writing comedy also like lifting heavy objects at a construction site?
Seems the solution here is the same it's actually always been if you want career progression: be more than just a code jockey. The true value of an engineer is to be plugged into overall roadmaps, broader thinking around product, how to achieve company goals, etc etc.
Yes, LLMs might dramatically reduce the amount of code we write by hand. But I'm a lot less convinced they'll solve all of the amorphous, human-interacting aspects of the job.
It'll move, sure.
Im looking at proper engineering in building local LLM networks, with proper firewalls, capability access, and guards around the LLM systems to allow and enable advanced use while not just "lol delete everything" happening.
When theres a land grab, move to selling tools and how to knowledge work in maintaining the tools and proper operation and maintenance.
I also look at upsells like local LLMs as reason to do this in house, so that companies arent liable for rug pulls and violation (consumption) of trade secrets, or breaching confidential discussions.
And LLMs arent good at recommending tech stacks for running them. Stuff is moving faster than most data training sets have.
There’s a hierarchy amongst knowledge work and AI hasn’t yet been able to do the work that is rare and valuable.
Over the past two decades, there have been lot of solved problems like building boring scalable web apps, UX design etc and AI is fairly good at this, enough so that good prompting can get you very far. This shouldn’t be a surprise, there’s a lot of publicly available data for this (GitHub repos etc).
On the other hand, there are rarer Computer science problems like designing efficient Datacenters, GPUs, DL models. Think about problems that someone of Jeff Dean’s or James Hamilton’s (AWS SVP) or a skilled Computer Architecture researcher like David Paterson’s ability would solve. These are incredibly hard and rare problems and AI hasn’t been able to make much progress in these areas. That’s true for other sciences as well.
If you’re a regular Joe like me who builds boring CRUD apps, AI is coming for you.
What I mean is if you are working on incredibly hard and rare problems that require rare skills and also those problems don’t have publicly available data that LLMs can be trained on, you’re safe from being “automated” away. If not, you must plan accordingly. Also if you’re a skilled manager (in any field) AI cannot replace you, highly skilled managers that can get the best out of their teams have rare skills that aren’t easily replicable even amongst humans much less AI. Although, if going forward we need fewer developers we will need fewer managers too.
The differentiator is augmenting reasoning with AI versus replacing reasoning with AI. But those who choose to replace their reasoning with AI probably weren't good at it to begin with; cause if they were, they'd choose to not replace it. Exception is that AI can actually replace reasoning (which it can't, yet) - then it's game over with a career in software engineering anyway.
80% of my day to day job has never been pumping out lots of code. it is a complicated career is it? we do a lot of alignment, design and thinking. i can't even agree the idea of outsourcing thinking, i think AI is very good at helping us to think clearly, but it doesn't really "think" for us.
if you do that then... likely very replacable.
Totally agreed and on point. Calculator operators aren't around a lot.
[delayed]
>If the models are good enough, you will simply get outcompeted by engineers willing to trade their long-term cognitive ability for a short-term lucrative career
> (2) AI-users thus become less effective engineers over time, as their technical skills atrophy
Wouldn't (2) imply that if everyone just used AI there eventually would come a time when there aren't engineers who will outcompete you (because their skills are so atrophied)?
I take issue with the premise that "Using AI means you don’t learn as much from your work" With AI assistance, I tackle far more tasks than I would without it. Learning per task goes down, but cumulative learning does not.
Software engineering today is almost nothing like the role it was 30 years ago.
Maybe if you somehow stick with the same company for your entire career, it could feel somewhat similar... but I doubt it, as 'best practices' and many other things cause it to change.
The days of 'lifetime career' had already gone for most people, way before AI arrived.
>I don’t think there’s compelling evidence that using AI makes you less intelligent overall1.
That statement is enough of an evidence
Is anything today a lifetime career? I’ve had at least five or six job descriptions over my time, and at least a few of them pretty much don’t exist anymore, or are changed beyond recognition.
Software is a tool to solve a problem, as long as you keep finding problems that you can solve with it, you're likely to get paid to do it.
If your crowning achievement is: "I can 100% all leetcode hards" I have bad news for you.
Software engineering was not a career long time ago. The companies have no respect for software engineers and treat them as commodity that can be replaced at any time. The traditional career "progression" also doesn't exist. You can get pay rise only so many times and become the seniorest senior or you want to fulfil the Peter's principle.
While most developers were busy grinding, the corporations did the most ensuring the only sensible pathway to wealth and development is closed = running own business that is. In many countries, due to regulatory capture enacted by corrupt governments, making profit is next to impossible, that if you manage to jump bureaucratic hurdles that are not present for larger corporations.
AI is just a tool. Will AI replace software engineer is like asking will hammer replace the carpenter?
So, permeant underclasses those billionaires talking about are actually just juniors that never get a chance to become a senior.
I'm a software engineer and architect, I love my job, I love diving into the small details, I love the grand overview.. I love identifying concepts and applying them to achieve elegant high-performing systems.
I love thinking about what kind of assembler the compiler may generate (though honestly, I haven't got a chance), I love thinking about how languages should be more dynamic (Who's got actually-first-class functions? Like, ones that you can build, compose, combine and manipulate to the same degree you can a string or a JSON object, no LISP, you're cheating, close no point).
And yet.. I don't care that much. Not because I'm late in my career (I'm 40, there's still some years left in me), but because I want to make computers do things, and what I enjoy doing is thinking up ways the things can happen, and sometimes the particulars that matter when making a lot of different things happen in a coherent system.. And yea, LLMs are trained on peoples output, and from what I'm seeing everywhere, is that people are overall fairly terrible at that, and most of the plumbing-type glue being written is not worth anyones time..
And I'm not saying I don't care because LLMs can't do my job (heck, even after hours of back-and-forth spec building and refining every little nook and cranny, the stupid coding agent still cheats or gets it wrong (even after it's beautifully explained, proven even, by reasoning and example alone, and on first try even) that the words coming after the previous words makes sense, as soon as the plan is put into motion, it'll mess it up on some scale so fundamental I should just have done it myself.. And I hope that changes, I hope that I don't have to go into such detail.. I hope to become a steward of taste rather than a code-reviewer.. I hope that I will eventually not be needed for that anymore.. I want it to replace me, so I can move to telling what I want, and have it made that way..
I hope I won't need to steward good taste, and that nobody will.. I hope the applications I use in 5 years will be a collection of one-offs, and gradually improving tools that was written _just_ for me, for my way of working, and my way of thinking.. I want to prompt the damn program to change itself as I discover new ways to do things, until it can eventually figure out how to automate the last bit of my task away.. And then I'll go do something else exciting.
Imagine a situation where AI creates thousands of lines of code in a few repos and there is a Production issue and does get resolved by AI. How can humans jump in and resolve the bug without knowing anything about the code ?
Why are we upvoting this?
Virtually, the entire blog is about AI with a ridiculous publishing rate (https://www.seangoedecke.com/page/5), funny how I can look at this site HTML and know right away it was done with AI.
Can we stop upvoting vibe published articles? The arguments are flawed and don't even make sense to anyone who does software
Nah it is
> Construction workers don’t say that being a good construction worker means not lifting heavy objects. They say “too bad, that’s the job”
I dont know, maybe in your part of the world, but where I'm from we have a series of robust worker protection laws that try to limit the damage the work does to you. We generally consider it a bad thing for workers to damage their bodies, and if we could build houses without it, we'd prefer that.
In this specific case we do have a techniques to build software without causing damage, so why change that?
This post is arguing that maybe software enginnering should start being harmful, even though we know it doesn't have to. It's a post of a guy begging to be fed into the capitalist meat grinder. Meaningless self sacrifice.
It never was a lifetime career, if you don't get the dough by 35 you just failed.
> I hope that this isn’t true. It would be really unfortunate for software engineers. But it would be even more unfortunate if it were true and we refused to acknowledge it.
More AI Soothsaying. Not so hard on the Inevitabilism this time.
On the contrary, in an efficient economy, every business operations manager (MBA) would be a skilled software engineer, able to comfortably manage data flows and design custom automated processes. There's so much potential energy there in unlocking this technical literacy.
Less "pure" programming, but lots more programming in general.
Was it ever? It's always seemed weird to me that people even think 'software engineering' is a career.
It's a tool for knowledge work.
No carpenter is a specialist in drills.
It seems to me that the best way to navigate a long term career is to have another specialty and use software engineering as a tool within that specialty.
Are people seriously thinking that you can make yourself dumber by using a chat UI?
If talking to an AI makes me dumber and a limited career, then all the customer support people that ever existed were in the same or worse position talking to dumb humans on chat all day answering tickets always about the same topics and linking the same docs over and over. This makes no sense.
> The career of a pro athlete has a maximum lifespan of around fifteen years. You have the opportunity to make a lot of money until around your mid-thirties, at which point your body just can’t keep up with it.
If you believe this about your software career, how do you think your going to switch into another career as a junior and keep up?
Most careers evolve as technology does.
Other professions do too, whether it's healthcare, etc.
Software being a new field, didn't really become a standardized profession in the way engineering might be.
The goalposts are moving because the standards are moving, because the capabilities are moving.
Remaining a self-directed learner will remain critical.
terribly written article that failed to make any point. anyone whise read ai generated code from the best models and who understand how llms work, knows this statement is complete bs.
It will be for those fixing AI slop software. (In fact, they might need several lifetimes.)
[flagged]
[flagged]
Multiple times per week I have the same conversation. It goes something like this:
The developers who still think their job is about writing code will perhaps not have a job in the future. Brutal as it may sound: I'm fine with that. I'm getting old and I value my remaining time on the planet.Business owners who think they can do without developers because they think LLMs replace developers are fine by me too. Natural selection will take care of them in due course.