Starting code when I was 14, sold my first bit of code at 17, which was written in 6502 assembler.
40+ years later, been through many BASICs, C, C++ (CFront on onwards) and now NodeJS, and I still love writing code.
Tinkering with RPi, getting used to having a coding assistant, looking forward to having some time to work on other fun projects and getting back into C++ sooooon.
What's not to love?
Yeah I could use Cursor or whatever but I don't, I like writing code. I guess that makes me a luddite or something, although I still develop agents. I enjoy architecting things (I don't consider myself an architect) I'm talking about my hobby hardware projects.
> I wrote my first line of code in 1983. I was seven years old, typing BASIC into a machine that had less processing power than the chip in your washing machine
I think there may be a counterpoint hiding in plain sight here: back in 1983 the washing machine didn't have a chip in it. Now there are more low-level embedded CPUs and microcontrollers to develop for than before, but maybe it's all the same now. Unfathomable levels of abstraction, uniformly applied by language models?
I'm 55 and I started at age 13 on a TI-99/4A, then progressed through Commodore 64, Amiga 2000, an Amiga XT Sidecar, then a real XT, and on and on. DOS, Windows, Unix, the first Linux. I ran a tiny BBS and felt so excited when I heard the modem singing from someone dialing in. The first time I "logged into the Internet" was to a Linux prompt. Gopher was still a bigger thing than the nascent World-Wide Web.
The author is right. The magic has faded. It's sad. I'm still excited about what's possible, but it'll never create that same sense of awe, that knowledge that you can own the entire system from the power coming from the wall to the pixels on your screen.
I know exactly how you feel. I don't know how many hours I sat in front of this debugger (https://www.jasik.com) poking around and trying to learn everything at a lower level. Now its so different.
Cool, at 7? I started at 9 and I'm 53 now. And Claude does all the things. Need to get adjusted to that though. Still not there.
Last year I found out that I always was a creator, not a coder.
Same, but it changed when I was 17 and again when I was 27 and then 37 and so on. It has always been changing dramatically, but this latest leap is just so incredibly different that it seems unique.
Are you me?
I'm 49.... Started at 12... In the same boat
First 286 machine had a CMOS battery that was loose so I had to figure that out to make it boot into ms-dos
This time it does feel different and while I'm using them ai more than ever, it feels soulless and empty even when I 'ship' something
I am younger than the author but damn this somehow hit me hard. I do remember growing up as a kid with a 486...
> the VGA Mode X tricks in Doom
Doom does not use mode-X :P ! It uses mode-Y.
That being said as a 47 years old having given 40 years to this thing as well, I can relate to the feeling.
Is there some magic lost also when using AI to write your blog post?
I don't know what these people from our now traditional daily lamentation session are coding where Claude can do all the work for them just with a few prompts and minimal reviews.
Claude is a godsend to me, but fuck, it is sometimes dumb as door, loves to create regressions, is a fucking terrible designer. Small, tiny changes? Those are actually the worse, it is easy for claude, on the first setback, decides to burn the whole world and start from zero again. Not to mention when it gets stuck in an eternal loop where it increasingly degenerates the code.
If I care about what I deliver, I have to actively participate in coding.
Same as assembly programmers felt when C came along I guess
I'm 43. Took a year or so off from contracting after being flat out for years without taking any breaks, just poked around with some personal projects, did some stuff for my wife's company, petitioned the NHS to fix some stuff. Used Claude Code for much of it. Travelled a bit too.
I feel like I turned around and there seem to be no jobs now (500+ applications deep is a lot when you've always been given the first role you'd applied to) unless you have 2+ years commercial AI experience, which I don't, or perhaps want to sit in a SOC, which I don't. It's like a whole industry just disappeared while I had my back turned.
I looked at Java in Google Trends the other day, it doesn't feel like it was that long ago that people were bemoaning how abstracted that was, but it was everywhere. It doesn't seem to be anymore. I've tried telling myself that maybe it's because people are using LLMs to code, so it's not being searched for, but I think the game's probably up, we're in a different era now.
Not sure what I'm going to do for the next 20 years. I'm looking at getting a motorbike licence just to keep busy, but that won't pay the bills.
I'm 47 and excited to live in a time of the moat important innovation since the printing press.
Abstractions can take away but many add tremendous value.
For example, the author has coded for their entire career on silicon-based CPUs but never had to deal with the shittiness of wire-wrapped memory, where a bit-flip might happen in one place because of a manufacturing defect and good luck tracking that down. Ever since lithography and CPU packaging, the CPU is protected from the elements and its thermal limits are well known and computed ahead of time and those limits baked into thermal management so it doesn’t melt but still goes as fast as we understand to be possible for its size, and we make billions of these every day and have done for over 50 years.
Moving up the stack you can move your mouse “just so” and click, no need to bit-twiddle the USB port (and we can talk about USB negotiation or many other things that happen on the way) and your click gets translated into an action and you can do this hundreds of times a day without disturbing your flow.
Or javascript jit compilation, where the js engine watches code run and emits faster versions of it that make assumptions about types of variables - with escape hatches if the code stops behaving predictably so you don’t get confusing bugs that only happen if the browser jitted some code. Python has something similar. Thanks to these jit engines you can write ergonomic code that in the typical scenario is fast enough for your users and gets faster with each new language release, with no code changes.
Lets talk about the decades of research that went into autoregressive transformer models, instruction tuning, and RLHF, and then chat harnesses. Type to a model and get a response back, because behind the scenes your message is prefixed with “User: “, triggering latent capabilities in the model to hold its end of a conversation. Scale that up and call it a “low key research preview” and you have ChatGPT. Wildly simple idea, massive implications.
These abstractions take you further from the machine and yet despite that they were adopted en masse. You have to account for the ruthless competition out there - each one would’ve been eliminated if they hadn’t proven to be worth something.
You’ll never understand the whole machine so just work at the level you’re comfortable with and peer behind the curtain if and when you need (eg. when optimizing or debugging).
Or to take a moment to marvel.
It'd be more strange if the thing you learned 43 years ago was exactly the same today. We should expect change. When that change is positive we call it progress.
I think more than ever programmers need jobs where performance matters and the naive way the AI does things doesn't cut it. When no one cares about things other than correctness your job turns into AI Slop. The good news right now is that AI tends to produce things that AI struggles to do well with so large scale projects often descend into crap. You can write a C-compiler for $20,000 with an explosive stack of agents, but that C-compiler isn't anywhere close to efficient or performant.
As model costs come down that $20,000 will become a viable number for doing entirely AI-generate coding. So more than ever you don't want to be doing work that the AI is good enough at. Either jobs where performance matters or being able to code the stack of agents needed to produce high quality code in an application context.
A bit younger, and exact opposite. Probably the most excited I've ever been about the state of development!
'It’s not a “back in my day” piece.'
That's exactly what it is.
"They’re writing TypeScript that compiles to JavaScript that runs in a V8 engine written in C++ that’s making system calls to an OS kernel that’s scheduling threads across cores they’ve never thought about, hitting RAM through a memory controller with caching layers they couldn’t diagram, all while npm pulls in 400 packages they’ve never read a line of."
and they still call themselves 'full stack developers' :eyeroll:
> …Not burnout…
Than meybe wadeAfay? ;)>"The abstraction tower
Here’s the part that makes me laugh, darkly.
I saw someone on LinkedIn recently — early twenties, a few years into their career — lamenting that with AI they “didn’t really know what was going on anymore.” And I thought: mate, you were already so far up the abstraction chain you didn’t even realise you were teetering on top of a wobbly Jenga tower.
They’re writing TypeScript that compiles to JavaScript that runs in a V8 engine written in C++ that’s making system calls to an OS kernel that’s scheduling threads across cores they’ve never thought about, hitting RAM through a memory controller with caching layers they couldn’t diagram, all while npm pulls in 400 packages they’ve never read a line of.
But sure. AI is the moment they lost track of what’s happening.
The abstraction ship sailed decades ago. We just didn’t notice because each layer arrived gradually enough that we could pretend we still understood the whole stack.
AI is just the layer that made the pretence impossible to maintain."
Absolutely brilliant writing!
Heck -- absolutely brilliant communicating! (Which is really what great writing is all about!)
You definitely get it!
Some other people here on HN do too, yours truly included in that bunch...
Anyway, stellar writing!
Related:
https://www.joelonsoftware.com/2002/11/11/the-law-of-leaky-a...
https://en.wikipedia.org/wiki/Tower_of_Babel
https://en.wikipedia.org/wiki/Abstraction_(computer_science)
https://en.wikipedia.org/wiki/Abstraction
https://ecommons.cornell.edu/entities/publication/3e2850f6-c...
The irony of these "My craft is dead" posts is that they consistently, heavily leverage AI for their writing. So you're crying about losing one craft to AI while using AI to kill another. It's disingenuous. And yes it is so damn obvious.
As someone who has always enjoyed designing things, but was never really into PUZZLES, I always felt like an outsider in the programming domain. People around me really enjoyed the "fun" of programming, whereas I was more interested in the Engineering of the thing - balancing tradeoffs until within acceptable margins and then actually calling it "DONE". People around me rarely called things "done", they rewrote it and rewrote it so that it kept satisfying their need for puzzle-solving (today, it's Ruby, tomorrow, it's rewritten in Scala, and the day after that, it's Golang or Zig!)
I feel that LLMs have finally put the ball in MY court. I feel sorry for the others, but you can always find puzzles in the toy section of the bookstore.
It's not like it's changing by itself, you can always opt out of the slop race and scratch your itches instead.
>But sure. AI is the moment they lost track of what’s happening. The abstraction ship sailed decades ago.
Bullshit. While abstraction has increased over time, AI is no mere incremental change. And the almost natural language interaction with an agent is not the same as Typescript over assembly (not to mention you could very well right C or Rust and the like, and know most of the details of the machine by heart, and no, microcode and low level abstractions are not a real counter-argument to that). Even less so if agents turn autonomous and you just herd them onto completion.
“... when I was 7. I'm 50 now and the thing I loved has changed”
Welcome to the human condition, my friend. The good news is that a plurality of novels, TV shows, country songs, etc. can provide empathy for and insight into your experience.
I'm 57 and wrote my first line of BASIC in 1980, so while I can still chime in on this specific demographic I feel that I ought to. So im like this guy, but like a lot of other people in my specific demographic we aren't writing these long melancholy blog posts about AI because it's not that big of a deal. As an OSS maintainer most of my work is a lot of boring slog adding features to libraries to suit new features in upstream dependencies, nitpicky things people point out, new docs, tons of tedium. Claude helps a ton with all of that. no way is Claude doing the real architectural puzzle stuff, that's still fully on me! I can just use Claude to help implement it. It's like the ultimate junior programmer assistant. It's certainly a new, different and unique experience in one's programming career but it really feels like another tool, like an autocomplete or code refactoring tool that is just a lot better, with similar caveats. I mean in my career, I've had to battle the whole time people who don't "get" source code control (starting with me), who don't "get" IDEs (starting with me), people who dont "get" distributed version control (same), people who don't "get" ORMs (oh yes, same for me though this one I took much more dramatic steps to appreciate them), people who don't "get" code formatters, now we're battling people who don't "get" LLMs used for coding, in that sense the whole thing doesnt feel like that novel of a situation.
it's the LLMs that are spitting out fake photos and videos and generating lots of shitty graphics for local businesses, that's where I'm still wielding a pitchfork...
I have the opposite take. There’s nothing stopping you from jumping into any component to polish things up. You can code whatever you wish. And AI takes away nearly all of the drudgery : boilerplate, test cases, inspecting poor documentation, absurd tooling.
It also lets me focus more on improving things since I feel more liberated to scrap low quality components. I’m much braver to take on large refactors now – things that would have taken days now take minutes.
In many ways AI has made up for my growing lack of patience and inability to stay on task until 3am.
There's 3-4 of these posts a day - why don't people spend more time hand-building things for fun in their free time? That's what led a lot of us to this career path to start with. I have a solid mix of hand-code and AI-assisted projects in my free time.
>>The machines I fell in love with became instruments of surveillance and extraction.
Surveillance and Extraction
"We were promised flying cars", and what we got was "investors" running the industry off the cliff into cheap ways to extract money from people instead of real innovation.
> I started programming when I was seven because a machine did exactly what I told it to
What a poetic ending. So beautiful! And true, in my experience.
This isn't new. It's the same feeling the first commercial programmers had working in assembly, or machine code, once compilers became available. Ultimately I think even Mel Kaye forsook being able to handpick memory locations for optimum drum access before his retirement, in favor of being able to build vastly more complex software than before.
AI has just vastly extended your reach. No sense crying about it. It is literally foolish to lament the evolution of our field into something more.
Programming is dead. In the last 4 days I've done 2 months of work. The future is finally here.
Bad times to be a programmer. Start learning business.
I'm 57. I was there when the ZX81 came out.
I had my first paid programming job when I was 11, writing a database for the guy that we rented our pirate VHS tapes from.
AI is great.
Don't program as a career, but am also 50 and programming since TRS-80. AI has transformed this era, and I LOVE IT! I can focus on making and not APIs or syntax or all of the bootstrapping.
Professional development is changing dramatically. Nothing stops anyone from coding "the old way," though. Your hobby project remains yours, exactly the way you want it. Your professional project, on the other hand, was never about you in the first place. It's always about the customer/audience/user, period full stop.
Please stop upvoting these posts. We have gotten to the point where both the front page and new page is polluted with these laments
It’s literally the same argument over and over and it’s the same comments over and over and over
HN will either get back to interesting stuff or simply turn into a support group for aging “coders” that refuse to adapt
I’m going to start flagging these as spam
same bud.
maybe that just means it's a maturing field and we gotta adapt?
yes, the promise has changed, but you still gotta do it for the love of the game. anything else doesnt work.
11 and now 45. I am still interested in it, but I feel like in my 20s I would get a dopamine rush when a row showed up in a database. In my 30s I would get that only if a message passed through a system and updated on-screen analytics within 10 seconds. Thank god for LLMs because all of it became extremely boring, I can't stand having to get these little milestones each new company or each new product I'm working on. At least with LLMs the dopamine hit comes from being in awe of the code that gets generated and realizing it found every model, every messaging system interface, every API, and figuring out how to make it backwards compatible, updating the UI - something that would take half a day, now in 5 minutes or less.
I’m 50 too and I’ve complained and yearned about the “old” days too, a lot of this is nostalgia as we reminisce about periods of time in our youth when we had the exuberance and time to play and build with technology of our own time
Working in AI startups strangely enough I see a lot of the same spirit of play and creativity applied to LLM based tools - I mean what is OpenClaw but a fun experiment
Those kids these days are going to reminisce about the early days of AI when prompts would be handwritten and LLMs would hallucinate
I’m not really sure 1983, 1993 or 2003 really was that gold of age but we look at it with rose colored glasses
[dead]
Great post. Good to see someone posting something positive for a change about the shift in development.