logoalt Hacker News

noemityesterday at 9:52 AM14 repliesview on HN

Not a day goes by that a fellow engineer doesn't text me a screenshot of something stupid an AI did in their codebase. But no one ever mentions the hundreds of times it quietly wrote code that is better than most engineers can write.

The catch about the "guided" piece is that it requires an already-good engineer. I work with engineers around the world and the skill level varies a lot - AI has not been able to bridge the gap. I am generalizing, but I can see how AI can 10x the work of the typical engineer working in Startups in California. Even your comment about curiosity highlights this. It's the beginning of an even more K-shaped engineering workforce.

Even people who were previously not great engineers, if they are curious and always enjoyed the learning part - they are now supercharged to learn new ways of building, and they are able to try it out, learn from their mistakes at an accelerated pace.

Unfortunately, this group, the curious ones, IMHO is a minority.


Replies

_dwtyesterday at 6:38 PM

I am going to try to put this kindly: it is very glib, and people will find it offensive and obnoxious, to implicitly round off all resistance or skepticism to incuriosity. Perhaps to alienate AI critics even further is the goal, in which case - carry on.

But if you are genuinely confused by the attitudes of your peers, try asking not "what do I have that they lack" ("curiosity"?) but "what do they see that I don't" or "what do they care about that I don't"? Is it possible that they are not enthusiastic for the change in the nature of the work? Is it possible they are concerned about "automation complacency" setting in, precisely _because_ of the ratio of "hundreds of times" writing decent code to the one time writing "something stupid", and fear that every once in a while that "something stupid" will slip past them in a way that wipes the entire net gain of AI use? Is it possible that they _don't_ feel that the typical code is "better than most engineers can write"? Is it possible they feel that the "learning" is mostly ephemera - how much "prompt engineering" advice from a year ago still holds today?

You have a choice, and it's easy to label them (us?) as Luddites clinging to the old ways out of fear, stupidity, or "incuriosity". If you really want to understand, or even change some minds, though, please try to ask these people what they're really thinking, and listen.

show 9 replies
ternyesterday at 10:26 AM

I am solidly in this "curious" camp. I've read HN for the past 15(?) years. I dropped out of CS and got an art agree instead. My career is elsewhere, but along the way, understanding systems was a hobby.

I always kind of wanted to stop everything else and learn "real engineering," but I didn't. Instead, I just read hundreds (thousands?) of arcane articles about enterprise software architecture, programming language design, compiler optimization, and open source politics in my free time.

There are many bits of tacit knowledge I don't have. I know I don't have them, because I have that knowledge in other domains. I know that I don't know what I don't know about being a "real engineer."

But I also know what taste is. I know what questions to ask. I know the magic words, and where to look for answers.

For people like me, this feels like an insane golden age. I have no shortage of ideas, and now the only thing I have is a shortage of hands, eyes, and on a good week, tokens.

show 7 replies
kifyesterday at 1:03 PM

But that's the problem. Something that can be so reliable at times, can also fail miserably at others. I've seen this in myself and colleagues of mine, where LLM use leads to faster burnout and higher cognitive load. You're not just coding anymore, you're thinking about what needs to be done, and then reviewing it as if someone else wrote the code.

LLMs are great for rapid prototyping, boilerplate, that kind of thing. I myself use them daily. But the amount of mistakes Claude makes is not negligible in my experience.

show 5 replies
codeboltyesterday at 11:45 AM

One issue is that developers have been trained for the past few decades to look for solutions to problems online by just dumping a few relevant keywords into Google. But to get the most out of AI you should really be prompting as if you were writing a formal letter to the British throne explaining the background of your request. Basic English writing skills, and the ability to formulate your thoughts in a clear manner, have become essential skills for engineering (and something many developers simply lack).

show 3 replies
kdheiwnsyesterday at 10:11 AM

Engineers will go back in and fix it when they notice a problem. Or find someone who can. AI will send happy little emoji while it continues to trash your codebase and brings it to a state of total unmaintainability.

hansmayeryesterday at 10:45 AM

> But no one ever mentions the hundreds of times it quietly wrote code that is better than most engineers can write.

Because the instances of this happening are a) random and b) rarely ever happening ?

javadhuyesterday at 10:04 AM

I agree on the curiosity part, I have a non CS background but I have learned to program just out of curiosity. This led me to build production applications which companies actually use and this is before the AI era.

Now, with AI I feel like I have an assistant engineer with me who can help me build exciting things.

show 1 reply
godelskiyesterday at 6:54 PM

  > But no one ever mentions the hundreds of times it quietly wrote code that is better than most engineers can write.
Your experience is the exact opposite of mine. I have people constantly telling me how LLMs are perfectly one shotting things. I see it from friend groups, coworkers, and even here on HN. It's also what the big tech companies are often saying too.

I'm sorry, but to say that nobody is talking about success and just concentrating on failure is entirely disingenuous. You claim the group is a minority, yet all evidence points otherwise. The LLM companies wouldn't be so successful if people didn't believe it was useful.

sn0wflak3syesterday at 3:48 PM

The K-shaped workforce point is sharp and I think you're right. The curious ones are a minority, but they've always been the ones who moved things forward. AI just made the gap more visible :)

Your Codex case study with the content creators is fascinating. A PhD in Biology and a masters in writing building internal tools... that's exactly the kind of thing i meant by "you can learn anything now." I'm surrounded by PhDs and professors at my workplace and I'm genuinely positive about how things are progressing. These are people with deep domain expertise who can now build the tools they need. It's an interesting time. please write that up...

Frannkyyesterday at 6:33 PM

This is my experience too. Also, the ones not striving for simplicity and not architecting end up with giant monsters that are very unstable and very difficult to update or make robust. They usually then look for another engineer to solve their mess. Usually, the easy way for the new engineer is just to architect and then turbo-build with Claude Code. But they are stuck in sunk cost prison with their mess and can't let it go :(

gavmoryesterday at 9:42 PM

When AI screws up, it's "stupid." When AI succeeds, I'm smart.

It's some cousin of the Fundamental Attribution Error.

dborehamtoday at 1:31 AM

> something stupid an AI did in their codebase

I have LLMs write code all day almost every day and these days I really haven't seen this happen. The odd thing here and there (e.g. LLM finds two instances of the same error path in code, decides to emit a log message in one place and throw an exception in the other place) but nothing just plain out wrong recently.

input_shyesterday at 10:03 AM

Quite frankly, if AI can write better code than most of your engineers "hundreds of times", then your hiring team is doing something terribly wrong.

show 3 replies
pydryyesterday at 10:06 AM

>But no one ever mentions the hundreds of times it quietly wrote code that is better than most engineers can write.

Are you serious? I've been hearing this constantly. since mid 2025.

The gaslighting over AI is really something else.

Ive also never seen jobs advertised before whose job was to lobby skeptical engineers over about how to engage in technical work. This is entirely new. There is a priesthood developing over this.

show 3 replies