logoalt Hacker News

bambaxyesterday at 10:18 AM4 repliesview on HN

I agree wholeheartedly with all that is said in this article. When guided, AI amplifies the productivity of experts immensely.

There are two problems left, though.

One is, laypersons don't understand the difference between "guided" and "vibe coded". This shouldn't matter, but it does, because in most organizations managers are laypersons who don't know anything about coding whatsoever, aren't interested by the topic at all, and think developers are interchangeable.

The other problem is, how do you develop those instincts when you're starting up, now that AI is a better junior coder than most junior coders? This is something one needs to think about hard as a society. We old farts are going to be fine, but we're eventually going to die (retire first, if we're lucky; then die).

What comes after? How do we produce experts in the age of AI?


Replies

jinko-niwashiyesterday at 7:55 PM

The instincts can absolutely be developed faster with AI — if you set it up right. I work with an AI partner daily and one thing I've noticed is that it's a brutal mirror: it exposes gaps in your thinking immediately because it does exactly what you tell it, not what you meant.

That feedback loop, hundreds of times a day, compresses years of learning into months. The catch is you need guardrails — tests that fail when the AI drifts, review cycles you can't skip, architecture constraints it must respect.

That's what builds the instincts: not the AI doing the work for you, but the AI showing you where your understanding breaks down, fast enough that you actually learn from it. Just-In-Time Learning.

sn0wflak3syesterday at 3:56 PM

This is the question I keep coming back to. I don't have a clean answer yet.

The foundation I built came from years of writing bad code and understanding why it was bad. I look at code I wrote 10 years ago and it's genuinely terrible. But that's the point. It took time, feedback, reading books, reviewing other people's work, failing, and slowly building the instinct for what good looks like. That process can't be skipped.

If AI shortens the path to output, educators have to double down on the fundamentals. Data structures, systems thinking, understanding why things break. Not because everyone needs to hand-write a linked list forever, but because without that foundation you can't tell when the AI is wrong. You can't course-correct what you don't understand.

Anyone can break into tech. That's a good thing. But if someone becomes a purely vibe-coding engineer with no depth, that's not on them. That's on the companies and institutions that didn't evaluate for the right things. We studied these fundamentals for a reason. That reason didn't go away just because the tools got better.

jstanleyyesterday at 10:20 AM

I think the problem is overstated.

People always learn the things they need to learn.

Were people clutching their pearls about how programmers were going to lack the fundamentals of assembly language after compilers came along? Probably, but it turned out fine.

People who need to program in assembly language still do. People who need to touch low-level things probably understand some of it but not as deeply. Most of us never need to worry about it.

show 2 replies