logoalt Hacker News

systemf_omegayesterday at 11:05 AM15 repliesview on HN

What I don't understand about this whole "get on board the AI train or get left behind" narrative, what advantage does an early adopter have for AI tools?

The way I see it, I can just start using AI once they get good enough for my type of work. Until then I'm continuing to learn instead of letting my brain atrophy.


Replies

simonwyesterday at 11:50 AM

This is a pretty common position: "I don't worry about getting left behind - it will only take a few weeks to catch up again".

I don't think that's true.

I'm really good at getting great results out of coding agents and LLMs. I've also been using LLMs for code on an almost daily basis since ChatGPT's release on November 30th 2022. That's more than three years ago now.

Meanwhile I see a constant flow of complaints from other developers who can't get anything useful out of these machines, or find that the gains they get are minimal at best.

Using this stuff well is a deep topic. These things can be applied in so many different ways, and to so many different projects. The best asset you can develop is an intuition for what works and what doesn't, and getting that intuition requires months if not years of personal experimentation.

I don't think you can just catch up in a few weeks, and I do think that the risk of falling behind isn't being taken seriously enough by much of the developer population.

I'm glad to see people like antirez ringing the alarm bell about this - it's not going to be a popular position but it needs to be said!

show 8 replies
quitityesterday at 12:16 PM

You're right, it's difficult to get "left behind" when the tools and workflows are being constantly reinvented.

You'd be sage with your time just to keep a high-level view until workflows become stable and aren't advancing every few months.

The time to consider mastering a workflow is when a casual user of the "next release" wouldn't trivially supersede your capabilities.

Similarly we're still in the race to produce a "good enough" GenAI, so there isn't value in mastering anything right now unless you've already got a commercial need for it.

This all reminds me of a time when people were putting in serious effort to learn Palm Pilot's Graffiti handwriting recognition, only for the skill to be made redundant even before they were proficient at it.

antirezyesterday at 11:17 AM

I think that who says that you need to be accustomed to the current "tools" related to AI agents, is suffering from a horizon effect issue: these stuff will change continuously for some time, and the more they evolve, the less you need to fiddle with the details. However, the skill you need to have, is communication skills. You need to be able to express yourself and what matters for your project fast and well. Many programmers are not great at communication. In part this is a gift, something you develop at small age, and this will, I believe, kinda change who is good at programming: good communicators / explorers may not have a edge VS very strong coders that are bad at explaining themselves. But a lot of it is attitude, IMHO. And practice.

show 2 replies
jsightyesterday at 11:07 PM

I thought this way for a while. I still do to a certain degree, but I'm starting to see the wisdom in hurrying off into the change.

The most advanced tooling today looks nothing like the tooling for writing software 3 years ago. We've got multi-agent orchestration with built in task and issue tracking, context management, and subagents now. There's a steep learning curve!

I'm not saying that everyone has to do it, as the tools are so nascent, but I think it is worthwhile to at least start understanding what the state of the art will look like in 12-24 months.

epolanskitoday at 12:10 AM

What would be the type of work you're doing where you wouldn't benefit from one or multiple of the following:

- find information about APIs without needing to open a browser

- writing a plan for your business-logic changes or having it reviewed

- getting a review of your code to find edge cases, potential security issues, potential improvements

- finding information and connecting the dots of where, what and why it works in some way in your code base?

Even without letting AI author a single line of code (where it can still be super useful) there are still major uses for AI.

oncallthrowyesterday at 11:16 AM

My take: learning how to do LLM-assisted coding at a basic level gets you 80% of the returns, and takes about 30 minutes. It's a complete no-brainer.

Learning all of the advanced multi-agent worklows etc. etc... Maybe that gets you an extra 20%, but it costs a lot more time, and is more likely to change over time anyway. So maybe not very good ROI.

show 1 reply
edg5000yesterday at 11:11 AM

It took me a few months of working with the agents to get really productive with it. The gains are significant. I write highly detailed specs (equiv multiple A4 pages) in markdown and dicate the agent hierarchy (which agent does what, who reports to who).

I've learned a lot of new things this year thanks to AI. It's true that the low levels skills with atrophy. The high level skills will grow though; my learning rate is the same, just at a much higher abstraction level; thus covering more subjects.

The main concern is the centralisation. The value I can get out of this thing currently well exceeds my income. AI companies are buying up all the chips. I worry we'll get something like the housing market where AI will be about 50% of our income.

We have to fight this centralisation at all costs!

show 3 replies
nikcubyesterday at 11:11 AM

I've used cursor and claude code both daily[0] within a month of their releases - i'm learning something new on how to work with and apply the tools almost every day.

I don't think it's a coincidence that some of the best developers[1] are using these tools and some openly advocating for them because it still requires core skills to get the most out of them

I can honestly say that building end-to-end products with claude code has made me a better developer, product designer, tester, code reviewer, systems architect, project manager, sysadmin etc. I've learned more in the past ~year than I ever have in my career.

[0] abandoned cursor late last year

[1] see Linus using antigravity, antirez in OP, Jared at bun, Charlie at uv/ruff, mitushiko, simonw et al

show 1 reply
CuriouslyCyesterday at 11:12 AM

AI development is about planning, orchestration and high throughput validation. Those skills won't go away, the quality floor of model output will just rise over time.

zahlmanyesterday at 11:31 AM

The idea, I think, is to gain experience with the loop of communicating ideas in natural language rather than code, and then reading the generated code and taking it as feedback.

It's not that different overall, I suppose, from the loop of thinking of an idea and then implementing it and running tests; but potentially very disorienting for some.

Ekarosyesterday at 11:10 AM

By their promises it should get so good that basically you do not need to learn it. So it is reasonable to wait until that point.

show 2 replies
bsaulyesterday at 11:13 AM

An ecosystem is being built around AI : Best prompting practices, mcps, skills, IDE integration, how to build a feedback loop so that LLM can test its output alone, plug to the outside world with browser extensions, etc...

For now i think people can still catch up quickly, but at the end of 2026 it's probably going to be a different story.

show 3 replies
nicceyesterday at 4:32 PM

> What I don't understand about this whole "get on board the AI train or get left behind" narrative, what advantage does an early adopter have for AI tools?

Replace that with anything and you will notice that people who are building startups in this area will want to bring the narrative like that as it usually highly increases the value of their companies. When narrative gets big enough, then big companies must follow - or they look like "lagging behind". Whether the current thing brings value or not. It is a fire that keeps feeding itself. In the end, when it gets big enough - we call it as bubble. Bubble that may explode. Or not.

Whether the end user gets actual value or not, is just side effect. But everyone wants to believe that that it brings value - otherwise they were foolish to jump in the train.

rvzyesterday at 11:15 AM

> What I don't understand about this whole "get on board the AI train or get left behind" narrative, what advantage does an early adopter have for AI tools?

The ones pushing this narrative have either the following:

* Invested in AI companies (which they will never disclose until they IPO / acquired)

* Employees at AI companies that have stock options which they are effectively paid boosters around AGI nonsense.

* Mid-life crisis / paranoia that their identity as a programmer is being eroded and have to pivot to AI.

It is no different to the crypto web3 bubble of 2021. This time, it is even more obvious and now the grifters from crypto / tech are already "pivoting to ai". [0]

[0] https://pivot-to-ai.com/

show 2 replies