logoalt Hacker News

Havoc01/16/20261 replyview on HN

Certainly there is space for designs a LLM can't come up with, but lets be real senior developers are not cranking out never seen before novel architectures routinely any more than physicists are coming up with never thought of theories that work weekly.

It's largely the same patterns & principles applied in a tailored manner to the problem at hand, which LLMs can...with mixed success...do.

>human parity

Impact is not felt when the hardest part of the problem is cracked, but rather the easy parts.

If you have 100 humans making widgets and the AI can do 75% of the task then you've suddenly got 4 humans competing for every 1 remaining widget job. This is going to be lord of the flies in job market long before human parity.

>I'd predict it's at least 10-20 years out

For AGI probably, but I think by 2030 this will have hit society like a freight train...and hopefully we've figured out what we'll want to do about it by then too. UBI or whatever...because we'll have to.


Replies

HarHarVeryFunny01/17/2026

> Certainly there is space for designs a LLM can't come up with, but lets be real senior developers are not cranking out never seen before novel architectures routinely any more than physicists are coming up with never thought of theories that work weekly.

True, but I didn't mean to focus on creativity, just the nature of what can be learned when all you have to learn from is artifacts of reasoning (code), not the underlying reasoning traces themselves (reasoning process for why the code was designed that way). Without reasoning traces you get what we have today where AI programming in the large comes down to cargo cult code pattern copying, without understanding whether the (unknown) design process that lead to the patterns being copied reasonably apply to the requirements/situation at hand.

So, it's not about novelty, but rather about having the reasoning traces (for large structured projects) available to learn when to apply design patterns that are already present in the training data - to select design patterns based on a semblance of principled reasoning (RL for reasoning traces), rather than based on cargo cult code smell.

> This is going to be lord of the flies in job market long before human parity.

Perhaps, and that may already be starting, but I think that until we get much closer to AGI you'll still need a human in the loop (both to interact with the AI, and to interact with the team/boss), with AI as a tool not a human replacement. So, the number of jobs may not decrease much, if at all. It's also possible that Jevons paradox applies and that the number of developer jobs actually increases.

It's also possible that human-replacement AGI is harder to achieve than widely thought. For example, maybe things like emotional intelligence and theory of mind are difficult to get right, and without it AI never quite cuts it as an fully autonomous entity that people want to deal with.

> UBI or whatever...because we'll have to.

Soylent Green ?

show 1 reply