logoalt Hacker News

yicmoggIrlyesterday at 11:29 PM1 replyview on HN

Excellent comment (even "mini essay"). I'm unsure if you've written it with AI-assistance, but even if that's the case, I'll tolerate it.

I have two things to add.

> This is not a moral failing. It is a psychological one.

(1) I disagree: it's not a failing at all. Resisting displacement, resisting that your identity, existence, meaning found in work, be taken away from you, is not a failing.

Such resistance might be futile, yes; but that doesn't make it a failing. If said resistance won, then nobody would call it a failing.

The new technology might just win, and not adapting to that reality, refusing that reality, could perhaps be called a failing. But it's also a choice.

For example, if software engineering becomes a role to review AI slop all day, then it simply devolves, for me, into just another job that may be lucrative but has zero interest for me.

(2) You emphasize identity. I propose a different angle: meaning, and intrinsic motivation. You mention:

> economic value of programming is increasingly detached from human identity

I want to rephrase it: what has been meaningful to me thus far remains meaningful, but it no longer allows me to make ends meet, because my tribe no longer appreciates when I act out said activity that is so meaningful to me.

THAT is the real tragedy. Not the loss of identity -- which you seem to derive from the combination of money and prestige (BTW, I don't fully dismiss that idea). Those are extrinsic motivations. It's the sudden unsustainability of a core, defining activity that remains meaningful.

The whole point of all these AI-apologist articles is that "it has happened in the past, time and again; humanity has always adapted, and we're now better off for it". Never mind those generations that got walked over and fell victim to the revolution of the day.

In other words, the AI-apologists say, "don't worry, you'll either starve (which is fine, it has happened time and agani), or just lose a large chunk of meaning in your life".

Not resisting that is what would be a failing.


Replies

threethirtytwoyesterday at 11:52 PM

I think where we actually converge is on the phenomenon itself rather than on any moral judgment about it.

What I was trying to point at is how strange it is to watch this happen in real time. You can see something unfolding directly in front of you. You can observe systems improving, replacing workflows, changing incentives. None of it is abstract. And yet the implications of what is happening are so negative for some people that the mind simply refuses to integrate them. It is not that the facts are unknown. It is that the outcome is psychologically intolerable.

At that point something unusual happens. People do not argue with conclusions, they argue with perception. They insist the thing they are watching is not really happening, or that it does not count, or that it will somehow stop before it matters. It is not a failure of intelligence or ethics. It is a human coping mechanism when reality threatens meaning, livelihood, or future stability.

Meaning and intrinsic motivation absolutely matter here. The tragedy is not that meaningful work suddenly becomes meaningless. It is that it can remain meaningful while becoming economically unsustainable. That combination is brutal. But denying the shift does not preserve meaning. It only delays the moment where a person has to decide how to respond.

What I find unsettling is not the fear or the resistance. It is watching people stand next to you, looking at the same evidence, and then effectively unsee it because accepting it would force a reckoning they are not ready for.

>I'm unsure if you've written it with AI-assistance, but even if that's the case, I'll tolerate it.

Even if it was, the world is changing. You already need to tolerate AI in code, it's inevitable AI will be part of writing.

show 1 reply