logoalt Hacker News

_ttgtoday at 6:35 AM4 repliesview on HN

I want to sympathize but enforcing a moral blockade on the "vast majority" of inbound inquiries is a self-inflicted wound, not a business failure. This guy is hardly a victim when the bottleneck is explicitly his own refusal to adapt.


Replies

venturecrueltytoday at 6:43 AM

Survival is easy if you just sell out.

show 3 replies
voiper1today at 6:58 AM

Surely there's AI usage that's not morally reprehensible.

Models that are trained only on public domain material. For value add usage, not simply marketing or gamification gimmicks...

show 1 reply
nrhrjrjrjtntbttoday at 6:42 AM

I wonder if there is a pivot where they get to keep going but still avoid AI. There must be for a small consultancy.

sddhrthrttoday at 11:00 AM

> "a self-inflicted wound"

"AI products" that are being built today are amoral, even by capitalism's standards, let alone by good business or environmental standards. Accepting a job to build another LLM-selling product would be soul-crushing to me, and I would consider it as participating in propping up a bubble economy.

Taking a stance against it is a perfectly valid thing to do, and the author is not saying they're a victim due to no doing of their own by disclosing it plainly. By not seeing past that caveat and missing the whole point of the article, you've successfully averted your eyes from another thing that is unfolding right in front of us: majority of American GDP is AI this or that, and majority of it has no real substance behind it.

show 1 reply