I feel like I’m watching group psychosis where people are just following each other off a cliff. I think the promise of AI and the potential money involved override all self preservation instincts in some people.
It would be fine if I could just ignore it, but they are infecting the entire industry.
You need to take every comment about AI and mentally put a little bracketed note beside each one noting technical competence.
AI is basically an software development eternal september: it is by definition allowing a bunch of people who are not competent enough to build software without AI to build it. This is, in many ways, a good thing!
The bad thing is that there are a lot of comments and hype that superficially sound like they are coming from your experienced peers being turned to the light, but are actually from people who are not historically your peers, who are now coming into your spaces with enthusiasm for how they got here.
Like on the topic of this article[0], it would be deranged for Apple (or any company with a registered entity that could be sued) to ship an OpenClaw equivalent. It is, and forever will be[1] a massive footgun that you would not want to be legally responsible for people using safely. Apple especially: a company who proudly cares about your privacy and data safety? Anyone with the kind of technical knowledge you'd expect around HN would know that them moving first on this would be bonkers.
But here we are :-)
[0] OP's article is written by someone who wrote code for a few years nearly 20 years ago.
[1] while LLMs are the underlying technology https://simonwillison.net/tags/lethal-trifecta/