> I worry about the damage caused by these things on distressed people. What can be done?
Why? We are gregarious animals, we need social connections. ChatGPT has guardrails that keep this mostly safe and helps with the loneliness epidemic.
It's not like people doing this are likely thriving socially in the first place, better with ChatGPT than on some forum à la 4chan that will radicalize them.
I feel like this will be one of the "breaks" between generations where millennial and GenZ will be purist and call human-to-human real connections while anything with "AI" is inherently fake and unhealthy whereas Alpha and Beta will treat it as a normal part of their lives.
https://en.wikipedia.org/wiki/Deaths_linked_to_chatbots
If you read through that list and dismiss it as people who were already mentally ill or more susceptible to this... that's what Dr. K (psychiatrist) assumed too until he looked at some recent studies: https://youtu.be/MW6FMgOzklw?si=JgpqLzMeaBLGuAAE
Clickbait title, but well researched and explained.
Using ChatGPT to numb social isolation is akin to using alcohol to numb anxiety.
ChatGPT isn't a social connection: LLMs don't connect with you. There is no relationship growth, just an echo chamber with one occupant.
Maybe it's a little healthier for society overall if people become withdrawn to the point of suicide by spiralling deeper into loneliness with an AI chat instead of being radicalised to mass murder by forum bots and propagandists, but those are not the only two options out there.
Join a club. It doesn't really matter what it's for, so long as you like the general gist of it (and, you know, it's not "plot terrorism"). Sit in the corner and do the club thing, and social connections will form whether you want them to or not. Be a choir nerd, be a bonsai nut, do macrame, do crossfit, find a niche thing you like that you can do in a group setting, and loneliness will fade.
Numbing it will just make it hurt worse when the feeling returns, and it'll seem like the only answer is more numbing.
This is an interesting point. Personally, I am neutral on it. I'm not sure why it has received so many downvotes.
You raise a good point about a forum with real people that can radicalise someone. I would offer a dark alternative: It is only a matter of time when forums are essentially replaced by an AI-generated product that is finely tuned to each participant. Something a bit like Ready Player One.
Your last paragraph: What is the meaning of "Alpha and Beta"? I only know it from the context of Red Pill dating advice.
load-bearing "mostly"
https://www.cnn.com/2025/11/06/us/openai-chatgpt-suicide-law...
The tech industry's capacity to rationalize anything, including psychosis, as long as it can make money off it is truly incredible. Even the temporarily embarrassed founders that populate this message board do it openly.