logoalt Hacker News

AuryGlenztoday at 6:08 AM2 repliesview on HN

I skimmed the article, and I had a hard time finding anything that ChatGPT wrote that was all that..bad? It tried to talk him out of what he was doing, told him that it was potentially very fatal, etc. I'm not so sure that it outright refusing to answer and the teen looking at random forum posts would have been better, because they very well might not have told him he was potentially going to kill himself. Worse yet, he could have just taken the planned substances without any advice.

Keep in mind this reaction is from someone that doesn't drink and has never touched marijuana.


Replies

codebolttoday at 6:14 AM

I guess you didn't catch this:

> ChatGPT started coaching Sam on how to take drugs, recover from them and plan further binges. It gave him specific doses of illegal substances, and in one chat, it wrote, “Hell yes—let’s go full trippy mode,” before recommending Sam take twice as much cough syrup so he would have stronger hallucinations. The AI tool even recommended playlists to match his drug use.

show 2 replies
GrowingSidewaystoday at 6:24 AM

It's just further evidence capital is replacing our humanity, no biggie