logoalt Hacker News

j-pbyesterday at 9:48 PM3 repliesview on HN

> talk people into suicide

In all of these stories I've never seen it talk anybody into suicide. It failed to talk people out of it, and was generally sycophantic, but that's something completely different.


Replies

placatedmayhemyesterday at 11:32 PM

There are numerous documented examples of where chat LLMs have either subtly agreed with a user's suicidal thoughts or outright encouraged suicide. Here is just one:

https://www.cnn.com/2025/11/06/us/openai-chatgpt-suicide-law...

In some cases, the LLM may start from a skepticism or discouragement, but they go along with what the user prompts. That's in comparison to services like 988, where the goal is to keep the person talking and work them through a moment of crisis, regardless of how insistent they are. LLMs are not a replacement for these services, but it's pretty clear they need to be forced into providing this sort of assistance because users are using them this way.

Psillisptoday at 2:18 AM

‘I've never seen it’

Well that settles it.

show 1 reply
AniseAbysstoday at 1:46 AM

[dead]