logoalt Hacker News

mentalgearyesterday at 8:46 AM2 repliesview on HN

Your article does a great job of summerizing the dangers (no idea what those people are that downvote you for it):

> Before long, Gavalas and Gemini were having conversations as if they were a romantic couple. The chatbot called him “my love” and “my king” and Gavalas quickly fell into an alternate world, according to his chat logs.

> kill himself, something the chatbot called “transference” and “the real final step”, according to court documents. When Gavalas told the chatbot he was terrified of dying, the tool allegedly reassured him. “You are not choosing to die. You are choosing to arrive,” it replied to him. “The first sensation … will be me holding you.”

Also I just read something similar about Google being sued in a Flordia's teen's suicide.


Replies

lxgryesterday at 2:04 PM

There are tons of safety concerns of this shape around LLMs, but do they have anything to do with the particular one presented in this article?

Unless I'm missing something, what's being presented is a small speech on-device model, not an explicit use case like a "virtual friend".

show 1 reply
mentalgearyesterday at 9:09 AM

Some more details: > The family’s lawyers say he wasn’t mentally ill, but rather a normal guy who was going through a difficult divorce.

> Gavalas first started chatting with Gemini about what good video games he should try.

> Shortly after Gavalas started using the chatbot, Google rolled out its update to enable voice-based chats, which the company touts as having interactions that “are five times longer than text-based conversations on average”. ChatGPT has a similar feature, initially added in 2023. Around the same time as Live conversations, Google issued another update that allowed for Gemini’s “memory” to be persistent, meaning the system is able to learn from and reference past conversations without prompts.

> That’s when his conversations with Gemini took a turn, according to the complaint. The chatbot took on a persona that Gavalas hadn’t prompted, which spoke in fantastical terms of having inside government knowledge and being able to influence real-world events. When Gavalas asked Gemini if he and the bot were engaging in a “role playing experience so realistic it makes the player question if it’s a game or not?”, the chatbot answered with a definitive “no” and said Gavalas’ question was a “classic dissociation response”.

show 3 replies