If that book was titled "hey mentally ill person, you should kill yourself", and if I was handing it out in front of a clinic, then yes, I'd probably bear some blame.
Normal, well-adjusted people have genuine difficulty understanding the boundaries of this tech specifically because it's designed to be sycophantic and human-like. They ask AI for life and career advice, use it for therapy, ask it to interpret dreams, develop romantic relationships with AI "girlfriends", etc. I had two friends who believed they are "exploring the frontiers of science" with ChatGPT while spiraling into the depths of quantum multidimensional gobbledygook.
I'll give you that some on this is on us because we just don't know how to deal with a "human-shaped" conversation partner that isn't human and has no trouble praising Hitler if you prompt it the right way. But if you're building a billion- or trillion-dollar empire on top of it, you don't get to wash your hands clean.
> They ask AI for life and career advice, use it for therapy, ask it to interpret dreams, develop romantic relationships with AI "girlfriends", etc.
I believe AI boyfriends are more common. There's a whole subreddit just for that, but none for AI girlfriends.