I dunno--of all the AI based products coming out, the whole "AI girlfriend / AI boyfriend" thing bothers me the least. If someone can afford it and they want a play relationship with a computer, then I don't see the harm. It's probably safer, better and healthier than many real-human relationships are. If they're getting what they need out of the computer, who are we to judge?
I would change my opinion if it could be shown to have the negative physical harm that your cocaine example implies.
The problem is when things like this happens: https://apnews.com/article/chatbot-ai-lawsuit-suicide-teen-a...
When AI behaves sycohphantically towards someone, it can encourage and exacerbate any mental health problems they may already be having, especially related to social isolation.
The issue isn't in the individual but at scale, what % of our population are we okay with separating with reality? What secondary effects of that inability to live in reality will show their heads? What will politics look like when everything can be made up and treated as equal to reality?
What will the mental health of society start to look like if every person who's on the edge has a computer to tell them they're totally correct and everyone else are haters?