After having spoken with one of the people there I'm a lot less concerned to be honest.
They described it as something akin to an emotional vibrator, that they didn't attribute any sentience to, and that didn't trigger their PTSD that they normally experienced when dating men.
If AI can provide emotional support and an outlet for survivors who would otherwise not be able to have that kind of emotional need fulfilled, then I don't see any issue.
phew, that's a healthy start.
I am still slightly worried about accepting emotional support from a bot. I don't know if that slope is slippery enough to end in some permanent damage to my relationships and I am honestly not willing to try it at all even.
That being said, I am fairly healthy in this regard. I can't imagine how it would go for other people with serious problems.
It may not be a concern now, but it comes down to their level of maintaining critical thinking. The risk of epistemic drift, when you have a system that is designed (or reinforced) to empathize with you, can create long-term effects not noticed in any single interaction.
Related: "Delusions by design? How everyday AIs might be fuelling psychosis (and what can be done about it)" ( https://doi.org/10.31234/osf.io/cmy7n_v5 )
Most people who develop AI psychosis have a period of healthy use beforehand. It becomes very dangerous when a person decreases their time with their real friends to spend more time with the chatbot, as you have no one to keep you in check with what reality is and it can create a feedback loop.