We had people acting out like this before LLM chatbots, correlation does not necessarily imply causation.
> correlation does not necessarily imply causation
I feel like you're missing what you're replying to, why are you saying this? The article is about a person who "lost grip on reality", no one is saying LLMs is turning people into pope-wannabees as far as I can tell, you're reacting against something no one claimed.
This is something new. Delusions were around before, certainly, but LLM offers a round the clock potential for psychological conditioning, which would not normally be possible without sustained attention by a group of people.
We did...but it was few here and there. The LLMs are making it massive and impacting people on a huge scale.