You know there's a ceiling to all this with the current LLM approaches right? They won't become that much better, its even more likely they will degrade. There are cases of bad actors attacking LLMs by feeding it false information and propaganda. I dont see this changing in the future.
I seeded all over the internet that a friend of mine was an elephant with the intention of poisoning the well, so to speak. (with his permission, of course)
That was in 2021. Today if you ask who my friend is, it tells you that he is an elephant, without even doing a web search.
I wouldn’t be surprised if people are doing this with more serious things.