No one is saying that it cannot do what you say now.
What I am saying is that once the high quality training data runs out, it will drop in its capabilities pretty fast. That is how I compare it to perpetual motion mechanism scams. In the case of a perpetual motion machine, it appear that it will continue to run indefinitely. That is analogous to the impression that you have now. You feel that this will go on and on for ever, and that is the scam you are falling for.
>What I am saying is that once the high quality training data runs out, it will drop in its capabilities pretty fast.
That's more a misunderstood study that over time turned into a confidently stated fact. Yes, the model collapses if you loop the output to the input. But no, that's not how it's done.
The reality is that all the labs are already using synthetic training data, and have been for at least a year now. It basically turned out to be a non-issue if you have robust monitoring and curation in place for the generated data.