That's an interesting take. When the environmentally unsustainable and so far unprofitable "AI" gold-rush collapses the world will be left with thousands of "models" kicking about, with widespread ability to run locally. If LLMs do ever provide any genuine utility these will be replicated at near zero cost across the globe. Then we'll see the boot switch to the other foot, more of what you predict.... a vicious and disgracefully hypocritical attempt to contain and erase that data to stop anybody obtaining any value from it. At that point the corporations will be magically all for safety and copyright again!
I think to an extent they're already laying the rhetorical groundwork for all of this, for the day they need it. AI safety, distillation is "unfair". I expect they will try to avoid it being purely about copyright so as to not be seen as hypocrites, but frame it in a more convoluted manner.