The industry will shift, yes. At some point, remote LLM compute will be like AWS.
Everyone can do baremetal at home and run on it, or VMs, containers. Many don't.
However, you'll still want the best model and toolset. So there is some place for them to pivot to. Something for them to sell or licence.
It will be interesting to see where the all lands, a decade from now. Who will be left?
If you are using LLMs for tool use locally, then in a decade it will not make sense anymore to pay for hosted solutions. Your device will have compute power to run powerful LLMs trivially.
If you need LLMs at scale to serve many customers, then hosted solutions make sense for the availability aspect. But by this point models can be offered by any generic services provider, like AWS or Cloudflare. Pure AI companies that just offer hosted models and nothing else will go extinct if they don’t expand to offer more services.