How would that work? What it means for the US AI bubble to burst is that tremendous amounts of inference capacity become open for pennies on the dollar. I don't see how Mistral is or could be sheltered from that.
SLAs that are valuable to their clients, guarantees and mechanisms to protect them from data exfiltration, and generally long-term contracts with cash-stable orgs like they're currently doing.
So long as they're sufficiently liquid at the right time, they don't really need to shelter more. They need to plan for a fire sale on the bulk of their operating expenses.
Isn't running the models for end users the biggest cost at the moment?
> tremendous amounts of inference capacity become open for pennies on the dollar.
They can't be operated for pennies on the dollar, though. The likely current status is that these products are subsidized to disregard model cost, and part of the operating cost.
If the bubble bursts, inference that can't be made profitable when factoring in operating costs will be scraped, not sold for pennies.
Btw, this makes a great argument for workers' rights - if you have a company who owns datacenters - well, you can't fire your GPUs to make your Q2 look better