AI demand isn't going away. It will just move from the data center to the local machine. On device AI is much better for the customer than it being in the cloud. Expecting people to stick with a few dozen gb of hbm is going to be the 'no one needs more than 640kb' of the 2030s.
> On device AI is much better for the customer than it being in the cloud
Which is exactly how you know it will always be nerfed. The last thing these guys want is to take their claws out of our data.
> AI demand isn't going away.
I'm not sure about that. When was the last time you have used Copilot prompt in Run dialog or Notepad?
It's being delayed by ai companies from running on local consumer grade machines specifically by making the cost of entry too expensive. OpenAi buys 40% of wafers to ensure the price of memory stays high.