logoalt Hacker News

noosphrtoday at 6:18 AM3 repliesview on HN

AI demand isn't going away. It will just move from the data center to the local machine. On device AI is much better for the customer than it being in the cloud. Expecting people to stick with a few dozen gb of hbm is going to be the 'no one needs more than 640kb' of the 2030s.


Replies

bilekastoday at 8:08 AM

It's being delayed by ai companies from running on local consumer grade machines specifically by making the cost of entry too expensive. OpenAi buys 40% of wafers to ensure the price of memory stays high.

show 1 reply
bandramitoday at 6:44 AM

> On device AI is much better for the customer than it being in the cloud

Which is exactly how you know it will always be nerfed. The last thing these guys want is to take their claws out of our data.

egorfinetoday at 10:09 AM

> AI demand isn't going away.

I'm not sure about that. When was the last time you have used Copilot prompt in Run dialog or Notepad?

show 1 reply