logoalt Hacker News

downrightmikeyesterday at 3:10 PM1 replyview on HN

Just like the previous generation of AI PC, consumers just need a usb/pcie NPU,

Mass adoption won't happen until we get those cheap, because there are no mass prosumers making software for them that is massively popular.


Replies

u8080yesterday at 3:42 PM

No, AI inference is mainly RAM/RAM speed constrained, we need more fast RAM to make local AI thrive.

show 1 reply