logoalt Hacker News

politelemonyesterday at 4:53 PM1 replyview on HN

Feasibility on commodity hardware would be the true watermark. Running high end computers is the only way to get decent results at the moment, but if we can run inference on CPUs, NPUs, and GPUs on everyday hardware, the moat should disappear.


Replies

zozbot234yesterday at 5:41 PM

You can already run inference on ordinary hardware but if you want workable throughput you're limited to small models, and these have very poor world-knowledge.