logoalt Hacker News

crystalntoday at 3:13 AM1 replyview on HN

Seems much more likely the cost will go down 99%. With open source models and architectural innovations, something like Claude will run on a local machine for free.


Replies

walterbelltoday at 5:00 AM

How much RAM and SSD will be needed by future local inference, to be competitive with present cloud inference?