logoalt Hacker News

delectiyesterday at 11:24 PM0 repliesview on HN

I agree that that's what it would take, but compute would need to get very cheap for it to be feasible to keep models running locally. That's an awful lot of memory to have just sitting with the model running in it.