logoalt Hacker News

singpolyma3today at 4:33 PM3 repliesview on HN

Since the article is largely about open weights models, I think the argument is that this is the "last gasp" and soon doing inference at home will be common.


Replies

vjvjvjvjghvtoday at 4:36 PM

The trend over the last decades was towards more centralization and I don't see that changing. Unless we radically change our economic system, the rent seekers will always win. There will be probably less of them but they will be even bigger.

philipkglasstoday at 4:44 PM

The small models that I can run at home are becoming more capable, and I have replaced some API-based tasks with local inference as they improve, but large open weights models are still a lot stronger. The nice thing with larger open weights models is that competing providers serve them at modest margins and prices. I don't have the hardware to run the largest Qwen models, but I can get API access at low cost. Since there are only modest barriers to new commercial inference providers for these models I'm not worried that API access to them will become drastically more expensive at some future time.

show 1 reply
filleduchaostoday at 4:58 PM

Running on what devices (and additionally, purchased with what money)?