logoalt Hacker News

NitpickLawyeryesterday at 5:09 PM3 repliesview on HN

> I don't really expect the prices to be this cheap for much longer

Open models are a great proxy (and scare tactic) to what we can expect. As they are already released, and won't change, you'll get basically the same capabilities in the future for current or decreasing cost (with normal hardware improvements trends). The current SotA for open models (dsv3, glm, minimax, devstral, etc) are at or above the mini versions of top labs (haikus, -mini, etc). With the exception of gemini 3.0-flash I would say. So, barring any black swan events in Taiwan, we can expect to be enough pressure to keep the prices at those points, or lower in the future. And we can expect the trend of open to chase top labs to continue. The biggest "gain" from open models is that they can't go backward. We can only stagnate or improve, on all fronts (capabilities, sizes, cost, etc).


Replies

biophysboyyesterday at 5:28 PM

Good points. I'm even more optimistic now!

imirictoday at 12:11 AM

I would like to live in a world where open source (not just open weights) models dominate the landscape, but I don't see that happening.

The moat is not the quality of the models, but the compute. Any open model that consumers can run will pale in performance compared to SOTA commercially hosted models. There's just no comparison. The really big open models (e.g. DeepSeek, Kimi K2) are much closer, but they're not accessible to most consumers, who still have to rely on companies to provide them.

Maybe this will change one day, but considering how this industry is artificially inflating the cost of hardware, I wouldn't bet on that happening anytime soon. In the meantime, mega-corporations are building out increasingly larger datacenters to meet the demand, and the moat grows.

_DeadFred_yesterday at 8:13 PM

I would love to see local libraries offering access to their AI models (whatever those might be). I think it can fit with their function and really serve the local community in the future. Plus it would be cool to have deeper knowledge of AI distributed to someone in every community (the person maintaining the local setup).