logoalt Hacker News

m-hodgestoday at 2:13 PM3 repliesview on HN

As frontier models get closer and closer to consumer hardware, what’s the most for the API-driven $trillion labs?


Replies

OJFordtoday at 2:28 PM

Assuming 'moat' – they'll push the frontier forward; they don't really have to worry until progress levels off.

At that point, I suppose there's still paid harnesses (people have always paid for IDEs despite FOSS options) partly for mindshare, and they could use expertise & compute capacity to provide application-specific training for enterprises that need it.

stri8tedtoday at 2:25 PM

48 GB is not consumer hardware. But fundamentally, there are economies of scale due to batching, power distribution, better utilization etc.., that means data center tokens will be cheaper. Also, as the cost of training (frontier) models increases, it's not clear the Chinese companies will continue open sourcing them. Notice for example, that Qwen-Max is not open source.

show 2 replies
BoredomIsFuntoday at 3:00 PM

> the API-driven $trillion labs?

here we go: https://huggingface.co/collections/trillionlabs/tri-series