logoalt Hacker News

hn_throwaway_99today at 2:51 PM1 replyview on HN

Your comment is responding to an issue that is different from what GP said. GP was talking about Chinese open source particularly, i.e. their open source models, which AFAIK have consistently been keeping up with (albeit a few steps behind) the closed source OpenAI and Anthropic models.

Hardware capacity is a separate issue entirely.


Replies

CharlieDigitaltoday at 2:58 PM

    > have consistently been keeping up with (albeit a few steps behind) 
I mean, this sentence is self contradictory, no?

    > Hardware capacity is a separate issue entirely.
It seems like hardware capabilities are at the very heart of both training and inference which is why Nvidia, TSMC are hitting record income and capitalization. Feels like divorcing hardware from the equation is discounting a big part of winning this race.
show 3 replies