> have consistently been keeping up with (albeit a few steps behind)
I mean, this sentence is self contradictory, no? > Hardware capacity is a separate issue entirely.
It seems like hardware capabilities are at the very heart of both training and inference which is why Nvidia, TSMC are hitting record income and capitalization. Feels like divorcing hardware from the equation is discounting a big part of winning this race.> I mean, this sentence is self contradictory, no?
By benchmarks, the Chinese models are ahead of where the proprietary US models were ... something like 6 or 12 months ago. And all the benchmarks are a bit fuzzy anyway on whether a small gap is trivial or significant. The Chinese aren't having any problems keeping up on model quality. The gap isn't going to lead to any difference that matters unless the US pulls a rabbit out of it's hat.
Plus dollar-for-performance they might be leading in practice, it is hard to compete with self hosted.
You can keep up even if you’re behind. If someone is running a race and you’re constantly two seconds behind their time, you are steps behind but keeping up.
> I mean, this sentence is self contradictory, no?
As others have pointed out, no, not at all. For specifics, see the chart from this link posted by another commenter: https://hai.stanford.edu/news/inside-the-ai-index-12-takeawa... . If anything, Chinese models are closing the gap with US models, not falling behind.
> Feels like divorcing hardware from the equation is discounting a big part of winning this race.
Depends on what race you're talking about. When it comes to "who has the most powerful models", I'd argue it's actually not really that significant - China obviously has the power to train good models.