This should erase any doubt that AI Labs are making $$$ on API inference.
Kimi 2.5 (which this is based on) is served at $0.44 input / $2 output by a ton of different providers on OpenRouter, 2.6 will certainly be similar.
That's about 11X less than Opus for similar smarts.
Famously, OpenAI and Anthropic are devoted to increasing efficiency before scaling up resource usage.
How does it erase any doubt? You’re implying Chinese things can’t be actually cheaper to produce than American which is laughable
It’s worth noting that the US is very behind on energy infra and that might affect the cost calculations since data centers are electricity guzzlers. Also, not sure if CN has completely switched off Nvidia or still using them for training.