logoalt Hacker News

dijittoday at 2:06 PM14 repliesview on HN

Frontier AI companies are selling at a loss.

Excusing everything else that u/bastawhiz said[0]; the obvious fact here is that Claude, OpenAI, Gemini et al. are quite literally burning through 100's of billions of dollars and selling it back to you for pennies on the dollar in the hopes that they get to be the only one left.

If I spend $10 growing Oranges and sell them to you for $1; then of course it's more expensive for you to do the growing.

I feel like I'm taking crazy pills. These models will become more expensive over time, it's functionally impossible for them not to, they just want to capture the market before they have to stop selling at a huge loss.

[0]: https://news.ycombinator.com/item?id=48168433


Replies

vanviegentoday at 2:15 PM

That seems unlikely. There are many providers for open models on openrouter. It seems unlikely that they are throwing money away for each token they sell.

Also, there a good technical reasons for inference being much more efficient at scale.

show 1 reply
brianwawoktoday at 2:32 PM

So many more efficiencies possible at scale though. I cannot keep a local model 98% utilized 24/7, at least not with my current workload. A big cloud can. I can’t power my servers with DC, I have this AC to DV conversion nonsense. The list goes on.

show 1 reply
NicuCalceatoday at 2:17 PM

The blog compares the cost of running Gemma4 31b, which on OpenRouter is offered by small no-name inference providers, not by frontier AI companies. It seems like a fair comparison to me.

OsrsNeedsf2Ptoday at 2:31 PM

The models have been dropping 10x in price for completing the same tasks, year over year. Even if you think Anthropic is losing money charging 10x more than everyone else for their 400B model, the prices will continue to go down based on model improvement alone

ianberdintoday at 2:18 PM

Do you have a proof? Anthropic’s CEO said they Are profitable. Same with OpenAI.

show 4 replies
visargatoday at 2:57 PM

> Frontier AI companies are selling at a loss.

There are huge economies to be had by batching requests and using lots of RAM for MoE (sparse models). You can't achieve that efficiency at batch size 1 on a single node.

show 1 reply
tempest_today at 2:37 PM

It is the model training that is dragging them down.

If the arms race stopped tomorrow the current price pays for the inference.

show 1 reply
vlovich123today at 2:22 PM

Except that’s not what the analysis is. They’re spending < $1 to get $1 from you and the other $9 to figure out how to improve the model further and build up products on top of that to turn that $1 spend into $5 in the future.

In other words, inference is fairly profitable for them and the rest of the money is spent growing revenue as quickly as possible. Building models is still an expensive line item but the costs for that are going down with time.

There is also maybe a “capture the market” mentality but I don’t think that’s necessarily it - the tools and processes are largely fungible and that’s a huge problem. They need to figure out how to make it sticky for “capture the market”, but there’s also a very real “grow as big as possible as quickly as possible to take on Google”; Google has an existential threat here.

poly2ittoday at 2:10 PM

Well, I'd be surprised if non-R&D inference providers were selling at a loss. There are a plethora to choose from, competition is quite healthy. Will they keep providing cheap tokens while the labs raise their prices? Probably, but then I don't see how they could be raised in the first place. And what timescale are you talking about? A couple of years? It is appropriate to assume inference will become more efficient over time. If you raise your prices, you are going to be out competed before it's profitable (if you assume it is unprofitable) which would be negligent. I don't see how this makes sense.

throwatdem12311today at 3:16 PM

The Michael Scott AI Companies.

EGregtoday at 2:23 PM

These models will become more expensive over time, it's functionally impossible for them not to, they just want to capture the market before they have to stop selling at a huge loss.

They could have said the same about transistors. People keep inventing new ways to keep the costs down. Just look at the latest Qwen, DeepSeek, BitNet. Interesting tidbit: they’re all open, and as Google said in 2022: they have no moat.

MattRixtoday at 2:17 PM

The inference is absolutely not sold at a loss, at least not when paying API prices (the subscriptions are less clear). The reason frontier model companies aren’t profitable is because training the models is so costly, not inference.

MuffinFlavoredtoday at 2:16 PM

> Frontier AI companies are selling at a loss.

How big/deep of a loss?

I feel like I read this every day for years that Uber did this same "idiotic, losing" strategy (how it was pitched/discussed) and then one day we woke up and... without much fuss, boom, they were profitable seemingly overnight.

show 3 replies
ajrosstoday at 2:19 PM

> I feel like I'm taking crazy pills.

Why? It's no less crazy than when Uber and Lyft were doing the same thing. Or when the entire tech industry was doing it in the dot com boom.

Investment-driven market growth at a loss is like the least surprising thing in all of this. The tech is new and fascinating. The bubble is just another trip through the funhouse.