logoalt Hacker News

vlovich12301/22/20253 repliesview on HN

Not quite. In 2 years their revenue has ~20x from 200M ARR to 3.7B ARR. The inference costs I believe pay for themselves (in fact are quite profitable). So what they're putting on their investor's credit cards are the costs of employees & model training. Given it's projected to be a multi-trillion dollar industry and they're seen as a market leader, investors are more than happy to throw in interest free cash flow now in exchange for variable future interest in the form of stocks.

That's not quite the same thing at all as your credit card's revenue stream as you have a ~18%+ monthly interest rate on that revenue stream. If you recall AMZN (& all startups really) have this mode early in their business where they're over-spending on R&D to grow more quickly than their free cash flow otherwise allows to stay ahead of competition and dominate the market. Indeed if investors agree and your business is actually strong, this is a strong play because you're leveraging some future value into today's growth.


Replies

lukev01/22/2025

All well and good, but how well will it work if the pattern continues that the best open models are less than a year behind what OpenAI is doing?

How long can they maintain their position at the top without the insane cashflow?

show 1 reply
hfcbb01/22/2025

Platform economics "works" in theory only upto a point. Its super inefficient if you zoom out and look not at system level but ecosystem level. It hasn't lasted long enough to hit failure cases. Just wait a few years.

As to openai, given deepseek and the fact lot of use cases dont even need real time inference its not obvious this story will end well.

show 1 reply
vFunct01/22/2025

Have they built their own ASICs for inference like Google and Microsoft have? Or are they using NVIDIA chips exclusively for inference as well?

show 1 reply