logoalt Hacker News

throwaway203701/22/20251 replyview on HN

    > they (and MS) are recouping inference costs from user subscription and API revenue with a healthy operating margin.
I tried to Google for more information. I tried this search: <<is openai inference profitable?>>

I didn't find any reliable sources about OpenAI. All sources that I could find state this is not true -- inference costs are far higher than subscription fees.

I hate to ask this on HN... but, can you provide a source? Or tell us how do you know?


Replies

manquer01/22/2025

I don't have any qualified source and this metric would be likely be quite confidential even internally.

It is just an educated guess factoring costs of running similar/comparable models to 4o or 4o-mini per token, and how azure commitments work with OpenAI models[2], also knowing that Plus subscriptions are probably more profitable[1] than API calls.

It would be hard for even OpenAI to know with any certainty because they are not paying for Azure credits like a normal company. The costs are deeply intertwined with Azure and would be hard to split given the nature of the MS relationship[3]

----

[1] This is from experience of running LibreChat using 4o versus ChatGPT Plus for ~200 users, subscriptions should quite profitable than raw API by a order of 3 to 4x, of course different types of users and adoption levels will be there my sample while not small is not likely representative of their typical user base.

[2] MS has less incentive to subsidize than say OpenAI themselves

[3] Azure is quite profitable in the aggregate, while possibly subsidizing OpenAI APIs, any such subsidy has not shown up meaningfully in Microsoft financial reports.

show 1 reply