logoalt Hacker News

toasty228today at 2:32 PM2 repliesview on HN

> The demand for AI is currently overwhelming.

Wait until they charge the real pice, if I sold a dollar for 10ct I'd also have a lot of demand.

I'm burning billions of tokens on chatgpt "deepresearch Pro extended" for things I wouldn't even bother googling, the second I have to pay even 2x the price I won't use that anymore


Replies

vanuatutoday at 2:46 PM

I hear this analogy (selling a dollar for 10ct) but it's unclear to me how we can cleanly map intelligence to cents.

If the LLM was GPT-1, most people wouldn't even use it for free. So clearly there's another axis here?

show 1 reply
ls612today at 2:59 PM

The estimates I've seen are that running inference at scale on a Deepseek V3 sized model (so 700B parameters) costs roughly $0.70/mtok or so given current H100 rental costs. Sonnet charges $15/mtok on the API so the delta between the true cost and the API cost is quite large, to the point where even many subscription users are likely profitable.