logoalt Hacker News

XenophileJKOyesterday at 10:47 PM0 repliesview on HN

I would actually like to see the real math currently.

The market adoption has increased a lot. The cost to serve has come down a lot per token.

Model sizes have not increased exponentially recently (The high point being the aborted GPT-4.5), most refinement recently seems to be extending training on relatively smaller models.

When you take this into account together, the relative training to inference income/cost ratio likely has actually changed dramatically.