I always assumed that with inference being so cheap, my subscription fees were paying for training costs, not inference.
Anthropic and OpenAI are both well documented as losing billions of dollars a year because their revenue doesn't cover their R&D and training costs, but that doesn't mean their revenue doesn't cover their inference costs.
Is inference really that cheap? Why can't I do it at home with a reasonable amount of money?
Doubtful
Anthropic and OpenAI are both well documented as losing billions of dollars a year because their revenue doesn't cover their R&D and training costs, but that doesn't mean their revenue doesn't cover their inference costs.