In this particular case, inference and training are intertwined. It might be one thing if Anthropic could get away with training a new model every five years and control costs that way. But they can't. Put another way, their inference has no value without continuous, very expensive training. Because consumers aren't purchasing based on price but capability, otherwise the Chinese models on OpenRouter would have buried OpenAI and Anthropic already.