If this claim is true (inference is priced below cost), it makes little sense that there are tens of small inference providers on OpenRouter. Where are they getting their investor money? Is the bubble that big?
Incidentally, the hardware they run is known as well. The claim should be easy to check.
To be clear, I'm talking about subscription pricing. API pricing for Anthropic is probably at-cost.
I dare you to run CC on API pricing and see how much your usage actually costs.
(We did this internally at work, that's where my "few orders of magnitude" comment above comes from)