logoalt Hacker News

Aurornisyesterday at 10:44 PM0 repliesview on HN

I'm referring to hosted models such as via OpenRouter or from the model providers' own services.

I think everyone making claims that inference is getting more expensive are unaware that there are more LLM providers than Google, Anthropic, and OpenAI.