logoalt Hacker News

ffsm8yesterday at 5:42 PM1 replyview on HN

But with open router you can always just use the latest model. If you're committed to eg Claude opus then you're better off going directly to anthropic for sure, but if not, varying other models may be fine too, depending on use case and be massively cheaper. Eg new deep seek model with same mio context window or Kimi k2.6 with 270k context window for subagents which implement


Replies

gruezyesterday at 7:00 PM

>but if not, varying other models may be fine too, depending on use case and be massively cheaper

Do inference providers have standardized endpoints, or at least endpoints compatible with claude code? Otherwise to pay 5.5% on all your tokens just so it's slightly easier to swap providers (ie. changing a few urls?)

show 1 reply