logoalt Hacker News

iamjackgyesterday at 9:35 PM2 repliesview on HN

It's not unsolved, at least not the first part of your question. In fact it is a feature offered by all main LLM providers!

- https://platform.openai.com/docs/guides/prompt-caching

- https://platform.claude.com/docs/en/build-with-claude/prompt...

- https://ai.google.dev/gemini-api/docs/caching


Replies

imiricyesterday at 9:43 PM

Ah, that's good to know, thanks.

But then why is there compounding token usage in the article's trivial solution? Is it just a matter of using the cache correctly?

show 1 reply
igraviousyesterday at 11:17 PM

dumb question, but is prompt caching available to Claude Code … ?

show 1 reply