logoalt Hacker News

moebrowneyesterday at 12:00 PM1 replyview on HN

https://platform.openai.com/docs/guides/prompt-caching#requi...

> Caching is available for prompts containing 1024 tokens or more.

No mention of caching being in blocks of 1024 tokens thereafter.


Replies

IanCalyesterday at 8:45 PM

At launch it was described as being in blocks of 128

https://openai.com/index/api-prompt-caching/