https://platform.openai.com/docs/guides/prompt-caching#requi...
> Caching is available for prompts containing 1024 tokens or more.
No mention of caching being in blocks of 1024 tokens thereafter.
At launch it was described as being in blocks of 128
https://openai.com/index/api-prompt-caching/
At launch it was described as being in blocks of 128
https://openai.com/index/api-prompt-caching/