Did a quick cost calc (with the help of gpt5, so might be wrong) when I read their comment about Pages not being suitable for this many files.
They say they're receiving $500/mo in donos and that it's currently just enough to cover their infra costs. Given 300 million 70 KB files, R2 + high cache hit ratio would work out to about $300 in storage-months + request costs, or $600/mo with Cache Reserve and then they'd always hit cache if I understand the project right: meaning the costs shouldn't blow up beyond that, and that request count would essentially just not matter.
Did a quick cost calc (with the help of gpt5, so might be wrong) when I read their comment about Pages not being suitable for this many files.
They say they're receiving $500/mo in donos and that it's currently just enough to cover their infra costs. Given 300 million 70 KB files, R2 + high cache hit ratio would work out to about $300 in storage-months + request costs, or $600/mo with Cache Reserve and then they'd always hit cache if I understand the project right: meaning the costs shouldn't blow up beyond that, and that request count would essentially just not matter.