It really depends on the size of the active set. If it fits into RAM of whatever server you are using, then it's not a problem at all, even with completely off-the-shelf hardware and software. Slap two 40gig NICs in it, install Varnish or whatever and you're good to go. (This is, of course, assuming that you have someone willing to pay for the bandwidth out to your users!)
If you need to go to disk to serve large parts of it, it's a different beast. But then again, Netflix was doing 800gig already three years ago (in large part from disk) and they are handicapping themselves by choosing an OS where they need to do significant amounts of the scaling work themselves.
I'm sure the server hardware is not a problem. The full dataset is 150 GB and the server has 64 GB RAM, most of which will be never requested. So I'm sure that the used tiles would actually get served from OS cache. If not, it's on a RAID 0 NVME SSD, connected locally.
What I've been referring to is the fact that even unlimited 1 Gbps connections can be quite expensive, now try to find a 2x40 gig connection for a reasonable money. That one user generated 200 TB in 24 hours! I have no idea about bandwidth pricing, but I bet it ain't cheap to serve that.