logoalt Hacker News

mattaitkentoday at 2:17 PM2 repliesview on HN

This is cool. I think for our use case this wouldn’t work. We’re dealing with billions of rows for some tenants.

We’re about to introduce alerts where users can write their own TRQL queries and then define alerts from them. Which requires evaluating them regularly so effectively the data needs to be continuously up to date.


Replies

SOLAR_FIELDStoday at 7:05 PM

Billions still seems crunchable for DDB. It’s however much you can stuff into your RAM no? Billions is still consumer grade machine RAM depending on the data. Trillions I would start to worry. But you can have a super fat spot instance where the crunching happens and expose a light client on top of that then no?

Quadrillions, yeah go find yourself a trino spark pipeline