Really interesting thanks! I guess my use-case would rather require incremental updates
Ideally it would just sync in real-time and buffer new data in the Bemi binary (with some WAL-like storage to make sure data is preserved on binary crash/reload), and when it has enough, push them on S3, etc
Is this the kind of approach you're going to take?
Yes, we want to use the approach like you described! We'll likely wait until enough changes are accumulated by using 2 configurable thresholds: time (like 30s) and size (like 100MB)