logoalt Hacker News

jandrewrogerstoday at 1:31 AM1 replyview on HN

Single data sets surpassed 2^64 bytes over a decade ago. This creates fun challenges since just the metadata structures can't fit in the RAM of the largest machines we build today.


Replies

jasonwatkinspdxtoday at 1:53 AM

Virtualization has pushed back the need for a while, but we are going to have to look at pointers larger than 64 bit at some point. It's also not just about the raw size of datasets, but how we get a lot of utility out of various memory mapping tricks, so we consume more address space than the strict minimum required by the dataset. Also if we move up to 128 bit a lot more security mitigations become possible.

show 2 replies