logoalt Hacker News

zozbot234today at 11:30 AM1 replyview on HN

> Back in the 1980s, lookup tables were by far the dominant technique because math was slow.

This actually generalizes in a rather clean way: compared to the 1980s, you now want to cheaply compress data in memory and use succinct representations as much as practicable, since the extra compute involved in translating a more succinct representation into real data is practically free compared to even one extra cacheline fetch from RAM (which is now hundreds of cycles latency, and in parallel code often has surprisingly low throughput).


Replies

QuadmasterXLIItoday at 12:10 PM

It’s a mad word where ultimate performance in one problem can require compressing data in ram and in another storing it uncompressed on disc.

show 1 reply