> And then memory expanded so much that all kinds of “optimal” patterns for programming just become nearly irrelevant.
I don't think that ever happened. Using relatively sparse amount of memory turns into better cache management which in turn usually improves performance drastically.
And in embedded stuff being good with memory management can make the difference between 'works' and 'fail'.
It obviously never became completely irrelevant. But I think programmers spend a lot less time thinking about memory than they used to. People used to do a lot of gymnastics and crazy optimizations to fit stuff into memory. I do quite a bit of embedded programming and most of the time it seems easier for me to simply upgrade the MCU and spend 10cents more (or whatever) than to make any crazy optimimzations. But of course there are still cases where it makes sense.
The need to use optimal patterns didn't go away, but the techniques certainly did. Just as a quick example, it's usually a bad idea now to use lookup tables to accelerate small math workloads. The lookup table creates memory pressure on the cache, which ends up degrading performance on modern systems. Back in the 1980s, lookup tables were by far the dominant technique because math was *slow.*