Yeah.. late 90's to early 00's was pretty peak builder era. I learned through a relative fluke how much going a bit over the top on memory and faster drives really helped over even a faster CPU/GPU a lot of the time for general use. My current computer is literally the first I've built in decades that I didn't max out the RAM... I mean, I kind of did as 2x48gb was the most I could get in DDR5@6000 and only use 2 slots (or it would run much slower).
4th gen Core series was the longest I'd held onto a single PC (close to 5 years total for a 4790K). I did a mid-cycle gpu and nvme upgrade and that was it. I bumped to a 3950X/5950X and now 9950X since... AM3 is really the first socket in a long time I'd done an in-place upgrade for any CPU. My daughter's Ryzen 2400 to a 5000 series, and my own build from a 3600 -> 3950X -> 5950X... the 3600 was a placeholder as I couldn't get a 3950X for a few months.
I couldn't even name half the CPUs I ran from 1998 to 2005 or so... it was such a blur of upgrades every 6-12 months... I'd upgrade my computer, my wife's, my son's... etc. Then, things just completely stagnated... I mean there's been progress, but it's over the course of years, not seeing 2-3x in under a year.
I remember that it started to stall out on RAM before CPU (in that it became "reasonable" to have way more RAM than you really needed - Chrome didn't exist yet lol); that the very early move to multi-core was a bit of a downer (much couldn't use more than 2 or 3 cores so a "new CPU" with 4 instead of 2 cores but the same single-thread performance would be hardly noticeable).
Then of course there was the huge "replace everything with SSDs ASAP" performance bump, but ever since the later Core and before the M1, everything felt incremental. Nothing like the "Wolfenstein 3D to Quake Glide in 5 years" era.
Holy shit it was only 5 years - the M1 was released 6 years ago!