It's probably just me being out of touch, but I don't think the GeForce RTX 4000 or 5000 series really mattered/matters that much.
At the same time I'd add the S3 ViRGE and the Matrox G200. Both mattered a lot at the time, but not long term.
> S3 ViRGE and the Matrox G200
Both were only really famous for how terrible they were though. I think the S3 Virge might even qualify as 3D decelerator ;)
I remember there was a kernel module for the Matrox/MPlayer combination. You get a new device that MPlayer could use. You did get `-vo mga` for the console and `-vo xmga` for X11; you couldn't tell the difference, and both produced high-quality hardware YUV output.
Recency bias probably, Iirc I think the 3000 and 4000 series did make significant improvements on RTX performance so compared to the 2000 series it's far more useful today.
G200 Matrox GPUs came integrated with servers for absolute ages,like past 2010's
This is an ad from viral marketing company and everyone here is falling for it.
The G200 mattered to some degree for a long time, because most x86 servers up until a few years ago would ship a G200 implementation or at least something pretending to be a G200 card as part of their BMC for network KVM.
Or the S3 Savage3D, which, while being inferior to the TNT2, pioneered texture compression.
https://en.wikipedia.org/wiki/S3_Texture_Compression