logoalt Hacker News

mrweaseltoday at 10:17 AM7 repliesview on HN

It's probably just me being out of touch, but I don't think the GeForce RTX 4000 or 5000 series really mattered/matters that much.

At the same time I'd add the S3 ViRGE and the Matrox G200. Both mattered a lot at the time, but not long term.


Replies

mizzacktoday at 11:51 AM

Or the S3 Savage3D, which, while being inferior to the TNT2, pioneered texture compression.

https://en.wikipedia.org/wiki/S3_Texture_Compression

show 5 replies
flohofwoetoday at 3:09 PM

> S3 ViRGE and the Matrox G200

Both were only really famous for how terrible they were though. I think the S3 Virge might even qualify as 3D decelerator ;)

show 1 reply
gen2braintoday at 2:22 PM

I remember there was a kernel module for the Matrox/MPlayer combination. You get a new device that MPlayer could use. You did get `-vo mga` for the console and `-vo xmga` for X11; you couldn't tell the difference, and both produced high-quality hardware YUV output.

whizztertoday at 11:35 AM

Recency bias probably, Iirc I think the 3000 and 4000 series did make significant improvements on RTX performance so compared to the 2000 series it's far more useful today.

show 2 replies
PunchyHamstertoday at 11:54 AM

G200 Matrox GPUs came integrated with servers for absolute ages,like past 2010's

cubefoxtoday at 1:12 PM

This is an ad from viral marketing company and everyone here is falling for it.

show 1 reply
formerly_proventoday at 10:30 AM

The G200 mattered to some degree for a long time, because most x86 servers up until a few years ago would ship a G200 implementation or at least something pretending to be a G200 card as part of their BMC for network KVM.

show 2 replies