High end GPU has over the last 5 years slowly turning from an enthusiast product into a luxury product.
5 or maybe 10 years ago, high-end GPU are needed to run games at reasonably eye candy setting. In 2025, $500 mid-range GPUs are more than enough. Folks all over can barely tell between High and Ultra settings, DLSS vs FSR, or DLSS FG and Lossless Scaling. There's just no point to compete at $500 price point any more, that Nvidia has largely given up and relegating to the AMD-built Consoles, and integrated graphics like AMD APU, that offer good value in low-end, medium-end, and high-end.
Maybe the rumored Nvidia PC, or the Switch 2, can bring some resurgence.
10 years ago, $650 would buy you a top-of-the-line gaming GPU (GeForce GTX 980 Ti). Nowadays, $650 might get you a mid-range RX 9070 XT if you miraculously find one near MSRP.
I bought a new machine with an RTX 3060 Ti back in 2020 and it's still going strong, no reason to replace it.
Absolutely right, only AAA games get to showcase the true power of GPUs.
For cheaper guys like me, I'll just give my son indie and low graphic games which he enjoys
I think this is the even broader trend here
In their never ending quest to find ways to suck more money out of people, one natural extension is to just turn the thing into a luxury good and that alone seems to justify the markup
This is why new home construction is expensive - the layout of a home doesn’t change much but it’s trivial to throw on some fancy fixtures and slap the deluxe label on the listing.
Or take a Toyota, slap some leather seats on it, call it a Lexus and mark up the price 40% (I get that these days there are more meaningful differences but the point stands)
This and turning everything into subscriptions alone are responsible for 90% of the issues I have as a consumer
Graphics cards seem to be headed in this direction as well - breaking through that last ceiling for maximum fps is going to be like buying a bentley (if it isn’t already) where as before it was just opting for the v8
Just going to focus on this one:
> DLSS vs FSR, or DLSS FG and Lossless Scaling.
I've used all of these (at 4K, 120hz, set to "balanced") since they came out, and I just don't understand how people say this.
FSR is a vaseline-like mess to me, it has its own distinct blurriness. Not as bad as naive upscaling, and I'll use it if no DLSS is available and the game doesn't run well, but it's distracting.
Lossless is borderline unusable. I don't remember the algorithm's name, but it has a blur similar to FSR. It cannot handle text or UI elements without artifacting (because it's not integrated in the engine, those don't get rendered at native resolution). The frame generation causes almost everything to have a ghost or afterimage - UI elements and the reticle included. It can also reduce your framerate because it's not as optimized. On top of that, the way the program works interferes with HDR pipelines. It is a last resort.
DLSS (3) is, by a large margin, the best offering. It just works and I can't notice any cons. Older versions did have ghosting, but it's been fixed. And I can retroactively fix older games by just swapping the DLL (there's a tool for this on GitHub, actually). I have not tried DLSS 4.
Not quite $500, but at $650, the 9070 is an absolute monster that outperforms Nvidia's equivalent cards in everything but ray tracing (which you can only turn on with full DLSS framegen and get a blobby mess anyways)
AMD is truly making excellent cards, and with a bit of luck UDNA is even better. But they're in the same situation as Nvidia: they could sell 200 GPUs, ship drivers, maintain them, deal with returns and make $100k... Or just sell a single MI300X to a trusted partner that won't make any waves and still make $100k.
Wafer availability unfortunately rules all, and as it stands, we're lucky neither of them have abandoned their gaming segments for massively profitable AI things.
The fact that we're calling $500 GPUs "midrange" is proof that Nvidia's strategy is working.