logoalt Hacker News

Nvidia won, we all lost

878 pointsby todsacerdotilast Friday at 9:58 PM522 commentsview on HN

Comments

__turbobrew__last Friday at 11:27 PM

> With over 90% of the PC market running on NVIDIA tech, they’re the clear winner of the GPU race. The losers are every single one of us.

I have been rocking AMD GPU ever since the drivers were upstreamed into the linux kernel. No regrets.

I have also realized that there is a lot out there in the world besides video games, and getting all in a huff about it isn’t worth my time or energy. But consumer gotta consoooooom and then cry and outrage when they are exploited instead of just walking away and doing something else.

Same with magic the gathering, the game went to shit and so many people got outraged and in a big huff but they still spend thousands on the hobby. I just stopped playing mtg.

show 18 replies
Arainachyesterday at 1:52 PM

Why was the title of this post changed long after posting to something that doesn't match the article title? This editorializing goes directly against HN Guidelines (but was presumably done by the HN team?)

show 5 replies
neuroelectronlast Friday at 11:21 PM

Seems a bit calculated and agreed across the industry. What can really make sense of Microsoft's acquisitions and ruining of billion dollar IPs? It's a manufactured collapse of the gaming industry. They want to centralize control of the market and make it a service based (rent seeking) sector.

I'm not saying they all got together and decided this together but their wonks are probably all saying the same thing. The market is shrinking and whether it's by design or incompetence, this creates a new opportunity to acquire it wholesale for pennies on the dollar and build a wall around it and charge for entry. It's a natural result of games requiring NVidia developers for driver tuning, bitcoin/ai and buying out capacity to prevent competitors.

The wildcard I can't fit into this puzzle is Valve. They have a huge opportunity here but they also might be convinced that they have already saturated the market and will read the writing on the wall.

show 8 replies
dagaciyesterday at 2:03 PM

Jenson has managed to kneel into every market boom in a reasonable amount of time with his GPUs and tech (hardware and software). No doubt he will be there when the next boom kicks off too.

Microsoft fails consistently ... even when offered a lead on the plate... it fails, but these failures are eventually corrected for by the momentum of its massive business units.

Apple is just very very late... but this failure can be eventually corrected for by its unbeatable astroturfing units.

Perhaps AMD are too small keep up everywhere it should. But compared to the rest, AMD is a fast follower. Why Intel is where it is is a mystery to me but i'm quite happy about its demise and failures :D

Being angry about NVIDIA is not giving enough credit to NVIDIA for being on-time and even leading the charge in the first place.

Everyone should remember that NVIDIA also leads into the markets that it dominates.

show 4 replies
strictneinyesterday at 12:14 AM

This really makes no sense:

> This in turn sparked rumors about NVIDIA purposefully keeping stock low to make it look like the cards are in high demand to drive prices. And sure enough, on secondary markets, the cards go way above MSRP

Nvidia doesn't earn more money when cards are sold above MSRP, but they get almost all the hate for it. Why would they set themselves up for that?

Scalpers are a retail wide problem. Acting like Nvidia has the insight or ability to prevent them is just silly. People may not believe this, but retailers hate it as well and spend millions of dollars trying to combat it. They would have sold the product either way, but scalping results in the retailer's customers being mad and becoming some other company's customers, which are both major negatives.

show 7 replies
cherioolast Friday at 10:56 PM

High end GPU has over the last 5 years slowly turning from an enthusiast product into a luxury product.

5 or maybe 10 years ago, high-end GPU are needed to run games at reasonably eye candy setting. In 2025, $500 mid-range GPUs are more than enough. Folks all over can barely tell between High and Ultra settings, DLSS vs FSR, or DLSS FG and Lossless Scaling. There's just no point to compete at $500 price point any more, that Nvidia has largely given up and relegating to the AMD-built Consoles, and integrated graphics like AMD APU, that offer good value in low-end, medium-end, and high-end.

Maybe the rumored Nvidia PC, or the Switch 2, can bring some resurgence.

show 7 replies
snittylast Friday at 11:29 PM

NVIDIA is, and will be for at least the next year or two, supply constrained. They only have so much capacity at TSMC for all the chips, and the lion's share of that is going to be going enterprise chips, which sell for an order of magnitude more than the consumer chips.

It's hard to get too offended by them shirking the consumer marker right now when they're printing money with their enterprise business.

show 4 replies
Kon5oleyesterday at 7:37 AM

TSMC can only make about as many Nvidia chips as OpenAI and the other AI guys wants to buy. Nvidia releases gpus made from basically the shaving leftovers from the OpenAI products, which makes them limited in supply and expensive.

So gamers have to pay much more and wait much longer than before, which they resent.

Some youtubers make content that profit from the resentment so they play fast and loose with the fundamental reasons in order to make gamers even more resentful. Nvidia has "crazy prices" they say.

But they're clearly not crazy. 2000 dollar gpus appear in quantities of 50+ from time to time at stores here but they sell out in minutes. Lowering the prices would be crazy.

show 2 replies
monster_trucklast Friday at 11:06 PM

Remember when nvidia got caught dropping 2 bits of color information to beat ati in benchmarks? I still can't believe anyone has trusted them since! That is an insane thing to do considering the purpose of the product.

For as long as they have competition, I will support those companies instead. If they all fail, I guess I will start one. My spite for them knows no limits

show 2 replies
leakycaplast Friday at 10:07 PM

This article goes much deeper than I expected, and is a nice recap of the last few years of "green" gpu drama.

Liars or not, the performance has not been there for me in any of my usecases, from personal to professional.

A system from 2017/2018 with an 8700K and an 8GB 2080 performs so closely to the top end, expensive systems today that it makes almost no sense to upgrade at MSRP+markup unless your system is older than this.

Unless you need specific features only on more recent cards, there are very few use cases I can think of needing more than a 30 series card right now.

show 2 replies
rkagereryesterday at 1:22 AM

I am a volunteer firefighter and hold a degree in electrical engineering. The shenanigans with their shunt resistors, and ensuing melting cables, is in my view criminal. Any engineer worth their salt would recognize pushing 600W through a bunch of small cables with no contingency if some of them have failed is just asking for trouble. These assholes are going to set someone's house on fire.

I hope they get hit with a class action lawsuit and are forced to recall and properly fix these products before anyone dies as a result of their shoddy engineering.

show 4 replies
ionwakelast Friday at 10:56 PM

I don’t want to jump on nvidia but I found it super weird when they clearly remote controlled a Disney bot onto the stage and claimed it was all using real time AI which was clearly impossible due to no latency and weirdly the bot verifying correct stage position in relation to the presenter. It was obviously the Disney bot just being controlled by someone off stage.

I found it super alarming because why would they fake something on stage to the extent of just lying.i know Steve jobs had backup phones but jsut claiming a robot is autonomous when it isn’t I just feel it was scammy.

It reminded me of when Tesla had remote controlled Optimus bots. I mean I think that’s awesome like super cool but clearly the users thought the robots were autonomous during that dinner party.

I have no idea why I seem to be the only person bothered by “stage lies” to this level. Tbh even the Tesla bots weren’t claimed to be autonomous so actually I should never have mentioned them but it explains the “not real” vibe.

Not meaning to disparage just explaining my perception as a European maybe it’s just me though!

EDIT > Im kinda suprised by the weak arguments in the replies, I love both companies, I am just offering POSITIVE feedback, that its important ( in my eyes ) to be careful not to pretend in certain specific ways or it makes the viewer question the foundation ( which we all know is SOLID and good ).

EDIT 2 >There actually is a good rebuttal in the replies, although apparently I have "reading comprehension skill deficiencies" its just my pov that they were insinuating the robot was aware of its surroundings, which is fair enough.

show 7 replies
kldgtoday at 1:38 AM

the big reason I upgrade GPUs these days is for more VRAM for LLMs and diffusion models. I don't care (or need to care, really) as much about gaming -- along with great Proton support, running things from a midrange Linux-based gaming PC I have shoved in my home server rack works great via Steam's Remote Play (NoMachine also pretty good), but I play strategy/spreadsheet games, not twitchy FPS games.

my most recent upgrade was for a 4090, but that gives me only 24GB VRAM, and it's too expensive to justify buying two of them. I also have an antique kepler datacenter GPU, but Nvidia cut driver support a long while ago, making software quite a pain to get sorted. there's a nonzero chance I will wind up importing a Moore Threads GPU for next purchase; Nvidia's just way too expensive, and I don't need blazing fast speeds given most of my workloads run well inside the time I'm sleeping, but I can't be running at the speed of CPU; I need everything to fit into VRAM. I'd alternately be stoked for Intel to cater to me. $1500, 48GB+ VRAM, good pytorch support; make it happen, somebody.

Nextgridlast Friday at 11:04 PM

I wonder if the 12VHPWR connector is intentionally defective to prevent large-scale use of those consumer cards in server/datacenter contexts?

The failure rate is just barely acceptable in a consumer use-case with a single card, but with multiple cards the probability of failure (which takes down the whole machine, as there's no way to hot-swap the card) makes it unusable.

I can't otherwise see why they'd persevere on that stupid connector when better alternatives exist.

show 4 replies
DeepYogurtyesterday at 3:44 AM

> And I hate that they’re getting away with it, time and time again, for over seven years.

Nvidia's been at this way longer than 7 years. They were cheating at benchmarks to control a narrative back in 2003. https://tech.slashdot.org/story/03/05/23/1516220/futuremark-...

hiAndrewQuinnyesterday at 3:49 PM

To anyone who remembers econ 101 it's hard to read something like "scalper bots scoop up all of the new units as soon as they're launched" and not conclude that Nvidia itself is simply pricing the units they sell too low.

ryaolast Friday at 10:48 PM

> The RTX 50 series are the second generation of NVIDIA cards to use the 12VHPWR connector.

This is wrong. The 50 series uses 12V-2x6, not 12VHPWR. The 30 series was the first to use 12VHPR. The 40 series was the second to use 12VHPWR and first to use 12V-2x6. The 50 series was the second to use 12V-2x6. The female connectors are what changed in 12V-2x6. The male connectors are identical between 12V-2x6 and 12VHPWR.

show 1 reply
mcdeltatyesterday at 4:22 AM

Anyone else getting a bit disillusioned with the whole tech hardware improvements thing? Seems like every year we get less improvement for higher cost and the use cases become less useful. Like the whole industry is becoming a rent seeking exercise with diminishing returns. I used to follow hardware improvements and now largely don't because I realised I (and probably most of us) don't need it.

It's staggering that we are throwing so many resources at marginal improvements for things like gaming, and I say that as someone whose main hobby used to be gaming. Ray tracing, path tracing, DLSS, etc at a price point of $3000 just for the GPU - who cares when a 2010 cell shaded game running on an upmarket toaster gave me the utmost joy? And the AI use cases don't impress me either - seems like all we do each generation is burn more power to shove more data through and pray for an improvement (collecting sweet $$$ in the meantime).

Another commenter here said it well, there's just so much more you can do with your life than follow along with this drama.

show 4 replies
reichsteinyesterday at 7:04 AM

Aks. "Every beef anyone has ever had with Nvidia in one outrage friendly article."

If you want to hate on Nvidia, there'll be something for you in there.

An entire section on 12vhpwr connectors, with no mention of 12V-2x6.

A lot of "OMG Monopoly" and "why won't people buy AMD" without considering that maybe ... AMD cards are not considered by the general public to be as good _where it counts_. (Like benefit per Watt, aka heat.) Maybe it's all perception, but then AMD should work on that perception. If you want the cooler CPU/GPU, perception is that that's Intel/Nvidia. That's reason enough for me, and many others.

Availability isn't great, I'll admit that, if you don't want to settle for a 5060.

porphyralast Friday at 11:21 PM

The article complains about issues with consumer GPUs but those are nowadays relegated to being merely a side hobby project of Nvidia, whose core business is enterprise AI chips. Anyway Nvidia still has no significant competition from AMD on either front so they are still getting away with this.

Deceptive marketing aside, it's true that it's sad that we can't get 4K 60 Hz with ray tracing with current hardware without some kind of AI denoising and upscaling, but ray tracing is really just _profoundly_ hard so I can't really blame anyone for not having figured out how to put it in a consumer pc yet. There's a reason why pixar movies need huge render farms that take lots of time per frame. We would probably sooner get gaussian splatting and real time diffusion models in games than nice full resolution ray tracing tbh.

show 1 reply
tom_mtoday at 12:15 AM

Know what really kicks me in the nuts? Stupid kid me didn't buy Nvidia when I told my father to back in like 2002 for $16 or something. He did. And holds it until this day. Fortunately that means taking care of him is easier haha, but dang I should have gotten some too.

Dylan16807yesterday at 1:39 AM

> The competing open standard is FreeSync, spearheaded by AMD. Since 2019, NVIDIA also supports FreeSync, but under their “G-Sync Compatible” branding. Personally, I wouldn’t bother with G-Sync when a competing, open standard exists and differences are negligible[4].

Open is good, but the open standard itself is not enough. You need some kind of testing/certification, which is built in to the G-Sync process. AMD does have a FreeSync certification program now which is good.

If you rely on just the standard, some manufacturers get really lazy. One of my screens technically supports FreeSync but I turned it off day one because it has a narrow range and flickers very badly.

yunyulast Friday at 11:04 PM

If you are a gamer, you are no longer NVIDIA's most important customer.

show 4 replies
liendolucasyesterday at 10:28 AM

I haven't read the whole article but a few things to remark:

* The prices for Nvidia GPUs are insane. For that money you can have an extremely good PC with a good non Nvidia GPU.

* The physical GPU sizes are massive, even letting the card rest on a horizontal motherboard looks like scary.

* Nvidia has still issues with melting cables? I've heard about those some years ago and thought it was a solved problem.

* Proprietary frameworks like CUDA and others are going to fall at some point, is just a matter of time.

Looks as if Nvidia at the moment is only looking at the AI market (which as a personal belief has to burst at some point) and simply does not care the non GPU AI market at all.

I remember many many years ago when I was a teenager and 3dfx was the dominant graphics card manufacturer that John Carmack profethically in a gaming computer magazine (the article was about Quake I) predicted that the future wasn't going to be 3dfx and Glide. Some years passed by and effectively 3dfx was gone.

Perhaps is just the beginning of the same story that happened with 3dfx. I think AMD and Intel have a huge opportunity to balance the market and bring Nvidia down, both in the AI and gaming space.

I have only heard excellent things about Intel's ARC GPUs in other HNs threads and if I need to build a new desktop PC from scratch there's no way to pay for the prices that Nvidia is pushing to the market, I'll definitely look at Intel or AMD.

show 1 reply
frollogastonlast Friday at 11:32 PM

Because they won't sell you an in-demand high-end GPU for cheap? Well TS

show 1 reply
nickdothuttonyesterday at 2:42 PM

It has been decades since I did any electronics, and even then only as a hobby doing self-build projects, but the power feed management (obviously a key part of such a high current and expensive component in a system) is shameful.

parketiyesterday at 6:16 PM

Here’s my take on video cards in general. I love NVIDIA cards for all out performance. You simply can’t beat them. And until someone does, they will not change. I have owned AMD and Intel cards as well and played mainly FPS games like Doim, Quake, Crysis, Medal of Honor, COD, etc. all of them perform better on NVIDIA. But I have noticed a change.

Each year those performance margins seem to narrow. I paid $1000+ dollars for my RTX 4080 Super. That’s ridiculous. No video card should cost over $1000. So the next time I “upgrade,” it won’t be NVIDIA. I’ll probably go back to AMD or Intel.

I would love to see Intel continue to develop video cards that are high performance and affordable. There is a huge market for those unicorns. AMDs model seems to be slightly less performance for slightly less money. Intel on the other hand is offering performance on par with AMD and sometimes NVIDIA for far less money - a winning formula.

NVIDIA got too greedy. They overplayed their hand. Time for Intel to focus on development and fill the gaping void of price for performance metrics.

Nifty3929yesterday at 4:06 PM

I just don't think NVidia cares all that much about it's gaming cards, except to the extent that they don't want to cede too much ground to AMD and basically preserve their image in that market for now. Basically they don't want to lose their legions of gaming fans that got them started, and who still carry the torch. But they'll produce the minimum number of gaming cards needed to accomplish that.

Otherwise the money is in the datacenter (AI/HPC) cards.

snarfyyesterday at 12:41 PM

I'm a gamer and love my AMD gpu. I do not give a shit about ray tracing, frame generation, or 4k gaming. I can play all modern fps at 500fps+. I really wish the market wasn't so trendy and people bought what worked for them.

show 1 reply
trichecoyesterday at 7:41 PM

> The RTX 4090 was massive, a real heccin chonker

Every line of the article convinces me I'm reading bad rage bait, every comment in the thread confirms it's working.

The article provides a nice list of grievances from the "optimized youtube channel tech expert" sphere ("doink" face and arrow in the thumbnail or GTFO), and none of them really stick. Except for the part where nVidia is clearly leaving money on the table... From 5080 up no one can compete, with or without "fake frames", at no price, I'd love to take the dividends on the sale of the top 3 cards, but that money is going to scalpers.

If nvidia is winning, it's because competitors and regulators are letting them.

fracusyesterday at 2:11 AM

This was an efficient, well written, TKO.

show 1 reply
voxleoneyesterday at 1:07 AM

It’s reasonable to argue that NVIDIA has a de facto monopoly in the field of GPU-accelerated compute, especially due to CUDA (Compute Unified Device Architecture). While not a legal monopoly in the strict antitrust sense (yet), in practice, NVIDIA's control over the GPU compute ecosystem — particularly in AI, HPC, and increasingly in professional content creation — is extraordinarily dominant.

show 3 replies
xgkicktyesterday at 10:15 PM

AMD’s openness has been a positive in the games industry. I only wish they too made ARM based APUs.

scrubsyesterday at 12:36 AM

Another perspective: Nvidia customer support on their mellanox purchase ...is total crap. It's the worst of corporate America ... paper pushing beurceatric guys who slow roll stuff ... getting to a smart person behind the customer reps requires one to be an ape in a bad mood 5x ... I think they're so used to that now that unless you go crazy mode their take is ... well I guess he wasn't serious about his ask and he dropped it.

Here's another nvdia/mellanox bs problem: many mlx nic cards are finalized or post assembled say by hp. So if you have a hp "mellanox" nic nvidia washes their hands of anything detailed. It's not ours; hp could have done anything to it what do we know? So one phones hp ... and they have no clue either because it's really not their IP or their drivers.

It's a total cluster bleep and more and more why corporate america sucks

show 2 replies
yalokyesterday at 4:52 AM

a friend of mine is a SW developer in Nvidia, working on their drivers. He was complaining lately that he is required to fix a few bugs in the drivers code for the new card (RTX?), while not provided with the actual hardware. His pleas to send him this HW were ignored, but the demand to fix by a deadline kept being pushed.

He actually ended up buying older but somewhat similar used hardware with his personal money, to be able to do his work.

Not even sure if he was eventually able to expense it, but wouldn't be surprised if not, knowing how big companies bureaucracy works...

musebox35yesterday at 6:41 AM

With the rise of LLM training, Nvidia’s main revenue stream switched to datacenter gpus (>10x gaming revenue). I wonder whether this have affected the quality of these consumer cards, including both their design and product processes:

https://stockanalysis.com/stocks/nvda/metrics/revenue-by-seg...

mrkrameryesterday at 2:38 PM

Probably the next big thing will be Chinese GPUs that are the same quality as NVIDIA GPUs but at least 10-20% cheaper aaand we will have to wait for that maybe 5-10 years.

TimParker1727yesterday at 6:16 PM

Here’s my take on video cards in general. I love NVIDIA cards for all out performance. You simply can’t beat them. And until someone does, they will not change. I have owned AMD and Intel cards as well and played mainly FPS games like Doim, Quake, Crysis, Medal of Honor, COD, etc. all of them perform better on NVIDIA. But I have noticed a change.

Each year those performance margins seem to narrow. I paid $1000+ dollars for my RTX 4080 Super. That’s ridiculous. No video card should cost over $1000. So the next time I “upgrade,” it won’t be NVIDIA. I’ll probably go back to AMD or Intel.

I would love to see Intel continue to develop video cards that are high performance and affordable. There is a huge market for those unicorns. AMDs model seems to be slightly less performance for slightly less money. Intel on the other hand is offering performance on par with AMD and sometimes NVIDIA for far less money - a winning formula.

NVIDIA got too greedy. They overplayed their hand. Time for Intel to focus on development and fill the gaping void of price for performance metrics.

FeepingCreaturelast Friday at 11:39 PM

Oh man, you haven't gotten into their AI benchmark bullshittery. There's factors of 4x on their numbers that are basically invented whole cloth by switching units.

show 1 reply
Ancapistaniyesterday at 12:20 AM

I disagree with some of the article’s points - primarily, that nVidia’s drivers were ever “good” - but the gist I agree with.

I have a 4070 Ti right now. I use it for inference and VR gaming on a Pimax Crystal (2880x2880x2). In War Thunder I get ~60 FPS. I’d love to be able to upgrade to a card with at least 16GB of VRAM and better graphics performance… but as far as I can tell, such a card does not exist at any price.

zoobabyesterday at 2:57 PM

Not enough VRAM to load big LLMs, in order not to compète with their expensive high end. Market segmentation it's called.

spoaceman7777yesterday at 12:55 AM

The real issue here is actually harebrained youtubers stirring up drama for views. That's 80% of the problem. And their viewers (and readers, for that which makes it into print) eat it up.

Idiots doing hardware installation, with zero experience, using 3rd party cables incorrectly, posting to social media, and youtubers jumping on the trend for likes.

These are 99% user error issues drummed up by non-professionals (and, in some cases, people paid by 3rd party vendors to protect those vendors' reputation).

And the complaints about transient performances issues with drivers, drummed up into apocalyptics scenarios, again, by youtubers, who are putting this stuff under a microscope for views, are universal across every single hardware and software product. Everything.

Claiming "DLSS is snakeoil", and similar things are just an expression of the complete lack of understanding of the people involved in these pot-stirring contests. Like... the technique obviously couldn't magically multiply the ability of hardware to generate frames using the primary method. It is exactly as advertised. It uses machine learning to approximate it. And it's some fantastic technology, that is now ubiquitous across the industry. Support and quality will increase over time, just like every _quality_ hardware product does during its early lifespan.

It's all so stupid and rooted in greed by those seeking ad-money, and those lacking in basic sense or experience in what they're talking about and doing. Embarrassing for the author to so publicly admit to eating up social media whinging.

show 2 replies
PoshBreezeyesterday at 5:49 AM

> The RTX 4090 was massive, a real heccin chonker. It was so huge in fact, that it kicked off the trend of needing support brackets to keep the GPU from sagging and straining the PCIe slot.

This isn't true. People were buying brackets with 10 series cards.

aviparsyesterday at 6:10 PM

If only, NVIDIA could use their enterprise solution on consumer hardware.

tonyhart7yesterday at 6:27 AM

Consumer GPU feels like an "paper launch" for the past years

that's like they purposely not selling because they allocated 80% of their production to enterprise only

I just hope that new fabs operate early as possible because these price is insane

jes5199yesterday at 1:55 AM

with Intel also shitting the bed, it seems like AMD is poised to pick up “traditional computing” while everybody else runs off to chase the new gold rush. Presumably there’s still some money in desktops and gaming rigs?

dofubejlast Friday at 11:14 PM

> With over 90% of the PC market running on NVIDIA tech, they’re the clear winner of the GPU race. The losers are every single one of us.

Of course the fact that we overwhelmingly chose the better option means that… we are worse off or something?

show 3 replies
alganetlast Friday at 11:09 PM

Right now, all silicon talk is bullshit. It has been for a while.

It became obvious when old e-waste Xeons were turned into viable, usable machines, years ago.

Something is obviously wrong with this entire industry, and I cannot wait for it to pop. THIS will be the excitement everyone is looking for.

show 2 replies
shmerllast Friday at 11:52 PM

> ... NVENC are pretty much indispensable

What's so special about NVENC that Vulkan video or VAAPI can't provide?

> AMD also has accelerated video transcoding tech but for some reason nobody seems to be willing to implement it into their products

OBS works with VAAPI fine. Looking forward to them adding Vulkan video as an option.

Either way, as a Linux gamer I haven't touched Nvidia in years. AMD is a way better experience.

🔗 View 27 more comments