logoalt Hacker News

Nvidia won, we all lost

914 pointsby todsacerdotilast Friday at 9:58 PM543 commentsview on HN

Comments

benreesmanlast Friday at 11:40 PM

The thing is, company culture is a real thing. And some cultures are invasive/contagious like kudzu both internally to the company and into adjacent companies that they get comped against. The people get to thinking a certain way, they move around between adjacent companies at far higher rates than to more distant parts of their field, the executives start sitting on one another's boards, before you know it a whole segment is enshittified, and customers feel like captives in an exploitation machine instead of parties to a mutually beneficial transaction in which trade increases the wealth of all.

And you can build mythologies around falsehoods to further reinforce it: "I have a legal obligation to maximize shareholder value." No buddy, you have some very specific restrictions on your ability to sell the company to your cousin (ha!) for a handful of glass beads. You have a legal obligation to bin your wafers the way it says on your own box, but that doesn't seem to bother you.

These days I get a machine like the excellent ASUS Proart P16 (grab one of those before they're all gone if you can) with a little 4060 or 4070 in it that can boot up Pytorch and make sure the model will run forwards and backwards at a contrived size, and then go rent a GB200 or whatever from Latitude or someone (seriously check out Latitude, they're great), or maybe one of those wildly competitive L40 series fly machines (fly whips the llama's ass like nothing since Winamp, check them out too). The GMTek EVO-X1 is a pretty capable little ROCm inference machine for under 1000, its big brother is nipping at the heels of a DGX Spark under 2k. There is good stuff out there but its all from non-incumbent angles.

I don't game anymore but if I did I would be paying a lot of attention to ARC, I've heard great things.

Fuck the cloud and their ancient Xeon SKUs for more than Latitude charges for 5Ghz EPYC. Fuck NVIDIA gaming retail rat race, its an electrical as well as moral hazard in 2025.

It's a shame we all have to be tricky to get what used to be a halfway fair deal 5-10 years ago (and 20 years ago they passed a HUGE part of the scaling bonanza down to the consumer), but its possible to compute well in 2025.

show 2 replies
oilkillsbirdslast Friday at 11:39 PM

Nobody’s going to read this, but this article and sentiment is utter anti-corporate bullshit, and the vastly congruent responses show that none of you have watched the historical development of GPGPU, or do any serious work on GPUs, or keep up with the open work of nvidia researchers.

The spoiled gamer mentality is getting old for those of us that actually work daily in GPGPU across industries, develop with RTX kit, do AI research, etc.

Yes they’ve had some marketing and technical flubs as any giant publically traded company will have, but their balance of research-driven development alongside corporate profit necessities is unmatched.

show 2 replies
amatechayesterday at 6:32 AM

Uhh, these 12VHPWR connectors seem like a serious fire risk. How are they not being recalled? I just got a 5060ti , now I'm wishing I went AMD instead.. what the hell :(

Whoa, the stuff covered in the rest of the post is just as egregious. Wow! Maybe time to figure out which AMD models compares performance-wise and sell this thing, jeez.

Havocyesterday at 12:33 PM

They’re not full of shit - they’re just doing what a for profit co in a dominant position does.

In other news I hope intel pulls their thumb out of their ass cause AMD is crushing it and that’s gonna end the same way

fithisuxyesterday at 2:57 PM

NVidia won?

Not for me. I prefer Intel offerings. Open and Linux friendly.

I even hope they would release the next gen Risc-V boards with Intel Graphics.

show 1 reply
bigyabailast Friday at 10:50 PM

> Pretty much all upscalers force TAA for anti-aliasing and it makes the entire image on the screen look blurry as fuck the lower the resolution is.

I feel like this is a misunderstanding, though I admit I'm splitting hairs here. DLSS is a form of TAA, and so is FSR and most other modern upscalers. You generally don't need an extra antialiasing pipeline if you're getting an artificially supersampled image.

We've seen this technique variably developed across the lifespan of realtime raster graphics; first with checkerboard rendering, then TAA, then now DLSS/frame generation. It has upsides and downsides, and some TAA implementations were actually really good for the time.

show 2 replies
d00mB0tlast Friday at 10:05 PM

Sounds about right :D

andrewstuartyesterday at 1:01 AM

All symptoms of being number one.

Customers don’t matter, the company matters.

Competition sorts out such attitude quick smart but AMD never misses a chance to copy Nvidias strategy in any way and intel is well behind.

So for now, you’ll eat what Jensen feeds you.

sonicvrooomlast Friday at 11:51 PM

it would be "just" capitalist to call these fuckers out for real, on the smallest level.

you are safe.

system2last Friday at 11:11 PM

Why does the hero image of this website says "Made with GIMP"? I've never seen a web banner saying "Made with Photoshop" or anything similar.

show 3 replies
delducalast Friday at 11:19 PM

Nothing new, it is just Enshittification

WhereIsTheTruthyesterday at 4:43 AM

Call it delusions or conspiracy theories, what ever, I don't care, but it seems to me that NVIDIA wants to vendor lock the whole industry

If all game developers begin to rely on NVIDIA technology, the industry as a whole puts customers in a position where they are forced to give in

The public's perception of RTX's softwarization (DLSS) and them coining the technical terms says it all

They have a long term plan, and that plan is:

- make all the money possible

- destroy all competition

- vendor lock the whole world

When I see that, I can't help myself but to think something is fishy:

https://i.imgur.com/WBwg6qQ.png

ksecyesterday at 2:14 AM

>How is it that one can supply customers with enough stock on launch consistently for decades, and the other can’t?

I guess the author is too young and didn't go through iPhone 2G to iPhone 6 era. Also worth remembering it wasn't too long ago Nvidia was sitting on nearly ONE full year of GPU stock unsold. That has completely changed the course of how Nvidia does supply chain management and forecast. Which unfortunately have a negative impact all the way to Series 50. I believe they have since changed and next Gen should be better prepared. But you can only do so much when AI demand is seemingly unlimited.

>The PC, as gaming platform, has long been held in high regards for its backwards compatibility. With the RTX 50 series, NVIDIA broke that going forward. PhysX.....

Glide? What about all the Audio Drivers API before. As much as I wish everything is backward compatible. That is just not how the world works. Just like any old games you need some fiddling to get it work. And they even make the code available so people could actually do something rather then emulation or reverse engineering.

>That, to me, was a warning sign that maybe, just maybe, ray tracing was introduced prematurely and half-baked.

Unfortunately that is not how it works. Do we want to go back to Pre-3DFx to today to see how many what we thought was great idea for 3D accelerator only to be replaced by better ideas or implementation? These idea were good on paper but didn't work well. We than learn from it and reiterate.

>Now they’re doing an even more computationally expensive version of ray tracing: path tracing. So all the generational improvements we could’ve had are nullified again......

How about Path Tracing is simply a better technology? Game developers also dont have to use any of these tech. The article act as if Nvidia forces all game to use it. Gamers want better graphics quality, Artist and Graphics asset is already by far the most expensive item in gaming and it is still increasing. What hardware improvement is allowing those to be achieved at lower cost. ( To Game Developers )

>Never mind that frame generation introduces input lag that NVIDIA needs to counter-balance with their “Reflex” technology,

No. That is not why "Reflex" tech was invented. Nvidia spend R&D on 1000 fps monitor as well and potentially sub 1ms frame monitor. They have always been latency sensitive.

------------------------------

I have no idea how modern Gamers become what they are today. And this isn't the first time I have read it even on HN. You dont have to buy Nvidia. You have AMD and now Intel ( again ). Basically I can summarise one thing about it, Gamers want Nvidia 's best GPU for the lowest price possible. Or a price they think is acceptable without understanding the market dynamics and anything supply chain or manufacturing. They also want higher "generational" performance. Like 2x every 2 year. And if they dont get it, it is Nvidia's fault. Not TSMC, not Cadence, not Tokyo Electron, not Issac Newton or Law of Physic. But Nvidia.

Nvidia's PR tactic isn't exactly new in the industry. Every single brand do something similar. Do I like it? No. But unfortunately that is how the game is played. And Apple is by far the worst offender.

I do sympathise with the Cable issue though. And not the first time Nvidia has with thermal issues. But then again they are also the one who are constantly pushing the boundary forward. And AFAIK the issues isn't as bad as the series 40 but some YouTube seems to be making a bigger issue than most. Supply issues will be better but TSMC 3nm is fully booked . The only possible solution would be to have consumer GPU less capable of AI workload. Or to have AI GPU working with leading edge node and consumer always be a node lower to split the capacity problem. I would imagine that is part of the reason why TSMC is accelerating 3nm capacity increase on US soil. Nvidia is now also large enough and has enough cash to take on more risk.

jekwoooooelast Friday at 11:23 PM

This guy makes some good points but he clearly has a bone to pick. Calling dlss snake oil was where I stopped reading

show 2 replies
AStonesThrowlast Friday at 10:41 PM

[dead]

KaoruAoiShihoyesterday at 12:36 AM

[flagged]

deepGemyesterday at 12:39 AM

[flagged]

827ayesterday at 3:51 AM

Here's something I don't understand: Why is it that when I go look at DigitalOcean's GPU Droplet options, they don't offer any Blackwell chips? [1] I thought Blackwell was supposed to be the game changing hyperchip that carried AI into the next generation, but the best many providers still offer are Hopper H100s? Where are all the Blackwell chips? Its been oodles of months.

Apparently AWS has them available in the P6 instance type, but the only configuration they offer has 2TB of memory and costs... $113/hr [2]? Like, what is going on at Nvidia?

Where the heck is Project Digits? Like, I'm developing this shadow opinion that Nvidia actually hasn't built anything new in three years, but they fill the void by talking about hypothetical newtech that no one can actually buy + things their customers have built with the actually good stuff they built three years ago. Like, consumers can never buy Blackwell because "oh Enterprises have bought them all up" then when Microsoft tries to buy any they say "Amazon bought them all up" and vice-versa. Something really fishy is going on over there. Time to short.

[1] https://www.digitalocean.com/products/gpu-droplets

[2] https://aws.amazon.com/ec2/pricing/on-demand/

show 1 reply
ls-alast Friday at 11:11 PM

Finally someone

honeybadger1last Friday at 11:12 PM

A bit hyperbolic

johnklosyesterday at 1:35 AM

I'm so happy to see someone calling NVIDIA out for their bullshit. The current state of GPU programming sucks, and that's just an example of the problems with the GPU market today.

The lack of open source anything for GPU programming makes me want to throw my hands up and just do Apple. It feels much more open than pretending that there's anything open about CUDA on Linux.

jdprgmlast Friday at 11:37 PM

The 4090 was released coming up on 3 years and is currently going for about 25% over launch msrp USED. Buying gpu's is literally an appreciating asset. It is complete insanity and an infuriating situation for an average consumer.

I honestly don't know why nvidia didn't just suspend their consumer line entirely. It's clearly no longer a significant revenue source and they have thoroughly destroyed consumer goodwill over the past 5 years.

show 1 reply
Sweepiyesterday at 7:15 AM

Nvidia is full of shit, but this article is full of shit, too. A lot of human slop, some examples:

- 12VHPWR is not at fault / the issue. As the article itself points out, the missing power balancing circuit is to blame. The 3090 Ti had bot 12VHPWR and the balancing power circuit and ran flawless.

- Nvidia G-Sync: Total non-issue. G-Sync native is dead. Since 2023, ~1000 Freesync Monitors have been released, and 3(!!) G-Sync native Monitors.

- The RTX 4000 series is not still expensive, it is again expensive. It was much cheaper a year before RTX 5000 release

- Anti-Sag Brackets were a thing way before RTX 4000

another_kellast Friday at 11:34 PM

I’m sorry but this framing is insane

> So 7 years into ray traced real-time computer graphics and we’re still nowhere near 4K gaming at 60 FPS, even at $1,999.

The guy is complaining that a product can’t live up to his standard, while dismissing barely noticeable proposed trade off that can make it possible because it’s «fake».

jdthediscipleyesterday at 9:12 AM

Read this in good faith but I don't see how it's supposed to be Nvidia's fault?

How could Nvidia realistically stop scalper bots?

DarkmSparkslast Friday at 11:53 PM

I sometimes wonder if people getting this salty over "fake" frames actually realise every frame is fake even in native mode. Neither is more "real" than the other, it's just different.