logoalt Hacker News

bluGilltoday at 6:42 PM4 repliesview on HN

I question depreciation. those gpu's will be obsolete in 5 years, but will the newer be enough better as to be worth replacing them is an open question. cpu's stopped getting exponetially faster 20 years ago, (they are faster but not the jumps the 1990s got)


Replies

rlpbtoday at 6:46 PM

> those gpu's will be obsolete in 5 years, but will the newer be enough better as to be worth replacing them is an open question

Doesn't one follow from the other? If newer GPUs aren't worth an upgrade, then surely the old ones aren't obsolete by definition.

show 1 reply
Negitivefragstoday at 6:52 PM

I recently compared performance per dollar for CPUs and GPUs on benchmarks for GPUs today vs 10 years ago, and suprisingly, CPUs had much bigger gains. Until I saw that for myself, I thought exactly the same thing as you.

It seems shocking given that all the hype is around GPUs.

This probably wouldn't be true for AI specific workloads because one of the other things that happened there in the last 10 years was optimising specifically for math with lower size floats.

show 1 reply
maxglutetoday at 6:55 PM

I think real issue is current costs / demand = Nvidia gouging GPU price that costs for hardware:power consumption is 70:20 instead of 50:40 (10 for rest of datacenter). Reality is gpus are serendipidous path dependent locked from gaming -> mining. TPUs are more power efficient, if bubble pops and demand for compute goes down, Nvidia + TMSC will still be around, but nexgen AI first bespoke hardware premium will revert towards mean and we're looking at 50% less expensive hardware (no AI race scarcity tax, i.e. 75% Nvidia margins) that use 20% less power / opex. All of a sudden existing data centers becomes not profitable stranded assets even if they can be stretched past 5 years.

lo_zamoyskitoday at 6:47 PM

> those gpu's will be obsolete in 5 years, but will the newer be enough better as to be worth replacing them

Then they won't be obsolete.