I question depreciation. those gpu's will be obsolete in 5 years, but will the newer be enough better as to be worth replacing them is an open question. cpu's stopped getting exponetially faster 20 years ago, (they are faster but not the jumps the 1990s got)
I recently compared performance per dollar for CPUs and GPUs on benchmarks for GPUs today vs 10 years ago, and suprisingly, CPUs had much bigger gains. Until I saw that for myself, I thought exactly the same thing as you.
It seems shocking given that all the hype is around GPUs.
This probably wouldn't be true for AI specific workloads because one of the other things that happened there in the last 10 years was optimising specifically for math with lower size floats.
I think real issue is current costs / demand = Nvidia gouging GPU price that costs for hardware:power consumption is 70:20 instead of 50:40 (10 for rest of datacenter). Reality is gpus are serendipidous path dependent locked from gaming -> mining. TPUs are more power efficient, if bubble pops and demand for compute goes down, Nvidia + TMSC will still be around, but nexgen AI first bespoke hardware premium will revert towards mean and we're looking at 50% less expensive hardware (no AI race scarcity tax, i.e. 75% Nvidia margins) that use 20% less power / opex. All of a sudden existing data centers becomes not profitable stranded assets even if they can be stretched past 5 years.
> those gpu's will be obsolete in 5 years, but will the newer be enough better as to be worth replacing them
Then they won't be obsolete.
> those gpu's will be obsolete in 5 years, but will the newer be enough better as to be worth replacing them is an open question
Doesn't one follow from the other? If newer GPUs aren't worth an upgrade, then surely the old ones aren't obsolete by definition.