logoalt Hacker News

IBM CEO says there is 'no way' spending on AI data centers will pay off

105 pointsby nabla9today at 6:10 PM135 commentsview on HN

Comments

pjdesnotoday at 8:50 PM

> $8 trillion of CapEx means you need roughly $800 billion of profit just to pay for the interest

That assumes you can just sit back and gather those returns indefinitely. But half of that capital expenditure will be spent on equipment that depreciates in 5 years, so you're jumping on a treadmill that sucks up $800M/yr before you pay a dime of interest.

scroottoday at 6:51 PM

As an elder millennial, I just don't know what to say. That a once in a generation allocation of capital should go towards...whatever this all will be, is certainly tragic given current state of the world and its problems. Can't help but see it as the latest in a lifelong series of baffling high stakes decisions of dubious social benefit that have necessarily global consequences.

show 7 replies
Octoth0rpetoday at 6:47 PM

> Krishna also referenced the depreciation of the AI chips inside data centers as another factor: "You've got to use it all in five years because at that point, you've got to throw it away and refill it," he said

This doesn't seem correct to me, or at least is built on several shaky assumptions. One would have to 'refill' your hardware if:

- AI accelerator cards all start dying around the 5 year mark, which is possible given the heat density/cooling needs, but doesn't seem all that likely.

- Technology advances such that only the absolute newest cards can be used to run _any_ model profitably, which only seems likely if we see some pretty radical advances in efficiency. Otherwise, it seems like assuming your hardware is stable after 5 years of burn in, you could continue to run older models on that hardware at only the cost of the floorspace/power. Maybe you need new cards for new models for some reason (maybe a new fp format that only new cards support? some magic amount of ram? etc), but it seems like there may be room for revenue via older/less capable models at a discounted rate.

show 8 replies
criddelltoday at 6:51 PM

> But AGI will require "more technologies than the current LLM path," Krisha said. He proposed fusing hard knowledge with LLMs as a possible future path.

And then what? These always read a little like the underpants gnomes business model (1. Collect underpants, 2. ???, 3. Profit). It seems to me that the AGI business models require one company has exclusive access to an AGI model. The reality is that it will likely spread rapidly and broadly.

If AGI is everywhere, what's step 2? It seems like everything AGI generated will have a value of near zero.

show 2 replies
myaccountonhntoday at 6:46 PM

> In an October letter to the White House's Office of Science and Technology Policy, OpenAI CEO Sam Altman recommended that the US add 100 gigawatts in energy capacity every year.

> Krishna also referenced the depreciation of the AI chips inside data centers as another factor: "You've got to use it all in five years because at that point, you've got to throw it away and refill it," he said.

And people think the climate concerns of AI are overblown. Currently US has ~1300 GW of energy capacity. That's a huge increase each year.

show 3 replies
bluGilltoday at 6:42 PM

I question depreciation. those gpu's will be obsolete in 5 years, but will the newer be enough better as to be worth replacing them is an open question. cpu's stopped getting exponetially faster 20 years ago, (they are faster but not the jumps the 1990s got)

show 4 replies
skeeter2020today at 6:57 PM

The interesting macro view on what's happening is to compare a mature data center operation (specifically a commoditized one) with the utility business. The margins here, and in similar industries with big infra build-out costs (ex: rail) are quite small. Historically the businesses have not done well; I can't really imagine what happens when tech companies who've only ever known huge, juicy margins experience low single digit returns on billions of investment.

show 1 reply
mbreesetoday at 7:22 PM

I would add an addendum to this -- there is no way the announced spending on AI data centers will all come to fruition. I have no doubt that there will be a massive build-out of infrastructure, but it can't reach the levels that have been announced. The power requirements alone will stop that from happening.

show 1 reply
badmonstertoday at 7:15 PM

He's right to question the economics. The AI infrastructure buildout resembles the dot-com era's excess fiber deployment - valuable long-term, but many individual bets will fail spectacularly. Utilization rates and actual revenue models matter more than GPU count.

show 1 reply
Ekarostoday at 8:24 PM

How much of Nvidias price is based on 5 year replacement cycle? If that stops or slows with new demand could it also affect things? Not that 5 years does not seem very long horizon now.

ic_fly2today at 6:52 PM

IBM might not have a data strategy or AI plan but he isn’t wrong on the inability to generate a profit.

A bit of napkin math: NVIDIA claims 0.4J per token for their latest generation 1GW plant with 80% utilisation can therefore produce 6.29 10^16 tokens a year.

There are ~10^14 tokens on the internet. ~10^19 tokens have been spoken by humans… so far.

show 2 replies
kenjacksontoday at 6:36 PM

I don't understand the math about how we compute $80b for a gigawatt datacenter. What's the costs in that $80b? I literally don't understand how to get to that number -- I'm not questioning its validity. What percent is power consumption, versus land cost, versus building and infrastructure, versus GPU, versus people, etc...

show 2 replies
nashashmitoday at 7:41 PM

Don’t worry. The same servers will be used for other computing purposes. And maybe that will be profitable. Maybe it will be beneficial to others. But This cycle of investment and loss is a version of distribution of wealth. Some benefit.

The banks and loaners always benefit.

show 1 reply
eitallytoday at 7:05 PM

At some point, I wonder if any of the big guys have considered becoming grid operators. The vision Google had for community fiber (Google Fiber, which mostly fizzled out due to regulatory hurdles) could be somewhat paralleled with the idea of operating a regional electrical grid.

Animatstoday at 7:21 PM

How much has actually been spent on AI data centers vs. amounts committed or talked about? That is, if construction slows down sharply, what's total spend?

maxglutetoday at 6:49 PM

How long can ai gpus stretch? Optmistic 10 years and we're still looking at 400b+ profit to cover interests. The factor in silicon is closer to tulips than rail or fiber in terms of depreciated assets.

bluGilltoday at 6:40 PM

This is likely correct overall, but it can still pay off in specific cases. However those are not blind investments they are targeted with a planned business model

wmftoday at 6:38 PM

$8T may be too big of an estimate. Sure you can take OpenAI's $1.4T and multiply it by N but the other labs do not spend as much as OpenAI.

jmclnxtoday at 7:08 PM

I guess he is looking directly at IBM's cash cow, the mainframe business.

But, I think he is correct, we will see. I still believe AI will not give the CEOs what they really want, no or very cheap labor.

qwertyuiop_today at 6:42 PM

The question no one seems to be answering is what would be the EOL for these newer GPUs that are being churned out of NVDIA ? What % annual capital expenditures is refresh of GPUs. Will they be perpetually replaced as NVIDIA comes up with newer architectures and the AI companies chase the proverbial lure ?

parapatelsukhtoday at 6:46 PM

The spending will be more than paid off since the taxpayer is the lender of last resort There's too many funny names in the investors / creditors a lot of mountains in germany and similar ya know

devmortoday at 7:05 PM

I suppose it depends on your definition of "pay off".

It will pay off for the people investing in it, when the US government inevitably bails them out. There is a reason Zuckerberg, Huang, etc are so keen on attending White House dinners.

It certainly wont pay off for the American public.

oxqbldpxotoday at 7:29 PM

FB playbook. Act (spend) then say sorry.

verdvermtoday at 6:16 PM

IBM CEO is steering a broken ship and it's not improved course, not someone who's words you should take seriously.

1. The missed the AI wave (hired me to teach watson law only to lay me off 5 wks later, one cause of the serious talent issues over there)

2. They bought most of their data center (companies), they have no idea about building and operating one, not at the scale the "competitors" are operating at

show 4 replies