I just hope when (if) the hype is over, we can repurpose the capacities for something useful (e.g. drug discovery etc.)
I just wish we forced every new data center to be built with renewables or something. The marginal cost over a conventional data center can’t be that big compared to the total cost, and these companies can afford it. Maybe it can help advance the next generation of small modular nuclear reactors or something.
I don't see how you can make the argument that a large portion of funds used for AI capex were diverted from other investments (and starving other industries), while simultaneously applying the economic multiplier to the whole sum when going from the investments to the GDP impact.
Surely you only get one of the two, because for diverted investments the multiplier applies equally on both sides of the equation.
The main argument builds on the assumption that the economy is a zero sum game when it clearly is not. Just because we invest these ressources in AI does not mean we could mobilize the same capital for other pursuits.
Precisely AI is being built out today because the value returned is expected to be massive. I would argue this value will be far bigger than railroads ever could be.
Overspending will happen, for sure, in certain geographies or for specialty hardware, maybe even capacity will outpace demand for a while, but I don’t think the author makes a good case that we are there yet.
That is why I stated transistor improvements what was previously known as Moore's law will continue for at least another 10 years. The Smartphone has carried us from 2008 - 2023. The money that is being used today are already invested into the next 2 - 3 years of Semi Conductor manufacturing. That is 2nm or A20 this year and A18 / 14 in two years time. There is enough momentum towards A10 and A8 by 2030, 2032. Even if things slows down by then it is enough to run till 2035 unless something catastrophic like WW3 or Market collapse happening.
That said even if we somehow reach A5 in 2035, we are only at about 12x density increase. If we include system packaging, chiplet, interconnect advancement pushing this to 30 to 40x. This is still a far cry from the 1000 to 10000x compute demands from a lot of AI companies. And that is assuming memory bandwidth could scale with it.
The counterintuitive part of automation is that it removes parts of the economy rather than making the economy bigger. You end up with more goods but the value people assign to them goes down as they don't provide additional social advantage.
For example at one point nails were 0.5% of the economy and today owning a nail factory is a low margin business that has no social status.
Similarly the percentage of the economy and social status associated with frontend software dev will get automated and become a smaller percentage of the economy.
Since social status is a zero sum game people increase spending in other areas where social status can be helped.
I'm waiting for the shoe to drop when someone comes out with an FPGA optimized for reconfigurable computing and lowers the cost of llm compute by 90% or better.
Faild to talk about the opportunity cost
Can you annualize Nvidia's Q1 results simply by multiplying them by 4?
What if anything would it take to actually change the markets perception that expectations may not be met in a significant way?
Now is this AI CapEx or data and IT CapEx? Because everyone and their mother are labeling regular data centers as AI data centers.
Apparently the telecom boost was 2020? What am I missing?
Using sus statistics to draw weird conclusions.
The premise of AI and certainly what a large subset of executives and investors believe is that AI will provide a significant productivity increase to a significant part of the work force.
Of 30% of the work is done 10% faster that leaves a 3% gain for other economic activities. If that is true the CapEx is justified.
> the scale and pace of capital deployment into a rapidly depreciating technology is remarkable
That’s an interesting perspective. It does feel a bit like we’re setting money on fire.
And they're not even making paperclips, as in https://www.google.com/search?q=ai+paperclipping
The 1880s 6% on railroads is an interesting number, I didn't know it was that much.
I hear AI data centers are consuming more power than the entire country of Argentina /s
But I don't hear anyone worried about the massive power consumption without a clear indication if this is a net positive for our society.
I don't know... 1.2% of GDP just doesn't seem that extreme to me. Certainly nowhere near "eating the economy" level compared to other transformative technologies or programs like:
- Apollo program: 4%
- Railroads: 6% (mentioned by the author)
- Covid stimulus: 27%
- WW2 defense: 40%