This makes no sense. It takes a complex industrial society to keep that tech going. The supply chain to make GPUs would not survive even a modest disruption in the world economy. It's probably the most fragile thing we currently manufacture.
This is also why I'm skeptical of claims that it would be impossible (or nearly so) for governments to meaningfully regulate AI R&D/deployment (regardless of whether or not they should). The "you can't regulate math" arguments. Yeah, you can't regulate math, but using the math depends on some of the most complex technologies humanity has produced, with key components handled by only one or a few companies in only a handful of countries (US, China, Taiwan, South Korea, Netherlands, maybe Japan?). US-China cooperation could probably achieve any level of regulation they want up to and including "shut it all down now." Likely? Of course not. But also not impossible if the US and China both felt sufficiently threatened by AI.
The only thing that IMO would be really hard to regulate would be the distribution of open-weight models existing at the time regulations come into effect, although I imagine even that would be substantially curtailed by severe enough penalties for doing so.
This is the best argument I’ve heard against it, so thanks.
My anxiety entirely orbits around the scale of AI compute we’ve reached and the sentiment that there is drastic room for improvement, the rapidly advancing state of the art in robotics, and the massive potential for disruption of middle/lower class stake in society. Not to mention the general sentiment that the economy is more important than people’s well being in 99.9% of scenarios.
Who's to say it has to keep moving forward? The companies are buying up massive amounts of GPUs in this AI race, a move that's widely questioned because next year's GPUs might render the current ones outdated[0], so there will probably be plenty of GPUs to go around if the CEO demands it (prior to collapse). Operating datacenters would probably be out of the question with a collapsed society as the power grid might be unreliable, global networks might be down and securing many datacenters would probably be difficult, but there's at least one public record of a billionaire building his own underground bunker with off-grid power generation and enough room to have his own little datacenter inside[1]. "Ordinary" people will acquire 32GB GPUs or Mac Studios for local open-source LLM inference, so it seems likely billionaires would just do the next step up for their bunker and use their company's proprietary weights on decommissioned compute clusters.
[0] https://www.cnbc.com/2025/11/14/ai-gpu-depreciation-coreweav... [1] https://www.businessinsider.com/mark-zuckerberg-hawaii-under...
If you're an AI company and you believe your own hype (like Musk seems to), you'll probably believe that you can automate everything from digging minerals out of the ground all of the way up to making the semiconductors in the robots that dig the minerals.
As you may infer from my use of the word "hype", I do not think we are close to such generality at a high enough quality level to actually do this.