I'm not saying that data center buildouts can't overshoot demand but AI and compute is different than fiber buildout. The more compute you have, the smarter the AI. You can use the compute to let the AI think longer (maybe hours/days/weeks) on a solution. You can run multiple AI agents simultaneously and have them work together or check each other's work. You can train and inference better models with more compute.
So there is always use for more compute to solve problems.
Fiber installations can overshoot relatively easily. No matter how much fiber you have installed, that 4k movie isn't going to change. The 3 hours of watch time for consumers isn't going to change.
You can't really use compute more because power is already the bottleneck. Datacenter buildouts are now being measured in GW which tells you everything you need to know. Newer hardware will be a lot more power-efficient but also highly scarce for that reason.
Did you pay attention in computer science classes? There are problems you can't simply brute-force. You can throw all the computing power you want at them, but they won't terminate before the heat-death of the universe. An LLM can only output a convolution of its data set. That's its plateau. It can't solve problems, it can only output an existing solution. Compute power can make it faster to narrow down to that existing solution, but it can't make the LLM smarter.