Data centers are overrated, local AI is what’s necessary for humanoid (and other) robots, which will be the most economically impactful use case.
You definitely still need data centers to train the models that you’ll run locally. Also if we achieve AGI you can bet it won’t be available to run locally at first.
Isn't it better to control robots from the data center? You can get 30ms round-trip to most urban centers, which is sufficient latency for most tasks; lower weight & cost robots with better battery life, and more uptime on compute (e.g. the GPU isn't sitting there doing nothing when the user is sleeping) which means lower cost to consumer for the same end result.
For self-driving you need edge compute because a few milliseconds of latency is a safety risk, but for many applications I don't see why you'd want that.
You probably still need to train the initial models in data centers, with local host mostly being used to run train models. At most we’d augment trained models with local data storage on local host.
If compute continues to become cheaper, local training might be feasible in 20 years.