> low latency connectivity
That's not exactly easy. I doubt on-device training will become much of a thing. But on-device inference is desirable in all sorts of distributed use cases. We're still a long way off from reliable internet everywhere. Especially when you want to start pushing large quantities of sensor data down the pipe.
I can't even get reliable internet on my phone in the centre of London.