Software design is not really my wheelhouse so I can't comment meaningfully on that, but on the networking side I can very confidently say it was a poor architecture. You simply cannot assume that all of your clients are going to be both 1) non-malicious and 2) work exactly as you think they will.
Link saturation would be one of the first things that would come to mind in this situation, and at these speeds QoS would be trivial even for cheap consumer hardware.
Well, on the software design side, there's plenty of scenarios where undocumented behavior crops up on unexpected network interruption. In the example above, Windows can even pre-download updates on metered connections during one time period, then install those updates during another. The customers really can't take the blame for that, IMO.
I think overall society has rapidly deteriorated in software quality and it is mostly because of the devaluing of software design. No one expects quality from software, everyone "understands there are bugs", and some like to take advantage of that. And so the Overton window gets pushed in the direction of "broken forever good luck holding the bag if you use it" rather than the more realistic "occasionally needs to restart IFF you hit an issue and it takes less than <10 seconds and has minimal data loss".