I still find it funny that my pretty new nVidia card takes several seconds to switch between resolutions while my 35 year old Atari could switch resolution several times per screen refresh. I made a simple game that had color display in low resolution on three quarters of the screen while the bottom had 4 colors and double horizontal resolution to fit more text instead. So resolution change was done 120 times per second without slowing down the cpu much at all.
Well the nvidia would have no trouble doing that at the refresh rate, resolution and colour depth that the atari was using if that was the target. Ataris were great for their time no doubt but to optimise very high frame rate, colour depth and resolution a modern graphics api is doing some setup that requires a bit of time so that the common case works very smoothly after that.