I think having any kind of "scaling" preferences focuses too much on the technical aspect. It could be narrowed down to one setting like "zoom level" or just "size." This would mean that all UI elements change size exactly proportionately to one another. Ideally, rendering should happen at the exact resolution of the display, and scaling, as in resizing a bitmap using bilinear interpolation or whatever, doesn't need to be part of the pipeline except for outdated legacy programs.
In the past, the problem with UI toolkits doing proportional sizing was because they used bitmaps for UI elements. Since newer versions of Qt and Gtk 4 render programmatically, they can do it the right way. Windows mostly does this, too, even with win32 as long as you're using the newer themes. MacOS is the only one that has assets prerendered at integer factors everywhere and needs to perform framebuffer scaling to change sizes. But Apple doesn't care because they don't want you using third-party monitors anyway.
Edit: I'm not sure about Apple's new theme. Maybe this is their transition point away from fixed asset sizes.
Using vector pipelines isn't new, of course: Windows has been doing DPI-independent rendering since almost the beginning with GDI. The actual issue with GDI's scaling is all about text: for something to be "scalable" it has to maintain its proportions when the scale factor changes, but this was not the case for text in Win32/GDI, due to pixel grid fitting. Because of this, it was common in the Windows XP era to see ill-sized text when changing the DPI to anything other than 96, resulting in things being cut off and generally broken. Also, although the rendering itself was DPI-independent and scalable, that doesn't mean that applications would properly handle scalable rendering themselves, when they do things like deal with pixels directly or what have you. If you did this again today, you could almost certainly account for this and make an API much harder to misuse. HTML applications really have to try to not be resolution-independent, for example.
In practice Windows and macOS both do bitmap scaling when necessary. macOS scales the whole frame buffer, Windows scales windows individually.
Can you do an entire windowing pipeline where it's vectors all the way until the actual compositing? Well, sure! We were kind of close in the pre-compositing era sometimes. Is it worth it to do so? I don't think so for now. Most desktop displays are made up of standard-ish pixels so buffers full of pixels makes a very good primitive. So making the surfaces themselves out of pixels seems like a fine approach, and the scaling problem is relatively easy to solve if you start with a clean slate. The fact that it can handle the "window splitting across outputs" case slightly better is not a particularly strong draw; I don't believe most users actually want to use windows split across outputs, it's just better UX if things at least appear correct. Same thing for legacy apps, really: if you run an old app that doesn't support scaling it's still better for it to work and appear blurry than to be tiny and unusable.
What to make of this. Well, the desktop platform hasn't moved so fast; ten years of progress has become little more than superficial at this point. So I think we can expect with relatively minor concessions that barring an unforeseen change, desktops we use 10 to 20 years from now probably won't be that different from what we have today; what we have today isn't even that different from what we already had 20 years ago as it is. And you can see that in people's attitudes; why fix what isn't broken? That's the sentiment of people who believe in an X11 future. Of course in practice, there's nothing particularly wrong with trying to keep bashing X11 into modernity; with much pain they definitely managed to take X.org and make it shockingly good. Ironically, if some of the same people working on Wayland today had put less work into keeping X.org working well, the case for Wayland would be much stronger by now. Still, I really feel like roughly nobody actually wants to sit there and try to wedge HDR or DPI virtualization into X11, and retooling X11 without regard for backwards compatibility is somewhat silly since if you're going to break old apps you may as well just start fresh. Wayland has always had tons of problems yet I always bet on it as the most likely option simply because it just makes the most sense to me and I don't see any showstoppers that seem like they would be insurmountable. Lo and behold, it sure seems to me that the issues remaining for Wayland adoption have started to become more and more minor. KDE maintains a nice list of more serious drawbacks. It used to be a whole hell of a lot larger!
https://community.kde.org/Plasma/Wayland_Known_Significant_I...
> Windows mostly does this, too, even with win32 as long as you're using the newer themes.
Win32 controls have always been DPI independent, as far back as Windows 95. There is DPI choice UX as part of the "advanced" display settings.