> I think Windows is the only other one which really does this properly
Windows is the only one that does this properly.
Windows handles high pixel density on a per-application, per-display basis. This is the most fine-grained. It's pretty easy to opt in on reasonably modern frameworks, too; just add in the necessary key in the resource manifest; done. [1]
Linux + Xorg has a global pixel density scale factor. KDE/Qt handles this OK; GNOME/GTK break when the scaling factor is not an integer multiple of 96 and cause raster scaling.
Linux + Wayland has per-display scaling factors, but Chromium, GNOME, and GTK break the same way as the Xorg setup. KDE/Qt are a bit better, but I'm quite certain the taskbar icons are sharper on Xorg than they are on Wayland. I think this boils down to subpixel rendering not being enabled.
And of course, every application on Linux in theory can handle high pixel density, but there is a zoo of environment variables and command-line arguments that need to be passed for the ideal result.
On macOS, if the pixel density of the target display is at least some Apple-blessed number that they consider 'Retina', then the 'Retina' resolutions are enabled. At resolutions that are not integer multiples of the physical resolution, the framebuffer is four times the resolution of the displayed values (twice in each dimension), and then the final result is raster-scaled with some sinc/Lanczos algorithm back down to the physical resolution. This shows up as ringing artifacts, which are very obvious with high-contrast, thin regions like text.
On non-retina resolutions, there is zero concept of 'scaling factor' whatsoever; you can choose another resolution, but it will be raster-scaled (usually up) with some bi/trilinear filtering, and the entire screen is blurry. The last time Windows had such brute-force rendering was in Windows XP, 25 years ago.
[1]: https://learn.microsoft.com/en-gb/windows/win32/hidpi/settin...
> Windows is the only one that does this properly. Windows handles high pixel density on a per-application, per-display basis.
This is not our [0] experience. macOS handles things on a per-section-of-window, per-application, per-display basis. You can split a window across two monitors at two different DPIs, and it will display perfectly. This does not happen on Windows, or we have not found the right way to make it work thus far.
[0] ardour.org
> then the final result is raster-scaled with some sinc/Lanczos algorithm back down to the physical resolution. This shows up as ringing artifacts, which are very obvious with high-contrast, thin regions like text.
I don't think this is true. I use non-integer scaling on my Mac since I like the UX to be just a little bit bigger, and have never observed any kind of ringing or any specific artifacts at all around text, nor have I ever heard this as a complaint before. I assume it's just bilinear or bicubic unless you have evidence otherwise? The only complaint people tend to make is ever-so-slight additional blurriness, which barely matters at Retina resolution.
> The last time Windows had such brute-force rendering was in Windows XP, 25 years ago.
To be fair, UXGA was a thing 20 years ago. I don't think it makes sense for Apple to care all that much about low DPI monitors. They don't sell any, and they wouldn't be acceptable to most Apple people, who have had crisp displays available for > 10 years now. I wouldn't be surprised if the number of Apple users on low dpi is single digit percentage.
This is a surprising opinion to encounter, given my experience with scaling on Windows, where simple things like taking my laptop off its dock (going from desktop monitors to laptop screen) causes applications to become blurry, and they stay blurry even when I've returned the laptop to the dock. Or how scaling causes some maximized window edges to show up on the adjacent screen. Or all manner of subtle positioning and size bugs crop up.
Is this more of an aspirational thing, like Windows supports "doing it right", and with time and effort by the right people, more and more applications may be able to be drawn correctly?
[edit] I guess so, I see your comment about setting registry keys to make stuff work in Microsoft's own programs. That aligns more closely with my experience.
> Windows is the only one that does this properly.
How can you say this when applications render either minuscule or gigantic, either way with contents totally out of proportion, seemingly at random?
I don’t have to pull out a magnifying glass to notice those issues.
ChromeOS also does fractional scaling properly because Chrome does it properly. The scaling factor is propagated through the rendering stack so that content is rastered at the correct scale from the beginning instead of using an integer scaling factor and then downscaling later. And it takes subpixel rendering into account too, which affects things like what elements can be squashed into layers backed by GPU textures.
I think Android does it properly too because they have to handle an entire zoo of screen sizes and resolutions there. Although they don't have the issue of dealing with subpixel rendering.