logoalt Hacker News

jsheardyesterday at 3:46 PM3 repliesview on HN

> Wayland has fractional scaling as a sort-of workaround if you can tolerate the entire screen being blurry. Every other major OS can deal with this.

I think Windows is the only other one which really does this properly, macOS also does the hack where they simulate fractional scales by rendering with an integer scale at a non-native resolution then scaling it down.


Replies

delta_p_delta_xyesterday at 4:05 PM

> I think Windows is the only other one which really does this properly

Windows is the only one that does this properly.

Windows handles high pixel density on a per-application, per-display basis. This is the most fine-grained. It's pretty easy to opt in on reasonably modern frameworks, too; just add in the necessary key in the resource manifest; done. [1]

Linux + Xorg has a global pixel density scale factor. KDE/Qt handles this OK; GNOME/GTK break when the scaling factor is not an integer multiple of 96 and cause raster scaling.

Linux + Wayland has per-display scaling factors, but Chromium, GNOME, and GTK break the same way as the Xorg setup. KDE/Qt are a bit better, but I'm quite certain the taskbar icons are sharper on Xorg than they are on Wayland. I think this boils down to subpixel rendering not being enabled.

And of course, every application on Linux in theory can handle high pixel density, but there is a zoo of environment variables and command-line arguments that need to be passed for the ideal result.

On macOS, if the pixel density of the target display is at least some Apple-blessed number that they consider 'Retina', then the 'Retina' resolutions are enabled. At resolutions that are not integer multiples of the physical resolution, the framebuffer is four times the resolution of the displayed values (twice in each dimension), and then the final result is raster-scaled with some sinc/Lanczos algorithm back down to the physical resolution. This shows up as ringing artifacts, which are very obvious with high-contrast, thin regions like text.

On non-retina resolutions, there is zero concept of 'scaling factor' whatsoever; you can choose another resolution, but it will be raster-scaled (usually up) with some bi/trilinear filtering, and the entire screen is blurry. The last time Windows had such brute-force rendering was in Windows XP, 25 years ago.

[1]: https://learn.microsoft.com/en-gb/windows/win32/hidpi/settin...

show 6 replies
tracker1yesterday at 4:43 PM

That's roughly what I did for my ANSI console/viewer... I started with EGA resolution, and each ega pixel renders 3x4 in its' buffer then a minor blur, then scaled to fit the render area. The effect is really good down to about 960px wide, which is a bit bigger in terms of real pixels than the original... at 640px wide, it's a little hard to make out the actual pixels... but it's the best way I could think of to handle the non-square pixels of original EGA or VGA... I went with EGA because the ratio is slightly cleaner IMO. It's also what OG RIPterm used.

amlutoyesterday at 8:03 PM

I have precisely one Windows thing I use regularly, and it has a giant window that needs lots of pixels, and I use it over Remote Desktop. The results are erratic and frequently awful.