logoalt Hacker News

adrian_byesterday at 10:43 PM2 repliesview on HN

As a Linux user, I am confused when I hear other people talking about "scaling" and even more when they talk about being able to use only a restricted set of values for "scaling".

For much more than a decade, I have not used any monitor with a resolution less than 4k with Linux. I have never used any kind of "scaling" and I would not want to use any kind of "scaling", because that by definition means a lower image quality than it should be.

In X Window System, and in any other decent graphic interface system, the sizes of graphic elements, e.g. the size of fonts or of document pages, should be specified in length units, e.g. typographic points, millimeters or inches.

The graphic system knows the dots-per-inch value of the monitor (using either a value configured by the user or the value read from the monitor EDID when the monitor is initialized). When the graphic elements, such as letters are rasterized, the algorithm uses the dimensions in length units and the DPI value to generate the corresponding bitmap.

"Scaling" normally refers to the scaling of a bitmap into another bitmap with a greater resolution, which can be done either by pixel interpolation or by pixel duplication. This is the wrong place for increasing the size of an image that has been generated by the rasterization of fonts and of vector graphics. The right place for dimension control is during the rasterization process, because only there this can be done without image quality loss.

Thus there should be no "scaling", one should just take care that the monitor DPI is configured correctly, in which case the size of the graphic elements on the screen will be independent of the resolution of the connected monitor. Using a monitor with a higher resolution must result in more beautiful letters, not in smaller letters.

Windows got this wrong, with its scaling factor for fonts, but at least in Linux XFCE this is done right, so I can set whatever DPI value I want, e.g. 137 dpi, 179 dpi, or any other value.

If you configure the exact DPI value of your monitor, then the dimensions of a text or picture on the screen will be equal to those of the same text or picture when printed on paper.

One may want to have a bigger text on screen than on paper, because you normally stay at a greater distance from the monitor than the distance at which you would hold a sheet of paper or a book in your hand.

For this, you must set a bigger DPI value than the real one, so that the rasterizer will believe that your screen is smaller and it will draw bigger letters to compensate for that.

For instance, I set 216 dpi for a Dell 27 inch 4k monitor, which will magnify the images on screen by about 4/3 in comparison with their printed size. This has nothing to do with a "scaling". The rasterizer just uses the 216 dpi value, for example when rasterizing a 12 point font, in such a way that the computed bitmap will have the desired size, which is greater than its printed size by the factor chosen by me.


Replies

wolvoleoyesterday at 11:24 PM

It's probably called scaling because that's what other OSes do.

For example macOS just renders at 200% and then scales down to the desired level.

Linux is indeed way better at this.

show 2 replies
kcbtoday at 12:39 AM

Doesn't this depend on the application. For example electron applications dgaf about this system, render to a bitmap, and then look terrible as a result.