For a long time, the only very high (>200) DPI monitors on the market were Apple's first-party ones and the LG UltraFine, the former being stupidly overpriced and the latter having, as you say, reliability horror stories. I assume the dearth of other options was because macOS doesn't do fractional scaling, only 2x, so only Apple users really needed 5K-at-27" or 6K-at-32" whereas Windows/Linux users can be ok with 150% scaling.
But that's finally changing: several high-DPI monitors came out last year, and even more are coming this year, which should force manufacturers to do better re: both price and reliability. Last year I got a pair of the Asus ProArt 5K monitors, plus a CalDigit Thunderbolt hub, and have been very happy with this setup.
As a Linux user, I am confused when I hear other people talking about "scaling" and even more when they talk about being able to use only a restricted set of values for "scaling".
For much more than a decade, I have not used any monitor with a resolution less than 4k with Linux. I have never used any kind of "scaling" and I would not want to use any kind of "scaling", because that by definition means a lower image quality than it should be.
In X Window System, and in any other decent graphic interface system, the sizes of graphic elements, e.g. the size of fonts or of document pages, should be specified in length units, e.g. typographic points, millimeters or inches.
The graphic system knows the dots-per-inch value of the monitor (using either a value configured by the user or the value read from the monitor EDID when the monitor is initialized). When the graphic elements, such as letters are rasterized, the algorithm uses the dimensions in length units and the DPI value to generate the corresponding bitmap.
"Scaling" normally refers to the scaling of a bitmap into another bitmap with a greater resolution, which can be done either by pixel interpolation or by pixel duplication. This is the wrong place for increasing the size of an image that has been generated by the rasterization of fonts and of vector graphics. The right place for dimension control is during the rasterization process, because only there this can be done without image quality loss.
Thus there should be no "scaling", one should just take care that the monitor DPI is configured correctly, in which case the size of the graphic elements on the screen will be independent of the resolution of the connected monitor. Using a monitor with a higher resolution must result in more beautiful letters, not in smaller letters.
Windows got this wrong, with its scaling factor for fonts, but at least in Linux XFCE this is done right, so I can set whatever DPI value I want, e.g. 137 dpi, 179 dpi, or any other value.
If you configure the exact DPI value of your monitor, then the dimensions of a text or picture on the screen will be equal to those of the same text or picture when printed on paper.
One may want to have a bigger text on screen than on paper, because you normally stay at a greater distance from the monitor than the distance at which you would hold a sheet of paper or a book in your hand.
For this, you must set a bigger DPI value than the real one, so that the rasterizer will believe that your screen is smaller and it will draw bigger letters to compensate for that.
For instance, I set 216 dpi for a Dell 27 inch 4k monitor, which will magnify the images on screen by about 4/3 in comparison with their printed size. This has nothing to do with a "scaling". The rasterizer just uses the 216 dpi value, for example when rasterizing a 12 point font, in such a way that the computed bitmap will have the desired size, which is greater than its printed size by the factor chosen by me.
> I assume the dearth of other options was because macOS doesn't do fractional scaling
Except it does? I have a 14" MBP with a 3024x1964 display. By default, it uses a doubling for an effective 1512x982, but I can also select 1800x1169, 1352x878, 1147x745, or 1024x665. So it certainly does have fractional scaling options.
If you connect a 4k 2160p monitor, you can go down or up from the default 1080p doubling (https://www.howtogeek.com/why-your-mac-shows-the-wrong-resol...). If you select 2560x1440 for a 4k 2160p screen, that's 150% scaling rather than 2x (https://appleinsider.com/inside/macos/tips/what-is-display-s..., see the image where it compares "native 2x scaling" to "appears like 2560x1440").