logoalt Hacker News

cycomaniclast Tuesday at 10:01 PM4 repliesview on HN

That's great, however why do we use a "scale factor" in the first place? We had a perfectly fitting metric in DPI, why can't I set the desired DPI for every monitor, but instead need to calculate some arbitrary scale factor?

I'm generally a strong wayland proponent and believe it's a big step forward over X in many ways, but some decisions just make me scratch my head.


Replies

zokierlast Tuesday at 10:50 PM

DPI (or PPI) is an absolute measurement. Scale factor is intentionally relative. Different circumstances will want to have different scale factor : dpi ratios; most software do not care if certain UI element is exactly x mm in size, but instead just care that their UI element scale matches the rest of the system.

Basically scale factor neatly encapsulates things like viewing distance, user eyesight, dexterity, and preference, different input device accuracy, and many others. It is easier to have human say how big/small they want things to be than have gazillion flags for individual attributes and then some complicated heuristics to deduce the scale.

show 1 reply
sho_hnlast Tuesday at 10:40 PM

The end-user UIs don't ask you to calculate anything. Typically they have a slider from 100% to, say, 400% and let you set this to something like 145%.

This may take some getting used to if you're familiar with DPI and already know the value you like, but for non-technical users it's more approachable. Not everyone knows DPI or how many dots they want to their inches.

That the 145% is 1.45 under the hood is really an implementation detail.

show 2 replies
MadnessASAPlast Tuesday at 11:50 PM

I'm not privy to what discussions happened during the protocol development. However using scale within the protocol seems more practical to me.

Not all displays accurately report their DPI (or can, such as projectors). Not all users, such as myself, know their monitors DPI. Finally the scaling algorithm will ultimately use a scale factor, so at a protocol level that might as well be what is passed.

There is of course nothing stopping a display management widget/settings page/application from asking for DPI and then converting it to a scale factor, I just don't known of any that exist.

show 1 reply
Dylan16807yesterday at 3:40 AM

> We had a perfectly fitting metric in DPI, why can't I set the desired DPI for every monitor, but instead need to calculate some arbitrary scale factor?

Because certain ratios work a lot better than others, and calculating the exact DPI to get those benefits is a lot harder than estimating the scaling factor you want.

Also the scaling factor calculation is more reliable.