I'm not privy to what discussions happened during the protocol development. However using scale within the protocol seems more practical to me.
Not all displays accurately report their DPI (or can, such as projectors). Not all users, such as myself, know their monitors DPI. Finally the scaling algorithm will ultimately use a scale factor, so at a protocol level that might as well be what is passed.
There is of course nothing stopping a display management widget/settings page/application from asking for DPI and then converting it to a scale factor, I just don't known of any that exist.
As I replied to the other poster. I don't think DPI should necessarily be the exposed metric, but I do think that we should use something non device-dependent as our reference point, e.g. make 100% = 96 dpi.
I can guarantee that it is surprising to non-technical users (and a source of frustration for technical users) that the scale factor and UI element size can be completely different on two of the same laptops (just a different display resolution which is quite common). And it's also unpredictable which one will have the larger UI elements. Generally I believe UI should have behave as predictably as possible.