Most "HDR" monitors are junk that can't display HDR. The HDR formats/signals are designed for brightness levels and viewing conditions that nobody uses.
The end result is a complete chaos. Every piece of the pipeline doing something wrong, and then the software tries to compensate for it by emitting doubly wrong data, without even having reliable information about what it needs to compensate for.
https://docs.google.com/document/d/1A__vvTDKXt4qcuCcSN-vLzcQ...
What we really need is some standards that everybody follows. The reason normal displays work so well is that everyone settled on sRGB, and as long as a display gets close to that, say 95% sRGB, everyone except maybe a few graphics designers will have a n equivalent experience.
But HDR, it's a minefield of different display qualities, color spaces, standards. It's no wonder that nobody gets it right and everyone feels confused.
HDR on a display that has peak brightness of 2000 nits will look completely different than a display with 800 nits, and they both get to claim they are HDR.
We should have a standard equivalent to color spaces. Set, say, 2000 nits as 100% of HDR. Then a 2000 nit display gets to claim it's 100% HDR. A 800 nit display gets to claim 40% HDR, etc. A 2500 nit display could even use 125% HDR in it's marketing.
It's still not perfect - some displays (OLED) can only show peak brightness over a portion of the screen. But it would be an improvement.