HDR on displays is actually largely uncomfortable for me. They should reserve the brightest HDR whites for things like the sun itself and caustics, not white walls in indoor photos.
As for tone mapping, I think the examples they show tend way too much towards flat low-local-contrast for my tastes.
Most "HDR" monitors are junk that can't display HDR. The HDR formats/signals are designed for brightness levels and viewing conditions that nobody uses.
The end result is a complete chaos. Every piece of the pipeline doing something wrong, and then the software tries to compensate for it by emitting doubly wrong data, without even having reliable information about what it needs to compensate for.
https://docs.google.com/document/d/1A__vvTDKXt4qcuCcSN-vLzcQ...
There's a pretty good video on YouTube (more than one, actually) that explains how careless use of HDR in modern cinema is destroying the look and feel of cinema we used to like.
Everything is flattened, contrast is eliminated, lights that should be "burned white" for a cinematic feel are brought back to "reasonable" brightness with HDR, really deep blacks are turned into flat greys, etc. The end result is the flat and washed out look of movies like Wicked. It's often correlated to CGI-heavy movies, but in reality it's starting to affect every movie.
HDR is really hard to get right apparently. It seems to get worse in video games too.
I'm a huge fan of Helldivers 2, but playing the game in HDR gives me a headache: the muzzle flash of weapons at high RPMs on a screen that goes to 240hz is basically a continuous flashbang for my eyes.
For a while, No Mans' Sky in HDR mode was basically the color saturation of every planet dialed up to 11.
The only game I've enjoyed at HDR was a port from a console, Returnal. The use of HDR brights was minimalistic and tasteful, often reserved for certain particle effects.