Is there a consensus definition of what counts as "HDR" in a display? What is the "standard dynamic range" of a typical TV or computer monitor? Is it roughly the same for devices of the same age?
My understanding is most SDR TVs and computer screens have displays about 200-300 nits (aka cd/m²). Is that the correct measure of the range of the display? The brightest white is 300 nits brighter than the darkest black?
this isn't a correct definition. human perception of brightness is roughly logarithmic, so it also matters how deep the blacks get. For a good HDR experience, you need a monitor that gets to 600 nits at a bare minimum, but also which can get very close to 0 nits (e.g. via OLED or less optionally local dimming)
Yes, and not just static but dynamic properties: https://displayhdr.org/performance-criteria/