That’s a strange claim because the first digital HDR capture devices were film scanners (for example the Cineon equipment used by the motion picture industry in the 1990s).
Film provided a higher dynamic range than digital sensors, and professionals wanted to capture that for image editing.
Sure, it wasn’t terribly deep HDR by today’s standards. Cineon used 10 bits per channel with the white point at coding value 685 (and a log color space). That’s still a lot more range and superwhite latitude than you got with standard 8-bpc YUV video.
They didn’t call that “HDR” at the time, and it wasn’t based on the idea of recording radiance or other absolute physical units.
I’m certain physicists had high range digital cameras before Cineon, and they were working in absolute physical metrics. That would be a stronger example.
You bring up an important point that is completely lost in the HDR discussion: this is about color resolution at least as much as it’s about range, if not moreso. I can use 10 bits for a [0..1] range just as easily as I can use 4 bits to represent quantized values from 0 to 10^9. Talking about the range of a scene captured is leaving out most of the story, and all of the important parts. We’ve had outdoor photography, high quality films, and the ability to control exposure for a long time, and that doesn’t explain what “HDR” is.