They didn’t call that “HDR” at the time, and it wasn’t based on the idea of recording radiance or other absolute physical units.
I’m certain physicists had high range digital cameras before Cineon, and they were working in absolute physical metrics. That would be a stronger example.
You bring up an important point that is completely lost in the HDR discussion: this is about color resolution at least as much as it’s about range, if not moreso. I can use 10 bits for a [0..1] range just as easily as I can use 4 bits to represent quantized values from 0 to 10^9. Talking about the range of a scene captured is leaving out most of the story, and all of the important parts. We’ve had outdoor photography, high quality films, and the ability to control exposure for a long time, and that doesn’t explain what “HDR” is.
It certainly was called HDR when those Cineon files were processed in a linear light workflow. And film was the only capture source available that could provide sufficient dynamic range, so IMO that makes it “HDR”.
But I agree that the term is such a wide umbrella that almost anything qualifies. Fifteen years ago you could do a bit of superwhite glows and tone mapping on 8-bpc and people called that look HDR.
It was called "extended dynamic range" by ILM when they published the OpenEXR spec (2003):
> OpenEXR (www.openexr.net), its previously proprietary extended dynamic range image file format, to the open source community
https://web.archive.org/web/20170721234341/http://www.openex...
And "larger dynamic range" by Rea & Jeffrey (1990):
> With γ = 1 there is equal brightness resolution over the entire unsaturated image at the expense of a larger dynamic range within a given image. Finally, the automatic gain control, AGC, was disabled so that the input/output relation would be constant over the full range of scene luminances.
https://doi.org/10.1080/00994480.1990.10747942
I'm not sure when everyone settled on "high" rather than "large" or "extended", but certainly 'adjective dynamic range' is near-universal.