That's true on the web, as well; HDR images on web pages have this problem.
It's not obvious whether there's any automated way to reliably detect the difference between "use of HDR" and "abuse of HDR". But you could probably catch the most egregious cases, like "every single pixel in the video has brightness above 80%".
Funnily enough HDR already has to detect this problem, because most HDR monitors literally do not have the power circuitry or cooling to deliver a complete white screen at maximum brightness.
My idea is: for each frame, grayscale the image, then count what percentage of the screen is above the standard white level. If more than 20% of the image is >SDR white level, then tone-map the whole video to the SDR white point.
> It's not obvious whether there's any automated way to reliably detect the difference between "use of HDR" and "abuse of HDR".
That sounds like a job our new AI overlords could probably handle. (But that might be overkill.)