> Our eyes can see both just fine.
This gets to a gaming rant of mine: Our natural vision can handle these things because our eyes scan sections of the scene with constant adjustment (light-level, focus) while our brain is compositing it together into what feels like a single moment.
However certain effects in games (i.e. "HDR" and Depth of Field) instead reduce the fidelity of the experience. These features limp along only while our gaze is aimed at the exact spot the software expects. If you glance anywhere else around the scene, you instead percieve an unrealistically wrong coloration or blur that frustratingly persists no matter how much you squint. These problems will remain until gaze-tracking support becomes standard.
So ultimately these features reduce the realism of the experience. They make it less like being there and more like you're watching a second-hand movie recorded on flawed video-cameras. This distinction is even clearer if you consider cases where "film grain" is added.
I'm with you on depth of field, but I don't understand why you think HDR reduces the fidelity of a game.
If you have a good display (eg an OLED) then the brights are brighter and simultaneously there is more detail in the blacks. Why do you think that is worse than SDR?
I had a similar complaint with the few 3D things I watched when that has been hyped in the past (e.g., when Avatar came out in cinemas, and when 3D home TVs seemed to briefly become a thing 15 years ago). It felt like Hollywood was giving me the freedom to immerse myself, but then simultaneously trying to constrain that freedom and force me to look at specific things in specific ways. I don't know what the specific solution is, but it struck me that we needed to be adopting lessons from live stage productions more than cinema if you really want people to think what they're seeing is real.
HDR, not "HDR", is the biggest leap in gaming visuals made in the last 10 years, I think.
Sure, you need a good HDR-capable display and a native HDR-game (or RTX HDR), but the results are pretty awesome.
This is why I always turn off these settings immediately when I turn on any video game for the first time. I could never put my finger on why I didn’t like it, but the camera analogy is perfect
Ok. I like depth of field and prefer it.
https://www.realtimerendering.com/blog/thought-for-the-day/
It's crazy that post is 15 years old. Like the OP and this post get at, HDR isn't really a good description of what's happening. HDR often means one or more of at least 3 different things (capture, storage, and presentation). It's just the sticker slapped on advertising.
Things like lens flares, motion blur, film grain, and shallow depth of field are mimicking cameras and not what being there is like--but from a narrative perspective we experience a lot of these things through tv and film. Its visual shorthand. Like Star Wars or Battlestar Galactica copying WWII dogfight footage even though it's less like what it would be like if you were there. High FPS television can feel cheap while 24fps can feel premium and "filmic."
Often those limitations are in place so the experience is consistent for everyone. Games will have you set brightness and contrast--I had friends that would crank everything up to avoid jump scares and to clearly see objects intended to be hidden in shadows. Another reason for consistent presentation is for unfair advantages in multiplayer.