One thing I wonder about is whether PC gaming is splitting into two distinct tiers, high end for those with thousands to spend on their rig and studios who are pathfinders (id, Remedy, 4A, etc) in graphics, then the wider market for cheaper/older systems and studios going for broad appeal. I know the market isn't going to be neatly divided and more of a blurry ugly continuum.
The past few years (2018 with the introduction of RT and upscaling reconstruction seems as good a milestone as any) feel like a transition period we're not out of yet, similar to the tail end of the DX9/Playstation3/Xbox360 era when some studios were moving to 64bit and DX11 as optional modes, almost like PC was their prototyping platform for when they made completed the jump with PS4/Xbox one and more mature PC implementations. It wouldn't surprise me if it takes more years and titles built targeting the next generation consoles before it's all settled.
Once the "path tracing" that the current top end Nvidia cards can pull off reaches mainstream it will settle down. The PS6 isn't going to be doing path tracing because the hardware for that is being decided now. I'd guess PS7 time frame. It will take console level hardware pricing to bring the gaming GPU prices down.
I understand the reason for moving to real time ray-tracing. It is much easier for development, and apparently the data for baked/pre-rendered lighting in these big open worlds was getting out of hand. Especially with multiple time-of-day passes.
But, it is only the "path tracing" that top end Nvidia GPUs can do that matches baked lighting detail.
The standard ray-tracing in the latest Doom for instance has a very limited number of entities that actually emit light in a scene. I guess there is the main global illumination source, but many of the extra lighting details in the scene don't emit light. This is a step backward compared to baked lighting.
Even shots from the plasma weapon don't cast any light into the scene with the standard ray-tracing, which Quake 3 was doing.