TVs don't support displayport, so it makes Linux PCs like the Steam Machine inferior console replacements if you want high refresh rates. A lot of TVs now support 4K/120hz with VRR, the PS5 and Xbox Series X also support those modes.
(Some games support 120, but it's also used to present a 40hz image in a 120hz container to improve input latency for games that can't hit 60 at high graphics quality.)
Correction, you can get 4K@120hz with HDMI 2.0, but you won't get full chroma 4:4:4, instead 4:2:0 will be forced.
In my case I have an htpc running linux and a radeon 6600 connected via hdmi to a 4k @ 120hz capable tv, and honestly, at the sitting distance/tv size and using 2x dpi scaling you just can't tell any chroma sub-sampling is happening. It is of course a ginormous problem when on a desktop setting and even worse if you try using 1x dpi scaling.
What you will lose however is the newer forms of VRR, and it may be unstable with lots of dropouts.
Why don't TVs support displayport? If HDMI 2.1 support is limited, a TV with displayport sounds like an obvious choice.
I thought audio might be the reason, for as far as I can tell, displayport supports that too.