I'm surprised AV1 usage is only at 30%. Is AV1 so demanding that Netflix clients without AV1 hardware acceleration capabilities would be overwhelmed by it?
There are a lot of 10 year old TVs/fire sticks still in use that have a CPU that maxes out running the UI and rely exclusively on hardware decoding for all codecs (e.g. they couldn't hardware decode h264 either). Image a super budget phone from ~2012 and you'll have some idea the hardware capability we're dealing with.
Compression gains will mostly be for the benefit of the streaming platform’s bills/infra unless you’re trying to stream 4K 60fps on hotel wifi (or if you can’t decode last-gen codecs on hardware either ). Apparently streaming platforms still favor user experience enough to not heat their rooms for no observable improvement. Also a TV CPU can barely decode a PNG still in software - video decoding of any kind is simply impossible.
If you are on a mobile device, decoding without hardware assistance might not overwhelm the processors directly, but it might drain your battery unnecessarily fast?
tv manufacturers don't want high end chips for their tv sets... hardware decoding is just a way to make cheaper chips for tvs.
Thanks to libdav1d's [1] lovingly hand crafted SIMD ASM instructions it's actually possible to reasonably playback AV1 without hardware acceleration, but basically yes: From Snapdragon 8 onwards, Google Tensor G3 onwards, NVIDIA RTX 3000 series onwards. All relatively new .
[1] https://code.videolan.org/videolan/dav1d