I get that to properly test a cable, you need that level of accuracy, but for home use, couldn’t you get away with a source and a receiver that are far cheaper?
If a USB4 device can output a USB4 stream and the receiver can check that stream for errors, isn’t that sufficient?
At some point you end up testing the peripheral and/or host rather than the cable. For example, cables often state that they can handle up to 240W ... but no 240W USB-PD chip has ever gone into production -- you won't even find one at the hottest USB-PD trade shows[0] in China.
It could be reasonable for computers to be allowed to trigger a data throughput test and the peripheral would state "I support up to 40Gbps of receiving/sending", and then send a simple pattern that can be generated on the fly. But a lot of devices can't receive/send that 80Gbps of data for long enough to perform a decent test - the storage, RAM, buffers, etc get depleted or act as bottlenecks.
If you know enough to accurately interpret the measurements you get from that, you know enough to write your own computer program to try to send 80Gbps from one computer to another and use DMA to process it in real-time without hitting storage (which a lot of peripherals likely don't have the CPU to accomplish).
If you don't know enough to write those test applications, you probably don't know enough to interpret the results of a built-in test function and the measurements would confuse and frustrate a lot of well-meaning, nerdy, but under-educated consumers who make assumptions about why they're not actually getting the rated speed.
Idk, my opinion doesn't go one way or the other here. Perhaps I myself don't quite know enough to be a good judge of that concept.
0: https://asiachargingexpo.com