logoalt Hacker News

niobetoday at 12:59 AM1 replyview on HN

So that table is using distance as a proxy for signal to noise ratio. SNR is what really matters.

Each data rate in the standard uses a different encoding technique. "Faster" encoding techniques cram more data into a given transmission interval but require a higher signal to noise ratio to be received without error. Since SNR declines with distance you can have a rough idea at what distance from a transmitter you will be able to receive at what data rate.

However, people and vendors focus far too much on maximum throughput. I've seen data showing that even in the best conditions, clients spend about 1% of their time transmitting or receiving at the highest data rates. Because they are dynamically adjusting the data rate based on the perceived SNR.

Individual clients' peak throughput also works against _aggregate_ throughput when talking about wireless networks with multiple users. If you have 100 clients, do you want one to be able to dominate the others or everyone get a more or less equal share? These peak speeds assume configurations that I would never deploy in practice, because they favour individual users and cripple aggregate throughput - things like 160 MHz wide channels.

But the sticker speed is what sells..


Replies

cortesofttoday at 2:30 AM

There are a lot of people who are the only ones using their Wi-Fi, so they probably don't care about the performance for anyone else

show 1 reply