The article has data points for n=10,100,1000,10000. Taking (n=10,000 - n=10)/(n=1,000 - n=10) would eliminate the constant factor and we'd expect about 10.09x higher times for a linear algorithm.
But for lsr, it's 9.34. The other tools have factors close to 10.09 or higher. Since ls has to sort it's output (unless -F is specified) I'd not be too surprised with a little superlinearity.
https://docs.google.com/spreadsheets/d/1EAYua3B3UeTGBtAejPw2...