> The times seem sublinear, 10k files is less than 10x 1k files
Two points are not enough to say it's sublinear. It might very well be some constant factor that becomes less and less important the bigger the linear factor becomes.
Or in other words 10000n+C < 10000(n+C)
The article has data points for n=10,100,1000,10000. Taking (n=10,000 - n=10)/(n=1,000 - n=10) would eliminate the constant factor and we'd expect about 10.09x higher times for a linear algorithm.
But for lsr, it's 9.34. The other tools have factors close to 10.09 or higher. Since ls has to sort it's output (unless -F is specified) I'd not be too surprised with a little superlinearity.
https://docs.google.com/spreadsheets/d/1EAYua3B3UeTGBtAejPw2...