logoalt Hacker News

shiandowyesterday at 7:57 PM0 repliesview on HN

Both are true in practice, so not unreasonable. For graph weights that is, not sorting.

That said the finite alphabet and bounded length requirements can be softened a bit. Even for general sorting algorithms.

I mean, for the kind of lexicographic sotable data we're talking about you can basically pick a convenient alphabet size without cost.

And unbounded length is not that big an obstruction. Sure you are going to need O(n log(n)) comparisons. But you can't compare data of unbounded length in constant time anyway. In the end you end up taking an amount of time that is at least proportional to the amount of data, which is optimal up to a constant factor. And if you fiddle with radix sort enough you can get it within something similar.

Basic ASCII strings and tuples aren't that big an obstruction. Fractions are more complicated.

Really the O(n log(n)) for comparison based sorts and O(N) for radix sort mean something different. One is the number of comparisons to the number of elements, and the other closer to the number of operations per amount of data. Though that assumes O(1) swaps, which is technically incorrect for data that doesn't fit a 64 bit computer.