logoalt Hacker News

dheerayesterday at 6:57 PM3 repliesview on HN

I hate big O notation. It should be O(N) = N^(1/3)

That way O is the function and it's a function of N.

The current way it's notated, O is the effectively the inverse function.


Replies

pfortunyyesterday at 7:02 PM

Yes, the notation seems wrong but it is because O(...) is a set, not a function. The functions is what goes inside.

So it should be f€O(...) instead of f=...

(don't know how to write the "belongs" symbol on iOS).

show 1 reply
tsimionescuyesterday at 9:40 PM

O(N) is not a function, though, so the notation is doing a good job of saying this. When we say "the complexity class of an algorithm is O(log N)", we mean "the function WorseCaseComplexity(N) for that algorithm is in the class O(log N)", The function "WorseCaseComplexity(N)" measures how much time the algorithm will take for any input of size N in the worse case scenario. We can also say "the average case complexity of quicksort is O(N log N)", which means "the function AverageCaseComplexity(N) for quicksort in the class O(N log N)", where AverageCaseComplexity(N) is a function that measure how much time quicksort will need to finish for an "average" input of size N.

Saying that a function f(n) is in the class O(g(n)) means that there exists an M and a C such that f(n) < C * g(n) for any n < M. That is, it means that, past some point, f(n) is always lower than C * g(n), for some constant C. For example, we can say the function f(n) = 2n+ 1 is in the class O(n), because 2n + 1 < 7n for any n > 1. Technically, we can also say that our f(n) is in the complexity class O(2^n), because 2n + 1 < 2^n for any n >= 3, but people don't normally do this. Technically, what we typically care about is not O(f(n)), it's more of a "least upper bound", which is closer to big_theta(n).

show 1 reply
purplesyringayesterday at 8:39 PM

No it shouldn't. The function you're talking about is typically called T(N), for "time". The problem is that you can't write T(N) = N^(1/3) because it's not exactly N^(1/3) -- for one thing, it's approximate up to a constant factor, and for another thing, it's only an upper bound. Big-O solves both of these issues: T(N) = O(N^(1/3)) means that the function T(N) grows at most as fast as N^(1/3) (i.e.: forms a relationship between the two functions T(N) and N^(1/3)). The "T(N) =" is often silent, since it's clear when we're talking about time, so at the end you just get O(N^(1/3)).