It's true though. Time complexity is a relative measure. Big-O notation is an upper bound, not a lower one. You don't take the fastest storage (L1 cache) as the baseline `x` and then say it takes some f(n) > x time to access storage because it's not in the fastest cache. This is a complete misrepresentation of Big-O notation.
When we say something is O(1), we're saying that there is a constant upper bound on time to compute the algorithm. It is constant time if it takes no longer to to perform the algorithm when `n` becomes large. O(1) simply means the worst case time to compute is independent of `n`. If the algorithm is accessing storages of varying latencies, then the baseline is the slowest medium.
For small `n` (ie, what might fit into cache), Big-O notation is not really that useful. When `n` is tiny a O(n) algorithm can outperform a O(1) algorithm. The notation doesn't tell us anything about how efficient a particular implementation is. It is mainly useful when we're talking about `n` which can grow to large sizes, where whatever micro-optimizations we, or the machine might perform are dominated by `n`.
It's true though. Time complexity is a relative measure. Big-O notation is an upper bound, not a lower one. You don't take the fastest storage (L1 cache) as the baseline `x` and then say it takes some f(n) > x time to access storage because it's not in the fastest cache. This is a complete misrepresentation of Big-O notation.
When we say something is O(1), we're saying that there is a constant upper bound on time to compute the algorithm. It is constant time if it takes no longer to to perform the algorithm when `n` becomes large. O(1) simply means the worst case time to compute is independent of `n`. If the algorithm is accessing storages of varying latencies, then the baseline is the slowest medium.
For small `n` (ie, what might fit into cache), Big-O notation is not really that useful. When `n` is tiny a O(n) algorithm can outperform a O(1) algorithm. The notation doesn't tell us anything about how efficient a particular implementation is. It is mainly useful when we're talking about `n` which can grow to large sizes, where whatever micro-optimizations we, or the machine might perform are dominated by `n`.