logoalt Hacker News

Capricorn2481last Tuesday at 11:46 PM1 replyview on HN

> Lots of programmers use DSA knowledge on a daily basis - and if you aren't you're probably writing bad code. I see a lot of O(n^2) code or worse making apps slow for no reason

I don't think one can seriously argue that. This as much a meme as anything. I know it's popular to rag on devs writing inefficient software, but there's plenty of apps with functions where a user couldn't possibly notice the difference between O(n^2) and O(1). You wouldn't take the time to make everything O(1) for no speedup because someone told you that's what good code is, that's just wasting dev time.

In fact, one of the first things you learn is that O(1) can be slower. Constant time is not good if the constant is big and n is small.


Replies

sfn42last Wednesday at 12:06 AM

Obviously I'm not talking about the cases where it doesn't matter. I'm talking about the cases where it does.

I fixed one where a report took 25 minutes to generate and after switching out an O(n^2) list lookup with a dict it too less than 5. Still embarrassingly slow but a lot better.

There's also a lot of cases where it didn't matter when the dev wrote it and they had 400 rows in the db but 5 years later theres a lot more rows so now it matters.

Doesn't cost anything to just use a better algorithm. Usually takes exactly the same amount of time, and even if it is marginally slower at small n values who cares? I don't give a shit about saving nanoseconds. I care about the exponential timewaste that happens when you don't consider what happens when the input grows.

For small inputs it doesn't matter what you do. Everything is fast when the input is small. That's why it makes sense to prefer low complexity by default.