Obviously I'm not talking about the cases where it doesn't matter. I'm talking about the cases where it does.
I fixed one where a report took 25 minutes to generate and after switching out an O(n^2) list lookup with a dict it too less than 5. Still embarrassingly slow but a lot better.
There's also a lot of cases where it didn't matter when the dev wrote it and they had 400 rows in the db but 5 years later theres a lot more rows so now it matters.
Doesn't cost anything to just use a better algorithm. Usually takes exactly the same amount of time, and even if it is marginally slower at small n values who cares? I don't give a shit about saving nanoseconds. I care about the exponential timewaste that happens when you don't consider what happens when the input grows.
For small inputs it doesn't matter what you do. Everything is fast when the input is small. That's why it makes sense to prefer low complexity by default.