It was a very long time ago, but during a programming competition one of the warm-up questions was something to do with a modified sudoku puzzle. The naive algorithmic solution was too slow, the fancy algorithm took quite a bit of effort... and then there were people who realised that the threshold for max points was higher than you needed for a brute force check of all possible boards. (I wasn't one of them)
This generalises to a few situations where going faster just doesn't matter. For example for many cli tools it matters if they finish in 1s or 10s. But once you get to 10ms vs 100ms, you can ask "is anyone ever likely to run this in a loop on a massive amount of data?" And if the answer is yes, "should they write their own optimised version then?"