Overall complexity (work required) is a conserved quantity. You can move it around and claim that a new algorithm has reduced the complexity, but in essence it has shifted the complexity elsewhere.
Also, whether some problem has polynomial complexity or exponential complexity depends on what you consider as the problem space. Complexity of b^c is polynomial if b is the problem space and exponential if c is the problem space.
Complexity of traveling salesman problem depends on what you consider as the problem space - number of cities or number of connections between cities.
> Complexity of traveling salesman problem depends on what you consider as the problem space - number of cities or number of connections between cities.
lmao what?
I'm not sure if you're just choosing intentionally obtuse verbiage or if you're actually saying something completely incoherent.
"Overall Complexity" is a meaningless term without you defining it. I suppose you mean that there is some lower bound limit to the amount of work relative to a model of computation, but this is a trivial statement, because the limit is always at least no work.
We have many problems where we don't know the least upper bound, so even an interesting formulation of your idea is not necessarily true: work need not be conserved at the least upper bound, because reaching the bound may not be possible and epsilon improvement might always be possible.
Finally, algorithmic complexity is the wrong analogy for reverse mathematics anyway.