> And the "second" in this definition means what people usually understand as a second, as in the duration is always the same.
Umm what? In Unix time some values span two seconds, which is the crux of the problem. In UTC every second is a proper nice SI second. In Unix time the value increments every one or two SI seconds.
Really? Now I'm confused