But it's correct. It's "a" count. Just not the count that you might always expect. And the "second" in this definition means what people usually understand as a second, as in the duration is always the same. That's all, and it's pretty useful imho.
> And the "second" in this definition means what people usually understand as a second, as in the duration is always the same.
Umm what? In Unix time some values span two seconds, which is the crux of the problem. In UTC every second is a proper nice SI second. In Unix time the value increments every one or two SI seconds.
> And the "second" in this definition means what people usually understand as a second, as in the duration is always the same.
Umm what? In Unix time some values span two seconds, which is the crux of the problem. In UTC every second is a proper nice SI second. In Unix time the value increments every one or two SI seconds.