ha ha ha!, we state the same thing, I think, maybe, but it's the "10's of milliseconds" of inconsistancy that I am refering to, we can measure things in as you well state, a "breathtaking" way, but are then reduced to bieng reality's mad, overworkered, note keepers, and ALL of the clocks are still ALWAYS wrong, with the very definition of time becoming something that we must descibe as "realitys ongoing mistake, that we must correct for". I will indulge myself and go a bit farther into woooooo, teritory, in that this new ability to keep time so accuratly is something that resists entropy by creating order, where there is none.
Frankly there is nothing woo about it.
Even without venturing into the realms of Quantam Mechanics or Chaotic Dynamics, all measurements are, by physical definition, upto the resolution of the measuring instrument. So it always come with inherent +/- error bars. Sometimes these error bars are made explicit, at other times they are elided, often with the assumption that it's obvious.
Models are always simplifications. That is precisely why they are useful. ( A 1:1 scale map is not very useful, especially when we already have one). Models are obtained by ignoring the effect of many known disturbances, but whose effect one deems not to exceed a tolerance bound.