I never checked it, but how much a typical's pc/server's clock does actually drift over a week or a month? I always thought it's well under a second.
I've seen some new ThinkPads lose a minute a month and others (the old ThinkPads) keep within a second of NTP over an entire year. It depends.
Several seconds per week is normal. Oscillator accuracy is roughly on the order of 10 PPM, which would correspond to 6 seconds per week.
I have an extremely cheap and extremely low power automatic cat feed - it’s been on 2 D batteries for 18 months. I just reset it after it had drifted 19 minutes, so about 1 minute a month, or 15 seconds a week!
Clocks do drift. Seconds a week is definitely possible. I think there are varying quality of internal clocks in electronic devices, and the cheaper the quality the more drift there is. I think small cheap microcontrollers can drift seconds per day.