The way Google implemented leap seconds wasn't by sticking a 23:59:60 second at the end of 31st Dec. The way they did it was more interesting.
What they did instead was to "smear" it across the day, by adding 1 / 86400 seconds to every second on 31st Dec. 1/86400 seconds is well within the margin of error for NTP, so computers could carry on doing what they do without throwing errors.
Edit: They smeared it from noon before the leap second, to the noon after, i.e 31st Dec 12pm - 1st Jan 12pm.
Two things that aren't really covered:
- system clock drift. Google's instances have accurate timekeeping using atomic clocks in the datacenter, and leap seconds smeared over a day. For accurate duration measurements, this may matter.
- consider how the time information is consumed. For a photo sharing site the best info to keep with each photo is a location, and local date time. Then even if some of this is missing, a New Year's Eve photo will still be close to midnight without considering its timezone or location. I had this case and opted for string representations that wouldn't automatically be adjusted. Converting it to the viewer's local time isn't useful.
Very nice write up! But I think your point that time doesn't need to be a mess is refuted by all the points you made.
I know you had to limit the length of the post, but time is an interest of mine, so here's a couple more points you may find interesting:
UTC is not an acronym. The story I heard was the English acronym would be "CUT" (the name is "coordinated universal time") and the French complained, the French acronym would be "TUC" and the English-speaking committee members complained, so they settled for something that wasn't pronouncable in either. (FYI, "ISO" isn't an acronym either!)
Leap seconds caused such havoc (especially in data centers) that no further leap seconds will be used. (What will happen in the future is anyone's guess.) But for now, you can rest easy and ignore them.
I have a short list of time (and NTP) related links at <https://wpollock.com/Cts2322.htm#NTP>.
> One way the website could handle this is by storing the user's exact input 2026-06-19 07:00, and also store the UTC+0 version of that datetime (if we assumed that the timezone rules won't change); this way, we can keep using the UTC+0 datetime for all logic, and we can recompute that UTC+0 datetime once we detect that the time rules for that timezone have changed.
Well, how do we know what timezone is "2026-06-19 07:00" in, to be able to know that the time rules for that timezone have changed, if we do not store the timezone?
Additionally, how do we really "detect that the time rules for that timezone have changed"? We can stay informed, sure, but is there a way to automate this?
> How does general relativity relate to the idea of time being a universal, linear, forward-moving "entity"?
TAI provides a time coordinate generated by taking the weighted average of the proper times of 450 world lines tracked by atomic clocks. Like any other time coordinate, it provides a temporal orientation but no time coordinate could be described as "universal" or "linear" in general relativity. It would be a good approximation to proper time experienced by most terrestrial observers.
Note that general relativity doesn't add much over special relativity here (the different atomic clocks will have different velocities and accelerations due to altitude and so have relative differences in proper time along their world lines). If you already have a sufficiently general notion of spacetime coordinates, the additional curvature from general relativity over minkowski space is simply an additional effect changing the relation between the coordinate time and proper time.
As many others have said, time and calendars is messy, and there is often no correct solution but just a bunch of trade-offs. Jon Skeets Storing UTC is not a Silver Bullet (https://codeblog.jonskeet.uk/2019/03/27/storing-utc-is-not-a...) was very influential for me in realizing some of the subtleties in what a point in time means for a user, and how that should incluece the design of a system.
> other epochs work too (e.g. Apollo_Time in Jai uses the Apollo 11 rocket landing at July 20, 1969 20:17:40 UTC).
I see someone else is a Vernor Vinge fan.
But it's kind of a wild choice for an epoch, when you're very likely to be interfacing with systems whose Epoch starts approximately five months later.
One thing I found out when programming a timeline lately was that year zero doesn’t exist. It just goes from -1 to 1. Which looks very weird if you want to display intervals of e.g. 500 years: -1001, -501, -1, 500, 1000, etc
It is a pet peeve of mine, but any statement that implies that Unix time is a count of seconds since epoch is annoyingly misleading and perpetuates such misconception. Imho better mental model for Unix time is that has two parts, days since epoch * 86400, and seconds since midnight, which get added together.
Nice post. I think about time... all the time haha. There's another source you might enjoy (Re: your NTP and synchronization question) from TigerBeetle: [Implementing Time](https://www.youtube.com/watch?v=QtNmGqWe73g)
The absl library has a great write up of time programming: https://abseil.io/docs/cpp/guides/time
I never really took time seriously until one of my cron jobs skipped execution because of daylight saving. That was the moment I realized how tricky time actually is.
This article explains it really well. The part about leap seconds especially got me. We literally have to smear time to keep servers from crashing. That’s kind of insane.
> What explains the slowdown in IANA timezone database updates?
My guess is that with the increasing dependency on digital systems for our lives the edge-cases where these rules aren't properly updated cause increased amounts of pain "for no good reason".
In Brazil we recently changed our DST rules, it was around 2017/2018. It caused a lot of confusion. I was working with a system where these changes were really important, so I was aware of this change ahead of time. But there are a lot of systems running without too much human intervention, and they are mostly forgotten until someone notices a problem.
It’s quite different from how I think about time, as a programmer. I treat human time and timezones as approximate. Fortunately I’ve been spared from working on calendar/scheduling for humans, which sounds awful for all the reasons mentioned.
Instead I mostly use time for durations and for happens-before relationships. I still use Unix flavor timestamps, but if I can I ensure monotonicity (in case of backward jumps) and never trust timestamps from untrusted sources (usually: another node on the network). It often makes more sense to record the time a message was received than trusting the sender.
That said, I am fortunate to not have to deal with complicated happens-before relationships in distributed computing. I recall reading the Spanner paper for the first time and being amazed how they handled time windows.
Glad OP discussed daylight savings nightmare.
But I hate how when I stack my yearly weather charts, every four years either the graph is off by one day so it is 1/366th narrower and the month delimiters don't line up perfectly, or i have to duplicate Feb 28th so there is no discontinuity in the lines. Still not sure how to represent that, but it sure bugs me.
We don’t have much trouble yet with relativistic temporal distortions, but Earth’s motion causes us to lose about 0.152 seconds per year relative to the Solar system. Likewise we lose about 8.5 seconds per year relative to the Milky Way. I wonder when we’re going to start to care. Presumably there would be consideration of such issues while dealing with interplanetary spacecraft, timing burns and such.
I never really took time seriously until one of my cron jobs skipped execution because of daylight saving. That was the moment I realized how tricky time actually is. This article explains it really well. The part about leap seconds especially got me. We literally have to smear time to keep servers from crashing. That’s kind of insane.
I think this is one of my favourite write ups on HN for a while. I miss seeing more things like this.
I’m all about monotonic time everywhere after having soon too many badly configured time sync settings. :)
... humans don't generally say
"Wanna grab lunch at 1,748,718,000 seconds from the Unix epoch?"
I'm totally going to start doing that now.
>Two important concepts for describing time are "durations" and "instants"
The standard name for durations in physics are "periods" or 'uppercase T' ('lowercase t' being a point in time), which curiously enough are the inverse of a frequency (or the frequency is the inverse of). A period can also be thought of as an interval [t0,t1] or inequality t0<=T<=t1
> The concept of "absolute time" (or "physical/universal time") refers to these instants, which are unique and precisely represent moments in time, irrespective of concepts like calendars and timezones.
Funnily enough, you mean the opposite. An absolute time physically does not exist, like an absolute distance, there is no kilometer 0. Every measurement is relative to another, in the case of time you might use relative to the birth of (our Lord and saviour) Jesus Christ. But you never have time "irrespective" of something else, and if you do, you are probably referring to a period with an implicit origin. For example if I say a length of 3m, I mean an object whose distance from one end to the other is 3m. And if I say 4 minutes of a song, I mean that the end is 4 minutes after the start, in the same way that a direction might be represented by a 2D vector [1,1] only because we are assuming a relationship to [0,0].
That said, it's clear that you have a lot of knowledge about calendars from a practical software experience of implementing time features in global products, I'm just explaining time from the completely different framework of classical physics, which is of course of little use when trying to figure out whether 6PM in Buenos Aires and 1 PM in 6 months in California will be the same time.
Obligatory falsehoods programmers believe about time:
https://gist.github.com/timvisee/fcda9bbdff88d45cc9061606b4b...
In a nutshell if you believe anything about time, you're wrong, there is always an exception, and an exception to the exception. And then Doc Brown runs you over with the Delorean.
> What's the history of human timekeeping? Particularly before the Gregorian calendar, what historical records do we have for who was tracking/tallying the days elapsed over time? How did people coordinate on the current date globally (if at all)? How did local mean time (LMT) work in the past?
Ooh, this is a really interesting topic!
Okay, so the first thing to keep in mind is that there are three very important cyclical processes that play a fundamental role in human timekeeping and have done so since well before anything we could detect archaeologically: the daily solar cycle, the lunar cycle (whence the month), and the solar year. All of these are measurable with mark 1 human eyeballs and nothing more technologically advanced than a marking stick.
For most of human history, the fundamental unit of time from which all other time units are defined is the day. Even in the SI system, a second wasn't redefined to something more fundamental than the Earth's kinematics until about 60 years ago. For several cultures, the daylight and the nighttime hours are subdivided into a fixed number of periods, which means that the length of the local equivalent of 'hour' varied depending on the day of the year.
Now calendars specifically refer to the systems for counting multiple days, and they break down into three main categories: lunar calendars, which look only at the lunar cycle and don't care about aligning with the solar year; lunisolar calendars, which insert leap months to keep the lunar cycle vaguely aligned with the solar year (since a year is about 12.5 lunations long); and solar calendars, which don't try to align the lunations (although you usually still end up with something akin to the approximate length of a lunation as subdivisions). Most calendars are actually lunisolar calendars, probably because lunations are relatively easy to calibrate (when you can go outside and see the first hint of a new moon, you start the new month) but one of the purposes of the calendar is to also keep track of seasons for planting, so some degree of solar alignment is necessary.
If you're following the history of the Western calendrical tradition, the antecedent of the Gregorian calendar is the Julian calendar, which was promulgated by Julius Caesar as an adaptation of the Egyptian solar calendar for the Romans, after a series of civil wars caused the officials to neglect the addition of requisite leap months. In a hilarious historical example of fencepost errors, the number of years between leap years was confused and his successor Augustus had to actually fix the calendar to have a leap year every 4th year instead of every third year, but small details. I should also point out that, while the Julian calendar found wide purchase in Christendom, that didn't mean that it was handled consistently: the day the year started varied from country to country, with some countries preferring Christmas as New Years' Day and others preferring as late as Easter itself, which isn't a fixed day every year. The standardization of January 1 as New Years' Day isn't really universal until countries start adopting the Gregorian calendar (the transition between Julian and Gregorian calendar is not smooth at all).
Counting years is even more diverse and, quite frankly, annoying. The most common year-numbering scheme is a regnal numbering: it's the 10th year of King Such-and-Such's reign. Putting together an absolute chronology in such a situation requires accurate lists of kings and such that is often lacking; there's essentially perennial conflicts in Ancient Near East studies over how to map those dates to ones we'd be more comfortable with. If you think that's too orderly, you could just name years after significant events (this is essentially how Winter Counts work in Native American cultures); the Roman consular system works on that basis. If you're lucky, sometimes people also had an absolute epoch-based year number, like modern people largely agree that it's the year 2025 (or Romans using 'AUC', dating the mythical founding of Rome), but this tends not to be the dominant mode of year numbering for most of recorded human history.
Time is a mess. Always. The author only scratched the surface on all the issues. Even if we exclude the time dilation of relativity which affects GPS/GNSS satellites - independent of if it is due to difference in gravitational pull or their relative speed over ground, it's still a mess.
Timezones; sure. But what about before timezones got into use? Or even halfway through - which timezone, considering Königsberg used CET when it was part of Germany, but switched to EET after it became Russian. There's even countries that have timezones differenting by 15 minutes.
And dont get me started on daylight savings time. There's been at least one instance where DST was - and was not - in use in Lebanon - at the same time! Good luck booking an appointment...
Not to mention the transition from Julian calendar to Gregorian, which took place over many, many years - different by different countries - as defined by the country borders at that time...
We've even had countries that forgot to insert a leap day in certain years, causing March 1 to occur on different days altogether for a couple of years.
Time is a mess. Is, and aways have been, and always will be.