> The algorithm provides accurate results over a period of ±1.89 Trillion years
i'm placing my bets that in a few thousand years we'll have changed calendar system entirely haha
but, really interesting to see the insane methods used to achieve this
> i'm placing my bets that in a few thousand years we'll have changed calendar system entirely haha
Given the chronostrife will occur in around 40_000 years (give or take 2_000) I somewhat doubt that </humor>
The calendar system already changed. So this won't get correct dates, meaning the dates actually used, past that date. Well, those dates, as different countries changed at different times.
Wouldn’t it be accurate for that as well? Unless we change to base 10 time units or something. Then we all have a lot of work to do.
But if it’s just about starting over from 0 being the AI apocalypse or something, I’m sure it’ll be more manageable, and the fix could hopefully be done on a cave wall using a flint spear tip.
Maybe not in a few thousand years, but given the deceleration of the Earth’s rotation around its axis, mostly due to tidal friction with the moon, in a couple hundred thousand years our leap-day count will stop making sense. In roughly a million years, day length will have increased such that the year length will be close to 365.0 days.
I therefore agree that a trillion years of accuracy for broken-down date calculation has little practical relevance. The question is if the calculation could be made even more efficient by reducing to 32 bits, or maybe even just 16 bits.