this post was submitted on 14 Dec 2023
219 points (95.8% liked)
Programmer Humor
32716 readers
244 users here now
Post funny things about programming here! (Or just rant about your favourite programming language.)
Rules:
- Posts must be relevant to programming, programmers, or computer science.
- No NSFW content.
- Jokes must be in good taste. No hate speech, bigotry, etc.
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
EDIT: NVM I'm a goddamn idiot, Unix Time's handling of leap seconds is moronic and makes everything I said below wrong.
Unix Time is an appropriate tool for measuring time intervals, since it does not factor in leap seconds or any astronomical phenomenon and is therefore monotonously increasing... If
T1
and/orT2
are given in another format, then it can get very hairy to do the conversion to an epoch time like unix time, sure.The alt-text pokes fun at the fact that due to relativity, at astronomical scales then time moves at different speeds. However, I would argue that this is irrelevant as the comic itself talks about "Anyone who's worked on datetime systems", vanishingly few of which ever have to account for relativity (the only non-research use-case being GPS AFAIK).
While the comic is funny, if:
(All of which are reasonable assumption for any real use-case)
Then
((time_t) t2) - ((time_t) t1)
is precise well within the error margin of the available tools. Expanding the problem space to take into account relativistic phenomena would be a mistake in almost every case and you're not getting the job.Clock misalignment comes up pretty frequently in some networking and networking-esque applications. Otherwise, yeah, the edge cases are indeed on the edge.
Subsecond precision comes up often in common applications too, but you can just expand out to milliseconds or whatever.
When you're saying Unix time does not include leap seconds, you are making exactly the wrong conclusion. Unix time is not a monotonically increasing number of seconds since the Epoch, because it excludes those seconds which are marked as leap seconds in UTC. I.e. the time between now and the Epoch was larger than the current Unix time shows (by exactly the number of leap seconds in between). See e.g. https://en.wikipedia.org/wiki/Unix_time#Leap_seconds
Aight I'm just dumb then. Now the question is who the fuck thought this was a good idea? Probably someone so naive they thought it'd make time conversions easy...
Unix time fails to work for the 'simple' case of timezones entirely. It's not meant for timezone based data and therefore unixtime in one timezone subtracted from unix time in another timezone will most likely give completely incorrect results. Even in the same timezone it will give incorrect results, see the 'simple' case of a country jumping across the international date line. Typically they skip entire days, none of which unix time will account for, as that would require not just time zone data, but location data as well.
You misunderstand what Unix Time is. It's the number of seconds since 1970-01-01T00:00+00:00. It's always relative to UTC. And the number of seconds since epoch is always the same regardless of where you are on Earth.
As I write this it's
1702600950
for you, for me, and in Sydney. Timezones (and DST, and leap seconds, and other political fuckery) only play a role once you want to convert1702600950
into a "human" datetime. It corresponds to2023-12-15 00:46:02 UTC
and2023-12-14 16:46:02 PST
(and the only sane and reliable way to do the conversion is to use a library which depends on the tzdata).