Where’d all the time go?
It’s starting to fly.
See how the hands go
Waving goodbye!
Dr. Dog
Here we are: the fall of 2021 and my final quarter at Oregon State University. Since I’d rather not write a boring retrospective on how much I’ve learned & grown (yawn — call me when somebody learns how to un-grow), and since the class this blog is named for is still in the process of kicking off, I’m going to use my first post as an opportunity to talk about a topic that interests me: time.
To be specific, I’m going to make this post about Unix time, the coolest, most intuitive, and easiest-to-read form of timekeeping. To be even more specific, this post is about how Unix time will ruin everything and leave us sifting through sand on a barren Earth after it launches the technological apocalypse.
(…Maybe.)

Photo by Tusik Only on Unsplash
Unix time
The time right now is 1,633,029,524. And by the time I typed that, it already isn’t. So what is Unix time? That part’s simple: it’s the number of seconds since January 1st, 1970 UTC (also referred to as seconds since the Epoch 1). Why do we use it? Well, that part’s pretty intuitive, too: it’s a nice, discrete whole number that’s easy to store on a computer as a 32-bit integer.
Every day we move 86,400 seconds further into the Unix epoch (tick, tock), and the number in our computers grows ever larger. But what happens when that number gets too large for our computers — at least, too large for our 32-bit integers? Will it ever get that big? The answer is Yes, and sooner than you might expect. Enter:
The Year 2038 Problem

Photo by the blowup on Unsplash
As you’ve probably already guessed, the Year 2038 Problem, also known by the significantly more punk rock name the Epochalypse 2, is Unix’s Y2K problem. At exactly 03:14:07 UTC on January 19th, 2038, Unix timestamps which are stored as signed 32-bit integers will overflow, going all the way back to the maximum negative value, -2,147,483,648. It’ll look something like this:

Applied to the epoch’s start date of January 1st, 1970, a Unix timestamp of -2,147,483,648 would throw us all the way back to Friday, December 13th, 1901 (and if that isn’t bad enough, you’d still have work tomorrow. People didn’t start getting Saturdays off until at least 1908 3).
While this clearly can’t be a good thing, is it something we ought to worry about? Are we headed for a dark day of planes tumbling from the skies (to cooperate with not having existing in 1901, obviously), or will this simply be a long, boring slog where the Peter Gibbons of the world go around updating lines of code in dated applications?

Photo by Zoltan Tasi on Unsplash
The fallout & the solution
First things first, what systems are actually going to be affected by this? As The Guardian puts it,
The computers that have the potential to cause the biggest issues are those embedded systems that cannot be upgraded. They are used in many transportation systems and other long-living devices, equipment such as stability control systems in cars or other isolated computer-based systems.
[. . .]
Those embedded systems that are effected are likely to have to be completely replaced as the software can’t simply be upgraded.
Samuel Gibbs. “Is the Year 2038 problem the new Y2K bug?”
for The Guardian, December 17th, 2014.
This isn’t an ideal turn of events, but it seems manageable when accepting occasional losses as a matter of course. Besides, in some cases the problem will take care of itself: take the Opportunity rover, for example. For a while, there was major concern over its 32-bit Unix ground control system having the clock roll over at the end of the epoch, causing Opportunity to spaz out. Opportunity was only programmed to understand time as a positive number 4. Now Opportunity is dead 5. Problem solved.

Photo by Nicolas Lobos on Unsplash
While that may sound like a joke, a large part of our mitigation strategy is, in fact, to stay the course and accept that by 2038 most systems will be obsoleted in favor of ones that avoid the issue entirely. The existence of 64-bit processors, now the norm for computers, allows for storage of Unix timestamps large enough to handle “20 times the current age of the universe, which comes to around 292 billion years” 6.
The work ahead
While the 64-bit news is comforting, it remains true that there are instances of software and programming languages which have yet to bridge the gap and will have to be updated or rebuilt in order to accommodate 64-bit architecture, and for that problem there exists no straightforward solution.
One of the bigger problem areas is with C, which is used in much of Linux’s kernel programming as well as many of those embedded systems mentioned earlier. The signed 32-bit time_t
type, present in some implementations of C, will have to be either be:
a.) abandoned in favor of a new, sufficiently large type,
b.) have its signed 32-bit integer storage replaced with an unsigned 32-bit integer or a signed 64-bit integer (an unsigned 32-bit integer, which has a larger positive range due to not having to count negative numbers, would only buy us until the year 2106),
OR
c.) the differences in older programs will just have to be wrangled with compatibility functions (such as _time32()
and _time64()
) 7.
In the end, it will likely be some combination of all those things as developers come up with solutions that make the most sense for them.

Photo by Noah Buscher on Unsplash
While there are myriad more examples of the individual problems that will have to be grappled with in order to prepare for 2038, it stands to reason that we will cope with the bulk of them uneventfully as we did with Y2K. In the end, the problems share many similarities, and the solutions employed the first time around provide a foundation from which we can draw insight as we continue to handle the situation.
One parting thought comes to mind: as we pay yet again for the consequences borne from the limitations of the past, an important lesson repeats itself, which is to build your software for the future, because you never know how long it’ll be around.