Countdown to Y2038 bug

January 19, 2038 at 03:14:07 UTC will happen in

Current timestamp

What is the Y2038 bug?

The Y2038 bug is a computer bug that is related to the way that computers store and handle dates. It is also known as the "Year 2038 problem", the "Unix Millennium bug", or the "Epochalypse".

How does a computer store a date?

To store a date, computers use a numerical value to represent the year, month, day, hours, minutes and seconds. This numerical value is often based on the number of seconds that have passed since a specific point in time, known as the "epoch". Different operating systems use different epochs, but a common one is January 1, 1970 at 00:00:00 UTC.
This means that, most of the time, computers represent a date as the number of seconds that have passed since January 1, 1970 at 00:00:00 UTC.

Here is the current date:

Number of seconds since January 1, 1970 at 00:00:00 UTC (also called Unix Timestamp):

A computer stores numbers in a binary form, which is a system of representing numbers using only two digits: 0 and 1. These digits are known as "bits".

Here is the representation of the Unix timestamp in binary:

As of today, it takes 31 bits to store the current date in a computer.

Inside a computer, memory is made of bytes, which are groups of 8 bits. Therefore, to store a date (which is actually a number) in a computer, it takes 4 bytes: 31 bits + 1 bit to define the sign of the number. (0 for positive, 1 for negative)

With this configuration, computers cannot use more than 31 bits to store the date!

Why size matters

Here is an example:

Let's say that we want to store a number in a computer. We use 4 bits to store it.

0000 = 0

0001 = 1

0010 = 2

0011 = 3


The maximum value that can be stored in 4 bits is 1111 which is equal to 15.

But what happens if we decide to keep incrementing the bits? If that happens, then the binary value will "overflow" and restart back at 0000.

Yes. For a computer, if a number is stored in 4 bits: 15 + 1 = 0

Now, if we go back to our date: The maximum value that can be stored in 31 bits is 2147483647. This is equal to January 19, 2038 at 03:14:07 UTC.

If we try to add one more second to this date, the computer will overflow again. But because computers are using one more bit for the sign (+/-), instead of going back to 0, the computer will restart back at -2147483647, which is equivalent to December 13, 1901 at 20:45:53 UTC!

Since computers don't handle well time-traveling, this is a problem.

Is it already happening?

Yes, and it's going to happen more and more. Software are already handling dates in the future (e.g. to calculate a loan payment schedule). So, even though we are not in 2038 yet, we are already experiencing the 2038 problem.

A recent real example of the 2038 problem documented on Wikipedia:

In May, 2006, reports surfaced of an early Y2038 problem in the AOLServer software. The software would specify that a database request should "never" timeout by specifying a timeout date one billion seconds in the future. One billion seconds after 21:27:28 on 12 May, 2006 is beyond the 2038 cutoff date, so after this date, the timeout calculation overflowed and calculated a timeout date that was actually in the past, causing the software to crash.

What can I do about it?

There is a solution: Instead of using 32 bits, computers should use 64 bits (i.e. 8 bytes) to store a date which will not overflow for 292 billion years.

As a user

As a user, you can ensure that your devices and software are kept up to date with the latest patches and updates. As soon as software are fixed to use 64 bits to store dates, you'll be safe from the Y2038 bug.

As a developer

As a developer, there are a few steps you can take to prepare for the Y2038 bug:


Y2106 bug

The Y2106 bug is a similar problem that is expected to occur in the year 2106, when the numerical value used to represent dates will again overflow. The difference being that computers use the full 32 bits to represent dates without using the first bit for the sign.

The Y2106 bug is expected to be less severe than the Y2038 bug, as it is expected that most systems will have been updated to use a 64-bit integer to represent dates by that time.

Y2K bug

The Y2K bug, or "Millennium bug", was a similar problem that occurred at the turn of the millennium, when many systems were not prepared to handle the transition from the year 1999 to the year 2000. The Y2K bug was caused by the fact that many systems used a two-digit representation of the year, which meant that the year 2000 was represented as "00". This could have caused issues when these systems attempted to process dates after the year 2000, as the two-digit representation of the year would have been indistinguishable from the year 1900. The Y2K bug was largely fixed before it could cause significant issues.