Question

The explanation I got for this was that it was common practice for programmers to simply represent the year with 2 digits. But why would anyone do that? If anything I imagine it would take more effort to make a program roll back to 1900 instead of going on to 2000.

Was it helpful?

Solution

Premium of storage space + lack of foresight = Y2K bug.

Saving bytes was very important in many older systems. Plus, a common fallacy in software development is "no one's going to be using this in X years". So why not save those bytes now? 10/20/30 years down the line this will SURELY be scrapped for a completely new system.

To quote Lex Luthor -- "Wrong."

OTHER TIPS

Why? Their question was likely, "Why not?" If it saved a few bits in a world where memory usage was significantly more limited, then they figured they may as well save that space.

Obviously, the "why not" was because "your software might actually be in use for significant amounts of time." Some programmers had the foresight to plan ahead, but not all.

The story goes that this was back in the day when 1 kilobyte of RAM cost more than $1000. Omitting the extra digits meant really money savings.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top