So, and I don't use four digits, of course.
Many old computer programs use only two digits to represent dates, recording 1998, for example, as 98.
The challenge with the Year 2000 problem is that for decades, programmers used only two digits to express years.
The problem stems from decades of using two digits to represent the year in dates.
As has been amply reported, many computer programs and operating systems use only two digits to recognize years.
In computers that use two digits to indicate the year, 2000 is often indistinguishable from 1900.
The binary numeral system is a way to write numbers using only two digits: 0 and 1.
All the threat stems from the fact, by now well known, that computer programs frequently use two digits to represent years, like 98 for 1998.
Cutting corners by using two digits to signify the year was what got us into this problem in the first place.
Take any four-digit number, using at least two different digits.