The famous Y2K bug occurred because some old programs used two decimal digits to
ID: 3798247 • Letter: T
Question
The famous Y2K bug occurred because some old programs used two decimal digits to store years. This became a problem in the year 2000, because such programs had no way to tell whether "00" meant 1900 or 2000.
A similar problem will occur for Java programs when the number of milliseconds since the beginning of 1970 exceeds the capacity of a long. In what year will this occur, given that the maximum value of a long is 9,223,372,036,854,775,807? What if getTime() returned an int, which has a maximum value of 2,147,483,647? What about those UNIX/C systems which use an int to store the number of seconds since the beginning of 1970?
Explanation / Answer
Part 1: Java Program Time Overflow if Long Data type is used to store the time:
Maximum Number that can be stored in long datatype :9,223,372,036,854,775,807
Number of millisecons in 1 year = 365*24*60*60*1000=
3153600000 msec
Total number of years befor it overflow : 2924712087 years
Part 2: If Int has been used instead of Long then It has been overflow withinn a year i.e 1971 itself.
Total number of years befor it overflow :
Part 3:
If Unix system might have used int to store second then
The years on which it will overflow is 0.6*1000=600 years after 1970.
Maximum Number that can be stored in long datatype :9,223,372,036,854,775,807
Number of millisecons in 1 year = 365*24*60*60*1000=
3153600000 msec
Total number of years befor it overflow : 2924712087 years
Part 2: If Int has been used instead of Long then It has been overflow withinn a year i.e 1971 itself.
Total number of years befor it overflow :
2,147,483,647/3153600000=0.6 yearsPart 3:
If Unix system might have used int to store second then
The years on which it will overflow is 0.6*1000=600 years after 1970.
Related Questions
drjack9650@gmail.com
Navigate
Integrity-first tutoring: explanations and feedback only — we do not complete graded work. Learn more.