Need a book? Engineering books recommendations...

Return to index: [Subject] [Thread] [Date] [Author]

RE: Year 2000 computer problem!

[Subject Prev][Subject Next][Thread Prev][Thread Next]
Patrick Rodgers <prodgers(--nospam--at)earthlink.net> wrote:

>This does not deal with leap years correctly, if I remember correctly, the
>magic year is set to 1972 and then a similar addition/subtraction
>performed.  You still need to deal with the dates past 2049.  I know, that
>is not our problem; but that is the thinking in the late 1970's and 1980's
>that caused this problem.

Good point regarding the leap years.  I hadn't considered that.  If 1972 is
the magic year that all the leap years line up properly, then the solution
might look like this:

1.  Subtract 72 from the two-digit year code from all your records (This
may take some work - perhaps a macro could be written to automatically
do this). 

2. Insert the following logic codes in your program everyplace where the
year is input as a two-digit code:

	If YEAR > or = 72, Then YEAR = YEAR - 72
	If YEAR < 72, Then YEAR = YEAR + 28

3.  Insert the following logic codes in your program everyplace where
the year is retrieved as a two-digit code: 

	If YEAR > or = 28, Then YEAR = YEAR - 28
	If YEAR < 28, Then YEAR = YEAR + 72

4. Insert a warning that all dates must be between 1972 and 2071.

5. If your program prints "19" or adds "1900" to the year code when it's
displayed or printed, then you need to insert logic codes so that it prints
"20" or adds "2000" to the year if it is less than 72.  


Regarding your other point, I think that all of these programs will become
obsolete by the year 2049 or 2071.  These simple changes provide an
immediate solution to the Y2K problem for those programs that cannot be
easily changed from a two-digit format to a four-digit format for the
years.  Of course, the long term solution, at least until the year 9999, is to
write the programs to take four digits for the years.


Michael S. Davis, P.E.