TechWhirl (TECHWR-L) is a resource for technical writing and technical communications professionals of all experience levels and in all industries to share their experiences and acquire information.
For two decades, technical communicators have turned to TechWhirl to ask and answer questions about the always-changing world of technical communications, such as tools, skills, career paths, methodologies, and emerging industries. The TechWhirl Archives and magazine, created for, by and about technical writers, offer a wealth of knowledge to everyone with an interest in any aspect of technical communications.
Subject:Re: TW role in Y2K issues From:Maynard Hogg <maynard -at- GOL -dot- COM> Date:Sun, 11 May 1997 08:07:29 +0900
At [Wed, 7 May 1997 14:10:58 -0700]
Steve Fouts <stefou -at- ESKIMO -dot- COM> spouted forth:
> Most IBM compatible PCs are incapable of handling the change.
> So any PC software that uses the system clock to tell it what
> day it is could be in a world of hurt. Interestingly, Mac OS
> was designed in 1984 to correctly handle dates until the middle
> of the next century.
Typical nonsense from a rabid Mac lover! (For the record, I have an
ancient Mac IIvi.)
(a) The DOS real-time clock (hardware) date functions (Int 1Ah, funcs 04h
and 05h) use four BCD fields: century, year, month, and day.
(b) The MS-DOS kernel system date functions (Int 21H funcs 2Ah and 2BH)
use a 16-bit value for year since 1980 (a decade later than Unix's 1970)
with a built-in limit of 2099 (119 years). (Hopefully, DOS--and
Windows--will be long gone by then.)
(c) MS-DOS directory entries (Int 21H funcs 5700h and 5701H) reserve
seven bits (bits 15-9) for the year since 1980 (i.e., enough for 128
years).
There's nothing wrong with the operating systems or (despite the
woefully ignorant scaremongering in the press) hardware. It's lazy
programmers that are at fault.
> Beyond that, I get to do a few updates as the programs switch from YY
> to YYYY, and the rest of the headache is very firmly in IS. Just where
> are we going to find space in a packed data array to store the
> century? Sure glad it's not my job.
Here Steve falls into the same trap that caused the problem in the first
place: Just because the full century number requires four digits/characters
in its *ASCII* representation (1997=31393937H), who says that you have
to store it that way inside the computer?
When I first studied computers in the Sixties, the problem was solved on
IBM mainframes with binary coded decimal, which only requires half as
many bytes--i.e., 1997=1997H. (I hope it's obvious how simple it is to
pack and unpack such numbers. Note also that BCD support lives on in the
Intel 80x86 architecture.) Alas, generations of database programmers
have failed to take advantage of even this, preferring the simple
expedient of storing only the last two digits of the year in ASCII--hence
the problem.
Not content with a year range spanning 10,000 years (0000H-9999H) with
only two bytes, Unix switched to the binary format, which covers over
67,000 years. (At a price: all the converting back and forth to human-
readable format takes time.)
===
Maynard Hogg
#306, 4-30-10 Yoga, Setagaya-ku, Tokyo, Japan 158
Fax: +81-3-3700-7399
Internet: maynard -at- gol -dot- com http://www2.gol.com/users/maynard/
Unsolicited commercial electronic mail sent to this address will be
proofread at a cost of US$200/hour (half-hour minimum).
TECHWR-L (Technical Communication) List Information: To send a message
to 2500+ readers, e-mail to TECHWR-L -at- LISTSERV -dot- OKSTATE -dot- EDU -dot- Send commands
to LISTSERV -at- LISTSERV -dot- OKSTATE -dot- EDU (e.g. HELP or SIGNOFF TECHWR-L).
Search the archives at http://www.documentation.com/ or search and
browse the archives at http://listserv.okstate.edu/archives/techwr-l.html