Skip to content
Author
PUBLISHED: | UPDATED:
Getting your Trinity Audio player ready...

Some of America’s top computer programming talent will gather in Chicago Tuesday to deal with a looming crisis over an issue that usually is left to the clergy.

We’re talking end-of-time millennium-type stuff here, or at least a digital gnashing of teeth among the world’s computer database droids as the year 2000 dawns.

The topic is known in computer industry circles as nothing less than the “Year 2000 Holocaust,” the meeting’s sponsor, a California-based consulting company called ADPAC Corp., noted in a press release announcing the session.

This rather strange crisis, which is much more pressing than many of the potential victims seem to realize, most directly afflicts big-time mainframe computer operations.

But, as we’ll soon see, it also has much to tell the rest of us about the strange state of the personal computing scene in this era of massively upsized computing power in a massively downsized workplace.

There is, it turns out, substantial fear and trembling in the ranks of the data-processing departments at some of the world’s greatest financial and business institutions about what’s going to happen one clock tick past midnight on Dec. 31, 1999.

At that instant, unknown billions of computer records, dealing with everything from mortgages to pay raises, driver’s license renewal times to medicine expiration dates, suddenly will carry incorrect data.

The problem is that, starting in the 1960s, computer experts spent nearly three decades writing programs in which precious mainframe memory space was saved by expressing the years in calendar dates as just the last two digits. Thus 95 stands for 1995.

This scrimping to save two bytes or 16 bits of memory seems ludicrous today when even the cheapest home computer comes out of the store with 4 million bytes of Random Access Memory, which is space for 32 million bits.

Bits, it should be remembered are the 0s and 1s that are the only language computers understand. Computing power thus is measured best in terms of how many bits the machine can pump out in the form of data or on-screen displays of text and pictures.

The enormous abundance of bits coursing through the innards of a modern-day desktop personal computer lies at the heart of the computer/information revolution.

But when the revolution dawned in the era of lumbering air-conditioned mainframe giants, memory was precious and economies like just using the last two digits for the year were commonplace.

Thus Sunday, Feb. 26, 1995, is expressed as 2-26-95 on the great bulk of the 6,000 mainframe computers now being operated by major businesses and government agencies around the world.

But after the millennium arrives, that 95 at the end could either be 1995 or 2095. The expression 2-26-95 thus becomes worthless. Actually, it’s worse than worthless. It’s a hazard.

If your retirement date is the year 2010, the computers at many of the places that hold your records likely will show you worked for a mere 10 years in your entire life.

Likewise, some corporate billing systems will wake up Monday morning, 1/3/2000-New Year’s Day 2000 is on a Saturday-and note that the computer records show that some of its debt has been on the books for more than 100 years.

The computers then will mark these valuable receivables as lost causes and suspend all further billing and collection efforts.

This sort of stuff obviously is major league bad news for banks, insurance companies, brokerage houses, credit card companies, government agencies and legions of other places where computer programmers traditionally have saved precious memory space by abbreviating the year in their data.

In a recent technical paper on the problem in the journal Application Development Trends, William Ulrich, president of Tactical Strategy Group Inc., a software consulting company, described this post-millennial scenario:

“When credit card systems check expiration dates during approval processing, they will show an expiration date that is prior to the current date and reject the transaction. People will not be able to buy on credit.

“Insurance systems will reject claims upon determining that policies have expired. Motor vehicle departments will experience a rash of expired driver licenses. . . .”

OK, you say. So if it’s broke, then let’s just fix it.

Well it turns out it isn’t that easy.

Peter Harris, chief executive of ADPAC, the sponsor of the Chicago meeting, explained that enormous amounts of computer code written over the last three decades carry the bad date information.

Fixing it will require a process called recompiling, and recompiling has much in common with putting the toothpaste back into the tube.

A typical business-strength mainframe computing application can contain a million lines or more of the coded instructions to the machine that were written by the original programmers.

To fix software that carries the millennium bug, the code must be run through a decompiling program that converts the computer code into languages like Cobol in which it originally was written.

These languages allow programmers to write instructions in something resembling English. Once the instructions are written in human form, they are compiled into 0s and 1s.

So once the code that was compiled with the millennium bug written into is decompiled, it must be fixed to allow four digits rather than two in what is called the date field.

Then all places in the software where the date field was used to perform other calculations must be fixed to accommodate four digits rather than two.

Finally, the program must be recompiled back into the binary code of bits that the computer understands and reloaded into the machines.

Harris and Ulrich said that because of the sheer volume of work to be done, most experts believe it will take several years for experts to review software now in use to find the bad data.

And these experts note that the current climate in business, particularly in already brutally downsized computer departments, works against finding a solution until after the computational chickens come home to roost.

“The average CIO (corporate information officer) doesn’t even last five years on the job, and this person probably isn’t going to improve longevity by telling the board that something is broken and needs to be fixed when the board wants to find new ways to make money,” Harris said.

Thus, as the years ticked by, computer executives kept putting off any millennium fixes on the theory that things would somehow work out even if they didn’t repent. I told you this was a job for the clergy.

In recent years, for example, CIOs embraced downsizing their mainframe operations into networks of personal computers, and predicted that by the time the century turned, the mainframes would be in landfills.

But studies by the prestigious Gartner Group now show the bulk of the 6,000 mainframes still in heavy use will continue to serve as the heart of American business well past 2000.

Meanwhile, the programmers who started things rolling in the 1960s knew they would be able to retire when the millennium makes the software they wrote go beddy bye. If you were 30 in 1965, for example, you will turn 65 in 2000.

Of course the computer at work will show that you just started with the company and cancel your pension. But, hey, nobody ever said we wouldn’t have to pay the price for progress.

———-

Tribune computer writer James Coates can be reached via the Internet at jcoates1@aol.com