Skip to content
Chicago Tribune
PUBLISHED: | UPDATED:
Getting your Trinity Audio player ready...

The threat of algorithm sabotage becomes vivid in the light of an actual case of an algorithm gone bad, one of the most important of multiple high-tech lessons in naval warfare learned from the 1982 British-Argentine Falklands War. During that war, a vital flaw surfaced in the guidance software controlling one of the basic systems British warships were relying on for defense against aerial attack: The guidance algorithm proved unable to cope with a combat situation involving two Argentine aircraft attacking along closely parallel courses. Faced with the dilemma of which plane to shoot at first–a contingency apparently unexpected and hence unplanned for–the software simply shut down the defense system, leaving a British frigate exposed to a bomb hit even as it watched another ship it was escorting burn and sink in the icy South Atlantic. Although this failure arose from simple oversight, not sabotage, its effect was no less severe.

In Clancy`s ”Red Storm Rising,” a fictional account of contingency planning for a future clash between U.S. and Soviet naval forces, the U.S. aircraft carrier Nimitz is knocked out of the war because of the same kind of glitch in a shipboard antimissile system. In both the real and fictional cases the software code itself was flawless. It was instead the underlying algorithm that was vitally flawed, through a subtle error of omission, not commission.

If such a crucial accidental flaw went undetected until after the British went into combat in the Falklands, what sort of subtle but potent defects could be deliberately built into–and hidden inside–the far more elaborate software needed to coordinate the full sweep of SDI? And who would be better suited to devise and implant such bugs than an SDI insider?

It is not even necessary to attack the software of SDI`s ”battle management” system, whose designers and builders are likely to be among the most highly guarded and compartmentalized for security purposes in the SDI development effort. More indirect sabotage could be just as effective. One possible way is by neutralizing ”testing” software intended to check out the final operational software. As past experience with seemingly unending major glitches in the U.S. Worldwide Military Command and Control System (WWMCCS)

suggests, an overlooked unplanned bug may have consequences just as unpleasant as those of a deliberately planted bug. Significantly, testing software is generally excluded from the widely quoted figure of 10 million

lines as a measure of SDI software`s complexity. When testing needs are fully accounted for, the true scale of SDI software may be much larger. In the case of the earlier SAGE system, for example, only about one-quarter of its software was devoted to supporting actual air-defense operations; three-quarters of it was for testing and other ancillary purposes.

Still other software is needed to operate key subsidiary systems. Accidental computer glitches have repeatedly forced postponement of U.S. space-shuttle launches. How many shuttle-type launches–and supporting software programs–might ultimately be needed to create and keep operational the complex network of orbiting battle stations and space-based sensors on which SDI`s performance will critically depend?

The extreme complexity of SDI`s software in all its aspects suggests that significant bugs may be virtually impossible to trace–even after some future software saboteur is caught. Software engineering environments have been likened to oceans, and as the last 50 years of antisubmarine warfare make clear, in such complex environments it is frequently far easier to hide things than to find them. Very possibly a future SDI software saboteur–whose damage has been executed with the most expert advice the other side can provide–may not himself know the full extent or ramifications of his actions. Logic bombs may not surface for years, or only when actual fighting begins.

Certain forms of ”graymail” may also pose a problem when a future software saboteur is taken into custody. A particularly messy situation would arise if an inside software saboteur were discovered midstream in SDI`s development, say, well after tens of billions of dollars had been invested in the system. Few people would be likely ever again to trust it in full.

Another great natural arena for software warfare is the vital field of computerized logistics, which is becoming central to warfare ranging from low- intensity conflict to nuclear war.

Although SDI is geared to the missile age, the art and science of logistics has been an integral part of warfare since Caesar invaded Gaul and George Washington wintered at Valley Forge. Broadly defined as the creation and sustained support of combat forces out of the economy of a nation, logistics ranges from the building, deploying and repairing of weapons to the equipping, feeding and supplying of troops in the field. Bad logistics can be just as disastrous as bad strategy or tactics. When some U.S. artillery units landed in Africa during World War II, for example, they discovered that their boxes of gunsights had been loaded on another ship, which headed back to the United States without unloading the gunsights, giving the artillery units a net military value of close to zero.

With more than 5 million items from bayonets to lasers represented in existing cataloging programs of the Defense Integrated Data System (DIDS), logistics management for the U.S. Department of Defense is widely recognized as one of the most data-intensive operations in the history of man. Computer hardware and software are being developed to make sure that snafus like the missing gunsights do not recur in a future war. Crucial decisions about the shape of U.S. commitments to computerized logistics are now being widely made that will have far-reaching effects on U.S. operational readiness and combat effectiveness well into the next century. The major Computer Aided Logistic Support (CALS) proposals of 1984-85, while emphasizing computerization to aid future weapons design, also give one of the best general statements of a comprehensive vision of the fully computerized logistics of the future. This proposed CALS system would unify all existing U.S. military and defense organizations into a single giant computer network for storing and passing on logistics data. And the beginnings of this massive effort are already fairly launched, with some existing logistic computer systems involving 55,000 industry suppliers and 140 government agencies.

Given the immense diversity of people, organizations, technologies and existing partially computerized systems that this CALS proposal seeks to tie together and coordinate, its ”paper-free” vision of future logistics is just as ambitious as SDI`s concept of strategic defense. To illustrate just one paper-free transaction, suppose a U.S. combat aircraft lands at an overseas base with its electronic defensive systems out of commission. To remedy the situation, a technician promptly uses the CALS computer system to obtain needed service data, stored in the system in purely electronic form. The fault is promptly identified with the aid of advanced software of the ”expert system,” or ”artificial intelligence,” kind, now also being discussed for CALS. Another query placed through the computer`s ”universal spare-parts catalog” digital identifier pinpoints the nearest facility holding the precise replacement part needed to make the repair. The order for the part is then logged into the computer, and–much more rapidly than would be possible today–the part arrives and the aircraft is again ready to fly its missions. Simultaneously, a computer report of the entire servicing transaction flows back to the U.S., dynamically updating databases maintained there to plan future production and logistics support.

In a manner that many veterans of the U.S. military may find almost surreal, this entire transaction would generate no paperwork at all. The computer would hold and channel all necessary information.

Implemented with appropriate caution, this comprehensive CALS vision has many merits. But even as these broad policies are laid down and commitments crystallize, often in decentralized ways, it is crucial to recognize that the ease of extracting and manipulating computerized logistic data also creates a host of new targets for software warfare.

Some of these targets are of the siphoned-supplies kind that caused the U.S. Army so much trouble in South Korea. Even more recently, a smuggling ring was discovered to be siphoning off spare parts for the U.S. Navy F-14 jet for clandestine shipment to Iran, an activity a Navy audit later found to be possible in part because of computer deficiencies in an inventory-control system used by U.S. Navy aircraft carriers.

Because margins of success in combat are often thin, even a simple form of software warfare directed against supply systems–for example, successfully diverting spare parts from combat units–could make the difference between victory and defeat in a future war.

Increasing the odds that inside saboteurs could be placed successfully is the vast scale of future computerized logistics systems. (CALS itself is expected to spill over into civilian telecommunications channels because the existing Defense Data Network will not be able to handle the projected CALS data volume.) After all, unlike software for a weapon system or related command, control and communications (C3), logistics software is widely intended to interface with broad sectors of the civilian economy, thereby involving day-to-day access to it by tens of thousands of users from field commanders to factory managers and technicians. A 1977 General Accounting Office critique of a computerized logistics system far less ambitious than CALS highlighted many management complexities that CALS, too, will face to a much greater degree. Just the highly dynamic problem of keeping authorized CALS user lists up to date is by itself formidable and gives rise to a

”security problem of unprecedented dimensions,” in the words of the major CALS planning study.

Moreover, CALS` comprehensive agenda affords far more opportunities for algorithm and software sabotage than just diverting spare parts. As engineers rely ever more broadly on tools of CAD/CAM (computer aided design/computer aided manufacturing), these, too, present opportunities for sophisticated sabotage of weapons in both their design and manufacturing stages.

An inside saboteur might also try to turn a logistic computer network`s dynamic strengths against themselves, creating in the network something analogous to a potent software virus capable of generating clouds of false or misleading logistics reports. These reports would attempt to maximize Clausewitz` famous ”fog of war,” confounding planning and encouraging misdirection of scarce military assets. And in the style of the logic bomb, this virus might be seeded in peacetime and held in reserve by an adversary for a future military crisis. For this reason, software warfare holds the promise of being the first truly militarily effective type of economic warfare.