Over some forty years the world nuclear weapons systems have evolved to a state of near hair-trigger sensitivity, due, in part, to the complexity of the systems and the history of international tension. A high premium was placed on speed in these weapons' launch and response systems. Some of these tensions remain. While the United States views the condition for stability as superiority, Russia views it as parity. Added to this is the U.S. assumption of moral superiority and its self-assumed role as global judge, jury, and police in the new world.
In an environment of distrust, the autonomy of a weapon system already subject to inherent technological problems of speed, size, and complexity increases the probability of accidental nuclear war. Even if this probability is very small, any risk is unacceptable because the consequences are so great. A nuclear war would represent the highest possible cost for the entire world. There are several classes of accidental nuclear war, but in this paper we will be dealing with accidents in the information systems attendant to nuclear forces. However, such accidents are not independent of human error or deviant behavior.
Launch on Warning (LOW): The Ultimate Folly
It seems that the most vulnerable system to the induction of nuclear war is the Command, Control, Communications, and Intelligence (C3I) system, since it, in effect, has become a surrogate decision-maker. The vast amount of data to be processed has led to the evolution of a huge network of electronic-computer-satellite-radar Command and Control Systems. These systems are being programmed to warn of a nuclear attack, to verify it, and to operate the counter-attack and defences. Its defensive capability rests on validation of an attack by two independent sensors (termed "dual phenomenology") satellites and radars. Among the reasons for potential failure is the sensitivity of the system to international tension, i.e. to the quantity of information flow, the complexity of the system with its potential for single or multiple link failure, and the fact that it is a prime target. Destroying the C3I system is called a "decapitation strike" and is a preferred "first-strike" strategy, because it destroys the brain of the body politic, eliminating communications to individual nuclear force commanders. This kind of attack, plus a "counter-force strike" against long-range missiles, is the essence of the United States' war-winning strategy, begun with Reagan and continued by Clinton. The system is vulnerable to an electromagnetic pulse attack initiated by the explosion of a large thermonuclear device in space and leading to massive disruption of all electronic systems. While hardening of these systems is possible, the cost seems prohibitive and the resistance to alteration by the civilian links to C3I is great.
There is an intrinsic conflict between the priorities of speed and response to valid threats and the time required for validation to avoid a false launch. In strategic doctrine an unauthorized launch of accidental launch is termed a Type I Error, while failure to launch on predetermined targets is a Type II error. Each requires a different set of controls, chain of command, and opposite information flows. It is impossible to maximize the avoidance of both types of error simultaneously. The problem has been exacerbated by the introduction of an even more rapid response system.
The U.S. nuclear command system has entered the 21st century. REACT, as it is euphemistically named, has now replaced the 1960's vintage command-and-control launch system throughout the entire U.S. nuclear ICBM land-based force. REACT is being touted as the perfect system to interpret a presidential directive to launch.
With REACT, all command sequences move at the speed of light rather than depending on voice messages between administrative and operational personnel. The cost of REACT is a modest $680 billion. By implication, the U.S. military strategy has moved to the assured destruction of its enemies.
During a recent CNN interview, former Senator Sam Nunn, a staunch supporter of the policies that led to the present situation, confirmed the danger of accidental nuclear war, asserting that it is greater now than during the Cold War (CNN, 8 Feb. 1999). The same CNN program revealed the case of a near-miss that occurred in July 1997, when a peaceful Norwegian space probe was launched, activating the Russian response system as if it were a Trident strike. The window for the decision to launch is only ten minutes. Russian President Boris Yeltsin took eight minutes to decide it was not a real attack. The world was left with two minutes to doomsday in this case, in the form of a launch of Russian nuclear weapons in response to a ghost radar signal.
Bruce Blair, an expert on accidental nuclear war, confirmed that two minutes is insufficient for hot-line mediation. The U.S. Strategic Command in Omaha is not prepared for a Russian mistake. The current Clinton-Yeltsin agreements on this problem are cosmetic. Russian bases are suffering serious deterioration of their command and control, and their radar and satellites are in disarray. Soldiers have not been paid for a long time. Some units are unable to pay their electrical bills. There was only one solution, Nunn and Blair agreed, and that was to take the nuclear arsenals of both countries off their hair-trigger alert, not merely to re-target them according to the current agreement.
The System is Not Alert
In 1983 alone, "false alerts" or alarms occurred on an average of every two or three days, for a total of about 250. During a political crisis involving the superpowers, obviously the most dangerous period, such false alerts took between six and eight minutes to resolve. The collection, processing, and transmission of the initial signal takes about two to three minutes. If we assume that it takes at least that time to validate this signal, then we again have a threshold of about seven minutes, below which a "launch-on-warning" or "launch-them-or-lose them" posture is adopted. I do not have the space to list here all recorded false alerts. In 1960, for example, one was caused by lunar effects; geese have also caused false alerts. One serious false alert in 1979 was caused by microchip failure and a major false alarm lasting a full six minutes occurred when a technician mistakenly mounted a training tape of a Soviet attack on an operational computer at NORAD in 1980.
If a false alert should indicate that a nuclear missile was on a flight path to Washington or Moscow with a calculated flight time of five minutes, there would barely be time to use the "hot line." The result - a major nuclear war.
The Millennium Problem
Because all emerging technologies, more than any other social activity, serve the direct interest of economic growth, there has been an established tendency to ignore potentially undesirable side-effects, social and environmental. From pesticides to automobiles this failure of prior assessment has exacted unacceptable social costs. The case of the "Year 2000" (Y2K) is both typical of this failure and yet in some ways unique. While the emergence of cyberspace with the internet was not fully foreseen, the failure to anticipate the impact of a two-number coding of the year appears in retrospect almost inexplicable. This error, coupled with the proliferation of the ubiquitous chip in everything from kitchen devices to intercontinental missiles with nuclear warheads is potentially a large hazard. Some of these computing devices are deeply embedded in very complex information systems, such as air travel, electrical generation plants, nuclear forces and financial services. Disruptions in any of these systems could have very serious consequences. It was recently reported that a woman born in 1896 has been notified to attend kindergarten next year - a minor error with a portent of things to come.
Again, as with all technological issues, the Y2K problem has its optimists and pessimists, both the self-serving and the true believers. Excluding doomsday prophets, in all these debates there seem to be two aspects of the problem which are largely a matter of agreement. One of these is the time factor - i.e. that we may have waited too long to correct all the problems; the other is that it is more likely that the United States and the other Western industrialized countries will have the problem under control before 1 January 2000. This is by no means certain for Russia. Yet there is also some residual doubt about the U.S. Department of Defense. The issue of the command and control of nuclear weapons is the most significant of the Y2K problems. The existence of two nuclear forces in the United States and Russia, coupled with a near hair-trigger alert system, is fraught with extreme danger.
When we combine the emergence of the U.S. REACT system, the real probability of false alerts deriving from the Y2K problem and the political and economic chaos in Russia, there is only one rational option, and that is the de-alerting of U.S. and Russian nuclear forces until there is a certainty that both systems have totally overcome the problem. It is more likely that the Russian system might fail through the Y2K problem. Boris Yeltsin is obviously an ailing leader and many in Russia are calling for his abdication. Yet even when he was recently hospitalized in January 1999, he insisted on keeping his finger on the nuclear button. There is no confidence that Russia will have solved the Y2K problem by 1 January 2000.
For all of the above reasons, it is important that the major nuclear powers institute a general "de-alerting" over a period beginning in December 1999 and continuing until the Y2K problem has been solved in all the nuclear weapon systems. To this end, we are calling for a worldwide movement of citizens, peace NGOs, and concerned groups to promote such a "de-alerting." Such a movement could be parallel to Abolition 2000 and use the name De-alert 2000. It would be important for such groups as Generals for Peace and the Canberra Commission to support this movement.
Professor Knelman lives in Vancouver.
Peace Magazine May-June 1999, page 18. Some rights reserved.
Search for other articles by F.H. Knelman here