Several thousand nuclear weapons in Russia and the USA, and several hundred in four other nuclear weapon states are still on full alert, with warning systems and some aspects of their launch and control systems run by computers. The world has survived all the false alarms and computer malfunctions in 40 years with this disaster waiting to happen, by luck and good management.
Little change has been made since the Cold War ended, despite statements by U.S. and Russian presidents about "de-targeting." Bruce Blair of the Brookings Institute referred in testimony to U.S. Congress to: "the serious false alarm in January 1995, triggered by the firing of a Norwegian scientific rocket, which for the first time in Russian history triggered a strategic alert of their LOW [launch on warning] forces, an emergency nuclear decision conference involving their president and other national command authorities, and the activation of their famous nuclear suitcases."
From 1 January 2000 the operation of all computer systems with a two digit year in them anywhere, will be uncertain. Expert "fixes" cannot be certain to work under every combination of circumstances. Could there be a malfunction that could trigger an accidental nuclear war? No expert can guarantee that there could not. Satellite navigation for cruise missiles and intercontinental rockets is one example where a date input is required. Could a test rocket flight, or a cruise missile launched by the U.S. military against terrorists, land in quite the wrong place - or appear to a computer-operated early warning system as if it were going to do so?
I know of no way of estimating whether the increase in the risk will be large or small, but it must be an increase. My interest in this matter is not so much the actual increase of the risk - there has been an unacceptable risk of accidental war throughout the nuclear deterrence era - as that this threat could become the trigger which makes governments of the nuclear powers listen to the concerns of their citizens, and abolish it.
The way to make the risk zero is to remove all warheads from their delivery systems. Computers can issue warnings or even command a launch, but they cannot bolt warheads back in position, and that gives everybody time to stop and think. A major degree of "de-alerting" would go some way to achieve the same. These things can be done quickly and are verifiable. Fixing all possible computer bugs is going to take much longer than the 15 months that are left, or however long they may have been working on it, and can never be guaranteed.
We need an international campaign based on well-informed statements. The question has been discussed by the Science for Peace Board, and we plan to talk to Pugwash for a start. Physicians for Global Survival will discuss it with International Physicians for Prevention of Nuclear War and affiliates in many countries.
To make the nuclear weapon states governments change policies needs a strong public and NGO movement. For a start we need a powerful statement by respected computer experts. The statement from David Parnas, former Science for Peace president, confirms my view that there is a real problem, with no guaranteed computer solution. You can contact the Science for Peace Board at <firstname.lastname@example.org> or me at <email@example.com> .
Dr. Phillips is a retired radiologist in Hamilton, Ontario.
The newspapers are filled with simple explanations and illustrations of the set of computer program bugs known as the Y2K or Year 2000 problem. These examples are kept simple so that non-programmers can understand them. Unfortunately, such examples make the problem sound as if it would be easy to find and fix. That is not necessarily so. Many of the problems are more subtle than the examples usually published.
Computer programs are complex constructions and they are full of bugs because they are difficult to understand. When a problem is discovered it often takes weeks to find it and additional weeks to fix it. Very often, the "fixed" program is still not right and requires further repair after the revised program is put into service. The Y2K problem is not easier to fix than other bugs.
It is not always easy to determine whether or not a program is sensitive to dates. Sometimes programs that should not be date sensitive are subject to failure because they have components that were written for other purposes. Sometimes programs that are not sensitive to dates exchange data with programs that are date-sensitive and will fail when those "partner" programs fail. Many of these programs are poorly documented, their authors are no longer around, and there is nobody who understands them.
The U.S. military establishment depends on computers for communications, for intelligence, and for control of weapons. Failure of its systems could endanger us all. The U.S. DoD, and other military organisations, owe the public the assurance that they are doing what is needed to examine all of their systems and to repair and test them. The only way to be confident that there will be no serious problems is to disconnect the systems until we can observe their behavior in 2000.
Professor David Lorge Parnas, Department of Computing, McMaster University.