In the 1990s computer analysts began to warn that the date change from 1999 to 2000 at the millennium could cause some computer systems to malfunction. This potential problem, which soon become known as the Y2K (Year 2000) bug, attracted much media attention and a good deal of conspiracy speculation.
Many computer operating systems designed in the 1970s and 1980s had an internal date clock that only used two digits, and so could only go as far as 1999. Industry analysts feared that some computers would not recognize the year with the rollover from 1999 to 2000, and would either produce errors (mistakenly assuming the date to be 1900), or would shut down altogether.
The Y2K “bug” was therefore not a computer virus as such, but a design flaw that hadn’t been anticipated in the early years of software development. Governments and businesses around the world spent huge sums of money in the years leading up to the millennium ensuring that their computer systems would be Y2K-compliant; it is estimated that the U.S. government spent $8.5 billion, and U.S. business around $100 billion.
Conspiracy theories took one of two main forms. On the one hand, many on the religious far right who were already concerned about the apocalyptic connotations of the millennium warned that the global meltdown of computer systems caused by the Y2K bug would result in civil mayhem, in turn provoking the U.S. government to declare martial law, and for the Federal Emergency Management Agency (FEMA) to initiate rumored plans for a complete erosion of citizen rights with the imposition of the socalled New World Order.
One posting to the Internet newsgroup alt.conspiracy in 1997, for example, warned, “In the year 2000, the United States Government, NATO and its armed forces will cease to exist, as will most of the organized economy, when all of the needed mainframe computers (and a good many others in consequence) become useless.”
The writer went on to predict that “a third to two thirds of the population of Europe and the United States will die of thirst, starvation and disease, anarchy and crime” (Wilson).
These fearful scenarios tapped into the existing apocalyptic, survivalist mentality, and a small but committed number of Americans stock-piled food and took to their basements or their cabins in the woods in the run-up to the year 2000.
The other form that conspiracy theories took was a cynical—some might say clear-sighted—dismissal of the Y2K problem as a panic promoted by the computer industry (and whipped up by the media and a compliant government) in order to fool everyone into spending a fortune on unnecessary technical fixes. This theory was circulated a good deal on the Internet, but was only held by a minority.
However, many more people began to question the scale of the reported problem and the cost of fixing it, and many urban legends and rumors muddied the waters further (one common rumor, for example, suggested that most video recorders would not recognize the date change, but would work properly if set to 1972, a year that corresponded in day/date to 2000, a fact that the manufacturers were apparently deliberately keeping quiet; unfortunately, the warning about the malfunction, the proposed quick fix, and the hint of a conspiracy of silence by the manufacturers were all off the mark).
According to an opinion poll reported in the Wall Street Journal on 7 September 1999, most Americans did not believe that the Y2K bug was deliberately created (as some of the far right claimed), but some 15 percent of Americans believed that “a person and/or company is hiding the solution to the Y2K bug.” For those holding this belief, 60 percent named Microsoft as the guilty party, while most of the rest blamed the government.
In the end, of course, very little happened at the millennium. Some power-generating companies in the United States, for example, reported that their computer systems had briefly malfunctioned, but then the date was quickly reset and the systems were restored; and there were other minor, temporary glitches (for instance, some slot machines at a Delaware race track shut down, but the problem was quickly fixed [CNN]).
It became apparent very soon that there was to be no global computer systems crash, although some commentators—in the tradition of endlessly deferred millennial warnings—pointed out that the problem would not kick in until the return to work after the long bank holiday, and then that the true millennium would not begin until 2001.
Although the first group of conspiracy theorists outlined above had little to say in the wake of the nonevents of 1 January 2000, some of the second group continued to question whether the threat had been unnecessarily exaggerated. The computer industry and the U.S. government, however, insisted that disaster had been avoided only because the vast sums of money had been well spent on curing the problem in advance.
As the sociologist Anthony Giddens has argued, in issuing warnings about potential disaster scenarios (the AIDS epidemic and mad cow disease are the obvious examples) the authorities are in a no-win situation: either they downplay the potential danger, but then if something bad does happen they will be accused of a cover-up; or they give full credence to the scientific predictions, but, because people change their behavior accordingly and the disaster is averted, they are vulnerable to accusations that the problem was deliberately exaggerated in the first place.