By AMY L. FAIRCHILD, DAVID MERRITT JOHNS and KAVITA SIVARAMAKRISHNAN
Anxiety: We worry. A gallery of contributors count the ways.
In September of 1873, United States Senator J.R. West of Louisiana received a telegram from his home state whose terse lines spoke of abject desperation:
The people are panic-stricken. All that could have left. The poor are nearly all on our hands; no money in the city treasury. All pecuniary aid will be thankfully received. Fever increasing. (Signed) Samuel Levy, Mayor
A wave of yellow fever had swept through Shreveport, leaving in its wake a gash of death and disorder. It was one of many unwanted visits from Yellow Jack in the years after the Civil War — a plague whose cause was unknown but popularly connected to the exchange of infected bedding and clothing. What was certain was that death from yellow fever arrived in a horrible fashion: internal hemorrhages brought on by organ failure gave rise to projectile vomiting of a dark mix of mucus and blood. Victims sometimes wandered from their homes as the last stages of delirium set in, and it was not uncommon to find dead bodies in the streets. Corpses were hardly recognizable,. Coffins lined the streets.
Social isolation — at all costs — was embraced as the means of defense. Citizens living in uninfected areas sometimes took up arms to impose shotgun quarantines to fend off outsiders. In Jackson, Miss., residents ripped up railroad tracks leading into the city.
“Indignation is at fever heat here,” stated a news account, “and the people say that if necessary … they will burn every bridge between here and Vicksburg.” Terror radiated from the Deep South. An 1888 telegram received by the Post Office Department in Washington, D.C., from the Postmaster in Cairo, Ill., warned that the “country below is in the hands of a howling mob.” It would be more than a decade before Walter Reed and colleagues discovered that yellow fever is transmitted by mosquitoes. In the meantime, panic prevailed.
Much more recently, a very different disease panic struck the nation: earlier this month news outlets reported that the country is in the grip of three emerging flu or flulike epidemics. “I think we’re right on the cusp of a major flu season, and there’s going to be some panic, unfortunately,” warned one infectious disease specialist. Emergency rooms were mobbed with sick patients. New York State and the City of Boston declared public health emergencies. “The entire country is already in somewhat of a panic about its fevers and runny noses,” asserted New York magazine. “People are starting to panic because of all the news reports,” said the public health director of Natick, Mass.
This winter’s flu scare hardly compares to the panics of the 19th century. Perhaps we have forgotten what a cataclysmic panic looks like. This welcome state of panic amnesia is a credit to our watchful health departments, which for the past 150 years have taken up the difficult task of both disease and panic prevention. But even as health officials sought to manage an “excited and terrified public mind” with swift intervention and precise information, they helped to transform panic into an elusive culprit capable of taking on different guises, of moving in new circles.
The fright and flight experience of panic that characterized yellow fever and cholera outbreaks began to change around the turn of the century as scientists identified the microbes that caused disease. Permanent health departments sprouted in cities and towns. The confidence that came with the new germ theory seemed to provide shelter from the panicky days of old. The public health pioneer William Sedgwick summed up the scientific triumph of the bacteriological age: “Before 1880 we knew nothing; after 1890 we knew it all.” For the public this meant a new “freedom from fear,” asserted one news report. “So many things have been done for the protection of the health of mankind that fear is being driven further and further into the background.”
Governments increasingly saw it as their duty to prevent both disease and disease-related panic. During the 1918 flu pandemic, which killed more than 500,000 Americans, the New York City health commissioner, Royal Copeland, wrote that his aim was “to prevent panic, hysteria, mental disturbance, and thus to protect the public from the conditions of mind that in itself predisposes to physical ills.”
While there was plenty of talk about panic in 1918, what is most striking is how little of it was actually happening. Panic was almost a dirty word — a reaction that might be expected from ignorant immigrant communities but not from educated citizens. Doctors from Rhode Island to Louisiana warned people not to give themselves over to “dismal imaginations.” To worry was to provide “a fertile field for attack.” Health officials joined in the chorus: the best preventive was maintaining “a fearless and hopeful attitude of mind.”
The New York City Department of Health proclaimed victory over both disease and panic in the wake of the 1947 smallpox outbreak. With what the health commissioner described as the “intelligent cooperation of the public,” they successfully administered more than 6 million vaccinations in the space of just a month. Smallpox was limited to 12 individuals; two died. “There has been no panic,” The New York Times reported. “At no time was there any cause to fear an epidemic — such is the vigilance of the Department of Health.”
But even the absence of hysteria could not dispel the nightmare of panic for many officials. Just two years after the 1947 smallpox campaign, the Soviets detonated their first nuclear bomb. Val Peterson, President Eisenhower’s first head of the Federal Civil Defense Administration, made the case that panic was “the ultimate weapon.” Panic could “produce a chain reaction more deeply destructive than any explosive known…. Mass panic — not the A-bomb — may be the easiest way to win a battle, the cheapest way to win a war.” Remarkably, social science research from the end of the cold war through the present age of bioterrorism suggests that people have not been gripped by mass panic for more than a century. What was required, argued a group at King’s College London in 2006, was “dispelling the myth of a panic prone public.”
By the 1970s, to many observers the only place panic seemed to be breaking out was in the statehouse. Swine flu became emblematic of official overreaction. In 1976, the Ford administration’s massive inoculation effort to combat what became know as the “epidemic that never was” — which produced only one death — met with derision from both scientists and the press. Even a high ranking member of the Centers for Disease Control called it a “monstrous tragedy” and a press post-mortem on the controversy called it “a panicky overreaction to a minimal threat.”
Deeply stung by these accusations, the C.D.C. hesitated during the early years of the 1980s, when the sentinel cases of a devastating immune disorder were first reported in young, otherwise healthy, gay men. A void in leadership combined with an unknown, deadly disease identified within an already stigmatized group actually generated calls for more panic. A California referendum sponsored by the Prevent AIDS Now Initiative — PANIC — sought to pave the way for mass testing and quarantine. Someone, they argued, needed to sound the alarm. This was just a part of what one gay rights leader in San Francisco described as a broad-based “right-wing…field-day spreading panic and hatred.”
September 11th and the subsequent anthrax attacks became an occasion for both genuine alarm and deep skepticism about the potential abuses of panic. Critics stopped short of using the word “panic” to describe the Bush administration’s plan to inoculate millions for smallpox, but many health experts felt it was “premature” to distribute the vaccine so widely, as one pox researcher told Science in 2002. A prestigious Institute of Medicine committee asked to evaluate the campaign originally titled its report “Betrayal of Trust.” Ronald Bayer, an ethicist who served on the committee, explained that the original title was meant to “indicate the extent to which members believed the smallpox threat had been exaggerated for political ends.”
Indeed, the smallpox campaign was but a piece of what many on the left saw as a broader effort to enact emergency public health and national security legislation in a moment of national panic. “Constructive public health legislation, which must be federal, cannot be carefully drafted under panic conditions,” wrote the health law scholar George Annas in 2002. “When it is,” he concluded, “it will predictably rely on broad, arbitrary state authority exercised without public accountability.”
Panic provides a rationale for action, sometimes overreaction or even manipulation. As such, it is the subject of heated accusation and denial that can create a swirl of confusion and frustration. Nonetheless, some lessons stand out in the long history of panic. There is no basis for imagining that the frenzied 19th century reactions to disease are a slumbering beast waiting to be roused. Too much government infrastructure and information stand between populations and unfettered panic. But whether it is flu or anthrax or H.I.V., we just can’t seem to shake that age-old specter of the howling mob.
(Anxiety welcomes submissions at anxiety@nytimes.com. Unfortunately, we can only notify writers whose articles have been accepted for publication.)
Amy L. Fairchild is a professor and the author, most recently, of “Searching Eyes: Privacy, the State, and Disease Surveillance in America.” David Merritt Johns is a journalist and doctoral student. Kavita Sivaramakrishnan is an assistant professor and author of “Old Potions, New Bottles: Recasting Indigenous Medicine in Colonial Punjab.” All are with the Center for the History and Ethics of Public Health Columbia University’s Mailman School of Public Health.
See online: A Brief History of Panic