E
Jan/Feb 2020 Nonfiction

Those People

by David Raney

Borrowed image


As men grow more alike, each man
feels himself weaker in regard to all the rest.

          —Alexis de Tocqueville

In April 1924, Franz Kafka wrote to a friend from the sanatorium where he would die two months later, complaining that no one would speak to him frankly about his illness: "Verbally I don't learn anything definite, since in discussing tuberculosis... everybody drops into a shy, evasive, glassy-eyed manner of speech." Thirty years later, a decade after publishing The Plague, Albert Camus observed at a symposium that euphemism still ruled the discussion of grave diseases:

In our well-policed society we recognize that an illness is serious from the fact that we don't dare speak of it directly. For a long time, in middle-class families people said no more than that the elder daughter had a "suspicious cough" or that the father had a "growth" because tuberculosis and cancer were looked upon as somewhat shameful maladies.

This remains true to a lesser extent today: if cancer is no longer "the C-word" in conversation, AIDS still shows up as a "lingering illness" in obituaries. But in an age in which diseases, particularly contagious ones, seem to be on every magazine cover, paperback rack, and movie marquee, it can't be said any longer that we're shy or evasive about our maladies. In fact, it'd be fair to twist Camus's axiom to a contemporary corollary—in our culture we recognize a phenomenon is serious from the fact that we speak of it as an illness.

An era's predominant concerns will be reflected in its metaphors, so it isn't surprising that computer phenomena and nuclear proliferation are referred to as "infectious." Perhaps it's inevitable such analogies sometimes collapse and veer toward the real. When the computer lexicon, for example, borrowed the word "virus" for a rapidly multiplying computer failure, it spawned a subgenre of science fiction thrillers like Graham Watkins's Virus, in which superviruses transfer themselves from machine to user and threaten millions. This seems a clear case of anxiety over ambiguous boundaries, in this case between man and machine, of the sort now put into question daily by developments in robotics and AI. Nonfiction titles on high-tech subjects can take nearly as grim and sensational a tone as their fictional counterparts: Silicon Shock: The Menace of the Computer Invasion, or Computer Viruses, Worms... Killer Programs, and Other Threats to Your System. The language and images of technology, medicine, and fearful fantasy are interdependent.

Language itself, for that matter, has often been conceived of as contagious. William Burroughs famously claimed that "language is a virus" long before we spoke of memes, and memoirist Anwar Accawi tells of his parents referring to cancer as "that disease," not out of a concern for euphemistic delicacy but "because they were afraid that saying the word itself would bring the sickness upon them." The Russian language theorist Mikhail Bakhtin suggested something similar. Language "lies on the borderline between oneself and the other," he said, "exists in other people's mouths... is populated with the intentions of others." Cognitive neuroscientist Stephen Pinker draws the analogy even more clearly: language innovation "must spread and catch on like a contagious disease until it becomes epidemic." Borg alert, Star Trek fans—learning is "contagiously spread from person to person [as] minds become coordinated."

These are fairly recent developments, but fears of certain Others, differing by race and class especially, have been cast in terms of contagion ever since the scientific breakthroughs of the 1800s. Germ theory offered a new way to conceive of boundary loss and shifting identity, and while these fears are hardly exclusive to America, the peculiarities of our culture and history render this a compelling, irresistible project for us.

One way to understand the American reaction to theories of contagion is to view it through the lens of our conflicted ideas about individuals and crowds. Deep in our national mythos is the glorified figure of the loner: a farmer, backwoodsman, cowboy, gold miner, or some other adventurer living by wit and grit a step from the frontier, needing no organized religion or government to show him the way. At the same time, of course, America has struggled from the beginning to define itself geographically, culturally, and politically, to assert some communal identity. Our democratic institutions have tried to reconcile the same conflicting demands of individual and group, avoiding either anarchy or Tocqueville's "tyranny of the majority."

We resist conformity on principle, it seems, but indulge it in practice. Despite expressions of horror about "mass man" during the Red Scare 1950s, for instance, Americans were already growing increasingly homogenous in behavior and taste. Max Lerner in his 1957 book America as a Civilization caricatured Americans as robots "performing routinized operations at regular intervals":

They take time out for standardized "coffee breaks" and later a quick standardized lunch, come home at night to eat processed or canned food, and read syndicated columns and comic strips. Dressed in standardized clothes they attend standardized club meetings... They are drafted into standardized armies, and if they escape the death of mechanized warfare they die of highly uniform diseases [and] are buried in standardized graves.

It's worth noting there's nothing uniquely American in this fear of conformity. In 1797 Scotsman John Robison was railing against the French Revolution as a conspiracy to reduce mankind to "one undistinguishable mass." Sociologist David Potter, though, in his 1963 Commonwealth lectures on "Freedom and Vulnerability," suggests one reason America's vaunted individualism so often expresses itself as conformity. "No other nation," Potter said, "has had the same combination of compelling experiences with pioneering, mass immigration, and urbanization—all of which tended to intensify the fear of isolation and the feeling of dependence on the group."

America has nevertheless tended, except during wartime, to emphasize its pluribus rather than its unum. This ambivalence plays a large part in both our fear of contagion and our insistence on using it to talk about other, unrelated fears. Contagion threatens natural borders (self and other) as well as artificial ones (class, race, nation), in the process imposing a paradoxical combination of difference and unity. It alters individuals but dissolves the distinctions between them, offering both a pariah's isolation and a community of shared symptoms.

But if America has always been divided against itself on the value of individual versus group, there's an important distinction to be made, on the group side, between "people" and "masses." Just as the Greek philosopher Proclus distinguished "the people" (a group "united to itself" and worthy of democracy) from "the populace" (an incoherent rabble), so in our rhetoric "the people" are lauded from the very first words of the Constitution while "crowds," "mobs" or "masses" always threaten social disruption. The difference goes beyond semantics, because the second set of terms generally carries a class stigma. Matthew Arnold labeled culture "an internal condition." He felt, as Lawrence Levine observes, that anything producing "a group atmosphere, a mass ethos, was culturally suspect." An 1894 article in Century magazine weighed in defining the masses as those delighting in "eating, drinking, smoking... dancing, music of a noisy and lively character," etc. Anyone demonstrating "a permanent taste for higher pleasures," Century concluded, "ceases, ipso facto, to belong to the masses."

It is these "crowds" or "masses," not the noble "people," who get themselves cast as contagious. Military historian John Keegan wrote in The Face of Battle that "a crowd is the antithesis of an army" because crowds harbor "potentially infectious emotion which, if it spreads, is fatal" to discipline. John Adams opined two centuries earlier that America with all its open land might avoid the unruliness of crowds: "Where large numbers live in small places," he said, inevitably there are "contagions of madness and folly."

Our imaginative literature has always made the same connections. Nathaniel Hawthorne in "My Kinsman, Major Molineux," writing in 1832 before germ theory had taken hold, nevertheless describes a gathering mob as a disease symptom: "[It was] as if a dream had broken forth from some feverish brain, and were sweeping visibly through the midnight streets." Mark Twain, writing after that seismic shift, links mobs specifically to infection, claiming that "men in a crowd... don't think for themselves, but become impregnated by contagious sentiments uppermost in the minds of all who happen to be en masse." What's at stake here is not behavior so much as identity: madness, fever, or uncontrolled emotions take us out of ourselves, whether violence follows or not. Expressing identity-purge as a contagious phenomenon acknowledges that disease, like an engulfing crowd, dissolves the fragile membranes by which we distinguish ourselves from others.

Contagion metaphors have not exactly vanished from our lives since then. The media confronts us daily with the "virus" of sexism, road rage, doubt, war, or witchcraft; with "epidemics" of hate, handguns, eating disorders, even historical novels. Virtually no facet of American life is immune to such treatment (as the phrase "immune to" itself suggests), and the phenomenon goes beyond popular journalism. Essayist William Zinsser remarks that our Alamo martyrs are "immune to the virus of revisionism." Roger Shattuck in Forbidden Knowledge shares imagery with early 20th-century censors—though he presents a more sophisticated argument than the book banners and burners—when he explicitly compares stories that appeal to our violent or prurient interests to "bacterial and viral disease" assaulting "our moral immune systems." Jean Baudrillard claims in The Transparency of Evil that thought itself is "a sort of network of antibodies and natural immune defenses" against a broad spectrum of phenomena rendering us vulnerable to "the evil genie of otherness." These phenomena can be as trivial as fashion fads, which "fade away like epidemics once they have ravaged the imagination," or as potentially catastrophic as terrorism, crack cocaine or computer viruses, all of which, Baudrillard maintains, "hew to the same agenda of virulence and radiation, an agenda whose very power over the imagination is of a viral character."

Obviously real contagion hasn't disappeared, either, in this welter of lazy metaphors and fevered rhetoric. We've seen a resurgence of ancient diseases like tuberculosis and cholera, plus newer horrors like hantavirus, Ebola, Zika, flesh-eating bacteria, and the so-called "X" virus from Sudan. Worldwide, bacterial and viral diseases still kill about 15 million people annually, and ironically enough we are victims, to some extent, of our own success in the battle against them. In barely a century science has learned to slow, halt, kill or cure most of the older scourges but, as natural selection would predict, those conquests have cleared the field for new entries and occasionally stronger versions of old ones. Drug-resistant strains of strep and staph in hospitals are a frightening example. Another is the set of emergent tropical diseases which have beset us as a result of our destruction of the rainforest, a process which itself only became possible once we devised vaccines and treatments for known tropical killers.

This is not to argue, obviously, for reducing efforts to control disease. But it does point to the likelihood that microscopic enemies will always be with us, tapping deep fears and at the same time questioning our notions of individual identity. Contagion, and the metaphors that deform or translate it into other realms of understanding, will continue to serve as the inflection point where, to borrow poet Louise Glück's phrase, the self ends and "the blur of the world begins."

 

Previous Piece Next Piece