Jan/Feb 2022  •   Salon

Beyond Good, Evil, and the Split Infinitive

by Thomas J. Hubschman

Artwork by Dale Bridges

Artwork by Dale Bridges


I learned moral relativity from a linguistics course I took in my junior year of college. It was one of the few interesting courses I had in that institution of higher learning, and ironically, it taught me something about the real world contradictory to everything the place, a religious college, stood for. That lesson turned out to have as profound an impact on me as did my finding out many years earlier about the way babies were made, though in this case the force of the revelation operated over the course of a couple semesters rather than in a few staggering minutes in a schoolyard.

But revelation it was nevertheless, and that one course changed my understanding of the moral world more profoundly than did the 15 years of religious schooling preceding it. And, unlike its birds-and-bees counterpart, the revelation that there are no linguistic absolutes, no rights or wrongs about how a language is used beyond the way people do in fact use it, became a template I could and would apply to almost every other area of human experience, even if I only did so mostly in retrospect.

Language, the use of it in ordinary daily experience, is as fraught with deep prejudice as anything else in our lives, and probably a good deal more so. After physical appearance—comeliness or lack of it, dress, and, certainly in the United States, color—how someone speaks is the most important indicator who they are. Their use of language (my own use of "their" to indicate either "her" or "his," will, as an example, rub many readers the wrong way, though its usage is becoming acceptable; it's worth noting it is employed regularly throughout Jane Austen's work), the way someone pronounces words as well as the range of their vocabulary, is the first and sometimes the most decisive basis on which we judge them, at least initially. In just a few moments, we decide whether the person is educated or not, what "race" or ethnicity they belong to, their economic status, as well as several other factors, all of which go into rapid-fire decisions about whether they are a physical or social threat, a suitable employee, a candidate for an endogamous relationship—i.e., one that could lead to marriage without raising eyebrows even in 2021—and all sorts of other possibilities.

I live in and have spent most of my life in or near New York City. In its metropolitan area, which takes in the city's five boroughs plus counties in neighboring New Jersey, Long Island, and parts of upstate New York, there are not just dozens but scores of what professional linguists call idiolects, mutually understandable but consistently different pronunciations and usages of the English language ("dialect" implies vocabulary and other differences so great that someone who does not speak a particular dialect is not intelligible to someone who doesn't; we have few if any true dialects in American English). Some of these minor differences are stark and consistent. The New York upstate and downstate pronunciations of "a," is one. Upstate it has a marked "air" sound, ba-nair-na. Downstate an "eh" sound, ba-neh-na.

To a native New Yorker City ear, an extreme version of the upstate accent, say from Buffalo, can sound comical. The first time I heard it, I thought the speaker, a pretty co-ed, was putting me on. I imagine New York City accents sound just as odd to upstaters and other speakers of "General American." But I doubt they can hear the more subtle differences between a mid-Brooklyn accent and one from south Brooklyn or the Lower East Side of Manhattan.

What's morally significant about all such different usages of language is the social signals they put forth about what sort of person a speaker is. Even New York City natives judge one another by the extent to which someone speaks in a pronounced New York accent as opposed to a more "educated"-sounding, newscaster-type accent (i.e., lack of accent to a New York ear). The NYC accents are all what the French would call déclassés, even to native New Yorkers. The assumption is if someone has gotten an education and holds a good job, they will "tawk" better than the way their old neighbors do, i.e., they will have lost or at least moderated their natural speech to something less regional.

What I learned in that linguistic course, though, is every way of speaking a language has a long history behind it and is rarely the result of mispronunciation or of simply not knowing better. When a New Yorker pronounces ask "aks," they are not doing so in ignorance of how it should be pronounced. They are saying the word the way it was spoken in King Alfred's time hundreds of years ago. The same is true for so-called double-negatives—"I don't have no time for your foolishness"—along with just about every other use of language coming out of a human mouth.

Those usages may be markers today of lack of schooling or of lower social status. What they are, though, is not "bad English." New Yorkers speak the way we do, just as everyone else in the nation speaks as they do, because the people who settled this city and this nation came from parts of England where those were the local idiolects, the speech not just of ordinary but of what today we would call middle-class people.

Relativism, though, can become seditious when applied to morality in general. Etymologically, "morality" just means "the way things are done." But the word's true meaning means "the right thing," sometimes in an absolute way people will kill and die for. To relativize the rightness or wrongness of a behavior is to question morality's claim to any kind of authority. To assert in the 1950s that homosexual love was just as valid and "natural" as the heterosexual kind was to flout a basic moral principle based on thousands of years of dogma, sacred texts, and common norms. To say women should enjoy full social and economic rights and not be in any way subservient to men was almost as bad. Married women could not get credit cards without their husbands underwriting them until the 1970s, and in some parts of the US well into the 20th century a woman could be "put away," committed to an insane asylum, only at her husband's request. Not even a judge could do so without it.

Obviously, in matters of sexual "preference" as well as women's rights, things have changed. Politicians openly vie for the LBGTQ vote, and women are, at least legally, socially and economically autonomous. But for most of us there is still a "right" and a "wrong" way to use the English language.

We shouldn't underestimate the moral force at work in the right and wrong of proper usage and pronunciation. Most people would not dream of openly criticizing a neighbor or job applicant for using "street talk" or even a double-negative. But they wince inwardly and perhaps have to bite their tongue to keep from offering such correction. And once they have established a close enough relationship with that same person, they no longer hesitate to set them right.

We shouldn't kid ourselves that linguistic morality doesn't matter. We may like to think we are above such superficiality, and perhaps some of us are. But most are not, even though if we are old enough we have seen the unthinkable become the norm: "where... at," being an example. In the 1960s most Americans were for the first time hearing the use of "at" at the end of a sentence that began with the word "where," such as "Where are we at?" by educated White people. Prior to that time it was unacceptable for at least two reasons: you should not end a sentence with a preposition ("at"), but, and probably more important, "where... at" was what uneducated, and especially Black, people (then known as Negroes) said. Because of the social upheaval going on in those days, the fact that it was black usage gave the "where... at" a cachet among those sympathetic with the Civil Rights Movement who also considered black mores to be "cool" and worthy of imitation. Nowadays presidents and other respected public figures use "where... at" all the time. But before the 1960s, no teacher or parent raised to speak "good" English would let a child use that construction without giving them a quick and firm reprimand.

But that's the point, isn't it. Language, like other mores, change... or don't. The French and Spanish language academies are still setting rules for the way those languages can be used, at least in published discourse, sometimes imposing penalties for public violations of their diktats. But there is no language academy for English beyond what school teachers and newspaper editors decide. Hence, there are all sorts of usages in American English (not to mention British English, which has finally given up the fight against Americanisms, even on the BBC) which 40 or 50 years ago would have brought forth mobs of academics with linguistic pitchforks raised high.

The belief there is a wrong and a right way—not just a standard or general versus a regional or local way—of using language, still runs just as deeply in us as it did when we thought of gay people with the moral outrage we look upon child molesters today. And therein was and still is the lesson I learned in professor Donohough's linguistic course: there is a "standard" or accepted way of speaking that has a justifiable reason for its existence—clear and precise communication—but other forms of speech, no matter how outlandish or offensive, are just as valid and have in some cases longer pedigrees than do the standard, acceptable ones.

Ditto for other mores we will all but die for. The nation is currently at loggerheads in such disputes politically, socially, and religiously. And all three areas of discourse are no longer a matter of personal choice, if they ever were, but absolutes.

A democracy assumes, if not a belief in moral relativity, at least in a commitment to the toleration of opposing and even offensive ideas and their expression. It also assumes a willingness to engage in discussion and debate about those ideas and beliefs. Hence the protection afforded to such in the Bill of Rights, amendments added to the Constitution but not an original part of it.

It's a dicey balancing act, with the law protecting everyone's right to think and act as they please on one side of the scale and social cohesion, indeed the existence of a society at all, on the other. Such tolerance had little purchase in so-called Western civilization before said civilization came into contact with indigenous peoples in other parts of the world, especially Northeastern American peoples. Notions of personal autonomy—what we now call "freedoms"—in those indigenous societies, not to mention absence of social shames like extreme poverty and a degraded status for women, were looked upon by Europeans as serious threats to long-held notions of government and morality. In Christian Europe the twin authorities of church and state assumed the worst about human nature and sacrilized those assumptions into brutal codes of law and punishment in this life and the next. Not a good matrix out of which to fashion a society of free men and women.

But American—Huron, Iroquois, et al—ideas and, more importantly, the success of those societies despite allowing for practicing maximum personal and political freedoms, suggested to some European minds the possibility of demanding similar rights for themselves. It could be said (though until now rarely has been) the great American experiment, which has never been realized and is now being tested so severely, is actually a European attempt to emulate the freedoms of native-Americans, whose ways those Europeans tried to adopt even as they were exterminating them.

But you can't force freedom on Americans any more than you can impose Western political values on parts of the world with their own traditions and values. The truth is, America is still less free and certainly less tolerant than the societies Europeans first came into contact with in the early 16th century. We pretend we are freedom-loving, but our tolerance for dissent and individualism only extends so far, and our insistence on maximum personal autonomy without governmental or any other kind of restraint is largely play-acting: We know the script, but we are incapable of living it in the real world. We remain, most of us, worshippers of one or another authority under the guise of claiming we stand for American values and American traditions, which turn out to be just New World versions of Church and State.

It's hard not to wonder if we are not ourselves one of those nations we are so eager to democratize but are unable to do so because it's just not in our cultural DNA. If we teach civics to our children and tell them all Americans have the right to all the freedoms we claim to believe in, then turn them loose in a culture that is of another mindset entirely, how can we talk with a straight face about being a free society?

It's a bit like the story told during governmental investigations into police corruption. The young recruit spends weeks being lectured on such subjects as the police's role as protector and defender of all citizens. S/he's told any kind of racial or other discrimination is not to be tolerated and unnecessary violence, never mind outright abuse, is forbidden. The cadet then graduates and is given an assignment, usually with a seasoned officer. The first thing the older officer tells the new cop is what s/he learned in the academy is one thing, the real world of policing is another. What the cadet was told was not to be tolerated—"tossing" young black men indiscriminately without cause in hopes of finding something illegal on their person or to intimidate them—is a necessary part of the job. S/he's told not to harass—make random stops—if blacks or latinos are seen driving expensive cars, for instance—but learns on the job that such stops are almost the only ones they should make. The new cop may also be told by his precinct commander to meet an arrest quota for the month and that s/he, the commander, in effect doesn't care how the officer does it. The days when police officers were openly on the take, shook down merchants in their precinct on a regular basis, may be over, but the categorization cops still use with regard to non-law enforcement individuals remains alive and well: "civilians" or "garbage."

Examples from the way legitimate business is run, in politics and in every other facet of our culture, show the same double-faced sets of codes. There is no respect for law or a common morality. In that sense we are no different from a state in which the clergy profess faith in the principles of the gospels, the Torah, or the Koran, but live lives scarcely distinguishable from mobsters and pimps. People live not by the morality in which they are indoctrinated but by the mores actually practiced by their society. If human beings flourished in truly democratic, free societies such as the ones Europeans confronted during their conquests of the 16th century, it was because the notions of freedom and equality obtained in those societies were ingrained, taken in with mother's milk and practiced as organically as if they were genetically passed on though they were in fact the result of conscious choice over long periods of time and were, to use a much misused term, "self-evident." They didn't need to be sent to school to learn that. But of course a culture that believes human beings are in the grip of evil influences, a "fallen" nature, sin, can never afford to allow its citizens to be truly free, or to even know what freedom is.