Book Read Free

Wordslut

Page 12

by Amanda Montell


  There are two huge flaws in their logic: The first is that using a plural pronoun for a singular meaning is nothing new for English speakers. A few hundred years ago, the second-person you was exclusively a plural; thou was the singular version (e.g., “Thou shalt not kill,” “Thou shalt not lie”). Eventually, you extended to the singular meaning and pushed out thou entirely. Who’s to say the same thing couldn’t happen with they?

  The other key defect in the argument against singular they is that most people already use it so naturally that they don’t even realize they’re doing it. (I’ve used singular they once in this chapter so far—100 points to anyone who finds it.) English speakers have been using they as a singular pronoun to refer to someone whose gender is unknown to them ever since the days of Middle English (“Someone left their goblet in the gatehouse”). If we’re talking grammar rules, singular they was considered perfectly acceptable as a generic third-singular pronoun all the way up until the late eighteenth century. That’s when grammarians decided that people should start using generic he instead. Their reasoning? That’s what they used to do in Latin. (English speakers’ obsession with Latin has inspired a great many of our most confusing grammar rules, like when to use “you and I” versus “you and me.”) Consequently, style books adopted generic he, as did most educators, who quickly convinced themselves that singular they, in any context, was not only grammatically unacceptable but fundamentally “illogical.”

  And yet, millions of everyday people, including plenty of respected writers, ignored the new generic he rule and continued using they as a gender-nonspecific pronoun anyway. Jane Austen was all about singular they and used it precisely seventy-five times throughout her six novels. (Check out this line from Pride and Prejudice: “But to expose the former faults of any person, without knowing what their present feelings were, seemed unjustifiable.”) Add to that all the protests from second-wave feminists who contested that generic he was sexist, and eventually, grammar authorities listened. Today, many reputable grammar sources, like the AP Stylebook, formally endorse singular they, as do influential institutions from Facebook to the government of Canada. Because ultimately, most people agreed that in practical usage, generic they simply works better than generic he, no matter what the books said.

  These days, the only problem anyone seems to have with singular they is when they’re specifically being asked to use it because someone doesn’t identify as either he or she. This is when you start hearing arguments about how it defies basic grammar rules and is just too confusing to bother with. I personally know a few people who’ve argued that if we just came up with a brand-new pronoun, that would solve the whole problem, because then at least you’d always know the speaker was referring to one person, as opposed to multiple. The issue there is that we already tried that, and it didn’t work. Twenty years ago, they was not the most common nonbinary pronoun used in genderqueer communities—instead, it was the gender-neutral singular pronoun ze (pronounced like the letter z). If singular they has had it tough, ze was up against a mountain of pushback. Learning a new word altogether was harder for most English speakers to accept than simply starting to use a word that already exists in a slightly new context, which is all that singular they requires. (Though there is at least one language in modern history where introducing an entirely new pronoun worked. In July 2014 the Swedish gender-neutral third-singular pronoun hen was added to the official dictionary next to han and hon, meaning he and she. Many people adopted hen into their vocabularies with little complaint.)

  I will admit, while it can be difficult to adjust to using they when referring to a specific person, like any new skill, it takes practice, and most people don’t mind if you make an honest mistake. I’ve certainly made my fair share of slipups learning how to use singular they. But the blunders were genuine, no one got upset, and now singular they comes quite naturally.

  For anyone else who would like to step up their pronoun game but is still a little confused, Lal Zimman has an amazing tip: think of people’s pronouns just like you think of their names. You can’t tell a person’s name just by looking at them; if you want to know it, you have to ask, and to argue with their answer would be weird and rude. Everyone has their own individual name, and it may be difficult to remember or pronounce, but it is common courtesy to try your best to learn it. (Just as it would be unreasonable to say, “What? Your name is Chrysanthemum? No, that’s too much for me, I think I’ll call you Bob,” it would be equally bananas to use a pronoun that someone explicitly told you wasn’t theirs.) People are also allowed to change their names whenever they want—if we mess up the new one occasionally, that’s fine, but eventually, everyone just has to accept it, or, again, it would be weird and rude. Indeed, twenty years from now, introducing yourself with your name and your pronouns could become the norm. “Hi, my name is Amanda, she/her—you?” “I’m Sam, they/them. Nice to meet you.” Is that really so mind-boggling?

  To some people that does seem mind-boggling. But for those who outright refuse to learn new pronouns, grammar does not work as a defense, because language scholars know that isn’t really the problem. If you don’t approve of nonbinary identity or feel the need to affirm it, then it’s possible to find a reason to avoid using gender-neutral language no matter what. “This is one of those things where people start with a conclusion and work backward to find an argument,” Lal Zimman says, before telling me a story about how his partner, who uses they/them pronouns, is always butting heads with their mother, who can’t be convinced to get on board. “And their mom constantly just asserts ‘It’s because they is plural for me. It would be so much easier if you just used ze.’ And so, eventually my partner was just like, ‘Okay, use ze then, if that’s really gonna be easier for you,’ and that hasn’t improved her accuracy at all.”

  It makes sense that these structural linguistic changes cause such strong reactions. This isn’t just an issue of gendered pronouns, either: long before nonbinary identity and singular they were a part of the mainstream cultural dialogue, meeting certain grammatical standards was still highly valued by English speakers. It has been for centuries—and historically, this hasn’t had much to do with gender at all. Instead, it has to do with money and social class.

  See, during Europe’s feudal period—the time of lords, ladies, and peasants—if you were born poor, you’d stay poor forever. In those days, learning to talk “properly” was not a thing, because it would be useless. It wouldn’t change anything about your life. But with the end of feudalism in the fifteenth century came new opportunities for class mobility. That’s also around the time the printing press was invented, and with it came the publication of grammatical guides. Now that there was a chance you could possibly become a member of a higher social class, people began to take an interest in learning how to talk like one. Soon, a “standard” form of language was agreed upon by the state and the education system, which reinforced this linguistic hierarchy. Over the centuries, the importance of ascending that hierarchy became more and more culturally embedded.

  In the United States, a mastery of English grammar has become tied to the American dream itself. A friend of mine, who is first-generation American, once told me that when she was growing up, her Japanese father made her put a dollar in a jar every time she used a slang word. “He thought it made me sound low class,” she said. “He was an immigrant.” For folks like my friend’s dad, speaking “proper” English is the way to the big house with the white picket fence. It’s the idea that if you want to be a CEO, you have to sound like one, and not caring about grammar means not caring about your future itself.

  All that said, it’s also true that not all language purists have the same background or whip out their red pen with the same agenda in mind. Plenty of folks oppose singular they due to their social conservatism, but some of the biggest grammar snobs in America actually come from the political left. Deborah Cameron has said that one of the first things she noticed upon joining Twitter in 2014 was how o
ften educated progressive types called upon their superior grammar skills to confront bigots. Take a look at this Twitter exchange from 2016:

  A: As a straight male how would u feel about yr child having a homosexual school teacher?! Who their around for 8hrs of the day?

  B: If a gay teacher teaches my child the difference between they’re, their and there, I’m good.

  In a world of highly divided politics, most of which are voiced online, grammar-based digs like the one in this tweet* have become some of the first projectiles launched to confront racist, homophobic, and xenophobic remarks. A 2016 news story out of the United Kingdom told of a white woman verbally accosting an immigrant woman, who responded to the harassment by saying, “I speak better English than you!” The victim later told reporters that the bigot’s “grammar was appalling.”

  You can’t blame someone in this situation for defending themselves however they can—but you have to ask why claiming to have better grammar than your antagonist is so often the weapon of choice. Linguists posit that this has to do with notion that bigots are not only depraved, but also stupid, and that the two are connected. “It allows their critics to feel intellectually and culturally as well as morally superior,” Cameron explains. That is a satisfying feeling, to be sure, but the reality is that grammar and morality don’t actually have anything to do with one another, and attacking a bigot’s poor grammar does not itself prove you are a better person. It might prove that you had the opportunity to become more educated than they did, or that you spent a lot of time mastering the rules of standard English. However, the moral significance of what someone says is about the content, not the grammar. As Cameron says, “Hitler wasn’t any less fascist because he could write a coherent sentence.”

  The other problem with policing people’s preposition usage or dangling modifiers is that “poor grammar” is often a criticism hurled at what is really just a nonstandard English dialect. For instance, one might call out a speaker of African-American Vernacular English (AAVE) for using a double negative (“I didn’t say nothing”) without realizing that AAVE is a systematic dialect, and the double negative isn’t a mistake but instead a legitimate part of AAVE grammar. It isn’t something we find in standard English anymore, but it used to be—centuries ago, everyone from Chaucer to Shakespeare to everyday English speakers used double negatives. Again, it wasn’t until those stuffy grammarians from the English standardization period decided we should copy Latin and nix the double negative that it was considered “incorrect.”

  Linguists know that nonstandard forms of a language are not objectively “bad.” The grammatical forms themselves, like saying “he be”* instead of “he is,” are not inherently worse or better than what we learned in English class. They’re simply stigmatized based on how we feel about the type of person using them.

  When highly educated folks engage in grammar policing, they’re basically just doing what misogynists do when they dismiss what a woman is saying because she uses uptalk or vocal fry; it’s another example of judging someone’s speech based on preconceptions of who they are. Discerning listeners can tell that addressing someone’s grammar is often just a way of avoiding the message itself. “Language pedantry is snobbery and snobbery is prejudice,” Cameron says. “And that, IMHO, is nothing to be proud of.”

  There’s something that all of these grammar critics—from the opposers of singular they, to the grammarsplainers on Twitter, to France’s Académie française—have in common. Whatever their political beliefs, they all possess a profound urge to correct or halt change in speech. This is true of most people. Whenever language changes, as when anything in life changes, folks can’t help but feel a little fussy. That’s because language change is frequently a sign of bigger social changes, which makes people anxious. It’s why people above the age of forty have always loathed teen slang, no matter the era: it represents a new generation rising up and taking over. One of my mom’s friends, a guy in his late fifties, recently told me he “hates” so many of today’s popular slang words (shade, lit, G.O.A.T.) because “they do nothing to improve the English language.” What’s funny is that I can almost promise, forty years ago, his parents were saying the exact same thing about cool, bummer, and freaking out, all phrases that have now taken a seat at the table of acceptable English terminology but started out as annoying teen slang.

  The type of language change that’s gotten perhaps the worst reputation is the push toward political correctness. The conservative media has played a big role in painting this concept as a negative, in propagating the idea that in this day and age “you can’t say anything anymore.” The fear is that being forced to use gender-inclusive language, like singular they, Mx. instead of Mrs. and Mr., and friends instead of “boys and girls,” poses a threat to free speech.

  In reality, of course, no one can force anyone to say anything in this country—political correctness does not endanger our freedom of expression at all. The only thing it actually threatens is the notion that we can separate our word choices from our politics—that how we choose to communicate doesn’t say something deeper about who we are. As American English speakers, we are perfectly at liberty to use whatever language we want; we just have to know that our words reveal our social and moral beliefs to some extent. So if one were to use the term comedienne instead of comic or the pronoun she to describe a Ferrari, they could be opening themselves up to criticism, not for flat-out sexism but definitely for expressing an indifference to gender equality. What rubs people the wrong way about political correctness is not that they can’t use certain words anymore, it’s that political neutrality is no longer an option.

  In defense of their objection to linguistic change, some folks will claim that their “brain just doesn’t work that way.” They simply “can’t handle” new rules like gender-neutral pronouns. To that, my answer is this: How about we set up the next generation to have brains that can? “What we really need is to change the way we teach language early in life,” says Lal Zimman. If we considered the ability to easily change the pronoun you use to refer to somebody as a valuable skill, that could be a part of our language arts education. We could incorporate all kinds of gender-inclusive language instruction into our grammar lessons. After all, there’s no reason acquiring linguistic flexibility shouldn’t be as appreciated as being able to know when to use well versus good or your versus you’re.

  In the meantime, we can either do our best to get on board or not—but whatever we choose, we can trust that language will move along its merry way regardless. The bigots and the pedants will be left at the station, and a generation of linguistically bendy, gender-inclusive whizzes will ride off into the sunset.

  I hope to see you there. I hear it’ll be quite the party.

  6

  How to Confuse a Catcaller

  (And Other Ways to Verbally Smash the Patriarchy)

  In India, they call it Eve teasing, which I think is quite poetic. I picture earth’s first man tiptoeing friskily behind earth’s first woman, his fig leaf fluttering in the breeze. In Syria, it’s sometimes called taltish, which isn’t as innocent. This word, with its harsh pair of letter t’s, describes a brisk way of saying something, as if tossed upon the hearer, like a martini to the face. Piropos are famous throughout Latin America: the term comes from the Ancient Greek pyropus, which means “fire-colored.” It is said that Romans appropriated this word to mean “red-colored precious stones,” similar to rubies, which represented the heart, and hence were the stones men gave to the women they were courting. (Those who didn’t have money for these gems gave them pretty words instead.) But the only term I’ve ever used to describe it was invented in eighteenth-century England. There, it referred to the act of heckling vulnerable theater performers: “Nice costume, dandy!”; “Get off the stage!” In English, we call it catcalling.

  So many languages offer a phrase to describe the act of a person (usually a man) shouting sexual comments in the street at someone they don’t know (usuall
y a woman or feminine-presenting person), because in almost every country, you are sure to find it. As much as catcallers claim that their behavior is meant to be flattering (“Where are you going, baby?”; “Damn, look at that ass!”), both social scientists and people who deal with catcalling firsthand can tell that’s not really the intent. As a college student, an age when I would have been thrilled for just about anybody to think I was sexy, I was catcalled wearing everything from a minidress and heels to a matching Halloween pajama set from Duane Reade. The shirt said “Boo!” and so did the catcaller, before asking for my hand in marriage.

  That guy didn’t want to marry me or even make me feel good about myself, but he did want me to hear him and to understand that he had control over me, at least for those few seconds. Because the act of catcalling isn’t really about sex—it’s about power.

  Since the beginning of patriarchy, language has been a primary means through which men have asserted their dominance in order to make sure women and other oppressed genders have no control over what happens to them. And though salaciously taunting strangers in public may be one of the flashiest tactics, it’s hardly the only one. Equally disempowering are the practices of labeling women overemotional, hormonal, crazy, or hysterical* as a way to discredit their experiences, or addressing female colleagues as sweetheart or young lady in a professional setting as a form of (often subconscious) subordination. I once worked in an office where the company’s owner referred to every female employee by her hair color: “You’re early today, blondie.” “How’s that write-up coming, pink?” (We worked alongside a male employee with a zigzag design shaved into the back of his head, but the boss just called him Daniel.)

 

‹ Prev