Book Read Free

Semicolon

Page 9

by Cecelia Watson


  I’m hardly the first person to home in on the conventionality of King’s register. It gets a mention in a speech that the novelist, essayist, and English teacher David Foster Wallace liked to give to black students whose writing he perceived to be, unlike King’s, “nonstandard.” Wallace gave this speech one-on-one to students he believed would benefit from it. It’s long, but here’s a taster:

  I don’t know whether anybody’s told you this or not, but when you’re in a college English class you’re basically studying a foreign dialect. This dialect is called Standard Written English. . . . In this country, SWE is perceived as the dialect of education and intelligence and power and prestige, and anybody of any race, ethnicity, religion, or gender who wants to succeed in American culture has got to be able to use SWE. This is just How It Is. You can be glad about it or sad about it or deeply pissed off. You can believe it’s racist and unfair and decide right here and now to spend every waking minute of your adult life arguing against it, and maybe you should, but I’ll tell you something—if you ever want those arguments to get listened to and taken seriously, you’re going to have to communicate them in SWE, because SWE is the dialect our nation uses to talk to itself. African-Americans who’ve become successful and important in US culture know this; that’s why King’s and X’s and Jackson’s speeches are in SWE . . . and why black judges and politicians and journalists and doctors and teachers communicate professionally in SWE. . . . And [STUDENT’S NAME], you’re going to learn to use it too, because I am going to make you.

  This speech, Wallace claims, is in the service of being honest and realistic about the way that language is wrapped up in politics and power. Students should be pressed into choosing to learn what Wallace calls SWE—Standard Written English, or Standard White English, as he acknowledges it might as well be called. The reason students should be required to learn SWE is that they will be at an extreme disadvantage in the world if they do not do so. This is how the world is whether you like it or not, Wallace says, ostensibly congratulating himself on his brave truth telling.

  Apparently a few students who were subjected to this speech were offended by it, and one lodged an official complaint with the university. I have some complaints about it, too. Did Wallace pull all his white students into his office for an in-camera chat about how it is that they might be upholding elitist power structures by speaking and writing SWE? Did Wallace call out his colleagues in the academy for failing to find ways to make room for ideas expressed by people who might not know—or might simply choose not to use—the secret-handshake grammar of the powerful? Did he make participation in SWE a choice as fraught with moral and political implications as he made nonparticipation in that dialect? No, the onus is on the black student to choose, not on Wallace or anyone else to use his power and privilege to help remake the world.

  Later in the essay, he tells the reader that what must have been offensive to the student who complained about his speech was only that he, a privileged white male, was the one making it. Because of his identity, he says, the student just wasn’t able to see the “logic” of his speech. Nice try, DFW, but what logic? So—your argument is “We must use one form of speech, Standard Written English, because that’s the form of speech we always use.” I’m not persuaded that “we are doing this already” is sufficient to justify a claim that “we ought to continue to do this” or “we must continue to do this.” Logic is about uncovering and examining assumptions, not perpetuating them. Maybe there is an intelligent, logical argument to be made for choosing SWE as a shared scholarly and professional dialect—but Wallace didn’t bother making it.

  What is most infuriating about reading Wallace—more infuriating even than his factual errors and logical hiccups, of which there are many in the “American Usage” essay*—is that it seems he was equipped to understand, for instance, that language is part of our self-presentation, crucial to our construction and conception of ourselves. He understood better than most people that language, and the choices we make surrounding it, is political, always. Yet in his “pep-talk” to black students, he didn’t see it as his job to create a world that would be more open to more possible selves than ones like his own. It’s a good thing to make students (and even people who’ve left their student days far behind) aware that there are context-specific costs and benefits in the choice of one English dialect over another. The problem is that Wallace exempts himself (and everyone who already speaks and writes like he does) from responsibility for his own choice, by pretending it isn’t a choice at all.

  It’s an attitude in keeping with Wallace’s self-proclaimed snobbery. He was profoundly a snob—or as he called it, a SNOOT, an acronym* Wallace’s family used for “somebody who knows what dysphemism means and doesn’t mind letting you know it.” A dysphemism is a derogatory term—like snoot, for instance—and you shouldn’t feel bad if you didn’t know that, nor should you feel exceptionally clever if you did and you got the joke. Each of Wallace’s sentences is a stunt of some kind, every clause an Odyssean convolution. For him, being a SNOOT was something to brag about, and it’s crucial to his literary style. When I look back on my own snob days, I feel it’s something to be embarrassed about. Where Wallace sees moral high ground lush with the fruits of knowledge, I see a desolate valley, in which the pleasures of speaking “properly” and following rules have choked out the very basic ethical principle of giving a shit about what other people have to say.

  Wallace cared a lot about language and punctuation, and I have no problem with that. I love that at his book readings, he read his punctuation aloud along with the words, because he put those punctuation marks in his writing for a reason. His enthusiasm for punctuation and for language more broadly is not the problem, and I don’t doubt that he took his role as a teacher of young people seriously. What is problematic to me is the direction in which he chose to channel those interests and concerns: he narrowed the conversation about the politics of language rather than expanding it, by making it one-sided and doing what everyone always does—obligating people who aren’t participating in the status quo to step it up rather than asking the people enforcing the status quo to think about, and justify, their own standards and values.

  So what happens now, if you’re ready to entertain the notion that having plum-picking contests and SNOOTing around isn’t the most admirable way to behave? What’s left, if we aren’t supposed to show our respect and love for language by respecting and loving rules?

  Conclusion

  Against the Rules?

  It’ll be clear to you by now that I disagree with the rule mongering of the David Foster Wallace types, and the semicolon hatred of Professor Paul Robinson, the guy who feels “morally compromised” when his pinkie finger plunks the semicolon key. The history of punctuation shows that rules can’t be taken for granted as necessary elements of language. For a start, when we consider rules, we have to ask: whose rules? Exactly which collection of rules are we supposed to rely upon and remember, when the fortunes of rule systems have depended on their contradicting one another? For over two centuries, grammar books have preached the gospel of rules, and now, when I talk to friends, students, and colleagues about grammar, they lower their voices confidentially to confess sins: I just don’t ever use the semicolon, because I’m afraid I’ll do it wrong. I sometimes want to use two colons in one sentence, but I’m not allowed. I am very confused about the Oxford comma. Occasionally I’ve been pulled aside after a talk on the semicolon to be told a story about a dogmatic elementary school English teacher still perched on the now-adult student’s shoulder, looking down, judging, even after decades. At times I’ve felt less like a punctuation theorist than like a punctuation therapist.

  Fear, worry, confusion—even if we did manage to agree on one set of rules to follow, we wouldn’t be relieved of our anxieties about punctuation. We still have to worry whether we know the rules and have applied them correctly. We have to worry about situations for which we can’t find a rule that seems a
pplicable, and hope that the authorities on the Chicago Manual’s “Q&A” web page address the oversight immediately. And even if we happen to be very very good at remembering the rules and applying their most obscure precepts, we have to wonder if our assiduous applications of these details will strike the average reader as mistakes* rather than the markers of precision we hope them to be: if rules are not natural features of language, then they depend upon their being shared knowledge in order to bestow on our writing the clarity and precision they promise. Rules have never in human history—and are not now—freeing us from the pitfalls and challenges of interpreting other people’s words and the anxieties of writing down our own.

  In spite of this problem, the bromide that we need to know rules is constant even among punctuation reformists, and it’s such a deeply entrenched idea that grammar writers rarely (if ever) attempt to justify it. Even the philosopher Theodor Adorno, who wrote a beautiful essay, “Punctuation Marks,” which recovered and built upon the humanist ideal of punctuation as musical, advised that rules have to “echo in the background” even in moments where the author “suspends” those rules. Adorno, like just about everyone else writing about punctuation, seems to have believed that we always process writing in terms of its conformity to, or conflict with, rules. When a rule is broken, these theorists believe, we hear the breakage, whether we realize it consciously or not.

  But if that were true, it sure would be strange that we can read someone like Shakespeare without having our delicate rule-bound constitutions constantly assaulted by “false syntax,” as the early American grammarians called it. Remember those corrections they made to Shakespeare’s verse? Perhaps the experience of reading isn’t so much a matter of hearing “rules” and “not-rules” as it is about immersing ourselves in a text and adjusting to a way of speaking that might be very foreign to us in terms of time period, culture, or genre. That’s a pretty extraordinary skill to have built into our brains, and I’m not sure why we’d want to squeeze it out of ourselves by insisting that language is inherently rule-bound.

  If rules don’t do what they set out to do for us—if rules are just idealizations of language that don’t manage either to help us learn to write well, or to describe why a piece of writing is effective or ineffective—does that mean that rules are totally worthless? Not necessarily. In fact, if we can learn to see past rules as the only framework with which we can understand and learn to use language, we might be able to see what purposes rules could really serve. That is, we can peel away the justification that “rules are really in language” and free ourselves to ask instead, “What good might rules be, even if they aren’t strictly necessary or sufficient?” Rules, considered as frameworks within which to work rather than as boundaries marking the outer limits of rhetorical possibility, might spur creativity, just as a poet might find it productive to work within the strictures of the sonnet form. But we would be making a big mistake to teach that the only “legal” way to write poetry is to write sonnets. The same goes for punctuation rules.

  Perhaps this will be some balm for the souls of some of you who, in spite of the story told in this book, still feel attached to The Elements of Style or Fowler or whatever your preferred grammar tome might be. Even if you accept everything I’ve said in this book about rules, you might still feel, deep down, a love for the idea of grammar rules. But when it comes down to it, I’d wager that the object of your love lies elsewhere. That love is really for the English language, or for orderliness and organization, or for tradition. None of these things is a foolish thing to love. But if we really love English, or if we love the sense of structure that grammar provides, or if we love traditions and a sense of shared linguistic practices across generations, we have to look somewhere else to celebrate that devotion; rules will be, just as they always have been, inadequate to form a protective fence around English. We will never find the rules, unshiftable, unchangeable, and incorruptible. There are no such things.

  It’s worth thinking carefully about the ethical costs of trying to build that fence anyway. A fence keeps things out as surely as it keeps things in. Who is kept out of our conversations, our public life, and our academies by these language-fences? Rules can be an easy, lazy way to put the onus on someone else: if you make a grammar mistake while trying to convey something heartfelt, I can just point out you’ve used a comma splice and I’m excused from confronting what you were saying, since you didn’t say it properly. What if we thought less about rules and more about communication, and considered it our obligation to one another to try to figure out what is really being communicated? Does it truly matter if there’s a grammar error in the email from your intern, or on the sandwich board outside the deli that a new immigrant couple has opened, or in a world leader’s tweet?* I care a lot more about whether the President’s tweets show the values of democracy versus hatred than I do about where he put a comma. During the 2012 elections, when Mitt Romney ran aggressive campaigns misconstruing President Obama’s statements on entrepreneurship, Jon Stewart capped off an indictment of Romney’s strategy with, “Mr. Romney, hanging your attack on a person’s slight grammatical misstep is what people do in an argument when they’re completely fucked and they know they have no argument.” Indeed. Or it’s what they do when they don’t want to be bothered to hear the other side in the first place: a grammar attack is quite simply an ad hominem attack that looks more legitimate because it’s dressed up in a cap and gown.

  Those of you readers who speak English as a first language, like I do, enjoy a remarkable privilege: we speak the most widely spoken language in all the world. This is a wonderful advantage for us; the preeminence of English does as much as airplanes and the Internet do to make the world small enough that we can skip across its circumference in ways both real and virtual that our grandparents wouldn’t have dreamed possible. At the same time, many native English speakers never experience what it’s like to struggle to communicate basic needs to a store clerk; or to be lost on the subway, the air filled with indecipherable phonemes that offer no aid; or to be talked down to as though you’re an idiot by someone who has heard your foreign accent. Having lived abroad myself, I can recall with yesterday’s clarity the casual callousness of a German pharmacist who pretended for five long minutes that she didn’t know I was asking for ibuprofen at the Berlin Hauptbahnhof Apotheke because I said “ich mochte” (I liked) instead of “ich möchte” (I would like); and I can recall with equal clarity the generosity and hospitality of a group of Germans who took the time to have a slow conversation with me in their language while waiting for the TXL bus to come.* I know which type of memory I’d rather be in the mind of a tourist, immigrant, Internet forum poster, or any other English-language learner. Having a more advanced knowledge of a language provides a wonderful opportunity to be welcoming and constructive, if you prioritize communication over a set of fictitious rules.

  What about that semicolon I argued over with my dissertation adviser so many years ago—the semicolon that launched my deep dive into the history of punctuation and led to this book? Looking back on it now, I was right that it was “legal.” Semicolons like the one I deployed are shown in the example sentences for Chicago Manual of Style rule 6.54, which treats sentences that use elision. But even though it was legal, it was a bad semicolon. What my adviser, Bob, was reacting to (although he’ll fight me on this, I’m sure!) was not a lack of rule-following, but the poor rhythm of the sentence, which his practiced ear detected and rejected. That’s the case with so many instances of grammar-rule violations. When your teacher told you that you can’t write a one-sentence paragraph, what he or she really meant was “I needed more evidence here” or “This wasn’t the right spot in this essay for drama.” Great writers use one-sentence paragraphs all the time, but it is true that they don’t always work. It might be more efficient to make a rule against writing a one-sentence paragraph than it is to explain why some particular one-sentence paragraph didn’t work, but it’s misleading.

/>   Even if they aren’t the basis by which we read and write, punctuation rules can’t just be unthought as though they never existed in the first place. We could not (and perhaps would not want to) go back to a time before there were punctuation rules. But maybe we can think beyond them now, to develop a new, more functional, more ethical philosophy of punctuation: one that would support a richer way of learning, teaching, using, and loving language. At the very least, by reflecting on the history of the commas, colons, question marks, and semicolons that dot our written language, we can gain some of the perspective necessary to properly evaluate the virtues and vices of rules. After all, it’s impossible to confront assumptions that we can’t even see.

  Acknowledgments

  Although brief in length, this book has been long in the making, and countless people have been essential in bringing it to its final full stop.

  Robert J. Richards and Lorraine Daston have been the very best and most generous mentors anyone interested in ideas could have. I’m immensely grateful that they encouraged me to think creatively and to take risks, and for the constancy of their confidence in me.

  Long before my graduate school years, Joan Traffas and Sheila Patrick taught me that the past and present are always in conversation with one another.

  Adrian Johns at the University of Chicago and Françoise Waquet at the Centre National de Recherche Scientifique in Paris were early advocates for this project, and gamely lent me snippets of their vast expertise in books, language, and the history of the transmission of knowledge. I hope they are reading this book—but not too closely.

 

‹ Prev