The Language Instinct: How the Mind Creates Language

Home > Nonfiction > The Language Instinct: How the Mind Creates Language > Page 46
The Language Instinct: How the Mind Creates Language Page 46

by Steven Pinker


  In language there are no licensed practitioners, but the woods are full of midwives, herbalists, colonic irrigationists, bonesetters, and general-purpose witch doctors, some abysmally ignorant, others with a rich fund of practical knowledge—whom we shall lump together and call shamans. They require our attention not only because the fill a lack but because they are almost the only people who make the news when language begins to cause trouble and someone must answer the cry for help. Sometimes their advice is sound. Sometimes it is worthless, but still it is sought because no one knows where else to turn. We are living in an African village and Albert Schweitzer has not arrived yet.

  So what should be done about usage? Unlike some academics in the 1960s, I am not saying that instruction in standard English grammar and composition is a tool to perpetuate an oppressive white patriarchal capitalist status quo and that The People should be liberated to write however they please. Some aspects of how people express themselves in some settings are worth trying to change. What I am calling for is innocuous: a more thoughtful discussion of language and how people use it, replacing bubbe-maises (old wives’ tales) with the best scientific knowledge available. It is especially important that we not underestimate the sophistication of the actual cause of any instance of language use: the human mind.

  It is ironic that the jeremiads wailing about how sloppy language leads to sloppy thought are themselves hairballs of loosely associated factoids and tangled non sequiturs. All the examples of verbal behavior that the complainer takes exception to for any reason are packed together in one unappealing mass and coughed up as proof of The Decline of the Language: teenage slang, sophistry, regional variations in pronunciation and diction, bureaucratic bafflegab, poor spelling and punctuation, pseudo-errors like hopefully, badly crafted prose, government euphemism, nonstandard grammar like ain’t, misleading advertising, and so on (not to mention deliberate witticisms that go over the complainer’s head).

  I hope to have convinced you of two things. Many prescriptive rules of grammar are just plain dumb and should be deleted from the usage handbooks. And most of standard English is just that, standard, in the same sense that certain units of currency or household voltages are said to be standard. It is just common sense that people should be given every encouragement and opportunity to learn the dialect that has become the standard one in their society and to employ it in many formal settings. But there is no need to use terms like “bad grammar,” “fractured syntax,” and “incorrect usage” when referring to rural and black dialects. Though I am no fan of “politically correct” euphemism (in which, according to the satire, white woman should be replaced by melanin-impoverished person off gender), using terms like “bad grammar” for “nonstandard” is both insulting and scientifically inaccurate.

  As for slang, I’m all for it! Some people worry that slang will somehow “corrupt” the language. We should be so lucky. Most slang lexicons are preciously guarded by their subcultures as membership badges. When given a glimpse into one of these lexicons, no true language-lover can fail to be dazzled by the brilliant wordplay and wit: from medical students (Zorro-belly, crispy critter, prune), rappers (jaw-jacking, dissing), college students (studmuffin, veg out, blow off), surfers (gnarlacious, geeklified), and hackers (to flame, core-dump, crufty). When the more passé terms get cast off and handed down to the mainstream, they often fill expressive gaps in the language beautifully. I don’t know how I ever did without to flame (protest self-righteously), to dis (express disrespect for), and to blow off (dismiss an obligation), and there are thousands of now-unexceptionable English words like clever, fun, sham, banter, mob, stingy, bully, junkie, and jazz that began life as slang. It is especially hypocritical to oppose linguistic innovations reflexively and at the same time to decry the loss of distinctions like lie versus lay on the pretext of preserving expressive power. Vehicles for expressing thought are being created far more quickly than they are being lost.

  There is probably a good explanation for the cult of inarticulateness, where speech is punctuated with you know, like, sort of, I mean, and so on. Everyone maintains a number of ways of speaking that are appropriate to different contexts defined by the status and solidarity they feel with respect to their interlocutor. It seems that younger Americans try to maintain lower levels of social distance than older generations are used to. I know many gifted prose stylists my age whose one-on-one speech is peppered with sort of and you know, their attempt to avoid affecting the stance of the expert who feels entitled to lecture the conversational partner with confident pronouncements. Some people find it grating, but most speakers can turn it off at will, and I find it no worse than the other extreme, certain older academics who hold court during social gatherings, pontificating eloquently to their trapped junior audiences.

  The aspect of language use that is most worth changing is the clarity and style of written prose. Expository writing requires language to express far more complex trains of thought than it was biologically designed to do. Inconsistencies caused by limitations of short-term memory and planning, unnoticed in conversation, are not as tolerable when preserved on a page that is to be perused more leisurely. Also, unlike a conversational partner, a reader will rarely share enough background assumptions to interpolate all the missing premises that make language comprehensible. Overcoming one’s natural egocentrism and trying to anticipate the knowledge state of a generic reader at every stage of the exposition is one of the most important tasks in writing well. All this makes writing a difficult craft that must be mastered through practice, instruction, feedback, and—probably most important—intensive exposure to good examples. There are excellent manuals of composition that discuss these and other skills with great wisdom, like Strunk and White’s The Elements of Style and Williams’s Style: Toward Clarity and Grace. What is most relevant to my point is how removed their practical advice is from the trivia of split infinitives and slang. For example, a banal but universally acknowledged key to good writing is to revise extensively. Good writers go through anywhere from two to twenty drafts before releasing a paper. Anyone who does not appreciate this necessity is going to be a bad writer. Imagine a Jeremiah exclaiming, “Our language today is threatened by an insidious enemy: the youth are not revising their drafts enough times.” Kind of takes the fun out, doesn’t it? It’s not something that can be blamed on television, rock music, shopping mall culture, overpaid athletes, or any of the other signs of the decay of civilization. But if it’s clear writing that we want, this is the kind of homely remedy that is called for.

  Finally, a confession. When I hear someone use disinterested to mean “apathetic,” I am apt to go into a rage. Disinterested (I suppose I must explain that it means “unbiased”) is such a lovely word: it is ever-so-subtly different from impartial or unbiased in implying that the person has no stake in the matter, not that he is merely committed to being even-handed out of personal principle. It gets this fine meaning from its delicate structure: interest means “stake,” as in conflict of interest and financial interest; adding -ed to a noun can make it pertain to someone that owns the referent of that noun, as in moneyed, one-eyed, or hook-nosed; dis- negates the combination. The grammatical logic reveals itself in the similarly structured disadvantaged, disaffected, disillusioned, disjointed, and dispossessed. Since we already have the word uninterested, there can be no reason to rob discerning language-lovers of disinterested by merging their meanings, except as a tacky attempt to sound more high-falutin’. And don’t get me started on fortuitous and parameter…

  Chill out, Professor. The original, eighteenth-century meaning of disinterested turns out to be—yes, “uninterested.” And that, too, makes grammatical sense. The adjective interested meaning “engaged” (related to the participle of the verb to interest) is far more common than the noun interest meaning “stake,” so dis- can be analyzed as simply negating that adjective, as in discourteous, dishonest, disloyal, disreputable, and the parallel dissatisfied and distrusted. But these ratio
nalizations are beside the point. Every component of a language changes over time, and at any moment a language is enduring many losses. But since the human mind does not change over time, the richness of a language is always being replenished. Whenever any of us gets grumpy about some change in usage, we would do well to read the words of Samuel Johnson in the preface to his 1755 dictionary, a reaction to the Jeremiahs of his day:

  Those who have been persuaded to think well of my design, require that it should fix our language, and put a stop to those alterations which time and chance have hitherto been suffered to make in it without opposition. With this consequence I will confess that I have flattered myself for a while; but now begin to fear that I have indulged expectations which neither reason nor experience can justify. When we see men grow old and die at a certain time one after another, from century to century, we laugh at the elixir that promises to prolong life to a thousand years; and with equal justice may the lexicographer be derided, who being able to produce no example of a nation that has preserved their words and phrases from mutability, shall imagine that his dictionary can embalm his language, and secure it from corruption and decay, that it is in his power to change sublunary nature, and clear the world at once from folly, vanity, and affectation. With this hope, however, academies have been instituted, to guard the avenues of their languages, to retain fugitives, and to repulse intruders; but their vigilance and activity have hitherto been vain; sounds are too volatile and subtle for legal restraints; to enchain syllables, and to lash the wind, are equally the undertakings of pride, unwilling to measure its desires by its strength.

  Mínd Desígn

  Early in this book I asked why you should believe that there is a language instinct. Now that I have done my best to convince you that there is one, it is time to ask why you should care. Having a language, of course, is part of what it means to be human, so it is natural to be curious. But having hands that are not occupied in locomotion is even more important to being human, and chances are you would never have made it to the last chapter of a book about the human hand. People are more than curious about language; they are passionate. The reason is obvious. Language is the most accessible part of the mind. People want to know about language because they hope this knowledge will lead to insight about human nature.

  This tie-in animates linguistic research, raising the stakes in arcane technical disagreements and attracting the attention of scholars from far-flung disciplines. Jerry Fodor, the philosopher and experimental psycholinguist, studies whether sentence parsing is an encapsulated mental module or blends in with general intelligence, and he is more honest than most in discussing his interest in the controversy:

  “But look,” you might ask, “why do you care about modules so much? You’ve got tenure; why don’t you take off and go sailing?” This is a perfectly reasonable question and one that I often ask myself…. Roughly, the idea that cognition saturates perception belongs with (and is, indeed, historically connected with) the idea in the philosophy of science that one’s observations are comprehensively determined by one’s theories; with the idea in anthropology that one’s values are comprehensively determined by one’s culture; with the idea in sociology that one’s epistemic commitments, including especially one’s science, are comprehensively determined by one’s class affiliations; and with the idea in linguistics that one’s metaphysics is comprehensively determined by one’s syntax [i.e., the Whorfian hypothesis—SP]. All these ideas imply a kind of relativistic holism: because perception is saturated by cognition, observation by theory, values by culture, science by class, and metaphysics by language, rational criticism of scientific theories, ethical values, metaphysical world-views, or whatever can take place only within the framework of assumptions that—as a matter of geographical, historical, or sociological accident—the interlocutors happen to share. What you can’t do is rationally criticize the framework.

  The thing is: I hate relativism. I hate relativism more than I hate anything else, excepting, maybe, fiberglass powerboats. More to the point, I think that relativism is very probably false. What it overlooks, to put it briefly and crudely, is the fixed structure of human nature. (That is not, of course, a novel insight; on the contrary, the malleability of human nature is a doctrine that relativists are invariably much inclined to stress; see, for example, John Dewey….) Well, in cognitive psychology the claim that there is a fixed structure of human nature traditionally takes the form of an insistence on the heterogeneity of cognitive mechanisms and the rigidity of the cognitive architecture that effects their encapsulation. If there are faculties and modules, then not everything affects everything else; not everything is plastic. Whatever the All is, at least there is more than One of it.

  For Fodor, a sentence perception module that delivers the speaker’s message verbatim, undistorted by the listener’s biases and expectations, is emblematic of a universally structured human mind, the same in all places and times, that would allow people to agree on what is just and true as a matter of objective reality rather than of taste, custom, and self-interest. It is a bit of a stretch, but no one can deny that there is a connection. Modern intellectual life is suffused with a relativism that denies that there is such a thing as a universal human nature, and the existence of a language instinct in any form challenges that denial.

  The doctrine underlying that relativism, the Standard Social Science Model (SSSM), began to dominate intellectual life in the 1920s. It was a fusion of an idea from anthropology and an idea from psychology.

  Whereas animals are rigidly controlled by their biology, human behavior is determined by culture, an autonomous system of symbols and values. Free from biological constraints, cultures can vary from one another arbitrarily and without limit.

  Human infants are born with nothing more than a few reflexes and an ability to learn. Learning is a general-purpose process, used in all domains of knowledge. Children learn their culture through indoctrination, reward and punishment, and role models.

  The SSSM has not only been the foundation of the study of humankind within the academy, but serves as the secular ideology of our age, the position on human nature that any decent person should hold. The alternative, sometimes called “biological determinism,” is said to assign people to fixed slots in the socio-political-economic hierarchy, and to be the cause of many of the horrors of recent centuries: slavery, colonialism, racial and ethnic discrimination, economic and social castes, forced sterilization, sexism, genocide. Two of the most famous founders of the SSSM, the anthropologist Margaret Mead and the psychologist John Watson, clearly had these social implications in mind:

  We are forced to conclude that human nature is almost unbelievably malleable, responding accurately and contrastingly to contrasting cultural conditions…. The members of either or both sexes may, with more or less success in the case of different individuals, be educated to approximate [any temperament]…. If we are to achieve a richer culture, rich in contrasting values, we must recognize the whole gamut of human potentialities, and so weave a less arbitrary social fabric, one in which each diverse human gift will find a fitting place. [Mead, 1935]

  Give me a dozen healthy infants, well-formed, and my own specified world to bring them up in and I’ll guarantee to take any one at random and train him to become any type of specialist I might select—doctor, lawyer, artist, merchant-chief, and yes, even beggarman and thief, regardless of his talents, penchants, tendencies, abilities, vocations, and race of his ancestors. [Watson, 1925]

  At least in the rhetoric of the educated, the SSSM has attained total victory. In polite intellectual conversations and respectable journalism, any generalization about human behavior is carefully prefaced with SSSM shibboleths that distance the speaker from history’s distasteful hereditarians, from medieval kings to Archie Bunker. “Our society,” the discussions begin, even if no other society has been examined. “Socializes us,” they continue, even if the experiences of the child are never considered. “To the role…” t
hey conclude, regardless of the aptness of the metaphor of “role,” a character or part arbitrarily assigned to be played by a performer.

  Very recently, the newsmagazines tell us that “the pendulum is swinging back.” As they describe the appalled pacifist feminist parents of a three-year-old gun nut son and a four-year-old Barbie-doll-obsessed daughter, they remind the reader that hereditary factors cannot be ignored and that all behavior is an interaction between nature and nurture, whose contributions are as inseparable as the length and width of a rectangle in determining its area.

  I would be depressed if what we have learned about the language instinct were folded into the mindless dichotomies of heredity-environment (a.k.a. nature-nurture, nativism-empiricism, innate-acquired, biology-culture), the unhelpful bromides about inextricably intertwined interactions, or the cynical image of a swaying pendulum of scientific fashion. I think that our understanding of language offers a more satisfying way of studying the human mind and human nature.

 

‹ Prev