The Story of Ain't
Page 8
Fifteen years before Winston Churchill stirred a fuss by saying, “It is me” (instead of It is I), the usage was considered by these linguists to be all right, which they also considered all right. The redundant expressions from whence and reason why were given a pass. The fading of the subjunctive form was acknowledged but hardly mourned as I wish I was wonderful, quoted from a J. M. Barrie comedy, slid by with two-thirds of the judges describing it as good informal English. (The formal rule on this would have led to I wish I were wonderful.) The line on singular-plural ambiguities was not entirely upheld as None of them are here was approved. An unorthodox use of shall was considered acceptable, but Leonard commented, “The whole [will-versus-shall] matter is at present surrounded by a cloud of uncertainty.”
“Current English Usage” became a promotional piece for the new linguistics, especially its open hostility to classroom grammar. Leonard himself, a student of both educational history and modern pedagogical methods, was pitiless when it came to the teaching of rules not supported by actual usage.
“Probably no study is allotted more time and is more barren of results than that from which our grammar schools derive their name. After five or six years of grammar work in the elementary schools; after endless diagramming (which is but parsing in pictorial imagination); after painful memorizing of rules and definitions; and after constant composition of illustrative exercises, many a high school freshman cannot write or speak a decent English sentence.”6
From the time of Samuel Johnson and Jonathan Swift, it had been common to bemoan the state of the language, but with the rise of the new linguistics, the complainers were put on notice to show cause, to buttress their suppositions about what was correct with real-world evidence. This did not, however, bring an end to such complaints. Rather, the complaints had to become far more selective, made with an eye to the historical record. So, bitching about the terrible language of others did not go out of style, even among linguists. On the contrary, language snobbery now required a much finer sense of actual usage.
Chapter 11
Dwight Macdonald grew a goatee and traded his vested shirt-and-tie for a chambray work shirt, which he wore to the office. Yet Macdonald did not all of a sudden hand his heart to the unionists, the communists, or other radicals. Covering the Communist Party of the U.S.A. for Fortune, he found the communists to be a “maddening race for an outsider because of the great air of mystery and conspiracy they adopt.”1 In July 1934, he, Walker Evans, and another writer, Geoffrey Hellman, visited Camp Nitgedaiget, a Communist Party training center in Westchester County, just outside the city.
The weekend “came near making a fascist out of me.” After bathing with the communists and eating off their dirty dishes, he notes, “the comrades were 99 44/100 percent pure Yiddish and they had that peculiar Yiddish love of living in each other’s laps that you can observe any day at Coney Island.” By Sunday noon, he could take no more of the “squirming mass.” He and his companions, he tells Nancy, “stole away to the Westchester Embassy Club . . . where we bathed in a clean if capitalistic pool and drank a couple of Tom Collinses in capitalistic solitude.”
This story of comic misadventure does not go over well with Nancy.
Four days later, Dwight is backpedaling. “Yes, I agree entirely. . . . Last weekend didn’t change my belief that, all else aside, communism is the only way out of the mess our society is in.” Not only that, he wants her to know that he has been flying his political flag at the office. “The other day I had a two-hour argument with Luce about Fortune not having enough social consciousness.”
That fall, after an announcement in the New York Times, Dwight Macdonald married Nancy Rodman. The tension with Luce, however, did not pass. Dwight’s feelings about Time’s political coverage, which seemed to be lurching rightward along with the Roosevelt administration, did not go unexpressed.
He wrote the staunchly Republican Luce a letter. Time, he said, was not as impartial as it should be and even showed a “bias toward the right.” The initial tone was reasonable. Macdonald said his own politics “happen to be liberal,” but that did not mean he thought Time should have a liberal bias either. Rather, Time should seek to be “truly objective and impartial.”2
After what he called a “few hours’ review” of recent issues, Macdonald had assembled seven examples of Time’s political slant, adding that he thought he could find even more examples of a clear bias if he had the raw material from which other stories had been written. The lecture went on for several pages—about half of it solid criticism, with the other half taken up by sideline whining of the If-I-had-written-this-article variety. But what stands out is the fact that Dwight Macdonald—a thirty-year-old writer who was paid a lordly $10,000 a year (over $150,000 in today’s dollars) at a time of great hardship and want nationwide—doesn’t care in the least if his boss finds him annoying.
Principles were at stake, but for Macdonald, protest was also personal indulgence. As he put it on another occasion, “I can work up a moral indignation quicker than a fat tennis player can work up a sweat.”3
And much of his indignation was on behalf of neither journalistic nor even liberal values. The seventh and last example he drew forth to demonstrate Time’s bias—“Exhibit G,” he called it—concerned a brief article recounting a strange near car accident turned fracas involving the French socialist leader Leon Blum and a funeral procession of monarchists yearning for the restoration of their exiled “king.”
“Pieces like this,” he wrote to Luce, “are one reason for the charge of fascism so frequently made against Time.”
A chaotic incident that would have required Evelyn Waugh to conjure, the near collision and bizarre episode of violence that followed was captured on film by an amateur cameraman, allowing Time’s writer to describe the whole thing rather vividly in the February 24, 1936, issue. Long on caricature and farce, the article seemed vulgar to Macdonald for its retailing of physical violence. And though its treatment of the monarchists was far from sympathetic, what angered him was its insufficient respect for the leftists involved, who were described as “screaming” with rage on the victim Blum’s behalf.
“Why must radicals always be presented as screaming or howling? Do their protests seem so silly to Time? If so, small wonder Time appears silly to them.”
Take that.
In 1936, the fourth installment of Macdonald’s book-length opus on the U.S. Steel Corporation for Fortune ran into heavy editing. His sardonic though sympathetic article on the Communist Party had not been bowdlerized before publication—which had surprised him. But for this conclusive chapter on U.S. Steel, he’d gone for broke, opening the article with a long quotation from Imperialism, by V. I. Lenin.
U.S. Steel’s managers were described as too stupid and monopolistic to be good capitalists. Nor were they communists, for they were too vicious and cruelly indifferent to the fate of the worker. More like fascists, thought Macdonald, cycling through one ism after another, and definitely dictatorial.
This attempt to report on American business from the inchoate point of view of a recently converted Marxist was likely never intended to succeed. Predictably, long stretches of theoretical anticapitalist commentary were cut. Macdonald played the role of the offended party. “If stories are to be edited like this,” he asked, “why hire writers at all?”4
Afterward, he took six months off, confident of never returning. “A weight has been lifted off my shoulders. I feel free, free to see what life is all about, to really read scores of books that I’ve long wanted to. Especially I hope to orient myself politically. I know that I don’t believe in capitalism, but I’m still hazy as to what course to take from there.”5
Chapter 12
In the 1930s, linguistics was in some ways still a young discipline. But, after more than a century, it was possible to write a unifying treatise of its history, methods, and assumptions, as the American scholar Leonard Bloomfield did. He t
itled the work, very simply, Language.
Bloomfield taught at the University of Chicago, not the best home for a linguist at the time. He complained of “the snobberies and imbecilities which make a byword of the American college.”1 His supervisor, Dean of the Humanities Richard McKeon, an administrator working under President Robert Hutchins, once dismissed linguistic research as “counting prepositions.”2 McKeon later did his best to keep Bloomfield from going to Yale, but he could not convince Bloomfield, who had gotten stuck chairing his department and teaching introductory German year after year, that Chicago was truly interested in its linguistics program.
The center of gravity at Chicago in those days was to be found in its burgeoning great books curriculum, a descendant of the liberal education conceived by Charles William Eliot and marketed as the Harvard Classics. The handsome and young Hutchins, a media darling, was declaring war on specialization. “The gadgeteers and data collectors,” Hutchins wrote, “masquerading as scientists, have threatened to become the supreme chieftains of the scholarly world.”3
The goal of undergraduate education, thought Hutchins, should be loftier than the assimilation of facts; it should be the attainment of wisdom, discovered through personal engagement with timeless works of great literature.
Bloomfield, a quiet man, did not speak the same language. He had more in common with the scholars who had labored for years to work out a doctrine of the final e in Chaucer. The nephew of another prominent linguist, an expert in Sanskrit, he was born into a German-speaking household with two other intellectually gifted children. According to his biographer (also a linguist), he spoke with an “inimitable uvular trill” for certain r sounds.
A dull classroom instructor, he was a bit of a mystery to many who knew him. To some colleagues he seemed insufficiently anti-German during World War I as Americans took to renaming their sauerkraut “liberty cabbage.” More outspoken during the 1930s, he said that if a Hitler were to come to power here in the United States, he, Bloomfield, would apply to the University of Mexico for a job—and if he wasn’t offered one in two weeks’ time, he would promptly commit suicide.
He greatly admired the Menominee Indians, whose language he studied and recorded, lamenting the loss of their native eloquence and their entrapment in the substandard English of the American underclass. He also bore a marked sympathy for American blacks, saying once that the United States could not rightly be called a democracy until the day it elected a black president. But it was not for his political views—which were rarely on offer, even to his closer friends and colleagues—that Bloomfield became known.
A socially inept intellectual—his idea of humor, when friends came to stay, was to plant fake bedbugs in his guest room and wait for the inevitable screams—he would become one of the most celebrated linguists of his time, a key spokesman (at least in print) for his discipline and its contribution to the sum of human knowledge.4
Hoping to explain linguistics to the outside world, Bloomfield began his treatise with the conventional point of view of the educated layman. “Occasionally he”—the layman, that is—“debates questions of ‘correctness’—whether it is ‘better,’ for instance, to say It’s I or It’s me.” To settle such issues the layman might refer to a grammar or a dictionary, or, more likely, he’ll try to reason his way to an answer, using those grammatical terms he learned in school: subject, object, predicate, and so on.
“This is the common-sense way of dealing with linguistic matters. Like much else that masquerades as common sense, it is in fact highly sophisticated.” By sophisticated, Bloomfield did not mean intelligent so much as complicated and bearing assumptions the layman did not even recognize.
Bloomfield’s layman also believed in the authority of experts when it came to language. He showed a humble faith “that the grammarian or lexicographer, fortified by his powers of reasoning, can ascertain the logical basis of language and prescribe how people ought to speak.”
There was more than a little politics behind the beliefs of this layman. His ideas—that language could be rationally explained and that grammarians were rightful authorities—were traceable to the antidemocratic assumptions of the eighteenth century, when “the spread of education led many dialect-speakers to learn the upper-class forms of speech.”5
With an air of persecution, Bloomfield continued: “This gave the authoritarians their chance: they wrote normative grammars, in which they often ignored actual usage in favor of speculative notions. Both the belief in ‘authority’ and some of the fanciful rules (as, for instance, about the use of shall and will) still prevail in our schools.”
The normative grammars, Bloomfield argued, helped cement a mental block of class prejudice and intellectual failure that kept Europeans from faithfully recording the facts of language and appreciating its natural spoken form. But not forever. Around the turn of the nineteenth century, European scholars learned about the study of Sanskrit.
The Hindu tradition collected ancient hymns in the Rig-Veda, which dated to 1200 BC or perhaps even earlier. As the spoken language evolved, it became the work “of a special class of learned men” to read and interpret the sacred text, preserving knowledge of correct pronunciation and the like. “We find the Hindu grammarians,” said Bloomfield, “making rules and lists of forms descriptive of the correct type of speech, which they called Sanskrit.”
The oldest treatise of Sanskrit dated to the third or fourth century BC. Written by a man named Panini, “it describes, with the minutest detail, every inflection, derivation, and composition, and every syntactic usage of its author’s speech. No other language, to this day, has been so perfectly described.”
The difference between Sanskrit and modern attempts to catalog socially preferred English was in method. “The Indian grammar presented to European eyes, for the first time, a complete and accurate description of a language based not upon theory but observation.”
After the European discovery of Sanskrit came the comparison of Indo-European languages. The scholar and fable writer Jacob Grimm discovered what came to be known as Grimm’s law: a system of corresponding sound shifts among Germanic and other Indo-European languages. These correspondences, said Bloomfield, “showed that human action, in the mass, is not altogether haphazard, but may proceed with regularity even in so unimportant a matter as the manner of pronouncing the individual sounds within the flow of speech.”
It also showed that language was properly, if not exclusively, an object of scientific investigation. But while fundamental laws of language change were being discovered in the nineteenth century, the layman continued to worry about secondary issues of correctness.
Of course, it was part of the linguist’s task, according to Bloomfield, “to find out under what circumstances the speakers label a form in one way or the other”—good or bad—“and, in the case of each particular form, why they label it as they do: why, for example, many people say that ain’t is ‘bad’ and am not is ‘good.’ ”
About what was truly good and truly bad, however, the linguist had to remain unprejudiced. And within large speech communities, the student of language should observe that standards for good and bad language change with class.6
In the United States, “children who are born into homes of privilege . . . become native speakers of what is popularly known as ‘good’ English; the linguist prefers to give it the noncommittal name of standard English. Less fortunate children become native speakers of ‘bad’ or ‘vulgar’ or, as the linguist prefers to call it, non-standard English.”
There were the literary standards of formal discourse, the colloquial standards of the privileged class, and middle-class public school standards. There was also “nonstandard” language. And there was, rather confusingly, something else called “substandard,” which Bloomfield described as used widely in the United States but not by adherents of privileged-class standards or middle-class standards. Bloomfield used the sam
e example for nonstandard as he did for substandard: I ain’t got none.7
The double negative was, of course, verboten in standard English. And whatever ain’t was—substandard or nonstandard—it was not standard. It was not “good.” This did not mean that ain’t was not a word, only that it was out of favor in standard English.
Linguistics was also neutral among the different standards of different times. Speaking for his profession, Bloomfield accepted without qualification that language change is normal. Change was “unceasing,” and “constantly going on in every language.” It was not even to be lamented.
And not only did Bloomfield and the new linguistics put words like “good” and “bad” in quotes; they put the word “authority” in quotes. There was no authority, no wise man who knows all about this stuff; there was only the record of usage, partly known and widely misunderstood. Dictionaries might attempt to combat “personal deviations” in the language or slow the pace of linguistic change, but they could not hold back the tide.
The discovery that language was better understood when better observed led to a reevaluation of writing, which had dominated and, according to Bloomfield, distorted the European understanding of language. Writing, said Bloomfield, was “a relatively recent invention, and . . . its use has been confined, until quite recently, to a very few persons.”
To study writing was not to study language. “All languages were spoken through nearly all of their history by people who did not read or write; the languages of such people are just as stable, regular, and rich as the languages of literate nations.” This was not just Bloomfield’s view; it was a matter of doctrine among linguists. Writing was “merely a way of recording speech by visible marks.”
And just as usage and grammar were relative, so was pronunciation—more so, even. It was infinitely variable. Phoneticians needed to be warned not to record too much information about pronunciation. “Having learned to discriminate many kinds of sounds, the phonetician may turn to some language, new or familiar, and insist upon recording all the distinctions he has learned to discriminate,” even when they “have no bearing whatsoever.”