Macdonald had come to loathe all systems of government. He saw a keen resemblance between the modern capitalistic state and its Nazi and Soviet competitors. Each deprived the individual of the economic and moral power to define his own existence.
In a great passage, the once-blatant anti-Semite described the crushing experience of reading about the horrors of the war, especially the concentration camps. “Let us not only accept these horrors; let us insist on them. Let us not turn aside from the greatest of them all: the execution of half the Jewish population of Europe.” One also had to reserve moral revulsion for the Soviet Union, where millions had been starved to death or left to die in state prisons. And still this would not be enough.
The moral conscience must also be stirred, Macdonald argued, by the American firebombing of Dresden and other German cities, and its atom-bombing of Japan, too. And yet it would not do to merely denounce such enormities, for they were as much the result of scientific progress as of modern social organization. In collective action—that of corporations, of armies, of the state—everyone was involved but no one was responsible. “If everyone is guilty, no one is guilty.”
It was an overwhelming, unforgiving vision of political fault, so idealistic, so insistent on denouncing the sins of the United States beside those of the Soviet Union and Nazi Germany that it diminished their moral differences and left the radical without a side of his own. In the modern bureaucratic world, the only morally acceptable role was that of the permanent dissident, for he alone is released from the compromises required by living and working with others.
Macdonald was serious about confronting evil but less serious about the other challenges of being in this world. He seemed to lack the humility necessary to reconcile himself to organizations small and large—magazines, marriages, countries—whose wrongdoings and rightdoings somehow become our own. Idealism is a one-man operation, and it was a useful creed for Macdonald, chosen, one suspects, for the support it gave to the pleasant and gratifying work of playing critic at large, a role he always played well.
As a theorist, Macdonald was unimpressive. Pages and pages he devoted to working through his Marxist education and applying it philosophically to the class struggle and the war. As much as he had tried since first laying eyes on The Communist Manifesto, he never showed a talent for dialectical work. And his grasp of American history was poor. Though he strove to prove himself as a thinker, he was more successful as a writer, a journalist, a magazine essayist—which this writer certainly does not mean as faint praise. A long, malicious attack he wrote on Henry Wallace, the Soviet apologist and Progressive Party candidate for president, did far more for Macdonald’s reputation than any of his more theoretical work.1
To the picky, discriminating writer, there is nothing so revealing of a person’s true character as the words they favor, and Wallace’s vocabulary revealed to Macdonald a dopey liberalism of high motives and low intelligence. He described Wallace as having his own idiom, “Wallese, a debased provincial dialect,” in which good people are always “ ‘forward-looking,’ ‘freedom-loving,’ ‘clear-thinking,’ and, of course, ‘democratic’ and ‘progressive.’ ” Bad people “were always ‘reactionaries’ or ‘Red-baiters.’ ”2
The major part of Macdonald’s still-developing literary gift consisted in finding bumptious nonsense that could be parlayed as journalistic material, at once comic and stimulating. He was a master of the telling detail that not only entertains a reader but silently congratulates him for being smarter than the idiot he is reading about. There is art to such writing, the art of verbal caricature, and there is sport: the fun of making fun and the pleasure of watching blowhards and know-it-alls get their due. But there is always the feeling that a minor injustice is being perpetrated; what saves the reader from being repelled is his confidence that this injustice is somehow deserved—in the end, more right than wrong.
Amid the riches of New York’s literary scene, there were other notable styles, also compelling, of intellectual writing about matters of public interest. Less entertaining, less stylish, but altogether more serious, for example, was Lionel Trilling’s essay in Partisan Review on the first Kinsey report. Sexual Behavior in the Human Male sold two hundred thousand copies in its first two months and suggested that a new consensus on sex was taking shape amid the ashes of nineteenth-century reticence.
Sex was above all natural, according to the report, and the more frequent the better. “American popular culture has surely been made richer by the Report’s gift of a new folk hero,” Trilling dryly commented, “the ‘scholarly and skilled lawyer’ who for thirty years has had an orgasmic frequency of thirty times a week.”3
Trilling astutely noted “an awkwardness in the handling of ideas” that required a careful rethinking not of the report’s scientific evidence, which he left for others to examine, but of its cultural and moral implications. The report’s findings on masturbation contradicted the traditional view codified in Webster’s Second, where masturbation was memorably defined as “onanism; self-pollution.” Kinsey and company sought to defend sex from such traditionalist moralizing and separate sex from character. But, Trilling noted, this did not stop the Kinsey Report from pursuing its own ideas about sex and character, sometimes at the cost of its principles.
According to the report, masturbation was common among upper-class males who found “an insufficient outlet through heterosexual coitus.” But it also represented an “escape from reality” and was therefore potentially harmful to the pent-up individual’s “personality.” What kept the report’s authors from entirely countenancing masturbation, Trilling ably showed, was its prior commitment to the sexually uninhibited individual.
“Much in the report is to be understood as having been dictated by a recoil from the crude and often brutal rejection which society has made of persons it calls sexually aberrant.” And this was a “good impulse,” said Trilling, but it was pursued thoughtlessly.
The report rejected the altogether sinister view that homosexuality was evidence of “psychopathic personality.” Fair enough, said Trilling, but the authors had not by this simple gracious act won social equality for homosexuality. In short, the fact of homosexuality did not save scientists and other thinkers from the work of thinking about how it might be accommodated.
Trilling had been educated at Columbia University in the 1920s, first in the general studies course developed by John Erskine, who had studied under Charles Eliot Norton. His essay showed how a classically trained mind was equipped to address questions, difficult moral and social questions, too subtle for scientific evidence-gathering. It was a style capable of humor and especially strong at making careful distinctions between fact and interpretation, between the is and the ought. It could be generous to other disciplines and views it opposed (speaking humanistically and quite ably to the authority of science), but it assumed a burden of seriousness, preferring classic categories to catchy handles while eschewing the pleasures of mockery and making one’s opponents out to be fools.
The two writers—Trilling and Macdonald—took an interest in what was only recently being referred to as “popular culture.” In 1944, Dwight Macdonald had written, “surprisingly little attention has been paid to Popular Culture by American intellectuals,” capitalizing a phrase that he also used with quotation marks in the title of an article in Politics, indicating its novelty.4 A Google Ngram search shows popular culture to be exceedingly rare before the 1950s. If Clement Greenberg’s argument about kitsch showed the difference between cheap reproduction and high art, Macdonald’s essay began the work of cataloging examples of popular culture, from the popular science of Paul de Kruif and the popular philosophy of Will Durant to the literary criticism of Henry Seidel Canby and the journalism of Reader’s Digest, which typified for Macdonald all that was wrong with literary culture in America.
“Here is a magazine which in a few years has attracted an enormous circulation simp
ly by reducing to even lower terms the already superficial formula of commercial periodicals. Where Harpers treats in six pages a theme requiring twelve, Reader’s Digest cuts the six pages to two, making it three times as ‘readable’ and three times as superficial.”
Success of this order practically required the diminution of smaller, nobler efforts such as Partisan Review and Kenyon Review. A cultural version of Gresham’s law took effect, and the bad drove out the good. Macdonald had long been a cultural pessimist, yet he paused to reflect on one especially novel example of popular culture: superheroes. Just a few years after the introduction of Superman, a striking response of American imagination to the despair of the 1930s, Macdonald was recommending that radicals pay attention to such “synthetic” folk heroes.
But as with Greenberg’s essay on kitsch, which mapped the cultural distance between the avant-garde and the rear guard, Macdonald looked to separate the high from the low. One sees the beginning of a recurring lament for Macdonald, that fairly serious culture that could be taken for granted in Paris or London was not available in America except on the margins. Typical of this period, however, Macdonald treats the question as a political one, asking about the exploitative aspects of American capitalism. But even here one sees the roots of a cultural critique that would soon be caricatured in popular discussions of highbrow versus middlebrow versus lowbrow culture, in Harper’s (though in fewer pages, of course) and other magazines.
By the end of the 1940s, Dwight Macdonald was depressed about politics and exhausted by Politics, which he not only wrote for, but edited, proofread, and laid out all by himself. And he began to part ways with his social conscience, Nancy. His approach to life changed. Political abstractions gave way to matters of the self. He started recording his dreams and liking his children and exploring a newfound sensuality in extramarital affairs.
All the while he kept a busy schedule of organization-joining and -supporting, lecturing, and correspondence. To George Orwell, he wrote, “I wonder if you share my private enthusiasm for Dr. Johnson?”5
The Macdonalds finally moved out of Greenwich Village to an apartment in Midtown. In 1949, Mary McCarthy published The Oasis, an anti-utopian satire of “the family,” the circle of anti-Stalinist New York intellectuals bound to each other through Partisan Review and other publications. Dwight Macdonald appeared as Macdougal McDermott, who had “sacrificed $20,000 a year and a secure career as a paid journalist. . . . He had moved down town into Bohemia, painted his walls indigo, dropped the use of capital letters and the practice of wearing a vest.”6
He is one of the leaders of a group of intellectuals who flee the city to start a utopian colony in the mountains. The first sign of trouble comes when the colonists have to decide what to do about Joe, a successful businessman, a “well-intentioned Babbitt” who has latched on to the scene in order to pursue his passion for painting but quickly runs afoul of the local gods.
For its transparent depictions of the family, The Oasis became the scandal of the year. “More rows, clashes, feuds, and factional conflicts in the NYC literary world this winter than at any time in the past—maybe it’s all breaking up—rather frightening really,” thought Macdonald.7
Chapter 21
In 1950, cold war and thirty-six pages of other new terms appeared in a special addendum, inserted into the latest reprinting of Webster’s Second just before the main vocabulary. The once-“fanciful” A-bomb was thus entered, along with a definition for Manhattan Project. Words to describe the challenge of the USSR gave rise to entries for commie, hammer and sickle, red, red-bait, NKVD, and soviet.
The Soviet threat colored other additions. A stooge was a dim-witted foil on the stage and screen. Larry, Moe, and Curly were not mentioned, but communism was. In the third sense a stooge was “a group outside the USSR that plays a completely obedient and obsequious role to Soviet dictation.” In defining this pointed term, the lexicographer adopted the point of view of the word itself. Similarly, communism was left out of the newly added definition of witch hunt: “A searching out of victims professedly for exposure on charges of subversion, disloyalty, or the like . . . esp. liberals.” This same year Alger Hiss was convicted of perjury. Communists in the State Department? “Fanciful,” the dictionary seemed to say.
War had contributed much new vocabulary that needed defining: beachhead, biological warfare, combat fatigue, flak, sack time, sad sack, snafu. And older words and phrases found new meanings and relevance: bombardier, dogfight, ground crew, scuttlebutt. Entries for Marshall Plan, National Security Act, the Social Security Act, and Truman Doctrine were added. Room was made for all the wartime abbreviations: V E Day, V J Day, WASP, WAAF, and so on. GI was defined to include both GI Joe and GI Jane. An entry for World War Two itself was added.
Keeping up with the language required keeping up with the culture, itself reshaped by newfound leisure time and the tastes of young Americans. Teenage and teenaged had been around since the 1920s, and other forms of the words much, much longer, but in recent years these adjectives grew into persons and the cultural figure known as the teenager had to be entered in the dictionary and defined.
What he or she chose from the jukebox might suggest they were a bobby-soxer. The jazz-inspired jive was finally entered, after decades of nonrecognition. Round the clock was added, two years before Max C. Freeman and Jimmy De Knight wrote the quintessential “Rock Around the Clock,” made into a resounding hit by Bill Haley and the Comets in 1954.
Dwight Macdonald’s favorite fabric, rayon, had made it into Webster’s Second in 1934; polyester was entered in 1950 in the new addendum, where the new look was described as “the 1947 mode in women’s clothing calling especially for much longer fuller skirts, and narrower waistlines, following relaxation of the material shortages of World War II.” Pinball was added, as were pizza and hamburger. Pizza had been around for a long, long time, of course, but now English-speaking people, Americans especially, were eating it, in pizzerias, some decades after waves of Italian immigrants had arrived. Webster’s Second had only an entry for Hamburg steak, sounding like some Teutonic pensioner who might have sworn fealty to the Kaiser. For drink service, jerk and soda jerk were now being yanked into the pages of Webster’s.
A new image of American domesticity was coming into view. With television ownership on the rise, the abbreviation TV needed consideration. As Levittown on New York’s Long Island was still under construction, prefab shows up in the addendum. And what would suburbia be like without the station wagon? This car was driven on the expressway or the superhighway, either way likely to be one of those cloverleaf roads Dr. McClintock had discussed in Time magazine. What to do when a young couple wanted to go for a ride? Hire a babysitter, a term Merriam-Webster dates only to 1947. The “tender relation of one to one” was already netting a great many babies to sit, but baby boom, dated 1941 in the Merriam-Webster files, did not yet rate.
Chapter 22
In 1950, W. Freeman Twaddell, a professor of German on leave from Brown University, joined G. & C. Merriam Company as a resident editor responsible for Webster’s Third. He stayed long enough to determine that he was not needed.
J. P. Bethel continued as general editor. And after Twaddell had been on board for a year or so, Bethel asked him as a relative newcomer to write down his impressions of the staff. It was not a casual suggestion, but a request for a statement that could be circulated to the president and the Editorial Board. Nor was the exchange spontaneous; Bethel knew what Twaddell would say and wanted him to say it.
“My strongest impression,” wrote Twaddell, “is that the Editorial Department, thanks to the wise policies and careful administration of the past couple of decades, has come of age; it is a force of the first order—anonymous and somewhat awe-inspiring.”1
Twaddell compared himself to the swallow in a story told in the eighth century by the Venerable Bede, a church historian. The swallow flies through an open door and in
to a house where people are peaceably eating dinner, then immediately leaves by another door. After so brief a visit, Twaddell had not developed an insider’s view of Merriam, full of history dating back to Webster himself. He nevertheless felt confident in his observations and ready to argue for the reversal of a major long-standing policy.
“It is no longer necessary or even desirable to import or borrow the prestige of an eminent outsider for any product of the Editorial Department.” The reason was simple: “No one personage, in scholarship, combines high reputation and balanced versatility as the Merriam-Webster editorial staff does.”
The search for a “name,” a spokesman for the learned professions, was actually self-defeating, Twaddell argued. Should there be a controversy, the dictionary would too easily be cast as an extension of that one person’s predilections. “That one name is far more likely to arouse suspicions of one-sidedness, of this or that over-emphasis, than the reverse.”
It was an argument for actual qualifications over public réclame, questioning the power of figureheads to lend majesty to those humble clerks and middle managers who do the actual work. Those names that flit about from government commissions to the presidential offices of colleges and universities to the covers of major anthologies, the William Allan Neilsons of the world, they did not deserve what rightly belonged to the staff lexicographer, who, like the forgotten man of Rooseveltian rhetoric, possessed all the experience, all the skill, Merriam wanted.
Beyond this, the passing swallow saw a set of problems essentially practical in nature. In making Webster’s Third, he said, “there are two evils that the Editorial Department has to avoid.” The first was “lost motion, needless disturbance of the smoothly functioning machinery by the ignorance and inexperience of a newcomer.” This problem could be avoided “only by turning over administration to a Merriam editor who knows the ropes and knows the deficiencies of individual staff members.”
The Story of Ain't Page 15