Joshua Zeitz

Home > Other > Joshua Zeitz > Page 8


  Founded in 1866 by former Confederate officers, the original Ku Klux Klan bathed the southern countryside in blood throughout 1869 and 1870 in a half-successful attempt to roll back the gains African Americans had achieved during the early years of Reconstruction.1 Under the determined leadership of President Ulysses S. Grant, however, the federal government cracked down hard and drove the Klan out of business. By 1871, it was a relic of history.

  Racial progress proved to be short-lived, and by the eve of World War I history was being written by the losers.2 For several decades, southern Democrats and their defenders had found strong allies in the northern intellectual community. The leading historian of the Civil War era, Columbia University’s William Dunning, was training a generation of scholars to write and teach about Reconstruction as a dark episode in American history, marked by the cruel attempts of northern radicals to impose a harsh and unforgiving peace on the white South and to foist black equality on a proud Confederate nation.

  Dunning’s interpretation of Reconstruction was dead wrong, but it was conventional wisdom in the first decades of the twentieth century. It fit well with the country’s diminishing goodwill toward African Americans and its steady embrace of social Darwinism and scientific racism.

  Thousands of Ku Klux Klan members take their protest against the modern age to Washington, D.C., in September 1926.

  Even more popular among mainstream readers was a fictional trilogy by the southern writer Thomas Dixon. In The Leopard’s Spots (1902), The Clansman (1905), and The Traitor (1907), Dixon held up the Ku Klux Klan as a savior of white society. The novels became instant best-sellers. Among their admirers was a young filmmaker, David Wark Griffith, who in the years leading up to World War I was laying designs to produce the industry’s first ever feature-length movie.

  Griffith needed new material—something stirring, something epic. The romance of the Ku Klux Klan was exactly the right fit. With the author’s blessing, he adapted Dixon’s trilogy into a new screenplay, The Birth of a Nation. Released in 1915, the film, which starred the Fitzgeralds’ good friend Lillian Gish, caused a nationwide stir. Millions of viewers lined up to watch almost three hours of melodrama, replete with blackfaced actors portraying black slaves as alternatively dim-witted or predatory. In the climactic scene, white-robed Klansmen gallop into town and save white southern womanhood from the threat of racial miscegenation.

  The film was America’s first blockbuster success.

  Among its many admirers was a motley group of Georgians who convened on Thanksgiving eve inside the stately lobby of Atlanta’s Piedmont Hotel.3 They had been electrified by The Birth of a Nation and came with the express intention of reinaugurating America’s most notorious and violent fraternity. After gathering together, the group climbed aboard a chartered bus and drove sixteen miles north of the city limits to a magnificent granite peak known locally as Stone Mountain. There, overlooking the distant lights of Atlanta, they fired up a tall wooden cross and set out to replicate the glory of the Invisible Empire. The Klan would ride again.

  Unlike the original Klan, the new organization sought a national profile and identified several groups—not just African Americans—as alien threats to family and nation: Catholics, Jews, immigrants, New Women, bootleggers, and criminals. Above all, the Klan’s five million members, including roughly five hundred thousand women, touted “one hundred percent Americanism” as an antidote to the social and cultural decay that seemed to be rotting away the core of American values.4

  In Texas, Klansmen beat a man from Timson who had separated from his wife and a lawyer from Houston who “annoyed” local girls.5 In Grove Creek, near Dallas, Klan riders broke into the home of a recently divorced woman who was convalescing from a recent illness; they dragged her from bed, chopped off her hair, and beat her male visitor senseless with a flail.

  “Well,” explained a Klan woman, “wasn’t their original purpose to kind of straighten out people?”6

  Though the Klan particularly deplored “the revolting spectacle of a white woman clinging in the arms of a colored man,” more humdrum violations of Victorian propriety also vexed members of the Hooded Empire.7 In Evansville, Indiana, William Wilson, the teenage son of the local Democratic congressman, remembered that Klan riders ruthlessly patrolled back roads in search of teenagers embroiled in wild petting parties or improper embraces.8 “They entered homes without search warrants” and “flogged errant husbands and wives. They tarred and feathered drunks. They caught couples in parked cars.…” One night, when Wilson was driving with his girlfriend on a backcountry road, a local farmer overtook him and warned, “If you kids know what is good for you, you’ll move along. The Kluxers are patrolling this road tonight, and God knows what they’ll do to you if they catch you here.”

  The Klan monitored movie theaters in order to police “moral conditions” among loose high school students and burned down dance halls frequented by teenagers, since “most of the cases of assault between the sexes have followed dances where they got the inspiration for rash and immoral acts.”9 In an almost pornographic ceremony that was repeated dozens if not hundreds of times, Klan members hauled “fallen women” to remote locations, stripped them naked, and flogged them.

  The Ku Klux Klan drew many—perhaps most—of its members from cities and metropolitan areas.10 Its rosters included a fairly even mix of small-business men, professionals, and manual workers. Unlike the original Klan, which was a southern phenomenon, the new organization drew from a cross section of white Protestant America, and many of its members attended mainline churches. In effect, the Klan had broad appeal among different people who shared a profound sense of unease over social change and modernization.

  The Klan was only one manifestation of cultural reaction in the 1920s. Like no other event of the 1920s, the Scopes “Monkey Trial” burned itself into the national imagination.

  Throughout most of the nineteenth century, Americans remained firmly committed to evangelical Protestantism. They shared a general commitment to the doctrine of Christian salvation, personal conversion experience, and absolute biblical authority. Generations of public school pupils were raised on textbooks like McGuffey’s Readers, which drove home the interconnected virtues of Sabbath observance, frugality, hard work, and Bible reading.

  This evangelical consensus began to unravel in the late nineteenth century under the strains of scientific discovery. Scholars were simply discovering too much about matter, energy, and the cosmos to sanction literal readings of scripture. Such developments led Lyman Abbott, an outspoken modernist, to argue that “whether God made the animal man by a mechanical process in an hour or by a process of growth continuing through the centuries is quite immaterial to one who believes that into man God breathes a divine life.”11

  Theologians like Abbott were raising the stakes high. If the Old Testament story of creation—of Adam and Eve and the Garden of Eden—was more allegory than straight history, then the entire Bible might very well be open to individual interpretation.

  Who could say with certainty what was and wasn’t the truth?

  In the absence of absolute truth, how could humans avoid slipping into an endless cycle of moral relativism?

  If biblical wisdom was fair game for interpretation, wasn’t the same true of other time-tested values and social codes?

  In 1910, traditionalists in the General Assembly of the Presbyterian Church identified five theological “fundamentals” that scientists and religious modernists could not challenge: absolute scriptural inerrancy, the virgin birth of Christ, personal salvation in Christ, Christ’s resurrection, and the authenticity of Christ’s earthly miracles.12 Other defenders of evangelical orthodoxy turned to the same language when they published The Fundamentals, a series of twelve paperback volumes that answered Christian modernists with a ringing defense of biblical literalism. In 1919, some of the more conservative members of the traditional camp formed the World Christian Fundamentals Association; and in 1920, journalists
began lumping most conservative Christians together as “fundamentalists.”

  People who were uneasy about the unraveling of Victorian culture—those who were unnerved by the religious and ethnic diversity that accompanied mass immigration, who feared the modern world’s celebration of personal choice and satisfaction, and who lamented the abandonment of old gender and sexual norms—tended to embrace fundamentalism as a bulwark against further social change.

  The liberal-fundamentalist war came to a head in the summer of 1925, when a group of local boosters in Dayton, Tennessee, persuaded a young high school science teacher, John Scopes, to violate the state’s antievolution law. They originally intended to draw attention to their economically depressed crossroads town. Instead, what followed was a sensational show trial that pitted the famous “lawyer for the damned,” Clarence Darrow, a committed civil libertarian and almost fanatical atheist, against Williams Jennings Bryan, the famously eloquent Nebraskan who had thrice failed to attain the presidency but who remained a hero to rural fundamentalists in the South and Midwest.

  The trial’s climax came when Darrow unexpectedly called Bryan to the stand as a biblical expert. Darrow posed a series of questions designed to cage a biblical literalist like Bryan. How did Jonah survive inside a whale for three days? How did Joshua lengthen the day by making the sun—and not the earth—stand in place? These were not original inquiries. But, as Darrow later boasted, they forced “Bryan to choose between his crude beliefs and the common intelligence of modern times.”13

  “You claim that everything in the Bible should be literally interpreted?” Darrow asked.

  “I believe everything in the Bible should be accepted as it is given there,” Bryan answered, though “some of the Bible is given illustratively.…”

  Already, he was on shaky ground. If some of the Bible was “illustrative,” could it be liberally construed? It was Bryan, after all, who had audaciously claimed that “one beauty about the Word of God is, it does not take an expert to understand it.” Now he was admitting otherwise.

  “But when you read that Jonah swallowed the whale, …” Darrow continued, “how do you literally interpret that?”

  “I believe in a God who can make a whale and can make a man and make both of them do what he pleases.”

  Darrow was having the time of his life.

  Did Bryan believe that in the book of Genesis “days” truly represented twenty-four-hour periods of time? “Have you any idea of the length of these periods?” Darrow asked.

  “No; I don’t.”

  “Do you think the sun was made on the fourth day?”

  “Yes.”

  “And they had an evening and morning without sun?”

  “I am simply saying it is a period.”

  Bryan had committed a fatal error. He had conceded the necessity of at least some interpretation in reading the Bible. It was a slight admission, and one that wouldn’t have bothered a religious moderate. But it unnerved Bryan, who lost his composure. “I am simply trying to protect the Word of God against the greatest atheist or agnostic in the United States,” he cried. “The only purpose Mr. Darrow has is to slur the Bible, but I will answer his questions.”

  Although Scopes was convicted and slapped with a small fine, liberals declared victory. Mark Sullivan, a popular journalist of the day, boldly concluded that the “Scopes trial marked the end of the age of Amen and the beginning of the age of Oh Yeah!”14

  In fact, the conservatives were far from licked. In the decades following the trial, they withdrew from the public eye and retrenched. Fundamentalists chartered missions, publishing houses, and radio stations; they founded seventy bible institutions; and they strengthened existing fortresses of traditional Evangelicalism like Riley’s Northwestern Bible Training School in Minnesota and Moody Bible Institute in Illinois. In the 1940s, they began to reappear in public life, and by the 1980s, they once again assumed a prominent place in political and cultural debates.

  Yet something important did change on the courthouse lawn in 1925. The contest over religion, much like the brief glory of the Ku Klux Klan, spoke to the profound sense of dislocation that accompanied the rise of modern America. By and by, Americans were becoming more comfortable with, or resigned to, modernity. But not without a struggle.

  8

  NEW YORK SOPHISTICATION

  IF THE CULTURE WARS of the 1920s often seemed to pit city against country, it wasn’t always the case. In 1922, Julia H. Kennedy, an official at the Illinois Department of Health, claimed that girls from small towns outside of Chicago and St. Louis were conducting themselves with even more reckless abandon than their big-city sisters.1 Among their other offenses, these small-town girls drank homemade concoctions like white mule and lemon extract from flasks that they tied around their necks. In Kearney, New Jersey, the local school board canceled all school dances after chaperones discovered cigarette butts, empty bottles, and semiclad teenagers in the nearby cloakrooms.2 Clearly, the flapper was every bit as much a small-town as a big-city phenomenon.

  For every small-town flapper critic—like the eccentric businessman from Geneva, Illinois, who bankrolled a laboratory dedicated to studying the “flapper slouch” and set out to “give the world a warning of the evil effects of … such incorrect posture”—more tolerant voices cut against the grain.3 In response to a local uproar over the youth problem, preachers in Michigan City, Indiana, and Evanston, Illinois, rushed to the flapper’s defense, arguing that “bobbed hair, short skirts and knickerbockers are not signs of sin, but a declaration of independence.”4 Their sermons belied the notion that all of Middle America despised the flapper.

  From left to right: Charlie Chaplin, Frank Crowninshield, Helen Sardeau, Lois Long, and Harry D’Arrast strike a pose at a Coney Island photo booth, 1924.

  In Philadelphia, by comparison, upstanding citizens were scandalized to learn that Mrs. Anna Mesime, a middle-aged mother from Allentown, Pennsylvania, had been arrested for standing watch while her twenty-three-year-old daughter stole $150 worth of dresses, silk hose, and lingerie from a Market Street clothing store.5 In tears, Mrs. Mesime explained to the judge, “I had no money to buy the clothes my daughter wanted. Ida got the craze to be a flapper, and to get her the necessary clothing we decided to steal. I was afraid she would adopt a worse method of getting her finery, so intent was she upon being able to dress as well as other girls in the neighborhood.”

  And no wonder, too. Experts agreed that it would cost the average working girl at least $117—more than $1,200 in today’s money—to affect the flapper look with passing success.6 Even then, “she must have good taste, practice self denial and steer away from the impractical garments.”

  Yet if the revolution in morals and manners was sweeping the entire country, in many people’s minds she was a product of the best neighborhoods in New York and Chicago. This popular image of the flapper was owed in large part to the new prominence that middle-class urbanites came to enjoy in the 1920s. Flush with money and able to dominate the national conversation through newspapers, magazines, and radio, this new urban elite assumed broad license to teach Americans what to buy and how to dress. They self-consciously developed ideas about style, poise, and humor. And the rest of the country often followed suit.

  As Zelda Fitzgerald remarked in her “Eulogy on the Flapper,” fashions and manners could be counted on to circulate in a predictable chain reaction.7 The moment an urban sophisticate bored of her flapper attire, she shed her “outer accoutrements … to several hundred girls’ schools throughout the country,” which in turn bequeathed their own discarded wares “to several thousand big-town shop girls, always imitative of the several hundred girls’ schools, and to several million small-town belles always imitative of the big-town shopgirls via the ‘novelty stores’ of their respective small towns.”

  The process took some time to run its course. In far-flung places like Butte, Montana, high school yearbooks didn’t record a widespread popularity of signature flappe
r styles like bobbed hair until as late as 1924.8 The year before, seniors at Butte High School elected Mary Josephine McGrath as their prom queen. A “true Irish beauty,” McGrath was popular because she wasn’t “the flapper type.” Her long, curly locks were imitative of those of Mary Pickford, not Louise Brooks.

  The process of cultural transmission was subtle but continuous, and it often began in New York. There, in the long summer of 1925, when America’s Jazz Age culture wars were just striking their apex, Herman Mankiewicz, assistant theater critic for The New York Times, strode into Harold Ross’s ragtag offices on West Forty-fifth Street. Mankiewicz got straight to the point: He wanted Ross to hire one of his girlfriends. She didn’t have much work experience, but she liked to drink, she loved to party, and she wasn’t too bad a writer.

  Mankiewicz had come to the right man. As founding editor of the upstart magazine The New Yorker, the Colorado-born Ross was quickly emerging as one of America’s most influential arbiters of style and taste. And an unlikely one at that. One of Ross’s more generous friends later remembered him as “a big-boned westerner … who talked in windy gusts that gave a sense of fresh weather to his conversation.9 His face was homely, with a pendant lower lip; his teeth were far apart.” Stiff in demeanor and painfully awkward around women, Ross “wore his butternut-colored thick hair in a high, stiff pompadour, like some wild gamecock’s crest [and] wore anachronistic, old-fashioned, high-laced shoes, because he thought Manhattan men dressed like what he called dudes.”

  Already in his early thirties by the time the twenties began to roar, Ross had spent most of his adult life as an editorial drifter. Boasting an education just shy of a high school diploma, he crisscrossed the continent before World War I and worked a series of dead-end writing jobs for second-rate newspapers in Brooklyn, New Jersey, Salt Lake City, Atlanta, New Orleans, Sacramento, and Panama. When World War I broke out, he joined the Eighteenth Engineering Regiment, shipped off to France, walked almost one hundred miles to Paris, and somehow managed to talk himself into an editorial post at Stars and Stripes. On the plus side, he got to see Europe, and he never heard a shot fired in anger. But when the army mustered him out in 1919, Ross faced an almost certain return to obscurity and mediocrity.

 

‹ Prev