By 1900, the typical mother had two or three children, less than half as many as her counterpart one hundred years earlier. On average, she was 53 when her last child left home, and she died at 71. Adulthood was no longer an unvarying, seamless whole filled with farmwork and child-rearing until death. Middle-aged women were able for the first time to turn to other pursuits, like fashion, shopping, working, and volunteering. There was life after children.
The New Normal
For all the public discussion of these new stages, the label “middle age,” with its recently gathered retinue of associations, was initially meaningful to a relatively small segment of the population. As historians note, societies are governed by “laws of uneven development” that sabotage efforts to generalize. The number of urban residents did not inch past that of their rural neighbors until 1920, when they made up fifty-one percent of the country’s 106 million people. Divisions were not as stark between parents and children in rural areas, where fathers maintained power over land and inheritance and claimed greater authority. The fashions and amusements of middle-aged women and the factory life of middle-aged men detailed in the press primarily referred to white city dwellers.
Age consciousness was most keenly felt in urban hubs, “the storm centers of civilization,” where opinion makers lived and people were headed. The growing middle class—the white professionals, managers, and bureaucrats who administered the industrialized economy—formed a new tier beneath wealthy landowners and businessmen, and above unskilled laborers and subsistence farmers. Divisions between classes and between men and women hardened in the late 1870s and 1880s. Couples moved from the farm, where all work was shared, to the city, where men and women were separated into the domestic or the business realm. Dire economic conditions stoked class identities and resentments, further parting professionals and businessmen from manual laborers, whose numbers increased daily as shiploads of immigrants streamed into America’s ports.
In 1900, fewer than a fifth of employed men, about four million, held white-collar jobs. At the top of this stratum were lawyers, scientists, clergymen, doctors, and managers; at the bottom were clerks, teachers, and governesses who did not earn as much money but shared the same values of hard work, moderation, responsibility, and prudence. Two-thirds of Americans still lived in rural areas, and half of the population was poor at any given time.
Later in the twentieth century, the middle class dominated and defined the United States, but even at this early stage, this relatively small group, along with the nation’s wealthier class, controlled the levers of influence and power; their voices dominated the media. These cultural brokers helped fashion the images and explanations of America’s disorienting hurly-burly. They command special attention because they steered the conversation and shaped perceptions of their own middle age, as well as those of the ethnically varied working classes they viewed as their inferiors.
The cable lines and periodicals that connected the Eastern part of the country with the West, and the North with the South, disseminated their tastes and prejudices. In 1890, there were nearly 200,000 miles of railway track, 250,000 miles of telephone wire, and 1,610 daily newspapers in print. By 1900, automobiles had entered the landscape. Residents from coast to coast could share photographs, cartoons, news, jokes, styles, and ideas through advertisements, novels, journals, newspapers, the Sears Roebuck & Co. and Montgomery Ward catalogs, and later, movies in a fraction of the time it used to take for information to travel. The public drew on this store of unfamiliar pictures, gestures, attitudes, and behaviors to construct a generational identity. In many parts of the country, the media’s “Middle-Aged Man” (and Woman) came into being before the flesh-and-blood article.
Fashion manufacturers and editors were among the first to assign appropriate clothes to various age groups. In rural America in the early 1800s, a sweet 16 and a sedate 60-year-old could wear the same style without embarrassment. By the end of the century, expectations had altered as each age developed its own dress code. The Los Angeles Times in 1895 even offered fashion tips for “a little roly poly grandmamma,” so that she need not “look wider than she is long.” Mrs. Wilson Woodrow (married to James Wilson Woodrow, a cousin of the future president) sternly informed The Cosmopolitan’s readers in 1903, “Sweet simplicity at fifty is absurd.” Middle-aged women should indulge in elaborate design and adornments, “the splendor of jewels, forbidden to the debutante, is her privilege.” Harper’s Bazar advised that 25 was the upper limit for simple clothing, and told perplexed 45-year-olds how to dress: “Fine clothes and jaunty and piquant fashions are in fact the property of middle life; not of old age, indeed, any more than of early youth.”
Men’s fashions also distinguished among generations, with the width of a brim or the placement of a crease as reliable a guide to age as the number of rings on a tree. “The New Styles That Are Designed for Young, Old and Middle-Aged Men” that appeared in the newspaper the San Francisco Call in 1904 explained: “There are straw hats for the young man, straw hats for the middle-aged man, straw hats for the old man and straw hats for the boy. Each age has its own particular and exclusive style more pronouncedly this year than ever before.”
Etiquette manuals, reform pamphlets, academic lectures, government reports, medical conferences, journals, and advertisements laid out the appropriate age for everything from marriage to eating meat. They set the social clock, defining expectations for when someone should finish an education, live independently, or have children. These new norms informed middle-aged wives how to dress and how often to have sex (alas, for them, it tended to be never). James Foster Scott, a former obstetrician, wrote in his 1898 text The Sexual Instinct what he considered appropriate sexual activity for every stage of life, declaring that when “the individual is in the afternoon of life” he and she are “again sexless from a physical standpoint.” He estimated that this turning point occurred in women between 42 and 50, and in men between 50 and 65. According to John Henry Kellogg, the food reformer and health spa founder (who helped perfect the humble cornflake with his brother, William Keith), men and women in different stages of life should not marry because older people didn’t have the energy to withstand the youthful demands of sex. Meanwhile, the forerunners of Miss Manners were policing the social corridors. “The haste and impetuosity so becoming to 18 are immoral at 50,” Celia Parker Woolley wrote in 1903. “It is neither pleasant nor edifying to see an aging man or woman aping the behavior of the young.”
Failure to conform to these widely proclaimed standards risked placing one outside the new “normal,” a word that passed into common use around mid-century, when it began to enjoy a steady rise in popularity. “Deviation,” previously a neutral statistical term, took on a more negative connotation. Fitting in mattered. As assorted experts displaced religious authorities, being normal was emphasized more than being moral.
Taylor had plenty of acolytes eager to take a scientific approach and spread standardization and segmentation to other spheres of human endeavor. The Progressive Era reformers who flourished between 1890 and 1920 were his social and political counterparts, and they sought to put his ideas into practice. With a base in the growing urban middle class, activists like Theodore Roosevelt, Woodrow Wilson, and Margaret Sanger believed that through the application of scientific expertise and efficiency they could find solutions to society’s ills. In the health field, Kellogg organized his popular wellness sanitarium in Battle Creek, Michigan, on a “rational and scientific basis under regular scientific management.” Frank and Lillian Gilbreth, whose lives were later portrayed in the book and film Cheaper by the Dozen, were probably the best known of the scientific management gurus. Also in the domestic arena was Ellen Richards, a chemist and sanitary engineer who founded the discipline of home economics. She was similarly inspired by scientific management and its promise to eliminate “wasted motions.” Science, Richards declared, compelled women to ask themselves, “Am I making the best use of my time?” Richards
was counseled by her friend Melville Dewey, the meticulous founder of the library cataloging system and the president of the New York Efficiency Society. He economized the spelling of his own name to Melvil Dui.
After Principles of Scientific Management was published in 1911, Taylorization seemed to be everywhere. The following year, Christine Frederick, a magazine editor, consultant, and home economist, initiated a series of articles in Ladies’ Home Journal, each preceded by Taylor’s pig-iron story, instructing women on how to apply scientific principles to the drudgery of housekeeping. Wealthy women had maids; the poor, who often worked as servants, had less complex demands and appearances to keep up, Frederick wrote. “The problem, the real issue, confronts the middle-class woman of slight strength and still slighter means, and of whom society expects so much—the wives of ministers on small salary, wives of bank clerks, shoe salesmen, college professors, and young men in various businesses starting to make their way,” she explained. “They are refined, educated women, many with a college or business training. They have one or more babies to care for, and limited finances to meet the situation.” Frederick advised them to break each chore down into individual steps and then time each to find the most efficient way to peel a potato or iron a shirt. A few years later, she created a correspondence course titled “Household Engineering: Scientific Management in the Home.” In 1913, Life magazine published a cartoon in which an efficiency expert breaks up a workplace embrace, admonishing the young man for employing “fifteen unnecessary motions in delivering that kiss.”
There were opponents, fierce ones, in fact. To them, Taylor was a scourge, a soulless champion of the Machine Age, who valued efficiency above humanity. After Principles appeared, Samuel Gompers, president of the American Federation of Labor, wrote that Taylor’s system makes “every man merely a cog or a nut or a pin in a big machine, fixed in the position of a hundredth or a thousandth part of the machine.”
Taylor saw no conflict between efficiency and humanity, because he profoundly believed that his standardized system would rescue the workingman from the whims of exploitive managers and raise wages. With impartial science as the guide and arbiter, acrimony between labor and management would disappear. Progressives like Louis Brandeis, the future U.S. Supreme Court justice, championed scientific management because it promised “industrial utopia.” For Taylor, there was no contradiction between the definition of “standard” as a model of excellence and the definition of “standard” as a commonplace, identical and unvarying. Conformity was a virtue. If you found the best, why change it? That’s the way Babbitt, the middle-aged hero of Sinclair Lewis’s 1922 novel, sees it. “Here’s the specifications of the Standardized American Citizen!” Babbitt proudly declares. “Here’s the new generation of Americans: fellows with hair on their chests and smiles in their eyes and adding-machines in their offices.”
Taylorites shared the belief that a right way, “the one best way,” and a wrong way also existed in the personal sphere. Forget Calvinism’s predestination. Following the prescribed norms would enable each person to reach his optimum best self. Pauline Manford, the wealthy modern matron in Edith Wharton’s 1927 novel Twilight Sleep, thinks science can help society’s elite turn out babies “in series like Fords.” She invoked scientific management in the fight against aging. “Nervousness, fatigue, brain-exhaustion . . . had her battle against them been vain? What was the use of all the months and years of patient Taylorized effort against the natural human fate: against anxiety, sorrow, old age—if their menace was to reappear whenever events slipped from her control?”
Poor Pauline. On the factory floor, Taylor—or the owner or manager—was the one to set the pace. Workers could not be relied on to determine the most efficient method on their own. But outside the industrial arena, men and women were expected to vigilantly monitor their own activities and implement improvements. In the home, Taylorization was adopted by the self-help movement as a method of exerting individual autonomy and achieving happiness in an era of mass production. Christine Frederick explained the goals in Ladies’ Home Journal: “The end aim of home efficiency is not a perfect system of work, or scientific scheduling, or ideal cleanliness and order; it is the personal happiness, health, and progress of the family in the home.”
The unclouded faith in science and reason to improve human existence may look naïve from our perch in the twenty-first century, where we have a view of environmental devastation and efficient killing technologies. But there was an almost childlike awe in what these amazing new tools could achieve. John D. Rockefeller established the Laura Spelman Rockefeller Memorial Foundation in the early twenties with the express purpose of promoting “scientific” solutions to social problems. The belief was that war, poverty, and class conflict were, as the foundation put it in its final report, “irrationalities” that could eventually be remedied by reason and science. Scientific management was a way to tame “the natural human fate,” as Wharton’s Pauline would say.
The creation of rational standards was reassuring at a time of breakneck and puzzling transformation. Livelihoods, homes, and habits, previously passed from one generation to the next like family Bibles, disappeared. Men who grew up on farms lost sight of the world they knew after moving to cities for factory work. Women who entered the workplace, either out of necessity or desire, raised fears about the unraveling of family life and traditional values. Waves of exotic immigrants, ten million between 1860 and 1890, with strange habits and tongues crowded into the urban stew. If an inhabitant of 1890 were suddenly transported to his own home thirty years later, he would be dazed by its unfamiliar marvels: hot and cold running water, an indoor toilet, a toaster, a washing machine, a telephone, fresh fruit and vegetables in winter, cosmetics, a closet of differently styled clothes, a radio, and maybe even a driveway with a car in it. If he entered a grocery store or pharmacy in 1920, hundreds of new products, many for problems or maladies that he had never heard of—from halitosis (the obscure medical term for bad breath that Listerine made famous) to homotosis (the lack of attractive home furnishings)—would greet him like a roomful of strangers. On the same street he would find businesses that had not existed before—commercial Laundromats, beauty parlors, movie houses.
In this daunting environment, life stages offered a solid and logical framework for a society uncertain of what was coming from one day to the next. For someone who felt lost in the mammoth urban industrial machine, the advice on how to act and dress, when to marry or expect a promotion provided a detailed map. Here’s what is expected, what is standard or normal, for someone like you.
Middle age satisfied a profound need; it offered at least a partial identity to people in the midst of a historical identity crisis. Physicians and psychologists like G. Stanley Hall were convinced that a series of distinct phases was biologically determined, as natural as teething. But as Voltaire said of God, if middle age didn’t exist, we would have had to invent it.
4
The Renaissance of the Middle-Aged
The actress Lillie Langtry in middle age, 1900
The woman of today “is just beginning to live at 30 . . . and she absolutely refuses to consider herself old until she is 70.”
—San Francisco Call (1910)
The single-minded focus on efficiency reveals how much thinking about the purpose and value of a human life had shifted by the end of the nineteenth century. Early Puritans were not unmindful of the need for economic production. In a 1701 sermon, the great American preacher Cotton Mather told his congregation that all Christians had two callings, “to service the Lord and to pursue useful employment.” But the ultimate purpose of both was salvation. “Their main end must be, to, acknowledge the great God, and His glorious Christ.” The accumulation of years brought one closer and closer to realizing that great project. On this spiritual journey, “every age has its joys and sorrows.”
The spread of Taylorization laid the groundwork for viewing midlife as the gateway to decline. It convert
ed age into a proxy for efficiency. The notion of “saving time,” as if minutes were pennies in a bank, infiltrated everyday consciousness. An industrial worker was valued solely in terms of his output: how many loads of coal he could shovel in a minute, how many bricks he could lay in an hour. Neither seniority nor experience, judgment nor instinct mattered. Men who had seen their fathers till the land for as long as they could stand upright were themselves superfluous at 45 because they were not as quick or adaptable on the factory’s rumbling floors. Middle-aged and older workers had more trouble keeping up with the physical demands dictated by Taylorism, even when it meant earning more money. If they couldn’t produce, they were fired. If they gummed up the works, they were heavily fined. Scientific management stripped age of its other attributes and reduced it to a set of physical capabilities. Before the 1920s, among white-collar workers and women, midlife remained a period when talents and influence were recognized as reaching full flower. But among those who toiled in the guts of America’s industrial machine, where profit margins were calculated to the last cent, being middle-aged became a distinct disadvantage. In the factory, the biological clock was forced to synchronize with the time clock.
Taylor’s scientific management favored the young, but ironically, so did the backlash his clockwork efficiency provoked. Although industrialization brought many Americans great wealth, an assortment of radicals and traditionalists rejected the notion of technological salvation and feared industrialization was creating an insensate, alienating society, one that valued productivity above humanity. Hope came in the form of the next generation, unsullied and uncorrupted. G. Stanley Hall claimed youth symbolized purity and regenerative power; it was an “oracle” that “will never fail.” In 1911, the same year Taylor published Principles, the 25-year-old Randolph Bourne, a musician, intellectual, and vehement antiwar activist, started writing the essays that would constitute Youth and Life, an early countercultural manifesto. Bourne, his face scarred from a botched delivery and his back hunched by childhood spinal tuberculosis, became a spokesman for his generation, whom he urged to rebel against Victorian convention. Bourne maintained that one’s ideals were set by the age of 20. “It is a tarnished, travestied youth that is in the saddle in the person of middle age,” he wrote. “Middle age has the prestige and the power,” he added, “but too seldom the will to use it for the furtherance of its ideals.” For different reasons, both supporters and critics of the newly mechanized age honored youth and devalued middle age.
In Our Prime Page 6