References proliferated. In 1893, Henry James, just 50, titled his short story about a writer “The Middle Years.” In 1898, Thomas Hardy included “Middle-Age Enthusiasms” in his collection Wessex Poems. Romantics who found James and Hardy a bit taxing after dinner could turn to tales of amour in Middle-Aged Love Stories (1903), a forerunner of Harlequin romances. That same year, The Cosmopolitan Magazine published an essay titled “The Woman of Fifty,” which noted: “The woman who today is celebrated for distinctive charm and beauty, ripe views, disciplined intellect, cultivated and manifold gifts would, two score years ago, have been relegated to the heavy ranks of the dowagers and grandmothers—forced by the stern conventions of prevailing opinion to confront the bitter knowledge that, just as she had gained a mastery of the rules, she was expected to retire from the game.”
Clearly, something happened between Bryant’s funeral oration for Cole in 1848 and the regular appearance of columns devoted to middle age three decades later. A wisp with a barely discernible outline, middle age condensed into a sturdy stage of life that took up space in people’s thoughts, discussions, and writings. Social change, of course, occurs gradually. Ideas spread in fits and starts, and cultural shifts occur in years, not months. Various currents of thought eventually converged in the second half of the nineteenth century and altered the stories people told about the ebb and flow of their lives. Between Cole’s death and his own, Bryant had seen the development of the germ theory of disease, anesthetics, and vaccines; the invention of the internal combustion engine, electricity, the telephone, the phonograph, practical refrigerators and typewriters, elevators, a faster process for making steel, a dozen ingenious tools of measurement, including the radiometer (to gauge radiation), galvanometer (electricity), dynamometer (force), and the interferometer (distance); the spread of railroads and gas lighting, pocket watches, and the laying of the first transatlantic cable; the publication of Darwin’s On the Origin of Species, the introduction of new novelistic techniques to mark the passage of time, the Census Bureau’s categorization of residents in ten-year age groups, and the rapid falloff of the apprenticeship system. Bryant had witnessed the creation of a new industrialized, bureaucratic, and technological epoch.
Enter the Timekeeper
Frederick Winslow Taylor, known as “the father of scientific management,” was a child of this new science-centered world. His obsessions and interests mirrored those of the rapidly changing era. In 1878, the year Bryant died, Taylor got a job at Midvale Steel Works, where he later commenced his pioneering time-and-motion studies. He believed that, through careful observation and experiment, he could discover the optimal way to perform every task in the workplace to maximize productivity. He broke down each job into tiny pieces, meticulously analyzing the sequence, tools, and motions in order to put them all back together in the most efficient form. From the mundane to the complex, every job was explained in a set of minutely detailed instruction cards that workers were supposed to follow assiduously. Describing his later experience at Bethlehem Steel in 1898 with the hapless workman Schmidt, Taylor wrote in Principles of Scientific Management about calculating the best way to load a four-by-four bar of pig iron weighing ninety-two pounds: “Schmidt started to work, and all day long, and at regular intervals, was told by the man who stood over him with a watch, ‘Now pick up a pig and walk. Now sit down and rest. Now walk—now rest,’ etc. He worked when he was told to work, and rested when he was told to rest, and at half-past five in the afternoon had his 47½ tons loaded on the car.” Every motion and moment mattered.
Whether or not Taylor was a fraud who cooked his data and exaggerated his success, as some historians later claimed, there is no disputing his impact. Taylor’s ideas about efficiency and standardization eventually served as a strut for mass production and a mass market as homebuilders, homemakers, libraries, schools, and hospitals incorporated his time-and-motion studies and scientific management theories into their designs and practices. Principles, published in 1911, influenced Henry Ford’s assembly lines in Detroit. In 1913, Ford’s workers took twelve and a half hours to assemble a car. By 1914, it took just over an hour and a half. The abundance and affluence that industrialized countries enjoyed in the last century were partly founded on the tremendous gains in productivity that Taylor’s system extracted, enabling goods to be made faster and more cheaply. One management scholar judged that “the United States owes a large, if incalculable proportion of their immense productivity and high standard of living” to Taylor.
For a society that used roosters and sunsets to track the hours and kept a sloppy record of birth dates, Speedy Taylor, as he was known in the shop, preached that every second counted and demanded that every second be counted. His views resonated in an era when time and space seemed to shift and collapse as technological advances like the railroad, telegraph, and electric light altered the way people thought about and experienced the passing of the hours. Clocks began adorning walls in the 1830s; pocket watches became widely available in the 1860s. In the 1870s, phrases like “on time,” “behind time,” and “ahead of time” entered the English language. The first international conference to synchronize timekeeping took place in 1884, when Greenwich Mean Time was established as the standard and the globe was divided into time zones.
Between 1890 and 1920, Taylor’s approach revolutionized the way Americans thought as much as the way they worked. He roped modern man to the clock and helped bring an exacting awareness of time into every nook of the culture. Laying out his ideas with the diligence he advocated in the factory, he wrote that “the same principles . . . can be applied with equal force to all social activities; to the management of our homes; the management of our farms; the management of the business of our tradesmen, large and small; of our churches, our philanthropic institutions, our universities, and our governmental departments.”
Scientific management was to be the overseer of the entire range of human existence, including the very process of aging. Taylor’s attention to time, his valuation of efficiency, his belief in science, and his insistence on classifying and standardizing created the conditions that led to the invention of middle age.
Just as Taylor instructed managers to “break down each task into its component parts,” psychologists, educators, and doctors dissected a single life into separate phases: childhood, adolescence, middle age, old age.
Just as Taylor created a standardized set of step-by-step instructions for every job, the nascent class of experts established norms of behavior, dress, sexual practices, and attitudes that were deemed appropriate for each stage of life.
Just as Taylor valued a worker solely in terms of timed productivity, so was a human life, once conceived in spiritual and moral terms, reduced to its economic essentials. This method of accounting found age to be a handy yardstick for measuring a man’s potential worth on the factory floor and in the office; it ultimately led to the view that the young were much more valuable to society than the middle-aged and elderly.
And behind it all was the steady tick of the clock, piped into workplaces, homes, and schools. That tick, which Taylor tried to harness with his stopwatch, helped awaken society to ever-finer gradations of age.
A Generational Identity Emerges
Taylor’s childhood obsessions well positioned him to become the modern world’s timekeeper. He was born in 1856 to a wealthy Quaker family, whose ancestors came to America before the Revolution and settled in Germantown, near Philadelphia. His parents were moderate abolitionists and suffrage supporters. He grew up during the Civil War, when local factories were producing bayonets and army uniforms, and local hospitals cared for wounded Union soldiers.
When Frederick turned twelve, the family traveled to Europe for the grand tour, and stayed for three years. During their sojourn, he meticulously copied the departure and arrival times of the horse-drawn carriages his family took in Norway. While hiking, he experimented to see which stride covered the most ground with the least effort. When th
e Taylors returned to Germantown, Fred made his friends wait as he spent an entire sunny morning measuring a playing field down to the last inch before agreeing to let the game commence.
In 1872, Taylor was sent off to Phillips Exeter Academy in New Hampshire. There he witnessed “the first piece of time study that I ever saw made by anyone.” He was studying mathematics with the legendary George Wentworth, a great bull of a teacher who had a tidy side income from thirty-four mathematics textbooks he had written. Wentworth sat behind a large wooden desk, his watch kept out of sight on a ledge while his fifty or so students figured out a handful of math problems. Each boy was supposed to raise his hand and snap his fingers when he had worked out the solution. When about half the class had finished, Wentworth, in a slipshod suit and sporting a long beard, called out: “That’s enough.” After some months, it dawned on Taylor that Wentworth was timing how long it took the boys to work through the equations and geometric drawings. He used the results to calculate how much homework to assign.
If Taylor had gone on to Harvard and become a lawyer as he and his family had planned, he might not have been watching the clock eight years later. Complaints of bad eyesight and pistonlike headaches kept him from moving south to Cambridge, however, and at his parents’ urging Taylor decided to become an engineer, an occupation that required less studying and eyestrain. He traded in his Exeter tie and jacket for overalls and a lunch pail and became an apprentice patternmaker in a large, dusty, and growling Philadelphia foundry. His next job was at Midvale Steel, where Taylor was soon made foreman and in 1881, like Wentworth, timed his boys. He was not simply counting how long it took a machinist or a patternmaker to complete a task but also calculating down to the hundredth of a minute every movement along the way.
Robert Kanigel, Taylor’s biographer, compares Taylor’s impact to that of Charles Darwin, Karl Marx, and Sigmund Freud, noting that “each brought a deeply analytical, ‘scientific’ cast of mind to an unruly, seemingly intractable problem.”
Most of the nineteenth century’s emerging class of experts and managers did not have such grand ambitions, but in trivial and consequential pursuits, psychologists, physicians, biologists, managers, and civil servants shared their deep faith in scientific methods and a belief that human society could be organized rationally and controlled. Everything in nature and thought, from a single human life to the atom, was subject to subdivision and classification. Even modernist artists were influenced by science and sought to break experience down into its most elemental parts.
As people migrated to urban centers, the unself-conscious mixing of generations that naturally occurred in rural homes, farms, schools, social halls, and churches was replaced by age-related groupings. Growing government bureaucracies like the army used age to help identify, organize, and track the population. Age became the basis for education, statistical compilations, and military enlistment. Channeling schoolchildren into different grades was first introduced in the 1850s and 1860s. By the 1880s, private middle-class organizations like the YMCA, the Boy Scouts, Campfire Girls, 4-H Clubs, and civic associations began to group members according to birth year. In 1900, the U.S. Census, which previously had grouped inhabitants in ten-year increments, added a question about one’s date of birth for the first time.
In cities, an array of leisurely pursuits developed as factory work replaced dawn-to-dusk farming and electricity lit up the night. Amusement parks, dance halls, social clubs, and fraternities and sororities were places where members of a single generation mingled. Women’s clubs became “schools for the middle-aged woman,” wrote Margaret Sanger, the birth control activist, where 50-year-olds can find “friends who like her are in the middle way of life.”
This separation simultaneously introduced a generational identity and reinforced it. The spread of public junior and senior high schools (which picked up speed after 1910), segregated teenagers and provided them with a unique shared experience. High school extended their education beyond the level most of their parents had achieved, and delayed their entry into the adult world of work. The more people identified with one particular stage of life, the deeper the divisions between the stages became.
The health-care professions started to formally recognize and classify these groupings, so that by the century’s end, stages, as Howard Chudacoff concludes, “were being defined with near-clinical precision, and more definite norms were being assigned to each stage.” The ages that Shakespeare and Dante had written about were no longer invoked merely as literary metaphors but declared to be verifiable scientific fact applicable to everyone. Physicians and psychologists drew up schedules of biological, social, and mental development that turned the first few years of a child’s life into a set of monthly checkpoints: the expected age for the first step, the first word, the first bite of solid food; for toilet training, for school; and eventually, for knowing what happened to Bambi’s mother. These measured and sequenced phases were the medical counterpart of Taylor’s scientific management theories, with each moment in time corresponding to an appropriate behavior or task.
Doctors determined that the unique attributes and illnesses of childhood required specialized expertise, and in the 1880s created the field of pediatrics. In 1900, the Swedish writer and feminist Ellen Key published her influential book on education and parenting, The Century of the Child, noting that children had a nature singular and distinct from that of adults. Arguing that the aged should similarly be in a separate category, an article in the 1904 American Journal of Nursing stated: “We must adapt our practice to the age of the individual. You must not treat a young child as you would a grown person, nor must you treat an old person as you would one in the prime of life.”
Five years later, the physician Ignatz Leo Nascher identified senescence, or old age, as “a distinct period of life, a physiological entity as much so as the period of a childhood,” and coined the term “geriatrics” for this “new special branch of medicine.”
Adolescence entered popular consciousness around the same period. The legendary psychologist G. Stanley Hall officially introduced the idea in 1905 in his massive tome Adolescence: Its Psychology and Its Relations to Physiology, Anthropology, Sociology, Sex, Crime, Religion, and Education. Hall defined the stage as running from ages 12 to 24—peaking between 14 and 16—and marked by a volatile mix of naïve optimism, burning sexuality, intense emotionalism, instability, self-absorption, and rashness. The adolescent’s literary precursor can be found in Goethe’s Young Werther, who exhibited the regenerative powers and romantic temperament that enamored Hall, Key, and others.
Hall was a towering figure in his field. He founded the American Journal of Psychology in 1887, became the first president of the American Psychological Association, presided over Clark University for more than thirty years, and hosted Freud’s 1909 visit to the States. Deeply influenced by Darwin, he developed what he called “genetic psychology” and applied his ideas about the evolution of species to a single individual. Each person repeats or recapitulates the same developmental stages that the human species experienced as a whole, he argued. Babies corresponded to the pre-savage state, while adolescents manifested the characteristics of ancient and medieval societies, in which there is a “peculiar proneness to be either very good or very bad.”
Middle age lacked a grand chronicler and advocate like Hall. But as childhood, adolescence, and old age were more precisely defined and corralled, middle age stood out in sharper relief. The idea of a separate midlife period swept through the culture, pulled along by age-graded institutions, industrialization, and urbanization, as well as drops in birth and mortality rates.
A look at early life expectancy charts can give the faulty impression that middle age was simply the result of longer life spans. After all, if death comes at 40 (the average life expectancy in 1800), there isn’t much of a middle to enjoy. But that is just a tiny part of the story. Average life expectancy hadn’t increased that much; it was only 47 in 1900. In any case, neither t
he 1800 nor the 1900 figures reflect how long people actually lived. The all-too-frequent death of babies and children is what kept average life expectancy statistics so low. Even in colonial times, most people who made it past age 15 had a good chance of living to 60. Knowing that the Bible allotted a life span of “three score and ten years,” no one mistook 45 or 50 for old age.
More crucial to midlife’s invention was the fact that women had fewer children. Discoveries like antiseptics and the microorganisms that caused cholera and diphtheria led to a plunge in childhood mortality and stanched the flood of women seriously disabled or killed in childbirth. Better diet and sanitation helped infants and children survive past adolescence, which contributed to the drop in birth rates. With fewer babies and more time and money, parents invested more in each child. Americans’ growing sense of autonomy also encouraged women to consider children a personal choice rather than solely a matter of God’s will or male authority, a view supported by the increasingly visible presence of feminists and birth control advocates, who counseled families on the logistics of vaginal jellies, douches, and withdrawal. “Always carry to bed a clean napkin,” Dr. James Ashton advised affianced men. The invention of vulcanized rubber that Charles Goodyear turned into a tire dynasty added to the array of methods by cheaply producing reliable rubber condoms, which became popular in the 1870s.
In Our Prime Page 5