Book Read Free

At Home

Page 47

by Bill Bryson


  The one place where there really was danger from tight corsets was in the development of babies. Many women wore corsets perilously deep into pregnancy, even pulling them tighter to hide for as long as possible the indelicate evidence that they had been party to an unseemly burst of voluptuous spasms.

  Victorian rigidities were such that ladies were not even allowed to blow out candles in mixed company, as that required them to pucker their lips suggestively. They could not say that they were going “to bed”—that planted too stimulating an image—but merely that they were “retiring.” It became effectively impossible to discuss clothing in even a clinical sense without resort to euphemisms. Trousers became “nether integuments” or simply “inexpressibles” and underwear was “linen.” Women could refer among themselves to petticoats or, in hushed tones, stockings, but could mention almost nothing else that brushed bare flesh.

  Behind the scenes, however, things were a little spicier than we are sometimes led to suppose. Chemical dyes—some of them quite rich and colorful—became available in midcentury and one of the first places they appeared was on underclothes, a matter that scandalized many since it raised the obvious question of for whose delight all that color was intended. The embroidery of underwear became similarly popular and identically scandalous. In the very year that it was praising an English girls’ school for keeping the young ladies murderously strapped into corsets for a week at a time, The Englishwoman’s Domestic Magazine was also railing that “the amount of embroidery put upon underclothing nowadays is sinful; a young lady spent a month in hemstitching and embroidering a garment which it was scarcely possible that any other human being, except her laundress, would ever see.”

  One thing Victorian women didn’t have were brassieres. Corsets pushed up from below, which held breasts in place, but for true comfort (I am told) breasts are better held up by slings. The first person to see this was a lingerie manufacturer named Luman Chapman, of Camden, New Jersey, who secured a patent in 1863 for “breast puffs”—a kind of early halter top. Between 1863 and 1969, exactly 1,230 patents on bras were taken out in the United States. The word brassière, from a French word meaning “upper arm,” was first used in 1904 by the Charles R. DeBevoise Company.

  One small but tenacious myth may be demolished here. It has been sometimes written that the bra was the invention of one Otto Titzling. In fact, if such a person ever existed, he played no part in the invention of foundation garments. And on that slightly disappointing note, we may move on to the nursery.

  * Overcome with grief, her husband buried her with a sheaf of poems that he had failed to copy; seven years later he thought better of the gesture, had the grave dug up, and retrieved the poems, which were published the following year.

  • CHAPTER XVIII •

  THE NURSERY

  I

  In the early 1960s, in a hugely influential book called Centuries of Childhood, a French author named Philippe Ariès made a startling claim. He declared that before the sixteenth century, at the very earliest, there was no such thing as childhood. There were small human beings, of course, but nothing in their lives made them meaningfully distinguishable from adults. “The idea of childhood did not exist,” he pronounced with a certain finality. It was essentially a Victorian invention.

  Ariès was not a specialist in the field, and his ideas were based almost entirely on indirect evidence, much of it now held to be a little doubtful, but his views struck a chord and were widely taken up. Soon other historians were declaring that children before the modern period were not just ignored but actually weren’t much liked. “In traditional society, mothers viewed the development and happiness of infants younger than two with indifference,” declared Edward Shorter in The Making of the Modern Family (1976). The reason for this was high infant mortality. “You couldn’t permit yourself to become attached to an infant that you knew death might whisk away,” he explained. These views were almost exactly echoed by Barbara Tuchman in the best-selling A Distant Mirror two years later. “Of all the characteristics in which the medieval age differs from the modern,” she wrote, “none is so striking as the comparative absence of interest in children.” Investing love in young children was so risky—“so unrewarding” was her curious phrase—that everywhere it was suppressed as a pointless waste of energy. Emotion didn’t come into it at all. Children were merely “a product,” in her chilling view. “A child was born and died and another took its place.” Or as Ariès himself explained, “The general feeling was, and for a long time remained, that one had several children in order to keep just a few.” These views became so standard among historians of childhood that twenty years would pass before anyone questioned whether they might represent a serious misreading of human nature, not to mention the known facts of history.

  There is no doubt that children once died in great numbers and that parents had to adjust their expectations accordingly. The world before the modern era was overwhelmingly a place of tiny coffins. The figures usually cited are that one-third of children died in their first year of life and half failed to reach their fifth birthdays. Even in the best homes death was a regular visitor. Stephen Inwood notes in A History of London that the future historian Edward Gibbon, growing up rich in healthy Putney, lost all six of his siblings in early childhood. But that isn’t to say that parents were any less devastated by a loss than we would be today. The diarist John Evelyn and his wife had eight children and lost six of them in childhood, and were clearly heartbroken each time. “Here ends the joy of my life,” Evelyn wrote simply after his oldest child died three days after his fifth birthday in 1658. The writer William Brownlow lost a child each year for four years, a chain of misfortune that “hast broken me asunder and shaken me to pieces,” he wrote, but in fact, he and his wife had still more to endure: the tragic pattern of annual deaths continued for three years more until they had no children left to yield.

  No one expressed parental loss better (as no one expressed most things better) than William Shakespeare. These lines are from King John, written soon after Shakespeare’s son Hamnet died at the age of eleven in 1596:

  Grief fills the room up of my empty child,

  Lies in his bed, walks up and down with me,

  Puts on his pretty looks, repeats his words,

  Remembers me of all his gracious parts,

  Stuffs out his vacant garments with his form.

  These are not the words of someone for whom children are a product, and there is no reason to suppose—no evidence anywhere, including that of common sense—that parents were ever, at any point in the past, commonly indifferent to the happiness and well-being of their children. One clue lies in the name of the room in which we are now.* Nursery is first recorded in English in 1330 and has been in continuous use ever since. A room exclusively dedicated to the needs and comforts of children would hardly seem consistent with the belief that children were of no consequence within the household. No less significant is the word childhood itself. It has existed in English for over a thousand years (the first recorded use is in the Lindisfarne Gospels circa AD 950), so whatever it may have meant emotionally to people, as a state of being, a condition of separate existence, it is indubitably ancient. To suggest that children were objects of indifference or barely existed as separate beings would appear to be a simplification at best.

  That isn’t to say that childhood in the past was the long, carefree gambol we like to think it now. It was anything but. Life was full of perils from the moment of conception. For mother and child both, the most dangerous milestone was birth itself. When things went wrong, there was little any midwife or physician could do. Doctors, when called in at all, frequently resorted to treatments that only increased the distress and danger, draining the exhausted mother of blood (on the grounds that it would relax her—then seeing loss of consciousness as proof of success), padding her with blistering poultices or otherwise straining her dwindling reserves of energy and hope.

  Not infrequently babies
became stuck. In such an eventuality, labor could go on for three weeks or more, until baby or mother or both were spent beyond recovery. If a baby died within the womb, the procedures for getting it out are really too horrible to describe. Suffice it to say that they involved hooks and bringing the baby out in pieces. Such procedures brought not only unspeakable suffering to the mother but also much risk of damage to her uterus and even graver risk of infection. Considering the conditions, it is amazing to report that only between one and two mothers in a hundred died in childbirth. However, because most women bore children repeatedly (seven to nine times on average) the odds of death at some point in a woman’s childbearing experience rose dramatically, to about one in eight.

  A woman giving birth in the eighteenth century (note the way modesty is preserved by the sheet pulled around the doctor’s neck) (photo credit 18.1)

  For children, birth was just the beginning. The first years of life weren’t so much a time of adventure as of misadventure, it seems. In addition to the endless waves of illness and epidemic that punctuated every existence, accidental death was far more common—breathtakingly so, in fact. Coroners’ rolls for London in the thirteenth and fourteenth centuries include such abrupt childhood terminations as “drowned in a pit,” “bitten by sow,” “fell into pan of hot water,” “hit by cart-wheel,” “fell into tin of hot mash,” and “trampled in crowd.” The historian Emily Cockayne relates the sad case of a little boy who lay down in the road and covered himself with straw to amuse his friends. A passing cart squashed him.

  Ariès and his adherents took such deaths as proof of parental carelessness and lack of interest in children’s well-being, but this is to impose modern standards on historic behavior. A more generous reading would bear in mind that every waking moment of a medieval mother’s life was full of distractions. She might have been nursing a sick or dying child, fighting off a fever herself, struggling to start a fire (or put one out), or doing any of a thousand other things. If children aren’t bitten by sows today, it is not because they are better supervised. It is because we don’t keep sows in the kitchen.

  A good many modern conclusions are based on mortality rates from the past that are not actually all that certain. The first person to look carefully into the matter was, a little unexpectedly, the astronomer Edmond Halley, who is of course principally remembered now for the comet named for him. A tireless investigator into scientific phenomena of all kinds, Halley produced papers on everything from magnetism to the soporific effects of opium. In 1693, he came across figures for annual births and deaths in Breslau, Silesia (now Wroclaw, Poland), which fascinated him because they were so unusually complete. He realized that from them he could construct charts from which it was possible to work out the life expectancy of any person at any point in his existence. He could say that for someone aged twenty-five the chances of dying in the next year were 80 to 1 against, that someone who reached thirty could reasonably expect to live another twenty-seven years, that the chances of a man of forty living another seven years were 5 ½ to 1 in favor, and so on. These were the first actuarial tables, and, apart from anything else, they made the life insurance industry possible.

  Halley’s findings were reported in the Philosophical Transactions of the Royal Society, a scientific journal, and for that reason seem to have escaped the full attention of social historians, which is unfortunate because there is much of interest in them. Halley’s figures showed, for instance, that Breslau contained seven thousand women of child-bearing age yet only twelve hundred gave birth each year—“little more than a sixth part,” as he noted. Clearly the great majority of women at any time were taking careful steps to avoid pregnancy. So childbirth, in Breslau anyway, wasn’t some inescapable burden to which women had to submit, but a largely voluntary act.

  Halley’s figures also showed that infant mortality was not quite as bad as the figures now generally cited would encourage us to suppose. In Breslau, slightly over a quarter of babies died in their first year, and 44 percent were dead by their seventh birthday. These are bad numbers, to be sure, but appreciably better than the comparable figures of one-third and one-half usually cited. Not until seventeen years had passed did the proportion of deaths among the young of Breslau reach 50 percent. That was actually worse than Halley had expected, and he used his report to make the point that people should not expect to live long lives, but rather should steel themselves for the possibility of dying before their time. “How unjustly we repine at the shortness of our Lives,” he wrote, “and think our selves wronged if we attain not Old Age; where it appears hereby, that the one half of those that are born are dead in Seventeen years.… [So] instead of murmuring at what we call an untimely Death, we ought with Patience and unconcern to submit to that Dissolution which is the necessary Condition of our perishable Materials.” Clearly expectations concerning death were much more complicated than a simple appraisal of the numbers might lead us to conclude.

  A further complication of the figures—and a sound reason for women limiting their pregnancies—was that just at this time women across Europe were dying in droves from a mysterious new disease that doctors were powerless to defeat or understand. Called puerperal (from the Latin term for child) fever, it was first recorded in Leipzig in 1652. For the next 250 years doctors would be helpless in the face of it. Puerperal fever was particularly dreaded because it came on suddenly, often several days after a successful hospital birth when the mother was completely well and nearly ready to go home. Within hours the victim would be severely fevered and delirious, and would remain in that state for about a week until she either recovered or expired. More often than not she expired. In the worst outbreaks, 90 percent of victims died. Until late in the nineteenth century most doctors attributed puerperal fever either to bad air or lax morals, when in fact it was their own grubby fingers transferring microbes from one tender uterus to another. As early as 1847, a doctor in Vienna, Ignaz Semmelweis, realized that if hospital staff washed their hands in mildly chlorinated water deaths of all types declined sharply, but hardly anyone paid any attention to him, and decades more would pass before antiseptic practices became general.

  For a lucky few women, there was at least some promise of greater safety with the arrival of obstetrical forceps, which allowed babies to be repositioned mechanically. Unfortunately their inventor, Peter Chamberlen, chose not to share his invention with the world, but kept it secret for the sake of his own practice, and his heirs maintained this lamentable tradition for a hundred years more until forceps were independently devised by others. In the meantime, untold thousands of women died in unnecessary agony. Forceps were not without risks of their own, it must be said. Unsterilized and clearly invasive, they could easily damage both baby and mother if not wielded with the utmost delicacy. For this reason, many medical men were reluctant to deploy them. In the most celebrated case, Princess Charlotte, heir presumptive to the British throne, died giving birth to her first child in 1817 because the presiding physician, Sir Richard Croft, would not allow his colleagues to use forceps to try to relieve her suffering. In consequence, after more than fifty hours of exhausting and unproductive contractions, both baby and mother died. Charlotte’s death changed the course of British history. Had she lived, there would have been no Queen Victoria and thus no Victorian period. The nation was shocked and unforgiving. Stunned and despondent at finding himself the most despised man in Britain, Croft retired to his chambers and put a bullet through his head.

  For most human beings, children and adults both, the dominant consideration in life until modern times was purely, unrelievedly economic. In poorer households—and that is what most homes were, of course—every person was, from the earliest possible moment, a unit of production. John Locke, in a paper for the Board of Trade in 1697, suggested that the children of the poor should be put to work from the age of three, and no one thought that unrealistic or unkind. The Little Boy Blue of the nursery rhyme—the one who failed to keep the sheep from the meadow a
nd the cows from the corn—is unlikely to have been more than about four years old; older hands were needed for more robust work.

  In the worst circumstances, children were sometimes given the most backbreaking of jobs. Those as young as six, of both sexes, were put to work in mines, where their small frames allowed them access to tight spaces. Because of the heat and to save their clothes, they often worked naked. (Grown men also traditionally worked naked; women usually worked naked to the waist.) For much of the year, those who worked in mines never saw sunlight, which left many stunted and weak from vitamin D deficiencies. Even comparatively light labor was often dangerous. Children in the ceramics factories of the Potteries in the Midlands cleaned out pots containing residues of lead and arsenic, inducing a slow poisoning that condemned many to eventual paralysis, palsies, and seizures.

 

‹ Prev