We were a team. After that first mastitis episode, I gave up pumping. My ducts seemed to clog up too easily, and my son would never take a bottle again anyway. Put in the cold terms of economics, pumping allows women and babies to decouple supply and demand. At the same time, they can decouple themselves, which can be very convenient. If you pump and freeze (as opposed to pump and dump), there’s enough for a dry day or an extended absence from your baby, or a chance for your partner to bottlefeed in the middle of the night. Since such arrangements were not to be in our household, we had to be in perfect sync. There was an urgent, immediate need for my breasts to be in good working order. What my breasts cooked up, my baby ate on the spot. When the baby had a growth spurt and fed more frequently, my breasts magically responded by stepping up production. My husband, well rested every night, was just fine with this arrangement.
“Lactation is a father’s best friend,” he said, heading happily off to work. It’s yet another reason for men to like breasts. It’s yet another reason for new mothers to want to throttle their husbands.
My pediatrician wore red Converse high-tops and a graying ponytail. When I complained to him in the early months that my son woke up every two or three hours for a feed, he looked at the baby and said, “You little shit.” Then he explained that’s the way it’s supposed to work. Too many parents expect their babies to sleep through the night; it’s probably one of the factors driving the use of formula, which takes the baby longer to digest.
I always knew I would breast-feed, and most of my peers would consider it a near felony not to, even though few of us were breastfed as babies. My mother nursed me for all of four weeks. I know this because I’ve inherited her journals. There’s one labeled, “F Feeds.” It was the late 1960s, when only 20 to 25 percent of all American women tried nursing. This era represented the formula companies’ strongest headlock on mothers and pediatricians, a time when the science (and profits) of nutritive molecules trumped the fumbling art of breast-feeding. My mother, normally an intuitive spirit with a high disregard for paperwork of any kind, must have been influenced by the men in lab coats. Her journal reads a bit like a high school science log: Mar 20, 1:15pm: L breast—15 mins. R breast—12 mins. No wonder she gave it up.
By then, breast-feeding rates had been declining steadily since the postwar baby boom, falling by half from 1946 to 1956 as mothers readily turned to the science of the bottle. Today, some breast-feeding advocates—the lactivists or nutricionistas, as they’re sometimes called—make you out to be a freak of nature if you don’t breast-feed. Historically, though, this is inaccurate. As “natural” as breast-feeding is, there has always been a cadre of women who couldn’t or wouldn’t do it, for either physiological or cultural reasons. Humans are the only mammal for whom not lactating is occasionally an option (although elephants, foxes, and primates have been known to nurse each other’s young). Archaeologists have found four-thousand-year-old graves of infants buried beside ancient feeding vessels lined with residue of cow’s milk. (It’s no wonder the infants died; for most of our history, not being nursed by someone usually meant a death sentence.) Sometimes mothers couldn’t nurse because they had died in childbirth, or their milk dried up after a breast infection, or they were ill. Syphilis, which could be transferred to the baby from the mother during breastfeeding, deterred many in Europe after the Middle Ages. Even fashion posed a problem; the tight corsets of the Restoration were known to flatten or even invert women’s nipples. And once the industrial revolution started, many working-class women took jobs away from their homes and babies.
Into these voids marched professional men with their half-witted and highly politicized ideas. Pliny and Plutarch were opposed to the practice of hiring wet nurses, but Plato loved the idea, “while taking every precaution that no mother shall know her own child.” Good thing he stuck to philosophy. Many ancient pundits and doctors offered suggestions for finding the best nurse: she should be cheerful and not deranged, and she should have a strong neck and moderately sized breasts, according to Avicenna in the eleventh century. Babylonia’s Code of Hammurabi, circa 1750 BC, was specific about laws and punishments for errant wet nurses: “If a man has given his son to a wet-nurse to suckle, and that son has died in the hands of the nurse, and the nurse, without consent of the child’s father or mother, has nursed another child, they shall prosecute her; because she has nursed another child, without consent of the father or mother, her breasts shall be cut off.”
Engaging the services of a wet nurse wasn’t always the result of dire necessity. The practice was very fashionable in the upper classes of many societies across much of recorded history, probably because it was known to increase the fertility rate of the mother. Prolactin, one of the key hormones triggered by breast-feeding, suppresses ovulation. Nature was wise in this birth-spacing design. Even today, a child born in a developing country less than two years after an older sibling is almost twice as likely to die as a child born after a longer spread. Wet-nursing not only allowed mothers to dodge nature’s duties but also resulted in some serious social engineering. While the rich procreated annually, poor women—the wet nurses—often “dry-fed” gruel to their own babies, resulting in huge mortality rates; meanwhile, their own fertility was squelched by their line of work. (This contraception, though, came in handy as some wet nurses moonlighted as prostitutes.)
The natural functioning of breasts has been upended by culture for a long time. While some mothers ducked out of breastfeeding, others were transformed into virtual dairy cows. In Dickensian foundling hospitals established for abandoned babies, wet nurses fed dozens of infants, nursing as often as thirty-four times a day. Sometimes the babies were fed a supplement of rancid cow’s milk and flour. The results were grim; death rates in foundling hospitals in the eighteenth and nineteenth centuries ran as high as 90 percent. Even for the urban middle and working classes, who farmed their babies out to wet nurses in the country, mortality rates reached 50 percent. Jane Austen’s story was typical. Three months after she was born in 1775, her parents sent her off to the nurse’s house, as they had her siblings before her. “When they approached the age of reason and became socially acceptable, they were moved again, back to their original home,” wrote Austen’s biographer Claire Tomalin. For harried parents, it was a rather appealing if appalling arrangement: pop the children out, send them off packing, and bring them back when they’re big enough to do some chores. Some sources claim this is where the term farmed out derived.
Dry-feeding, despite its disastrous results, was also rather common because it was cheaper than hiring a wet nurse. Many mothers throughout the ages suffered either real or perceived bouts of insufficient milk, and so they would resort to feeding babies some sort of mash as a milk supplement or total replacement. (Breast-feeding is an ingeniously calibrated feedback loop, and it’s not always forgiving. Once you embark on the slippery slope of supplemental feeding, milk production can taper precipitously.) Many recipes were handed down the generations for the ideal food, usually involving some sort of milk, water, grain, and sugar. Occasionally wine or spirits, cod liver oil, and opium were added. Before the days of refrigeration and pasteurization, such cocktails were at best a dicey proposition for infants, who have immature immune systems. Of the dry-fed kids who survived, many had scurvy, rickets, and iron or other mineral deficiencies.
It’s not surprising that by the end of the nineteenth century, medical professionals, who had increasingly elbowed their way into the realm of midwives, would also turn their attention to infant food. Motivated by high mortality rates due to malnutrition and intestinal disease (and seeking to cement their own job security), leaders in the field actively recommended that doctors replace “old women” and “uneducated nurses” when it came to overseeing the infant diet. In fact, the growth of the pediatrics field at the turn of the twentieth century was predicated on infant nutrition. In 1893, the medical lecturer John Keating called it “the bread and butter” of the profession. These do
ctors tinkered with their own formulations for milk substitutes and readily experimented with offerings in the growing marketplace. Mothers had to visit the doctors regularly to get access to the food, which was available only by prescription. Together, the formula companies and the doctors reinforced each other’s businesses.
Two major developments around this time nudged the creep away from breast-feeding. One was the rise of manufacturing, which allowed formula companies to produce large amounts of relatively consistent product. Henri Nestlé was a young chemist and merchant surrounded by dairy cows in Vevey, Switzerland. In 1867, he concocted his Farine Lactée Nestlé. He described his Milk Food as “good Swiss milk and bread, cooked after a new method of my invention, mixed in proportion, scientifically correct, so as to form a food which leaves nothing to be desired.” By the 1870s, the product had gone global. (Now, 140-plus years later, formula has propelled Nestlé to being the largest food company in the world.)
The other major event was the rise of germ theory. It changed modern medicine forever, and much along with it. In a nutshell, this was the recognition that microbes caused disease. Before that, people thought sickness was caused by swamp vapors or spontaneous eruptions or a combination of bad luck, bad behavior, and God. In 1876, the German physician Robert Koch proved the bacterium Bacillus anthracis caused anthrax in livestock, and he discovered the bacterium that causes tuberculosis. Over the next twenty years, microbiologists isolated the organisms that cause pneumonia, diphtheria, typhoid, cholera, strep and staph infections, tetanus, meningitis, and gonorrhea. In one sense, an environmental understanding of disease was being replaced by a germ-based one. It represented medical progress, but it would ultimately provide a false sense of separation from nature.
The new discoveries led to many vaccines and antibiotics, as well as to vastly improved sanitation measures, quarantines, and food safety practices, such as pasteurization. Many, many lives were saved. You might think an understanding of deadly bacteria would bolster the argument for breast-feeding and against commercially processed infant foods, but the opposite happened. Medicine and science enjoyed a new prestige, and mothers grew more willing to surrender traditional knowledge and personal control over infant care. The midwives and grandmothers didn’t stand a chance. In 1920, only 20 percent of American women gave birth in hospitals; by 1950, over 80 percent did. There, scientific motherhood flourished.
Such science did not look favorably upon breast-feeding. Mid-century mothers were often totally anesthetized for childbirth, necessitating the use of forceps to pull the baby out. After birth, babies were typically given formula to wait out the period of time before the mother’s milk came in (around three days, but before that she makes immune-boosting colostrum). Babies and mothers were separated at birth, only to be reunited for short, closely regulated, and highly sanitized feedings. Talk about some weird amendments to our eons-old mammalian patterns: mothers wore masks, scrubbed their nipples with soap, nursed their babies, and then watched them get burped through Plexiglas windows. To allow the mothers to rest, nurses took over the night feeding with formula freely supplied by manufacturers. Only several other feeds were allowed during the day, usually with one breast at each feed. It’s no wonder the mothers didn’t produce enough milk. The babies became hungry, and the bottle was presented as a perfectly acceptable alternative. The doctors and nurses had little-to-no training in lactation but much expertise in formula measurements. After a week of the Nurse Ratched approach to baby care, mother was sent home with a pat on the back and a free sample of Nestlé’s finest.
In 1956, a small backlash ensued. It started in Illinois at a church picnic with a couple of Catholic matrons, not normally known for upholding mammalian urges. “It just didn’t seem fair that mothers who bottle feed … were given all sorts of help … but … when a mother was breastfeeding, the only advice she was given was to give the baby cow’s milk,” said Marian Thompson. She and Mary White formed a small group to support other breast-feeding mothers. They called themselves the La Leche League. As one of the founding members told the New York Times, “You didn’t mention ‘breast’ in print unless you were talking about Jean Harlow.”
You probably know the tale from here. The suburban La Leche ladies went on to align with the hippies and the back-to-landers, and together they reformed hospital practices and reversed the sorry downward spiral of breast-feeding rates. Then they took on Nestlé through an epic boycott over its capitalist hegemony in developing countries, where infants were dying from formula made with contaminated water. The league is alternately inspirational and infuriatingly dogmatic. For a time, it advocated that mothers stay home and not work. This proscription did not sit well with the emerging feminist movement in the 1970s, and tensions over their take-noprisoners attitude remain to this day. Still, the organization has effectively hammered home its central message that breast-feeding makes babies smarter, healthier (those ear infections!), less obese, more loved, and pretty much superior in every way.
The lactivists, both nationally and globally, have made their milky mark: the World Health Organization now recommends breast-feeding for two years; the American Academy of Pediatrics recommends it for one. Many hospitals still distribute free formula (mine did), but they also allow “rooming in” so new mothers and babies can spend all their time together; often, baby is put to breast within moments of delivery.
Even so, American mothers learning to breast-feed are uniquely beset by problems. It normally takes about three days after childbirth for a mother’s milk to “come in.” Any longer than that, and the docs are going to whisk Junior away and start formula. So the early days are critical for the long-term success of nursing. In Ghana, only 4 percent of women experience delayed lactation; in Sacramento, site of a recent study, 44 percent do. No one knows why. It might be because we’re more obese or older, or we use spinal anesthesia, or we have more C-sections or more environmental exposures, or tight bras have flattened our nipples.
Stay tuned; it’s an active area of research.
Despite the lactivists’ best efforts and deepest intonations of “breast is best,” breast-feeding rates here and in much of the world remain middling. In the United States, about 70 percent of mothers initiate breast-feeding, but only 33 percent nurse longer than six months and only 13 percent fulfill the year-long recommendation. Australia and Sweden, with their generous maternal leave policies, have gone lactomanic, with initiation rates of over 90 percent. Canada’s is about 87 percent. Brazil has been a great public health success story: The average duration of breast-feeding has increased from two and a half months in the 1970s to over eleven months today. Over 95 percent of Brazilian women try breast-feeding, and for those who can’t swing it, the country has two hundred human milk banks and over 100,000 donors. Their milk is collected and stored by firefighters.
Has all the ruckus—all the maternal guilt, the physical and mental introspection, the madre-a-madre name-calling, the battles with the medical establishment—been worth it? Is milk, au naturel, really so superior to formula that we must make each other feel bad about our failures and choices? The honest answer to this question is yes and no. I don’t mean to be feckless. Breast milk confers many known benefits to infants, but for healthy babies in the developed world, those benefits are relatively small. They might actually be bigger than we think, but the truth is, we don’t really know. To some extent, the activists asserting we’re in the midst of “a biocultural crisis” have perhaps overstated their case.
The journalist Hanna Rosin wrote a compelling essay challenging breast-feeding in a much talked-about Atlantic article in 2009: “And in any case, if a breast-feeding mother is miserable, or stressed out, or alienated by nursing, as many women are, if her marriage is under stress and breast-feeding is making things worse, surely that can have a greater effect on a kid’s future success than a few IQ points… So overall, yes, breast is probably best. But not so much better that formula deserves the label of ‘public health mena
ce,’ alongside smoking.”
It was a strong argument delivered well for an Altantic readership. Who needs a few extra IQ points or a few less visits to the doctor when our children will be high achieving and successful anyway? Given the context, it’s a reasonable stance. And refreshing, a welcome astringent to the sanctimony of the lechistas. I’m sure many mothers read it and gleefully ran out to the market—perhaps Whole Foods in their case—to buy Earth’s Best Soy Infant Formula. After all, most of us weren’t breast-fed and look at us: we’re healthy, long-lived, long limbed, and terribly clever with iPhone apps.
Then again, once you look outside Rosin’s reader demographic, a lot of us are rather obese. And diabetic. On the road to heart disease? Check. And because of all this, some of us will be facing shorter lifespans than our parents. Allergies and asthma? Common. (And a word about the IQ scores: formula confers the same average loss in points—four—as high childhood lead levels. That dip was enough to trigger a public health outcry in the 1970s, and resulted in federal laws banning lead in gasoline and paint. Now kids’ scores are back up.) But can we attribute any of our metabolic malaise and chronic ill health to formula? Some people are trying, and they’re trying hard. So far, the data are intriguing, but underwhelming.
Many studies, for example, have compared formula and breastfeeding for a risk of obesity in infants and children. The results of these studies, it must be said, are all over the map. Two major reviews of the literature found that in a majority of studies, children who were breast-fed faced a lower risk—ranging from about 10 to 25 percent—of obesity. Despite the fact that most of these studies rigorously controlled for factors such as the mother’s level of education, smoking, and so on, other confounding factors might exist, so it’s difficult to know for sure how much of the benefit to attribute to breast-feeding. It is known, however, that formula-fed babies consume about 70 percent more protein than their peers, and this may trigger higher levels of growth factors and insulin secretion, which in turn lead to increased deposition of fat.
Breasts Page 15