Book Read Free

Elephants on Acid

Page 18

by Alex Boese


  Harlow’s work then took a darker turn. Having determined the qualities that fortify the love between a baby and its mother, he set out to discover whether these bonds of love could easily be broken. He wanted to simulate in monkeys the experience of human children who have been exposed to poor parenting, to help unravel some of the resultant problems. So he created mothers that inflicted various forms of abuse on the babies. There was shaking mom (who at times shook so hard she flung her infant across the room), air-blast mom (who occasionally blasted her babies with violent jets of compressed air), and brass-spike mom (from whom blunt brass spikes periodically emerged). Whatever cruelties these mothers dealt out, their babies would simply pick themselves up and crawl back for more. All was forgiven. Their love could not be shaken, dented, or air-blasted away. Very few infant monkeys were involved in these tests, but the results were clear.

  It’s worth noting that although Harlow did identify some of the qualities a competent mother should have, his surrogates reliably failed the ultimate test. An infant monkey would never, ever have chosen one of them over a living, breathing female of its own species. Which shows that, when it comes to motherly love, there is no substitute for the real thing.

  Harlow, H. (1958). “The Nature of Love.” American Psychologist 13 (12): 673–85.

  Braking for Baby

  The squad car speeds down the street, sirens wailing, in hot pursuit of a criminal. It rounds a corner, and suddenly, just yards ahead, a person is crossing the street and—OH MY GOD!—she’s pushing a baby stroller! The cop jams on the brakes and twists the wheel sideways. Tires squeal. Rubber skids against the tarmac. The car veers up onto the curb, hits a fire hydrant, and slams into the side of a building. Water sprays everywhere. But the stroller and its occupant are untouched. Inside the car, the cops breathe a sigh of relief. The criminal, however, is long gone.

  Hollywood car chase scenes have taught us that people will go to great lengths to avoid hitting a baby stroller—far greater lengths, it seems, than they’ll go to avoid running over an adult. Is this true? Are drivers in real life more careful to avoid baby strollers than they are to avoid grown-ups?

  In 1978 researchers at UCLA put this idea to an experimental test. At a busy four-lane street in Los Angeles, a young female experimenter stepped out into the crosswalk and waited for cars to stop so she could proceed across. She either stepped out alone, pushing a shopping cart, or pushing a baby stroller. An observer counted the number of cars that passed in each situation before a driver stopped for her.

  Thankfully, the researchers didn’t put a real baby in danger. The stroller was empty, and drivers could easily see this. But the researchers believed this wouldn’t matter. They hypothesized that the mere presence of a stroller would encourage drivers to stop more readily.

  They were right. When the experimenter stood alone, an average of almost five cars drove by before one stopped. In the shopping-cart condition, the average was three cars. But with a baby stroller in front of her, the number of passing cars dropped to one.

  The researchers theorized that drivers stopped sooner for the stroller because babies—and, by extension, all things baby related—act as anger inhibitors. They trigger nonviolent, courteous impulses in people. Strong taboos against harming small children exist in all cultures. Even monkeys share this sentiment. If male monkeys want to avoid being attacked, they often sidle up to infants.

  Intriguingly, the researchers stumbled upon a parallel phenomenon during the course of their experiment. They noticed that when the woman was standing alone, certain types of drivers stopped far more readily than others:

  Young male drivers, in particular, seemed more inclined to stop for the attractive female experimenter when she was without the baby stroller (and in a few instances to actually try and converse with her). It would be desirable in future research to replicate the present findings with other pedestrians such as older men.

  So perhaps we should add a corollary to the observation that people stop more readily for babies. They also stop frequently for babes.

  Malamuth, N. M., E. Shayne, & B. Pogue (1978). “Infant Cues and Stopping at the Crosswalk.” Personality and Social Psychology Bulletin 4 (2): 334–36.

  The Ultimate Baby Movie

  You see them at playgrounds, chasing after children, camera in hand. Or at restaurants, filming away as an infant hurls food on the floor. They’re proud parents, determined to preserve for posterity every moment of their kid’s childhood—his first step, his first bite of carrots, his first inarticulate gurgle. Later these archived memories will be sprung on guests who thought they had been invited over for a no-strings-attached dinner.

  Every year proud parents collectively shoot hundreds of thousands—perhaps millions—of hours of baby movies. But it would be hard for any of them to match the output of Deb Roy. By the time Roy’s son was three, Roy had amassed 400,000 hours of baby video, enough to make any dinner guest shudder.

  Roy managed this feat by installing eleven overhead omnidirectional megapixel fish-eye cameras in the rooms of his home, as well as fourteen microphones. Literally every move and utterance his baby made was recorded. The data, almost three hundred gigabytes’ worth of it a day, was continuously streamed to a five-terabyte disk cache in his basement. After installing the system, his electricity bill quadrupled.

  Roy didn’t do this for the sake of parental pride, though that certainly played a part. He was head of the MIT Media Lab’s Cognitive Machines Research Group. When his wife became pregnant he realized he had the perfect opportunity to study how children learn language. So Roy recorded almost everything his son heard and saw from birth until the age of three (in mid-2008). Powerful computers at the MIT media lab then began analyzing the footage, searching for the visual and verbal clues the child had used to construct his understanding of language. A one-million-gigabyte storage system—one of the largest storage arrays in the world—was built to hold all the data. The goal was to build a model of language acquisition out of the data. With luck, such a model could eventually be used to design a machine learning system capable of mimicking a human baby’s ability to learn language.

  Roy called his experiment the Human Speechome Project. Speechome stands for “Speech at home.” The media, however, dubbed it the Baby Brother Project. But, unlike the Big Brother contestants, Roy didn’t totally sacrifice his privacy. The cameras in every room had an ON/OFF switch, as well as an OOPS button that deleted the last few minutes of activity. Roy noted that the oops button was used 109 times during the first six months the system was in place, although he didn’t state why. If it was used because, “Oops, I just said a bad word,” that omission could potentially have undermined the purpose of the project. MIT analysts will be scratching their heads wondering, “How in the world did the child learn to say that?” Of course, they could always use the strategy employed by millions of parents around the world—blame it on the TV.

  Roy, D., et al. (2006). The Human Speechome Project. Presented at the 28th Annual Conference of the Cognitive Science Society. Available online: http://www.media.mit.edu/press/speechome/speechomecogsci.pdf.

  CHAPTER EIGHT

  Toilet Reading

  Back in the Middle Ages, toilets—the few that existed—were placed at the top of castle turrets. Waste products slid down a chute into the moat below. This was one reason swimming the moat was an unappealing prospect. The anonymous author of The Life of St. Gregory praised the solitude of this lofty perch for the “uninterrupted reading” it allowed. This admission made him one of the first bathroom readers in recorded history. Today many people do much of their best reading while on the loo. It is possible you are reading these very words in such a situation. Toilet readers can be divided into two types: relaxers and workers. For relaxers the toilet is a place of retreat and tranquillity. Dr. Harold Aaron, author of Our Common Ailment, Constipation, notes, “Reading gives the initial feeling of relaxation so useful for proper performance.” Workers, on the other
hand, dare not waste even those few minutes spent attending to bodily functions. Lord Chesterfield tells of “a gentleman who was so good a manager of his time that he would not even lose that small portion of it which the call of nature obliged him to pass in the necessary-house; but gradually went through all the Latin poets, in those moments.” This chapter is dedicated to toilet readers of all persuasions. Gathered in the following pages are unusual experiments that speak, in some way, to loo-related themes. May they help set the mood for peak performance.

  The Doctor Who Drank Vomit

  The yellow-fever patient groaned as he lay in bed. His skin had a sickly lemon tinge, marred by red and brown spots. The smell of decay clung to him. Suddenly he jerked upward and leaned over the side of the bed. Black vomit, like thick coffee grounds, gushed from his mouth. A young doctor sitting by his side expertly caught the spew in a bucket and patted the patient on the back. “Get it all out,” he said. A few final mucus-laced black globs dribbled from the patient’s mouth before the man collapsed onto the bed. The doctor swirled the steaming liquid in the bucket a few times, examining it closely. The stench of it was overpowering, but barely a flicker of disapproval registered on the doctor’s face. Instead he calmly poured the vomit into a cup, lifted it to his lips, and slowly and deliberately drank it down.

  The vomit-imbibing doctor was Stubbins Ffirth. As successive yellow-fever epidemics devastated the population of Philadelphia during the early nineteenth century, Ffirth made a name for himself by courageously exposing himself to the disease to prove his firm belief that yellow fever was noncontagious.

  Ffirth confessed that when he first saw the ravages of yellow fever he, like everyone else, believed it to be contagious. But subsequent observation dissuaded him of this. The disease ran riot during the sweltering summer months, but disappeared as winter approached. Why, he wondered, would weather affect a contagious disease? And why didn’t he grow sick, despite his constant contact with patients? He concluded yellow fever was actually “a disease of increased excitement” brought on by an excess of stimulants such as heat, food, and noise. If only people would calm down, he theorized, they would not develop the disease.

  To prove his noncontagion hypothesis, Ffirth devised a series of tests. First he confined a dog to a room and fed it bread soaked in the characteristic black vomit regurgitated by yellow-fever victims. (The blackness is caused by blood hemorrhaging from the gastrointestinal tract.) The animal did not grow sick. In fact, “at the expiration of three days he became so fond of it, that he would eat the ejected matter without bread.” Pet-food manufacturers might want to take note.

  Emboldened by this success, Ffirth moved on to a human subject, himself:

  On the 4th of October, 1802, I made an incision in my left arm, mid way between the elbow and wrist, so as to draw a few drops of blood; into the incision I introduced some fresh black vomit; a slight degree of inflammation ensued, which entirely subsided in three days, and the wound healed up very readily.

  Ffirth’s experiments grew progressively bolder. He made deeper incisions in his arms, into which he poured black vomit. He dribbled the stuff into his eyes. He cooked some on a skillet and forced himself to inhale the fumes. He filled a room with heated regurgitation vapors—a vomit sauna—and remained there for two hours, breathing in the air. He experienced a “great pain in my head, some nausea, and perspired very freely,” but otherwise was okay.

  He then began ingesting the vomit. He fashioned some of the black matter into pills and swallowed them. Next, he mixed half an ounce of fresh vomit with water and drank it. “The taste was very slightly acid,” he wrote. “It is probable that if I had not, previous to the two last experiments, accustomed myself to tasting and smelling it, that emesis would have been the consequence.” Finally, he gathered his courage and quaffed pure, undiluted black vomit. Having apparently acquired a taste for the stuff, he even included in his treatise a recipe for black-vomit liqueur:

  If black vomit be strained through a rag, and the fluid thus obtained be put in a bottle or vial, leaving about one-third part of it empty, this being corked and sealed, if set by for one or two years, will assume a pale red colour, and taste as though it contained a portion of alkahol.

  Despite his Herculean efforts to infect himself, Ffirth had still not contracted yellow fever. He momentarily considered declaring his point proven, but more yellow-fever-tainted fluids remained to be tested: blood, saliva, perspiration, and urine. So he soldiered on, liberally rubbing all of these substances into incisions in his arms. The urine produced the greatest reaction, causing “some degree of inflammation.” But even this soon subsided. And he was still disease free.

  Ffirth now felt justified in declaring his hypothesis proven. Yellow fever had to be noncontagious. Unfortunately, he was wrong. We now know that yellow fever is caused by a tiny RNA virus spread by mosquitoes. This explains why Ffirth observed seasonal variations in the spread of the disease. The epidemic retreated in winter as the mosquito population lessened.

  How Ffirth failed to contract the disease is a bit of a mystery, considering he was rubbing infected blood into wounds on his arms. Christian Sandrock, a professor at UC Davis and an expert on infectious diseases, speculates that he simply got lucky. Yellow fever, much like other mosquito-borne diseases such as West Nile virus, requires direct transmission into the bloodstream to cause infection. So Ffirth happened to pick the right virus to smear all over himself. Had he done the same thing with smallpox, he would have been a goner.

  Although Ffirth made a bad guess about the cause of the disease, his experiments weren’t entirely in vain. He submitted his research to the University of Pennsylvania to satisfy the requirements for the degree of Doctor of Medicine, which was subsequently granted to him. Modern graduate students who complain about the excessive demands of their thesis committees might want to keep his example in mind. They don’t realize how easy they have it.

  Ffirth, S. (1804). A treatise on malignant fever; with an attempt to prove its non-contagious nature. Philadelphia: Graves.

  Magic Feces

  Imagine a dog turd. Some unknown pooch deposited it on a lawn weeks ago. Since then it’s been baking in the sun until it’s formed hard, crusty ridges. Would you want to pick this up and eat it? Of course not.

  Now imagine it’s 1986 and you’re an undergraduate at the University of Pennsylvania. You volunteered to participate in a food-preferences study, and you find yourself sitting in a small, square laboratory room. You’ve just been given a piece of fudge to eat, and it was very good. The researcher now presents you with two more pieces of fudge, but there’s obviously a trick. The fudge morsel on the left is molded into the form of a disk, but the one on the right is in the shape of a “surprisingly realistic piece of dog feces.”

  “Please indicate which piece you would prefer,” the researcher asks in a serious tone.

  This question was posed to people as part of a study designed by Paul Rozin, an expert in the psychology of disgust. The responses his team got were no surprise. Participants overwhelmingly preferred the disk-shaped fudge, rating it almost fifty points higher on a two-hundred-point preference scale.

  The researchers exposed subjects to a variety of gross-out choices. In each situation the options were equally hygienic—there was never any risk of bacterial infection—but one was always distinctly more stomach turning than the other.

  They offered volunteers a choice between a glass of apple juice into which a candleholder had been dipped, or one in which a dried sterilized cockroach had been dunked. “Which would you prefer?” the researcher asked as he dropped the cockroach into the glass and stirred it around. The roach juice scored one hundred points lower on the preference scale.

  Would volunteers prefer to hold a clean rubber sink stopper between their teeth, or a piece of rubber imitation vomit? The sink stopper won out.

  Would they be willing to eat a bowl of fresh soup stirred by an unused fly swatter, or poured into a brand-new b
edpan? “No, thank you,” participants responded in both cases.

  The researchers offered the results of their study as evidence of the “laws of sympathetic magic” at work in American culture. These laws were named and first described by the nineteenth-century anthropologist Sir James Frazer in his classic work The Golden Bough, an encyclopedic survey of the belief systems of “primitive” cultures around the world.

  Frazer noticed two forms of belief showing up again and again. First, there was the law of contagion: “Once in contact, always in contact.” If an offensive object touches a neutral object, such as a cockroach touching some juice, the neutral object becomes tainted by the contact. Second, there was the law of similarity: “The image equals the object.” Two things that look alike are thought to share the same properties. A voodoo doll that looks like a person becomes equivalent to that person. Or fudge that looks like feces is as disgusting as feces.

  In the modern world we like to think of ourselves as being quite rational. We understand the principles of good hygiene, and how diseases are transmitted. We like to imagine we’re not ruled by simple superstitions. And yet, of course, we are. As the researchers put it, “We have called attention to some patterns of thinking that have generally been considered to be restricted to preliterate or Third World cultures.” The curious thing is that most of us will readily acknowledge the illogic of these beliefs. We know the laws of sympathetic magic aren’t real. But we’re still not about to eat that doggy-doo fudge.

  Rozin, P., L. Millman, & C. Nemeroff (1986). “Operation of the Laws of Sympathetic Magic in Disgust and Other Domains.” Journal of Personality and Social Psychology 50 (4): 703–12.

 

‹ Prev