Elephants on Acid

Home > Other > Elephants on Acid > Page 20
Elephants on Acid Page 20

by Alex Boese


  “I don’t want to be responsible for killing a man.”

  “The responsibility is mine. Please go on.”

  The man shakes his head as though unsure. A hollow, fearful look flickers through his eyes. He shrugs his shoulders, turns back around, and mutters, “Well, that’s that.”

  He leans forward and speaks into the microphone, “Learner, your answer is wrong.” Then he presses the switch. A bloodcurdling scream shakes the walls.

  Would you torture or kill an innocent victim on the command of a stranger? When asked this question, almost everyone says no. But almost everyone is wrong. Stanley Milgram’s obedience experiment, conducted at Yale University in the early 1960s, demonstrated that the average person is capable of doing horrendous things, especially when told to do so by someone wearing a white lab coat.

  Milgram dreamed up his experiment while thinking about the Holocaust. Why was it, he wondered, that German citizens obeyed orders to send millions of Jews to death camps? Was there some quirk in the German character that made these citizens peculiarly obedient to authority? Or is such obedience a common feature of human psychology? If ordered, would Americans have done the same thing? To find out, Milgram decided to place randomly chosen subjects into a situation in which an authority figure would ask them to commit increasingly repellent acts of cruelty. The researcher would not coerce them. Subjects could stand up and leave at any time without consequence. Only a verbal command would be given: “Please go on . . . The experiment requires that you continue . . . You have no other choice, you must go on.” How would people respond to this request?

  Milgram’s subjects were utterly ordinary people—postal clerks, teachers, salesmen, factory workers. He recruited them by placing an ad in a newspaper, offering four dollars to anyone willing to participate in an hour-long “scientific study of memory and learning.”

  When a subject showed up at the Yale Interaction Laboratory, where the experiment was conducted, he was led through a series of elaborately staged events. First, a young, serious-looking researcher met him and introduced him to a man described as a second volunteer—a pleasant-looking, round-faced accountant in his late forties. Both the researcher and the second volunteer were actors who had carefully rehearsed the parts they would play during the next hour. Milgram was hidden behind a one-way window, observing everything that happened.

  The young researcher provided the subject with a false explanation of the experiment. He said it was designed to examine the effect of punishment on learning. One volunteer would serve as a “learner.” He would attempt to memorize a series of word pairs. The other would be a “teacher.” He would read the word pairs to the learner. The researcher stressed the next point—the teacher would operate a shock generator. Each time the learner gave a wrong answer, the teacher would administer punishment by flipping a switch on this machine and giving the learner a shock. The shocks would increase in intensity each time an incorrect answer was given.

  The two volunteers drew straws to determine who would be the learner and who the teacher. The fake volunteer always got to be the learner. The researcher then made a show of strapping the learner into an electric-chair apparatus—applying electrode gel to his wrists and tightening the restraints to prevent movement. Looking nervous, the learner asked whether the shocks could aggravate a heart condition he had. The researcher dismissed this concern: “Although the shocks can be extremely painful, they cause no permanent tissue damage.”

  Next, the researcher led the subject into an adjacent room where the voltage panel was housed and showed him how to operate the machine. The teacher settled himself in front of the panel. The researcher sat behind him, across the room, and the experiment began.

  It always started calmly. The teacher read out a series of word pairs: blue/box, nice/day, wild/duck. Then he read the first word of one of the pairs along with four other terms. Blue: sky, ink, box, lamp. He waited for the learner to identify the corresponding term.

  The learner aced the first few pairs. The subjects must have imagined there would be no need to explore the horrors waiting at the far right of the panel. But the word pairs became more challenging, and the learner began making mistakes. One error followed another. “Incorrect,” the teacher would say, and flip a switch on the voltage panel. Next time the shock was slightly stronger. The time after that it was stronger still.

  When the teacher pressed the 75-volt switch, the learner let out a distinct “Ugh” that could be heard through the wall. At 120 volts the learner’s reaction became more animated. “Hey, this really hurts,” he shouted. By 150 volts the learner was screaming to have the experiment stopped and to be let out. The cries of the learner came from a tape recorder. No one was actually being shocked. But the teachers didn’t know that. For them, the screams were terrifyingly real.

  Many of the teachers began to sweat and tremble. They bit their lips and dug their fingernails into their palms. Some of them laughed hysterically. All of them looked to the experimenter for guidance. What should I do now? The researcher offered calm reassurances and urged them to proceed. “Please go on,” he would say. “The experiment requires that you continue.”

  This was the moment of truth. How far up the panel would the teacher progress? Would he go to 200 volts? 300? 400? When would he push back his chair and say, “No more”? Or would he never do this? Would he press the switches all the way up to 450 volts?

  Before he conducted the experiment, Milgram anticipated that virtually no one would go all the way to the end of the panel. Psychiatrists he polled agreed with this prediction. They forecasted that only one subject in a thousand would administer the highest shock. But the actual behavior of the subjects shattered these expectations. Almost two-thirds of the teachers never disobeyed the experimenter. They agonized and sweated and shook, but they kept pressing the switch. They pressed the switch after the learner started screaming, after he yelled out that his heart was weak, and after he screamed in agony to be let out. They kept pressing the switch after the learner received 330 volts and fell into an eerie silence, apparently unconscious or dead. They pressed the switch all the way up to 450 volts, and then they kept pressing it until, finally, the researcher told them to stop. These were not serial killers or sadists. These were just average Americans, who were apparently willing to kill an innocent person because a man in a white lab coat told them to. Years later, during a CBS 60 Minutes interview, Milgram glumly concluded:

  I would say, on the basis of having observed a thousand people in the experiment and having my own intuition shaped and informed by these experiments, that if a system of death camps were set up in the United States of the sort we had seen in Nazi Germany, one would be able to find sufficient personnel for those camps in any medium-sized American town.

  Milgram tried numerous variations of the experiment, searching for the limits of obedience. He discovered that the proximity of the victim had a powerful effect on compliance. If subjects could neither see nor hear feedback from the victim, obedience was almost total. If they could hear only a thumping on the walls, compliance was 65 percent. But if the two people were in the same room, and the subject had to physically press the victim’s hand onto a metal plate to give him a shock, compliance dropped to 30 percent. Of course, 30 percent is still dismayingly high. Other variables, such as gender, had little effect on the results. Women proved just as willing as men to shock the victim.

  Milgram’s obedience study offers a depressing view of human nature. The average person seems all too willing to follow orders, no matter how cruel or unjust. But humanity’s stock sinks even lower when you consider a similar experiment conducted in Chicago during the same period. The Chicago researchers locked rhesus monkeys in cages. To obtain food, the monkeys had to pull on a chain. But there was a catch. Pulling the chain also caused a monkey in a neighboring cage to receive a high-frequency shock. After witnessing the agony of their neighbors, the majority of the monkeys refused to pull the chain again. They starved, s
ome for as long as twelve days, instead of inflicting pain on another. The monkeys, in other words, did something most humans could not: They said no. Apparently we still have much to learn from our primate cousins.

  Milgram, S. (1974). Obedience to Authority: An Experimental View. New York: Harper & Row.

  Shock the Puppy

  When Stanley Milgram published the results of his obedience experiments in 1963, they sent (figurative) shock waves through the scientific community. Other researchers found what he was reporting hard to believe. Could subjects really be so easily manipulated? They were sure Milgram must have made a mistake. Researchers conducted numerous follow-ups to his experiments, searching for ways to bring his results back in line with expectations. One experiment, carried out by Charles Sheridan and Richard King in 1972, easily stands out from this crowd.

  Sheridan and King theorized that Milgram’s subjects suspected the victim was fake. This would explain their remarkable obedience. They were just playing along with the game. To test this possibility, Sheridan and King decided to repeat Milgram’s experiment using an actual victim who would really get shocked. Obviously they couldn’t use a human for this purpose. So they used the next best thing—a cute, fluffy puppy.

  The experimenters placed the puppy inside a box that had a shock-grid floor. The interior of the box contained a signal light. Subjects—all volunteers from an undergraduate psychology course—were told the puppy was being trained to distinguish between a flickering and a steady light. The dog had to stand either to the right or the left depending on the cue from the light. If the animal failed to stand in the correct place, the subjects had to press a switch to shock it. As in the Milgram experiment, the shock level increased fifteen volts for every wrong answer.

  The human subjects could not see the light from where they stood. They could only see the position of the puppy. They judged its responses based on a chart they were given.

  Sheridan and King stressed the importance of this research, claiming they were attempting to measure “critical fusion frequency (CFF) in puppies,” but they also assured the volunteers that they would receive their compensation, which was course credit, simply for having shown up.

  The experiment began, and the puppy immediately got a lot of wrong answers. In fact, there was no right answer for the puppy to get. There was no correlation between the signal light and the answer sheet that had been provided to the students. From the puppy’s point of view, it was getting shocked randomly.

  As the voltage increased, the puppy first barked, then jumped up and down, and finally started howling with pain. The volunteers were horrified. They paced back and forth, hyperventilated, and gestured with their hands to show the puppy where to stand. Many openly wept. Yet the majority of them, twenty out of twenty-six, kept pushing the shock button right up to the maximum voltage. This finding validated Milgram’s results.

  In their write-up of the study, the experimenters noted that the shocks were amperage-limited and did not cause the puppy any permanent physical harm. However, they made no mention of psychological harm. If the poor creature later shook with terror whenever it came to a traffic light while out on its walkies, you could understand why.

  Sheridan, C. L., & R. G. King (1972). “Obedience to Authority with an Authentic Victim.” Proceedings of the Annual Convention of the American Psychological Association 80: 165–66.

  Requiem for a Rat

  The young woman holds the white rat in her hand. It struggles to get free, so she grips it tighter. “Do you seriously want me to do this?” she asks. The researcher standing in front of her nods. “But why?” As she says this, the woman suddenly laughs. It’s a nervous, awkward laugh, as though she can’t believe the situation she’s found herself in. “It is important for the experiment that you proceed,” the researcher says. The woman’s laughter turns into tears that roll down her cheeks. “Please don’t make me do this,” she begs. “Please . . .” “The experiment requires that you do it,” the researcher states firmly.

  Decades before Stanley Milgram shocked the world by demonstrating how readily people will obey a repellent order, a young graduate student at the University of Minnesota witnessed a similar phenomenon in his lab.

  It was 1924, and the student was Carney Landis. As part of his doctoral research, he was studying facial expressions. He wondered whether every emotion produces a characteristic expression. Is there one expression used by everyone to show fear? Another to show disgust? Another for arousal? And so on.

  To find out, he brought subjects one at a time into his lab. He drew lines on their faces with burnt cork, to better observe which facial muscles they were using. The lines made them look a bit like painted tribal warriors. Then he contrived ways to make them experience emotions. As they expressed each emotion, he took their photos.

  The situations in which Landis placed his subjects began with the mundane. They listened to some jazz music. They read the Bible. They told a lie. They smelled ammonia.

  Gradually, the situations became more unusual. BANG! A firecracker went off behind a curtain, and the camera snapped as their faces registered shock. Landis brought out pictures of skin-disease patients, pornographic scenes, and artistic nudes. The camera clicked away as the subjects browsed the images.

  Next came the mystery bucket. “Reach into it,” Landis told them, “and tell me what you feel.” Carefully they placed their hands inside. Ewww. Their faces wrinkled with displeasure as they touched three slimy frogs sitting in a puddle of water. “Yes, but you have not felt everything yet,” Landis said. “Feel around again.” They did so and—ZAP!—they received a powerful electric shock from wires attached to the bucket.

  But all this was a mere prelude to what came next—the experimental coup de grâce. Landis carried out a live white rat on a tray. “Hold this rat with your left hand,” he told them, “and then cut off its head with the knife.”

  The subjects stared at him in disbelief. They hadn’t been expecting this. They questioned whether he was serious. When he assured them he was, they hesitantly picked the knife up and put it back down again. Many of the men swore. Some of the women started to cry. They pleaded with him to stop the experiment. Nevertheless, Landis urged them on. Hovering over the rat with their painted faces, knife in hand, they now looked even more like members of some strange tribe preparing to offer a sacrifice to the Great God of the Experiment.

  It took a lot of coaxing, but eventually 75 percent of his subjects—fifteen out of twenty—complied. They decapitated the rats while the animals were still alive and squirming in their hands. This percentage was similar to the obedience levels Milgram would later find in his electric-shock experiments at Yale.

  In general, the procedure went badly. Landis noted, “The effort and attempt to hurry usually resulted in a rather awkward and prolonged job of decapitation.” Nor did the rats get a reprieve if the subjects refused to obey. In the five cases of noncompliance, Landis simply picked up the knife and did the job himself. He was determined not to let those rats live.

  Most of Landis’s subjects were fellow graduate students at the University of Minnesota, but Landis also tested a thirteen-year-old boy suffering from high blood pressure. Doctors had referred the boy to the department of psychology because they suspected his symptoms were caused by emotional instability. You have to wonder whether being forced to decapitate a rat added to his issues.

  Landis stumbled upon the phenomenon of experimental obedience almost forty years before Milgram, but Landis never realized the significance of what he had found. It never occurred to him that the willingness of his subjects to obey bizarre commands was far more interesting than their facial expressions as they did so. As it turned out, their expressions varied so widely he failed to find any one look that typified a situation. For instance, expressions shown while decapitating a rat included pained smiling, crying, and what Landis called “fascinated attention,” produced by “a slight contraction of the risorius, medium contraction of th
e zygomatics and lowering of the upper eyelids.” Landis died in 1962, just as Milgram was conducting his more famous obedience studies.

  It is often this way with experiments. A scientist sets out to prove one thing, but stumbles upon something completely different, something far more intriguing. For this reason, good researchers know they should always pay close attention to strange events that occur during their experiments. A great discovery might be lurking right beneath their eyes—or beneath the blade of their knife.

  Landis, C. (1924). “Studies of Emotional Reactions, II., General Behavior and Facial Expression.” Journal of Comparative Psychology 4 (5): 447–509.

  What a Difference a Bag Makes

  Oregon State University, 1967. It is a cold winter day. A car pulls up to the curb. The passenger door opens, and a man enclosed in a black cotton bag stumbles out. Only his feet protrude from beneath the fabric. As he sways back and forth trying to gain his balance, the car pulls away with a screech of the tires. Having steadied himself, the man in the bag proceeds forward with an air of determination. He walks up the stairs of Shepard Hall, through the doors, and into the classroom of Charles Goetzinger. The students in the room turn to stare as he enters. Dr. Goetzinger looks up from his notes. “Ah, welcome back, Bag. Good to see you again.”

  Most of the students who attended Charles Goetzinger’s class, Speech 113: Basic Persuasion, in the winter quarter of 1967 wore normal clothes—shirts, shoes, slacks, or skirts. But one student opted to show up in a large black bag. He shuffled into class on the first day and took a seat at the back. He didn’t say a word.

  The Black Bag, as he came to be known, showed up for every class. At first he maintained his silence. When members of the class were each required to give a short speech, he stood before his fellow students for four minutes without saying a word, then returned to his seat. Eventually, as the quarter wore on, he loosened up and let fly with a few cryptic remarks such as, “I’m not Jesus Christ or anything. I’m just one of you in a bag.” Reportedly, the Black Bag spoke with a New England accent.

 

‹ Prev