The Undoing Project

Home > Other > The Undoing Project > Page 14
The Undoing Project Page 14

by Michael Lewis


  The broad answer to that last question coming from the University of Michigan, Amos reported to Danny’s class, was that, yes, more or less, they do. Amos presented research done in Ward Edwards’s lab that showed that when people draw a red chip from the bag, they do indeed judge the bag to be more likely to contain mostly red chips. If the first three chips they withdrew from a bag were red, for instance, they put the odds at 3:1 that the bag contained a majority of red chips. The true, Bayesian odds were 27:1. People shifted the odds in the right direction, in other words; they just didn’t shift them dramatically enough. Ward Edwards had coined a phrase to describe how human beings responded to new information. They were “conservative Bayesians.” That is, they behaved more or less as if they knew Bayes’s rule. Of course, no one actually thought that Bayes’s formula was grinding away in people’s heads.

  What Edwards, along with a lot of other social scientists, believed (and seemed to want to believe) was that people behaved as if they had Bayes’s formula lodged in their minds. That view dovetailed with the story then winning the day in social science. It had been told best by the economist Milton Friedman. In a 1953 paper, Friedman wrote that a person shooting billiards does not calculate the angles on the table and the force imparted on the cue ball, and the reaction of one ball to another, in the way a physicist might. He just shot the ball in the right direction with roughly the right amount of force, as if he knew the physics. His mind arrived at more or less the right answer. How that happened didn’t matter. Similarly, when a person calculates the odds of some situation, he does not do advanced statistics. He just behaves as if he does.

  When Amos was done talking, Danny was baffled. Was that it? “Amos had described the research in the normal way that people describe research done by respected colleagues,” said Danny. “You assume it is okay, and you trust the people who did it. When we look at a paper that has been published in a refereed journal, we tend to take it at face value—we assume that what the authors say must make sense—otherwise it would not have been published.” And yet, to Danny, the experiment that Amos described sounded just incredibly stupid. After a person has pulled a red chip out of a bag, he is more likely than before to think the bag to be the one whose chips are mostly red: well, duh. What else is he going to think? Danny had had no exposure to the new research into the way people thought when they made decisions. “I had never thought much about thinking,” he said. To the extent that Danny thought of thinking, he thought of it as seeing things. But this research into the human mind bore no relationship to what he knew about what people actually did in real life. The eye was often deceived, systematically. So was the ear.

  The Gestalt psychologists he loved so much made entire careers out of fooling people with optical illusions: Even people who knew of the illusion remained fooled by it. Danny didn’t see why thinking should be any more trustworthy. To see that people were not intuitive statisticians—that their minds did not naturally gravitate to the “right” answer—you needed only to sit in on any statistics class at Hebrew University. The students did not naturally internalize the importance of base rates, for instance. They were as likely to draw a big conclusion from a small sample as from a big sample. Danny himself—the best teacher of statistics at Hebrew University!—had figured out, long after the fact, that he had failed to replicate whatever it was that he had discovered about Israeli kids from their taste in tent sizes because he had relied on sample sizes that were too small. That is, he had tested too few kids to get an accurate picture of the population. He had assumed, in other words, that a few poker chips revealed the true contents of the book bag as clearly as a few big handfuls, and so he never fully determined what was in the bag.

  In Danny’s view, people were not conservative Bayesians. They were not statisticians of any kind. They often leapt from little information to big conclusions. The theory of the mind as some kind of statistician was of course just a metaphor. But the metaphor, to Danny, felt wrong. “I knew I was a lousy intuitive statistician,” he said. “And I really didn’t think I was stupider than anyone else.”

  The psychologists in Ward Edwards’s lab were interesting to Danny in much the same way that the psychoanalysts at the Austen Riggs Center had been interesting to him after their patient had surprised them by killing herself. What interested him was their inability to face the evidence of their own folly. The experiment Amos had described was compelling only to someone already completely sold on the idea that people’s intuitive judgment approximated the correct answer—that they were, at least roughly, good Bayesian statisticians.

  Which was odd when you thought about it. Most real-life judgments did not offer probabilities as clean and knowable as the judgment of which book bag contained mostly red poker chips. The most you could hope to show with such experiments is that people were very poor intuitive statisticians—so poor they couldn’t even pick the book bag that offered them the most favorable odds. People who proved to be expert book bag pickers might still stumble when faced with judgments in which the probabilities were far more difficult to know—say, whether some foreign dictator did, or did not, possess weapons of mass destruction. Danny thought, This is what happens when people become attached to a theory. They fit the evidence to the theory rather than the theory to the evidence. They cease to see what’s right under their nose.

  Everywhere one turned, one found idiocies that were commonly accepted as truths only because they were embedded in a theory to which the scientists had yoked their careers. “Just think about it,” said Danny. “For decades psychologists thought that behavior is to be explained by learning, and they studied learning by looking at hungry rats learning to run to a goal box in a maze. That was the way it was done. Some people thought it was BS, but they were not smarter or more knowledgeable than the brilliant people who dedicated their career to what we now see as rubbish.”

  The people in this new field devoted to human decision making had become similarly blinded by their theory. Conservative Bayesians. The phrase was worse than meaningless. “It suggests people have the correct answer and they adulterate it—not any realistic psychological process that produces the judgments that people make,” said Danny. “What do people actually do in judging these probabilities?” Amos was a psychologist and yet the experiment he had just described, with apparent approval, or at least not obvious skepticism, had in it no psychology at all. “It felt like a math exercise,” said Danny. And so Danny did what every decent citizen of Hebrew University did when he heard something that sounded idiotic: He let Amos have it. “The phrase ‘I pushed him into the wall’ was often used, even for conversations among friends,” explained Danny later. “The idea that everyone is entitled to his/her opinion was a California thing—that’s not how we did things in Jerusalem.”

  By the end of the seminar, Danny must have sensed that Amos didn’t particularly want to argue with him anymore. Danny went home and boasted to his wife, Irah, that he had won an argument with a brash younger colleague. Or anyway, that’s how Irah remembered it. “This is, or was, an important aspect of Israeli discussions,” Danny said. “They were competitive.”

  In the History of Amos there aren’t a lot of examples of Amos losing an argument, and there are even fewer examples of Amos changing his mind. “You can never say he’s wrong, even if he’s wrong,” said his former student Zur Shapira. It wasn’t that Amos was rigid. In conversation he was freewheeling and fearless and open to new ideas—though perhaps more so if they did not openly conflict with his own. It was more that Amos had been right so often that, in any argument, “Amos is right” had become a useful assumption for all involved, Amos included. When asked for his memories of Amos, the first thing the Nobel Prize–winning Hebrew University economist Robert Aumann recalled was the one time he had surprised Amos with an idea. “I remember him saying, ‘I didn’t think of that,’” said Aumann. “And I remember it because there wasn’t much Amos hadn’t thought of.” />
  Danny later suspected that Amos actually hadn’t given much thought to the idea of the human mind as some kind of Bayesian statistician—the stuff with the book bags and poker chips wasn’t his line of research. “Amos probably never had a serious discussion with anyone about that paper,” said Danny. “And if he had, no one would have raised deep objections.” People were Bayesian in the same way that people were mathematicians. Most people could work out that seven times eight equals fifty-six: so what if some could not? Whatever errors they made were random. It wasn’t as if the human mind had some other way of doing math that led it to systematic error. If someone had asked Amos, “Do you think people are conservative Bayesians?,” he might have said something like, “Certainly not every person, but as a description of the average person, it will do.”

  In the spring of 1969, at least, Amos wasn’t overtly hostile to the reigning theories in social science. Unlike Danny, he wasn’t dismissive of theory. Theories for Amos were like mental pockets or briefcases, places to put the ideas you wanted to keep. Until you could replace a theory with a better theory—a theory that better predicted what actually happened—you didn’t chuck a theory out. Theories ordered knowledge, and allowed for better prediction. The best working theory in social science just then was that people were rational—or, at the very least, decent intuitive statisticians. They were good at interpreting new information, and at judging probabilities. They of course made mistakes, but their mistakes were a product of emotions, and the emotions were random, and so could be safely ignored.

  But that day something shifted inside Amos. He left Danny’s seminar in a state of mind unusual for him: doubt. After the seminar, he treated theories that he had more or less accepted as sound and plausible as objects of suspicion.

  His closest friends, who found the change in him shocking, assumed that Amos had always had his doubts. For instance, on occasion he spoke of a problem experienced by Israeli army officers when they led troops through the desert. He’d experienced the problem himself. In the desert, the human eye had trouble judging shapes and distances. It was difficult to navigate. “That was something that really troubled Amos,” said his friend Avishai Margalit. “In the army you had to navigate a lot. And he was very good at it. But it gave even him trouble. Traveling at night, you’d see a light in the distance: Was it close or far away? The water appeared as if it were a mile or less away—then it would take many hours to walk to it.” The Israeli soldier couldn’t protect his country if he didn’t know the country, but the country was difficult to know. The army gave him maps, but the maps were often useless. A sudden storm could drastically alter the desert landscape; one day the valley was here, the next day it was over there. Leading soldiers in the desert, Amos had become sensitive to the power of optical illusion: An optical illusion could kill. Israeli army commanders in the 1950s and 1960s who became disoriented or lost their way also lost the obedience of their soldiers, as the soldiers understood that there was a short step from being lost to being dead. Amos wondered: If human beings had been shaped so carefully for their environment, why was their perception of that environment still prone to error?

  There’d been other signs that Amos was less than wholly satisfied with the worldview of his fellow theorists in decision making. Just a few months before he’d spoken at Danny’s seminar, for instance, he had been called back into the army, on reserve duty, and sent to the Golan Heights. There was no fighting to be done just then. His job was simply to command a unit in the newly acquired territory, gaze down upon Syrian soldiers, and judge from their movements if they were planning to attack. Under his command was Izzy Katznelson, who would go on to become a professor of mathematics at Stanford University. Like Amos, Katznelson had been a boy in Jerusalem during the 1948 war of independence; scenes from that year were seared into his memory. He remembered Jews running into the houses of Arabs who had fled and stealing whatever they could. “I thought, those Arabs are people like me: They didn’t start the war and I didn’t start the war,” he said. He’d followed the noise inside one of the Arab houses and discovered yeshiva boys destroying the Arabs’ grand piano—for the wood. Katznelson and Amos didn’t talk about that; those were events best forgotten.

  What they talked about was Amos’s new curiosity about the way people judged the likelihood of uncertain events—for instance, the probability of an attack at that moment by the Syrian army. “We were standing looking at the Syrians,” recalled Katznelson. “He was talking about probabilities, and how do you assign probabilities. He was interested in how, in 1956 [moments before the Sinai campaign], the government had made some estimates that there wouldn’t be a war for five years, and other estimates that there wouldn’t be a war for at least ten years. What Amos was pushing is that probability was not a given. People do not know how to do it properly.”

  If, since his return to Israel, there had indeed been a growing pressure along some fault line inside Amos’s mind, the encounter with Danny had triggered the earthquake. Not long afterward, he bumped into Avishai Margalit. “I’m waiting in this corridor,” said Margalit. “And Amos comes to me, agitated, really. He started by dragging me into a room. He said, You won’t believe what happened to me. He tells me that he had given this talk and Danny had said, Brilliant talk, but I don’t believe a word of it. Something was really bothering him, and so I pressed him. He said, ‘It cannot be that judgment does not connect with perception. Thinking is not a separate act.’” The new studies being made about how people’s minds worked when rendering dispassionate judgments had ignored what was known about how the mind worked when it was doing other things. “What happened to Amos was serious,” said Danny. “He had a commitment to a view of the world in which Ward Edwards’s research made sense, and that afternoon he saw the appeal of another worldview in which that research looked silly.”

  After the seminar, Amos and Danny had a few lunches together but then headed off in separate directions. That summer Amos left for the United States, and Danny for England, to continue his studies of attention. He had all these ideas about the possible usefulness of his new work on attention. In tank warfare, for instance. In his research, Danny was now taking people and piping one stream of digits into their left ear and another stream of digits into their right ear, and testing how quickly they could switch their attention from one ear to the other, and also how well they blocked their minds to sounds they were meant to be ignoring. “In tank warfare, as in a Western shootout, the speed at which one can decide on a target and act on that decision makes the difference between life and death,” said Danny later. He might use his test to identify which tank commanders could best orient their senses at high speed—who among them might most quickly detect the relevance of a signal, and focus his attention upon it, before he got blown to bits.

  * * *

  By the fall of 1969 Amos and Danny had both returned to Hebrew University. During their joint waking hours, they could usually be found together. Danny was a morning person, and so anyone who wanted him alone could find him before lunch. Anyone who wanted time with Amos could secure it late at night. In the intervening time, they might be glimpsed disappearing behind the closed door of a seminar room they had commandeered. From the other side of the door you could sometimes hear them hollering at each other, but the most frequent sound to emerge was laughter. Whatever they were talking about, people deduced, must be extremely funny. And yet whatever they were talking about also felt intensely private: Other people were distinctly not invited into their conversation. If you put your ear to the door, you could just make out that the conversation was occurring in both Hebrew and English. They went back and forth—Amos, especially, always switched back to Hebrew when he became emotional.

  The students who once wondered why the two brightest stars of Hebrew University kept their distance from each other now wondered how two so radically different personalities could find common ground, much less become soul mates. “It was very diff
icult to imagine how this chemistry worked,” said Ditsa Kaffrey, a graduate student in psychology who studied with them both. Danny was a Holocaust kid; Amos was a swaggering Sabra—the slang term for a native Israeli. Danny was always sure he was wrong. Amos was always sure he was right. Amos was the life of every party; Danny didn’t go to the parties. Amos was loose and informal; even when he made a stab at informality, Danny felt as if he had descended from some formal place. With Amos you always just picked up where you left off, no matter how long it had been since you last saw him. With Danny there was always a sense you were starting over, even if you had been with him just yesterday. Amos was tone-deaf but would nevertheless sing Hebrew folk songs with great gusto. Danny was the sort of person who might be in possession of a lovely singing voice that he would never discover. Amos was a one-man wrecking ball for illogical arguments; when Danny heard an illogical argument, he asked, What might that be true of? Danny was a pessimist. Amos was not merely an optimist; Amos willed himself to be optimistic, because he had decided pessimism was stupid. When you are a pessimist and the bad thing happens, you live it twice, Amos liked to say. Once when you worry about it, and the second time when it happens. “They were very different people,” said a fellow Hebrew University professor. “Danny was always eager to please. He was irritable and short-tempered, but he wanted to please. Amos couldn’t understand why anyone would be eager to please. He understood courtesy, but eager to please—why??” Danny took everything so seriously; Amos turned much of life into a joke. When Hebrew University put Amos on its committee to evaluate all PhD candidates, Amos was appalled at what passed for a dissertation in the humanities. Instead of raising a formal objection, he merely said, “If this dissertation is good enough for its field, it’s good enough for me. Provided the student can divide fractions!”

 

‹ Prev