But did anyone raise those red flags? Latell says he never once suspected she was a spy. “There were CIA officers of my rank, or close to my rank, who thought she was the best Cuban analyst there was,” he said. So he rationalized away his uneasiness. “I never trusted her, but for the wrong reasons, and that’s one of my great regrets. I was convinced that she was a terrible analyst on Cuba. Well, she was. Because she wasn’t working for us. She was working for Fidel. But I never connected the dots.”
Nor did anyone else. Montes had a younger brother named Tito, who was an FBI agent. He had no idea. Her sister was also an FBI agent, who in fact played a key role in exposing a ring of Cuban spies in Miami. She had no idea. Montes’s boyfriend worked for the Pentagon as well. His specialty, believe it or not, was Latin American intelligence. His job was to go up against spies like his girlfriend. He had no idea. When Montes was finally arrested, the chief of her section called her coworkers together and told them the news. People started crying in disbelief. The DIA had psychologists lined up to provide on-site counseling services. Her supervisor was devastated. None of them had any idea. In her cubicle, she had a quotation from Shakespeare’s Henry V taped to her wall at eye level—for all the world to see.
The king hath note
of all that they intend,
By interception
Which they dream not of.
Or, to put it a bit more plainly: The Queen of Cuba takes note of all that the U.S. intends, by means that all around her do not dream of.
The issue with spies is not that there is something brilliant about them. It is that there is something wrong with us.
4.
Over the course of his career, the psychologist Tim Levine has conducted hundreds of versions of the same simple experiment. He invites students to his laboratory and gives them a trivia test. What is the highest mountain in Asia? That kind of thing. If they answer the questions correctly, they win a cash prize.
To help them out, they are given a partner. Someone they’ve never met before, who is, unknown to them, working for Levine. There’s an instructor in the room named Rachel. Midway through the test, Rachel suddenly gets called away. She leaves and goes upstairs. Then the carefully scripted performance begins. The partner says, “I don’t know about you, but I could use the money. I think the answers were left right there.” He points to an envelope lying in plain sight on the desk. “It’s up to them whether they cheat or not,” Levine explains. In about 30 percent of cases, they do. “Then,” Levine goes on, “we interview them, asking, ‘Did you cheat?’”
The number of scholars around the world who study human deception is vast. There are more theories about why we lie, and how to detect those lies, than there are about the Kennedy assassination. In that crowded field, Levine stands out. He has carefully constructed a unified theory about deception.3 And at the core of that theory are the insights he gained from that first trivia-quiz study.
I watched videotapes of a dozen or so of those post-experiment interviews with Levine in his office at the University of Alabama at Birmingham. Here’s a typical one, featuring a slightly spaced-out young man. Let’s call him Philip.
Interviewer: All right, so…have you played Trivial Pursuit games…before?
Philip: Not very much, but I think I have.
Interviewer: In the current game did you find the questions difficult?
Philip: Yes, some were. I was like, “Well, what is that?”
Interviewer: If you would scale them one to ten, if one was easy and ten was difficult, where do you think you would put them?
Philip: I would put them [at] an eight.
Interviewer: An eight. Yeah, they’re pretty tricky.
Philip is then told that he and his partner did very well on the test. The interviewer asks him why.
Philip: Teamwork.
Interviewer: Teamwork?
Philip: Yeah.
Interviewer: OK, all right. Now, I called Rachel out of the room briefly. When she was gone, did you cheat?
Philip: I guess. No.
Philip slightly mumbles his answer. Then looks away.
Interviewer: Are you telling the truth?
Philip: Yes.
Interviewer: Okay. When I interview your partner and I ask her, what is she going to say?
At this point in the tape, there’s an uncomfortable silence, as if the student is trying to get his story straight. “He’s obviously thinking very hard,” Levine said.
Philip: No.
Interviewer: No?
Philip: Yeah.
Interviewer: OK, all right. Well, that’s all I need from you.
Is Philip telling the truth? Levine has shown the Philip videotape to hundreds of people and nearly every viewer correctly pegs Philip as a cheater. As the “partner” confirmed to Levine, Philip looked inside the answer-filled envelope the minute Rachel left the room. In his exit interview, he lied. And it’s obvious. “He has no conviction,” Levine said.
I felt the same thing. In fact, when Philip is asked, “Did you cheat?” and answers, “I guess. No,” I couldn’t contain myself, and I cried out, “Oh, he’s terrible.” Philip was looking away. He was nervous. He couldn’t keep a straight face. When the interviewer followed up with, “Are you telling the truth?” Philip actually paused, as if he had to think about it first.
He was easy. But the more tapes we looked at, the harder it got. Here is a second case. Let’s call him Lucas. He was handsome, articulate, confident.
Interviewer: I have to ask, when Rachel left the room, did any cheating occur?
Lucas: No.
Interviewer: No? You telling me the truth?
Lucas: Yes, I am.
Interviewer: When I interview your partner and I ask her the same question, what do you think she’s going to say?
Lucas: Same thing.
“Everybody believes him,” Levine said. I believed him. Lucas was lying.
Levine and I spent the better part of a morning watching his trivia-quiz videotapes. By the end, I was ready to throw up my hands. I had no idea what to make of anyone.
The point of Levine’s research was to try to answer one of the biggest puzzles in human psychology: why are we so bad at detecting lies? You’d think we’d be good at it. Logic says that it would be very useful for human beings to know when they are being deceived. Evolution, over many millions of years, should have favored people with the ability to pick up the subtle signs of deception. But it hasn’t.
In one iteration of his experiment, Levine divided his tapes in half: twenty-two liars and twenty-two truth-tellers. On average, the people he had watch all forty-four videos correctly identified the liars 56 percent of the time. Other psychologists have tried similar versions of the same experiment. The average for all of them? 54 percent. Just about everyone is terrible: police officers, judges, therapists—even CIA officers running big spy networks overseas. Everyone. Why?4
Tim Levine’s answer is called the “Truth-Default Theory,” or TDT.
Levine’s argument started with an insight that came from one of his graduate students, Hee Sun Park. It was right at the beginning of Levine’s research, when he was as baffled as the rest of his profession about why we are all so bad at something that, by rights, we should be good at.
“Her big insight, the first one, was that the 54-percent deception-accuracy figure was averaging across truths and lies,” Levine said. “You come to a very different understanding if you break out…how much people are right on truths, and how much people are right on lies.”
What he meant was this. If I tell you that your accuracy rate on Levine’s videos is right around 50 percent, the natural assumption is to think that you are just randomly guessing—that you have no idea what you are doing. But Park’s observation was that that’s not true. We’re much better than chance at correctly identifying the students who are telling the truth. But we’re much worse than chance at correctly identifying the students who are lying. We go through all thos
e videos, and we guess—“true, true, true”—which means we get most of the truthful interviews right, and most of the liars wrong. We have a default to truth: our operating assumption is that the people we are dealing with are honest.
Levine says his own experiment is an almost perfect illustration of this phenomenon. He invites people to play a trivia game for money. Suddenly the instructor is called out of the room. And she just happens to leave the answers to the test in plain view on her desk? Levine says that, logically, the subjects should roll their eyes at this point. These are college students. They’re not stupid. They’ve signed up for a psychological experiment. They’re given a “partner,” whom they’ve never met, who is egging them on to cheat. You would think that they might be even a little suspicious that things are not as they seem. But no!
“Sometimes, they catch that the instructor leaving the room might be a setup,” Levine says. “The thing they almost never catch is that their partners are fake.…So they think that there might be hidden agendas. They think it might be a setup because experiments are setups, right? But this nice person they are talking and chatting to? Oh no.” They never question it.
To snap out of truth-default mode requires what Levine calls a “trigger.” A trigger is not the same as a suspicion, or the first sliver of doubt. We fall out of truth-default mode only when the case against our initial assumption becomes definitive. We do not behave, in other words, like sober-minded scientists, slowly gathering evidence of the truth or falsity of something before reaching a conclusion. We do the opposite. We start by believing. And we stop believing only when our doubts and misgivings rise to the point where we can no longer explain them away.
This proposition sounds at first like the kind of hairsplitting that social scientists love to engage in. It is not. It’s a profound point that explains a lot of otherwise puzzling behavior.
Consider, for example, one of the most famous findings in all of psychology: Stanley Milgram’s obedience experiment. In 1961, Milgram recruited volunteers from New Haven to take part in what he said was a memory experiment. Each was met by a somber, imposing young man named John Williams, who explained that they were going to play the role of “teacher” in the experiment. Williams introduced them to another volunteer, a pleasant, middle-aged man named Mr. Wallace. Mr. Wallace, they were told, was to be the “learner.” He would sit in an adjoining room, wired to a complicated apparatus capable of delivering electrical shocks up to 450 volts. (If you’re curious about what 450 volts feels like, it’s just shy of the amount of electrical shock that leaves tissue damage.)
The teacher-volunteer was instructed to give the learner a series of memory tasks, and each time the learner failed, the volunteer was to punish him with an ever-greater electrical shock, in order to see whether the threat of punishment affected someone’s ability to perform memory tasks. As the shocks escalated, Wallace would cry out in pain, and ultimately he started hammering on the walls. But if the “teacher” wavered, the imposing instructor would urge them on:
“Please continue.”
“The experiment requires that you continue.”
“It is absolutely essential that you continue.”
“You have no other choice, you must go on.”
The reason the experiment is so famous is that virtually all of the volunteers complied. Sixty-five percent ended up administering the maximum dose to the hapless learner. In the wake of the Second World War—and the revelations about what German guards had been ordered to do in Nazi concentration camps—Milgram’s findings caused a sensation.
But to Levine, there’s a second lesson to the experiment. The volunteer shows up and meets the imposing young John Williams. He was actually a local high-school biology teacher, chosen, in Milgram’s words, because he was “technical-looking and dry, the type you would later see on television in connection with the space program.” Everything Williams said during the experiment had been memorized from a script written by Milgram himself.
“Mr. Wallace” was in fact a man named Jim McDonough. He worked for the railroad. Milgram liked him for the part of victim because he was “mild and submissive.” His cries of agony were taped and played over a loudspeaker. The experiment was a little amateur theatrical production. And the word amateur here is crucial. The Milgram experiment was not produced for a Broadway stage. Mr. Wallace, by Milgram’s own description, was a terrible actor. And everything about the experiment was, to put it mildly, more than a little far-fetched. The electric-shock machine didn’t actually give shocks. More than one participant saw the loudspeaker in the corner and wondered why Wallace’s cries were coming from there, not from behind the door to the room where Wallace was strapped in. And if the purpose of the experiment was to measure learning, why on earth did Williams spend the entire time with the teacher and not behind the door with the learner? Didn’t that make it obvious that what he really wanted to do was observe the person inflicting the pain, not the person receiving the pain? As hoaxes go, the Milgram experiment was pretty transparent. And just as with Levine’s trivia test, people fell for it. They defaulted to truth.
“I actually checked the death notices in the New Haven Register for at least two weeks after the experiment to see if I had been involved and a contributing factor in the death of the so-called learner—I was very relieved that his name did not appear,” one subject wrote to Milgram in a follow-up questionnaire. Another wrote, “Believe me, when no response came from Mr. Wallace with the stronger voltage I really believed the man was probably dead.” These are adults—not callow undergraduates—who were apparently convinced that a prestigious institution of higher learning would run a possibly lethal torture operation in one of its basements. “The experiment left such an effect on me,” another wrote, “that I spent the night in a cold sweat and nightmares because of the fear that I might have killed that man in the chair.”
But here’s the crucial detail. Milgram’s subjects weren’t hopelessly gullible. They had doubts—lots of doubts! In her fascinating history of the obedience experiments, Behind the Shock Machine, Gina Perry interviews a retired toolmaker named Joe Dimow, who was one of Milgram’s original subjects. “I thought, ‘This is bizarre,’” Dimow told Perry. Dimow became convinced that Wallace was faking it.
I said I didn’t know exactly what was going on, but I had my suspicions about it. I thought, “If I’m right in my suspicions, then he [the learner] is in collusion with them; he must be. And I’m not delivering shocks at all. He’s just hollering out every once in a while.”
But then Mr. Wallace came out of the locked room at the end of the experiment and put on a little act. He looked, Dimow remembers, “haggard” and emotional. “He came in with a handkerchief in his hand, wiping his face. He came up to me and he offered his hand to shake hands with me and he said, ‘I want to thank you for stopping it’.…When he came in, I thought, ‘Wow. Maybe it really was true.’” Dimow was pretty sure that he was being lied to. But all it took was for one of the liars to extend the pretense a little longer—look a little upset and mop his brow with a handkerchief—and Dimow folded his cards.
Just look at the full statistics from the Milgram experiment:
I fully believed the learner was getting painful shocks.
56.1
percent
Although I had some doubts, I believed the learner was probably getting the shocks.
24
percent
I just wasn’t sure whether the learner was getting the shocks or not.
6.1 percent
Although I had some doubts, I thought the learner was probably not getting the shocks.
11.4 percent
I was certain the learner was not getting the shocks.
2.4 percent
Over 40 percent of the volunteers picked up on something odd—something that suggested the experiment was not what it seemed. But those doubts just weren’t enough to trigger them out of truth-default. That is Levine’s point. You believe som
eone not because you have no doubts about them. Belief is not the absence of doubt. You believe someone because you don’t have enough doubts about them.
I’m going to come back to the distinction between some doubts and enough doubts, because I think it’s crucial. Just think about how many times you have criticized someone else, in hindsight, for their failure to spot a liar. You should have known. There were all kinds of red flags. You had doubts. Levine would say that’s the wrong way to think about the problem. The right question is: were there enough red flags to push you over the threshold of belief? If there weren’t, then by defaulting to truth you were only being human.
5.
Ana Belen Montes grew up in the affluent suburbs of Baltimore. Her father was a psychiatrist. She attended the University of Virginia, then received a master’s degree in foreign affairs from Johns Hopkins University. She was a passionate supporter of the Marxist Sandinista government in Nicaragua, which the U.S. government was then working to overthrow, and her activism attracted the attention of a recruiter for Cuban intelligence. In 1985 she made a secret visit to Havana. “Her handlers, with her unwitting assistance, assessed her vulnerabilities and exploited her psychological needs, ideology, and personal pathology to recruit her and keep her motivated to work for Havana,” the CIA concluded in a postmortem to her career. Her new compatriots encouraged her to apply for work in the U.S. intelligence community. That same year, she joined the DIA—and from there her ascent was swift.
Montes arrived at her office first thing in the morning, ate lunch at her desk, and kept to herself. She lived alone in a two-bedroom condo in the Cleveland Park neighborhood of Washington. She never married. In the course of his investigation, Scott Carmichael—the DIA counterintelligence officer—collected every adjective used by Montes’s coworkers to describe her. It is an impressive list: shy, quiet, aloof, cool, independent, self-reliant, standoffish, intelligent, serious, dedicated, focused, hardworking, sharp, quick, manipulative, venomous, unsociable, ambitious, charming, confident, businesslike, no-nonsense, assertive, deliberate, calm, mature, unflappable, capable, and competent.
Talking to Strangers Page 6