by Matthew Syed
Most failure is not like that. Most failure can be given a makeover. You can latch on to any number of justifications: “it was a one-off,” “it was a unique case,” “we did everything we could.” You can selectively cite statistics that justify your case, while ignoring the statistics that don’t. You can find new justifications that did not even occur to you at the time, and which you would probably have dismissed until they—thankfully, conveniently—came to your rescue.
Psychologists often point out that self-justification is not entirely without benefits. It stops us from agonizing over every decision, questioning every judgment, staying awake at night wondering if getting married/taking that job/going on that course was the right thing to do. The problem, however, is when this morphs into mindless self-justification: when we spin automatically; when we reframe wantonly; when failure is so threatening we can no longer learn from it.
And this takes us back to a question that has been lingering since the opening section of this book when we examined the scale of deaths from preventable medical error. How could doctors and nurses preside over such suffering? How could these honorable people cover up their mistakes in such a brazen way? How could they live with themselves?
Our exploration of cognitive dissonance finally provides us with the answer. It is precisely in order to live with themselves, and the fact that they have harmed patients, that doctors and nurses reframe their errors in the first place. This protects their sense of professional self-worth and morally justifies the practice of nondisclosure. After all, why disclose an error if there wasn’t really an error, after all?
And this pierces to the very heart of the distinction between internal and external deception. If nurses and doctors were fully aware of the fatal errors they were making, nondisclosure would add to their emotional anguish. They would know that they had harmed a patient, know that they had deliberately deceived patients, and know that they had made mistakes more likely in the future.
It is hardly likely that health professionals would engage in this kind of deceit on such a large scale. The vast majority of doctors and nurses are committed and decent people. Indeed, many are heroic in their care for their patients. And therein lies the tragedy of cognitive dissonance. It allows good, motivated people to harm those they are working to protect, not just once, but again and again.
To put it a slightly different way, the most effective cover-ups are perpetrated not by those who are covering their backs, but by those who don’t even realize that they have anything to hide.
In his book Medical Errors and Medical Narcissism, John Banja, professor of medical ethics at Emory University, looked in detail at the reframing techniques used by clinicians.2 The words may be different, but the underlying semantics are uncannily similar to those used by prosecuting lawyers when faced with DNA exonerations. They are a way of taking the sting out of mistakes and of justifying nondisclosure:
“Well, we did our best. These things happen.”
“Why disclose the error? The patient was going to die anyway.”
“Telling the family about the error will only make them feel worse.”
“It was the patient’s fault. If he wasn’t so (obese, sick, etc.), this error wouldn’t have caused so much harm.”
“If we’re not totally and absolutely certain the error caused the harm, we don’t have to tell.”
Banja writes: “Health professionals are known to be immensely clever at covering up or drawing attention away from an error by the language they use. There is good reason to believe that their facility with linguistic subterfuge is cultivated during their residency years or on special training.”3
A landmark three-year investigation published in the The Social Science and Medical Journal revealed similar findings, namely that physicians cope with their errors through a process of denial. They “block mistakes from entering conscious thought” and “narrow the definition of a mistake so that they effectively disappear, or are seen as inconsequential.”*
The same conclusion is also revealed in direct surveys of health professionals. A study in 2004, for example, polled medical practitioners at conferences in Dallas, Kansas City, Richmond, and Columbus. They were asked whether “rationalizations that excuse medical errors (and excuse the need to disclose and report those errors) are common in hospitals.” An astonishing 86 percent of respondents, who actually work within the health care system, either agreed or strongly agreed.4
Consider again the doctors who operated on Elaine Bromiley, the case explored at the start of the book. At the time their behavior may have seemed like a blatant attempt to avoid the external repercussions of their mistake, like a reprimand from management or legal action from the patient’s family. But we can now see that it also bears the classic hallmarks of dissonance-reduction. The doctors didn’t want to admit their mistake to themselves.
They had spent years training to reach high standards of performance. It was a tough initiation. As with most good doctors, health care was more than a job, it was a vocation. Their self-esteem was bound up with their clinical competence. They came into medicine to reduce suffering, not to increase it. And now they were confronted with having killed a healthy thirty-seven-year-old woman.
Just think of how desperate they would have been to reframe the fatality as a mere “complication.” Think, too, of researcher Nancy Berlinger’s investigation into the way doctors report errors. She wrote of “the depths of physicians’ resistance to disclosure and the lengths to which some will go to justify the habit of nondisclosure—it was only a technical error, things just happen . . .”
This research may have looked like an indictment of health care culture, but we can now see that this is a painfully accurate description of the effects of cognitive dissonance. Self-justification, the desire to protect one’s self-image, has the potential to afflict us all. The health care and criminal justice systems are but two strands in a wider story that represents a clear and present danger to our future progress.
II
Let us return briefly to the Iraq War, for it will allow us to drill deeper into the psychological mechanisms associated with cognitive dissonance. To avoid controversy, we will not take a stand on whether the invasion was right or wrong.* Instead, we will look at the intellectual contortions of the leaders who took us to war. This will provide a glimpse at how the reframing exercise can take on a life of its own.
Remember that for a man like Tony Blair, this was the biggest decision of his political life. He was not just a voter who supported the war, he was a prime minister who had gambled his career on the conflict, committing troops on the ground, of whom 179 would lose their lives. His political reputation, to a large extent, hinged on the decision. If anyone would be motivated to defend it, he would.
So, let us explore the contortions.
On September 24, 2002, before the conflict, Blair made a speech to the House of Commons about Saddam Hussein’s weapons of mass destruction: “His WMD program is active, detailed and growing,” he said. “Saddam has continued to produce them, . . . he has existing and active military plans for the use of chemical and biological weapons, which could be activated within 45 minutes . . .”
Of course, within months of the invasion the problem with these claims became clear. First of all Saddam’s troops did not use these supposedly devastating weapons to repel the advancing Western forces. Further, the search for WMD in the immediate aftermath of Saddam’s fall drew a rather conspicuous blank.
But as social psychologists Jeff Stone and Nicholas Fernandez of the University of Arizona detail in a powerful essay on the Iraq conflict,5 Blair parried. In a speech to the House of Commons, he said: “There are literally thousands of sites . . . but it is only now that the Iraq Survey Group has been put together that a dedicated team of people, which includes former UN inspectors, scientists and experts, will be able to go in and do the job properly . . . I have no doub
t that they will find the clearest possible evidence of Saddam’s weapons of mass destruction.”
So, to Blair, the lack of WMD did not show that these weapons were not actually there, but rather provided evidence that inspectors hadn’t been looking hard enough. Note another thing, too. The absence of WMD had strengthened his conviction that they would be found.
This is a classic response predicted by cognitive dissonance: we tend to become more entrenched in our beliefs (like those in the capital punishment experiment, whose views became more extreme after reading evidence that challenged their views and the members of the cult who became more convinced of the truth of their beliefs after the apocalyptic prophecy failed). “I have no doubt that they will find the clearest possible evidence of Saddam’s weapons of mass destruction [my italics],” Blair said.
Twelve months later, when the Iraq survey group, Blair’s inspectors of choice, couldn’t find the weapons either, he changed tack again. Speaking to the House of Commons Liaison Committee, he said: “I have to accept we haven’t found them and we may never find them, we don’t know what has happened to them . . . They could have been removed, they could have been hidden, they could have been destroyed.”
The evidential dance was now at full tilt. The lack of evidence for WMD in Iraq, according to Blair, was no longer because troops had not had enough time to find them, or because of the inadequacy of the inspectors: rather, it was because the Iraqi troops had spirited them out of existence.
But this stance, within a few months, became untenable too. As the search continued in a state of near desperation, it became crystal clear that not only were there no WMD, but there were no remnants of them, either. Iraqi troops could not have spirited them away. So Blair parried again. In a set-piece speech at the Labour Party Conference, he finally accepted that Saddam did not have chemical or biological weapons, but argued that the decision to go to war was right anyway.
“The problem is that I can apologize for the information that turned out to be wrong, but I can’t, sincerely at least, apologize for removing Saddam,” he said. “The world is a better place with Saddam in prison.”
These contortions continued for the next ten years. At times Blair struggled to remember their precise chronology, and appeared strained when trying to keep track of them under questioning. When the so-called Islamic State began a major offensive in Iraq in 2014, and the country was on the brink of a Civil War—which some commentators linked to the 2003 conflict—Blair found another avenue of justification.
He pointed to the policy of nonintervention in Syria, which had descended into its own bloody civil war. In an article written for his personal website, he said: “In Syria we called for the regime to change, took no action and it is in the worst state of all.”6 In other words, “if things look bad in Iraq now, they would have been even more awful if we had not invaded in 2003.”
The most important thing, for our purposes, is not whether Blair was right or wrong on this point. The vital thing to realize is that had non-intervention in Syria achieved the most heavenly outcome (peace, happiness, doves circling above), Blair would likely still have found a way to interpret that evidence through the lens of the rightness of his decision to invade Iraq. In fact, he would probably have become more convinced of its rightness, not less so. That is the domino effect of cognitive dissonance. A similar domino effect can be seen in the behavior of George W. Bush. Almost all of Bush’s claims in the buildup to war and its aftermath turned out to be mistaken. He was wrong that Saddam had WMD and wrong that the Iraqi leader had links with Al Qaeda. When he stood under a banner proclaiming “Mission Accomplished” six weeks after the invasion began and stated that “major combat operations in Iraq have ended,” he was wrong about that, too.
But he seemed able to effortlessly reframe any inconvenient evidence. As Aronson and Tarvis put it in their book Mistakes Were Made (but Not by Me):
Bush [responded by finding] new justifications for the war: getting rid of a “very bad guy,” fighting terrorists, promoting peace in the Middle East . . . increasing American security, and finishing the task [our troops] gave their lives for . . . In 2006, with Iraq sliding into civil war . . . Bush said to a delegation of conservative columnists: “I’ve never been more convinced that the decisions I made are the right decisions.”
If it is intolerable to change your mind, if no conceivable evidence will permit you to admit your mistake, if the threat to ego is so severe that the reframing process has taken on a life of its own, you are effectively in a closed loop. If there are lessons to be learned, it has become impossible to acknowledge them, let alone engage with them.
This is not intended as an argument against Blair or Bush or their followers. Issues of war and peace are complex and there are always arguments on both sides (we will look at how to learn in situations of complexity in Part 3). No political party has a monopoly on making mistakes, either. But what this does show is that intelligent people are not immune to the effects of cognitive dissonance.
This is important because we often suppose that bright people are the most likely to reach the soundest judgments. We associate intelligence, however defined, as the best way of reaching truth. In reality, however, intelligence is often deployed in the service of dissonance-reduction. Indeed, sometimes the most prestigious thinkers are the most adept at deploying the techniques of reframing, often in such subtle ways that it is difficult for us, them, or anyone else to notice.
• • •
In December 2012 I briefly interviewed Tony Blair. Our paths had crossed a few times before, and for the first few minutes we chatted about what he had been doing since leaving Downing Street in 2007. He was talkative and, as always, courteous. He was also somewhat strained: public disapproval for the Iraq War had been steadily growing.
After a minute or two I asked the question I was most keen to ask. Given what he now knew, with the thousands of deaths that had occurred, the absence of WMD, and the huge upheaval, did he still think that his decision over Iraq was the right one. “Decisions of war and peace are controversial, and I would be lying if I said the decision was easy,” he said. “But do I think I made the right decision? Yes, I am more sure than I have ever been.”
A few months later, I met with Alastair Campbell, Blair’s former head of communications and one of his most trusted lieutenants. We talked at length about the phenomenon of cognitive dissonance. Campbell was characteristically thoughtful, talking about the buildup to war and the pressure-cooker atmosphere in Downing Street.
I asked him if he still backed the decision to go to war. “There are times when I wonder about it, particularly when news comes through of more deaths,” he said. “But on balance, I think we were right to get rid of Saddam.” Do you think it is possible that you could ever change your mind, I asked? “It would be difficult, given what we have been through, but it’s not impossible,” he said.
And what about Tony, I asked. “Think about what it would mean if he admitted he was wrong,” Campbell replied. “It would overshadow everything he had ever worked for. It would taint his achievements. Tony is a rational and strong-minded guy, but I don’t think he would be able to admit that Iraq was a mistake. It would be too devastating, even for him.”
III
In November 2010, a group of renowned economists, high-profile intellectuals, and business leaders wrote an open letter to Ben Bernanke, then chairman of the Federal Reserve.7 The bank had just announced its second tranche of so-called quantitative easing. They proposed to purchase bonds with newly printed money, introducing, over time, an additional $600 billion into the U.S. economy.
The signatories were worried about this policy. In fact, they thought it might prove disastrous. In the letter, which was published in the Wall Street Journal, they argued that the plan was not “necessary or advisable under current circumstances” and that it would not “achieve the Fed’s objective of promoting employm
ent.” They concluded that it should be “reconsidered and discontinued.”
The signatories included some of the most celebrated individuals in their fields, including Michael J. Boskin, the former chairman of the president’s Council of Economic Advisers; Seth Klarman, the billionaire founder of the Baupost Group, an investment company; John Taylor, professor of economics at Stanford University; Paul Singer, the billionaire founder of Elliott Management Corporation; and Niall Ferguson, the renowned professor of history at Harvard University.
Perhaps their greatest concern was over inflation, the fear that printing money would lead to runaway price increases. This is a worry often associated with economists within the “monetarist” school of policymaking. The signatories warned that quantitative easing would risk “currency debasement and inflation” and “distort financial markets.”
The letter, which was also published as a full-page ad in the New York Times, made headlines around the world. The fears were well expressed, well argued, and the prediction of trouble ahead for the U.S. economy caused a minor tremor in financial markets.
But what actually happened? Did the prediction turn out to be accurate? Did inflation soar out of control?
At the time the letter was published the inflation rate was 1.5 percent. Four years later, in December 2014, inflation had not merely remained at historically low levels, it had actually fallen. According to the Consumer Prices Index published monthly by the Bureau of Labor Statistics, inflation was at 0.8 percent. By January 2015, just before these words were written, it had fallen into negative territory. Inflation had become deflation. The headline rate in the United States was minus 0.1 percent.
It is probably fair to say, then, that the predictions did not materialize quite as expected. In fact, the U.S. economy seemed to go in a different direction altogether. It is not just inflation that failed to balloon out of control. Jobs were also growing, despite the warning by the signatories that they didn’t think the policy would “promote employment.” By autumn 2014 the U.S. economy was creating jobs at the fastest pace since 2005 and unemployment had dropped from 9.8 percent to 6.1 percent. American companies were also faring well, reporting low debts, high levels of cash, and record profits.8