Book Read Free

Born Liars

Page 18

by Ian Leslie


  Admiral John Godfrey, Britain’s head of Naval Intelligence during World War II, noted that when presented with two items of contradictory information, Nazi leaders were always ‘inclined to believe the one that fits in best with their own previously formed conceptions’. Hitler’s officers would deliberately distort and even invent evidence to confirm what he already believed. Stalin famously refused to believe that the Nazis were about to invade in 1941, despite the massive build-up of German troops on his border. Saddam’s regime was beset by the weaknesses that Godfrey identified as ‘wishfulness’ and ‘yesmanship’. Everyone knew the legendary story of Riyadh Ibrahim, the former health minister. During a low point in the Iraq–Iran war Saddam asked his ministers for candid advice. Ibrahim had the temerity to suggest that Saddam might consider stepping down temporarily, and resume his presidency once a peace deal was brokered. Saddam had him carted away immediately. The next day, pieces of the minister’s chopped-up body were delivered to his wife. In the delicate words of one of Ibrahim’s former colleagues, ‘this powerfully concentrated the minds of ministers’. Twenty years after it happened this story continued to haunt those officials who considered telling Saddam anything he might not want to hear.

  Nevertheless, at least one of them did so. Four years after the first Gulf War, a senior Republican Guard dared to challenge the regime’s military orthodoxy. This is how he described the moment to Woods:

  There was a big military science lecture and conference. Saddam attended along with most of the military leadership. Three of us were scheduled to make presentations. The central idea of my presentation was simple . . . our capabilities were weakening. The Americans’ technological capabilities were growing . . . By 1995 we knew we were moving towards conflict and lacked the capability. I said we should change the picture of the whole Iraqi military. We need to change from a heavy mechanised force to a light infantry force. We should make simple light infantry formations and start fighting right away in a guerrilla war. Like in Vietnam – fight and withdraw. I was the first presenter and Saddam became very angry at my thesis. I was singled out as being a mental hostage of American thinking . . . Saddam was so mad at my presentation that the other presenters who were going to say something similar became too scared and changed their reports . . . It was around this time that everyone started lying.

  Saddam had created a punitive environment; one in which the rational survival strategy for everyone was to lie about everything.20 In the end he and his country became victims of the insidious symbiosis between deception and self-deception.

  * * *

  Robert Trivers alighted on evolutionary biology after studying, successively, mathematics, law and American history. Upon falling in love with Darwinian theory, he realised that nobody had yet arrived at a convincing evolutionary explanation of human social behaviour and set about remedying the oversight. In the early 1970s, while still a graduate student, he published a series of five brilliant papers that included radical new theories of altruism, parenting, and self-deception.

  Trivers’s theory was that human beings developed the capacity to deceive themselves in order to become better deceivers, and thus better competitors, in the Machiavellian arms-race of deception and counter-deception. Those who wanted to persuade a potential mate or ally of their good intentions would be better at it if they could deceive themselves without ‘leakage’ of knowledge or intent; and the most efficient way to simulate truth-telling would be to erase internal awareness of the deception. The best liars would be those who were better at lying to themselves because they would actually believe their own deceptions when they made them. They would be more likely to survive and pass on their genes; hence our gift for self-deception.

  Trivers’s theory is very plausible, and if it seems a little pat – it is, after all, a truism that the best liars are people who believe their lies – it offers a useful way of thinking about the relationship between deception and self-deception in organisations. Each begets the other. People tell their superiors what they think they want to hear, so as to avoid the perception of hesitancy, a lack of conviction, or disloyalty. The ones that are best at doing this get promoted. Their superiors become even more confident in their own convictions. As everyone in the organisation cleaves to the same script, everyone starts to believe it, even when it contradicts the facts. In organisations dominated by an over-confident and over-powerful leader, a virtually seamless alternative reality is formed.

  There is an important distinction between illusions and delusions. The normal person might be an unrealistic optimist, but he or she is far from oblivious to reality. Most of us are capable of responding to new facts that dispel our illusions – even if it takes us longer to do so than it should. A person, or an organisation, can be said to be deluded when they manage to exclude all new information, however pertinent or dramatic, that threatens to undermine a cherished illusion. In Shelley Taylor’s succinct summary, ‘Delusions are false beliefs that persist despite the facts. Illusions accommodate them, though perhaps reluctantly.’

  The historian Christopher Andrew has argued that one of the main purposes of an intelligence service in a one-party state is to reinforce the regime’s misperceptions of the outside world. In Saddam’s Iraq, every part of the national security apparatus conspired to shut out information that conflicted with Saddam’s version of reality, thus creating a vortex of deception and self-deception. The tragedy is that the West became sucked into it too.

  * * *

  When it came to WMD, Saddam was attempting to juggle two conflicting objectives. On the one hand, he wanted potential rivals within his country and from the region to think of him as the wrong guy to tangle with. He often reminded his advisers that Iraq lived in a dangerous neighbourhood where even the perception of vulnerability drew predators; chemical and nuclear weapons were the equivalent of a BEWARE THE DOG sticker on Iraq’s front door. On the other hand, he knew that as long as the West believed he had WMD, he would endure crippling sanctions and face the threat of American military action. So he allowed in the UN inspectors while signalling to his neighbours that he wasn’t really doing as the Americans wished. But of course, foreign leaders noticed him make the same hints and boasts, which strengthened their suspicion that Saddam was hiding something.

  Saddam Hussein was far from stupid or hot-headed. He was a talented reader of other people’s intentions and emotions; as Kevin Woods points out, he had to be in order to survive for so long as head of a country riven by tribal, familial and sectarian rivalries, and constantly threatened by external enemies.21 Saddam’s ability to read other people, however, deteriorated precipitously the further his interlocutors were from home. By necessity, he was a pronounced Cynic (rather than a Truster), and not nearly as good as he thought at divining the motivations of foreign leaders, especially those outside of the Middle East. Like the participants in Ellen Langer’s card game, he too readily allowed his confidence in an area of competence to flow into one of incompetence.

  By the time Saddam realised that the Americans were serious about invading, it was too late. As 2002 drew to a close, he accepted that UN inspectors should be given full access and ordered his officers to remove all traces of previous WMD programmes. When the Americans tapped into this frantic clearing-up activity, they viewed it through the prism of a decade of deceit and assumed it was an effort to hide an ongoing programme. When they arrived in Iraq and turned up nothing, they were amazed. (So were Iraqi officials, who had assumed that if Bush didn’t find WMD, he would plant them.)

  Perhaps part of the reason that Western intelligence agencies were fooled is that they picked up on the genuine beliefs of many in Saddam’s government that the programme persisted. Woods asked Iraq’s head of research into WMD what he thought was a straightforward question: did he ever think it possible there was a secret supply of WMD he didn’t know about? To Woods’s surprise, the man nodded. He explained that the regime was too compartmentalised an
d secretive for any one person – apart from Saddam, perhaps – to know everything. But the main reason he thought that Iraq might possess WMD was because ‘Your president said it was so.’ He and other senior Iraqi officials couldn’t fathom that America would bring itself to the brink of war unless it had very good reasons to believe in the existence of Iraq’s illegal weapons. So they came to believe it themselves. After all, the CIA wouldn’t get something like this wrong.

  Saddam had constructed a hall of mirrors into which all parties were drawn, and within which each saw only what they wanted to see. In the year leading up to war, Saddam loudly declared that he had no WMD, and that he would never back down in the face of American threats. In a collective and calamitous failure of mind-reading, Western leaders and diplomats assumed that the first was a lie and the second a bluff. But, for once, Saddam was telling the truth. As Woods says, Saddam may have been deluded, but ‘so too was the United States in thinking he was not deluded.’

  Keeping Politicians Honest

  One of the underrated virtues of a liberal democracy is that it militates against delusion in its leaders. Democratic leaders are subject to critical oppositions, and to a free press which helps keep their feet on the ground and punctures information bubbles. Dictators, who have no such checks and hear only good things about themselves, are much more likely to cross the frontier that separates a healthy margin of self-deception from dangerous delusion. Of course, democratic administrations are capable of collective acts of self-deception too; some argue that this is what the US and UK governments succumbed to during the run-up to the Iraq war. But the chance of this happening in free societies is lower than in authoritarian regimes. Perhaps the greater problem for mature democracies is that their electorates and media have developed a distorted ideal of honesty. By striving to eradicate all forms of deception from our public life, we may have only ended up fooling ourselves.

  For most of the twentieth century, the relationship between the press and its ruling classes was governed by a discreet decorum. It was accepted that politicians had private lives that might contrast with their public image, which was understood as a kind of benign deceit. Then, partly because of political scandals like Watergate and partly because of our growing thirst for and access to information, a culture of radical honesty developed. We demanded, quite rightly, to know more about the activities of our elected rulers; more dubiously, we started to insist that a politician’s public mask was ripped off so that we might see the ‘real’ person underneath and ensure that their every thought and action was consistent, and seen to be consistent. The end result isn’t, as one might have hoped, a better class of political rulers and a healthier polity; we are more dissatisfied with our politicians than ever before, and more likely than ever to think of them as deceitful.

  Keeping our politicians honest seems to have become confused with demanding honest politicians. Throughout this book I’ve tried to show that deceitfulness is a natural part of being human, and that facile distinctions between ‘honest’ people and ‘liars’ merely obscure subtler truths about our conduct in different environments. The mistake we make too often is in viewing honesty solely as a trait – something that individuals have or don’t have – rather than as a state: something that people adapt to under conducive conditions. Thus we routinely denounce all politicians as exceptionally dishonest, but it takes only a moment’s thought to realise how unlikely it is that, by coincidence or otherwise, only sociopathic liars get elected to public office.

  If we want a more honest politics, we will have to create conditions that steer our politicians towards honesty. First, we need to be more honest about ourselves. As we’ve seen, most of us like to think of ourselves as a little more unselfish, virtuous and honest than we actually are. This is as true of us as voters as it is of us as colleagues or friends; polls find time and again that people are willing to pay more taxes to improve public services, yet people usually vote for the party that will lower their tax bill. Politicians understand that we are inveterate self-deceivers. They deliberately appeal to the instincts that voters don’t admit to, as well as those to which they do – and we hate them for it. It’s a society-wide example of what Freud called ‘displacement activity’. We unload uncertainties about our own probity on to a group that everyone feels comfortable in pillorying.

  Second, we need to get used to accepting unsettling political truths. At the moment, it’s as if we don’t actually want our politicians to be honest. We get uncomfortable or angry when they change their minds, or confess that some problems are insoluble, or exhibit anything other than a pretence of superhuman command. Neither do we like it when they say anything that isn’t expected. Sometimes, when presidential candidates or British cabinet ministers find themselves in hot water over an unscripted remark, it’s because the remark is stupid or offensive; more often, it’s because it’s true. In fact, the very definition of a gaffe, according to the American journalist Michael Kinsley, is an occurrence of a politician telling the truth in public. Like the children in School B, our politicians have learned that they will get flayed whatever they say – so they may as well lie, or at least avoid giving you an honest answer. Perhaps we don’t need to give our politicians a moral education so much as a better set of choices.

  Lies We Live By: Part One

  The Medicine of Deceit

  The last two decades have seen the emergence of an innovative new treatment for heart disease. Laser surgery has been carried out across the world on thousands of patients suffering from severe angina and related conditions. The NHS doesn’t offer it, and even in America, where it originated, it is considered a treatment of last resort. But the doctors who practise laser heart surgery eulogise it, telling stories of patients who have gained relief from what seemed to be incurable suffering.

  Dr William O’Neill, of William Beaumont Hospital in Michigan, told a reporter from the Associated Press that ‘in twenty years of medicine, I have never seen anything that gives as much symptomatic benefit for patients.’ One of O’Neill’s patients was Frank Warren, an auto-worker in his forties. Warren had suffered from heart problems for years. He always felt short of energy, and the slightest exercise brought on a burning sensation. Sometimes the pain came when he was resting. Over the years he had had eight operations; none of them helped. After undergoing laser surgery, however, he experienced immediate results: ‘I felt a warmth in my face. My colour seemed to change.’ One year later, Warren finished a marathon in a very respectable four hours, twenty-nine minutes.

  When somebody suffers from angina it’s because their arteries are clogged, stymieing the flow of blood and therefore oxygen to the heart. As a result they get short of breath very quickly, find it hard to exercise, suffer from debilitating pain, and live with the constant risk of a fatal heart attack. In a routine coronary bypass, the surgeon takes healthy arteries from elsewhere in the patient’s body and weaves them into the heart, so that blood can be diverted around the blocked vessels.

  During laser treatment, an incision is made in the side of the chest between two ribs. The outer layer of the heart (the pericardium) is pulled back to expose the heart muscle itself (the myocardium). But instead of grafting on a new artery, the surgeon pierces the heart muscle. She takes aim at the patient’s heart with a laser gun, which is attached to an expensive, impressive-looking machine. A dot of red light tells her where the laser will hit. She pulls the trigger (actually, a foot-pedal), fires the laser, and blasts a tiny, pinhead-sized hole in the myocardium. This is repeated twenty or more times. Drilling holes in the heart might seem like an odd thing to do when you’re trying to save somebody’s life, but the idea is that by opening up new channels the surgeon creates the equivalent of a new artery, allowing blood to flow to the heart’s oxygen-starved flesh.

  Surgeons who practise laser surgery don’t have to rely on their own experiences to support their confidence in it. By the late 1990s a number of large-scale trials had
been carried out involving patients with serious heart conditions (the term often used is ‘end-stage’ conditions). The results were remarkable, with success rates in the range of seventy-five to ninety per cent, comparing very favourably with more established heart surgery procedures.

  There is only one problem with this wonder treatment: no-one quite knows how it works. The theory behind it is plausible, and the results are undeniable. But the bloodlines close up within hours of being opened, and there is no evidence that the blood flow to the heart muscle actually increases.

  Dr Martin Leon, a professor of medicine at Columbia University in New York, is one of the world’s leading cardiologists. He knew that laser heart surgery had performed well in tests versus other treatments. But it hadn’t, he noted, been tested against no treatment at all. In 2005, Leon oversaw a study of three hundred patients, in their fifties and sixties, with heart conditions. They were very sick: most had previously undergone heart surgery at least once, and all were suffering from continuing problems. The patients were divided into three groups: high-dose (twenty to twenty-five laser punctures), low-dose (ten to fifteen), and a mock procedure that merely simulated laser treatment; the patients were shown the machine and had its workings explained. Then they were heavily sedated, blindfolded, and played music to create an effect of ‘sensory isolation’. When they awoke, those who had had the fictional surgery were told it had gone well.

  Twelve months later, and most of the patients who had undergone actual laser surgery were in much better shape. They revelled in a rediscovered capacity for physical exercise. They reported that their heart pains had receded and that they were feeling healthier and fitter than they had done in years. A battery of objective tests showed they weren’t making this up. But the strange thing was that patients in the third group – the sham group – were also rejuvenated. Despite not having had any surgery, or any physical treatment at all, they too felt years younger and full of beans, and the frequency of their angina pains declined. In fact, in terms of the effects on patients, there was no significant difference between all three treatments.

 

‹ Prev