Book Read Free

Hidden Depths: The Story of Hypnosis

Page 46

by Waterfield, Robin


  One of the clever aspects of Broch's portrait of Marius Tatti is the idea that he allows people to think he is only expressing their own convictions. Similarly, as we have just seen in the section on hypnotic sales techniques, a clever salesman makes it seem as though his customers are doing the choosing, with no compulsion from him. If, as I suggested in Chapter 1, all hypnosis can really be seen as self-hypnosis, because you choose at some point to go along with it, so all entrancement is self-entrancement. As well as being one of the outstanding orators of the twentieth century, Hitler was a cynical exploiter of the power of rhetoric. He once said to one of his sidekicks: ‘What you say to the people collectively in that receptive state of fanatical abandonment remains in their mind like an order given to someone under hypnosis, which cannot be wiped out and resists all logical argument.’ Since rhetoric is not irresistible, even when combined with the suggestibility of crowds, it is clear that Hitler received the consent of the mass of the German people. Even crowds and nations cannot be hypnotized against their will.

  Hitler is an interesting case in relation to hypnosis. Many of the reports of people who actually met him face to face, when asked what their chief memory is of the man, mention the power of his eyes. He seemed to look right inside them, to be focused entirely on them to the exclusion of everyone else. This is certainly part of the technique of hypnotism. In his classic book The Mass Psychology of Fascism, maverick psychologist Wilhelm Reich attributes the attraction of fascism to middle-class Germans to the sexual repression found in their authoritarian, patriarchal family structure. Hitler's rhetoric appealed to their sense of sin and salvation, with his pseudo-mystical emphases on the holy trinity of blood, soil and state, on the racial soul, mother Germany, the will of the people, national history and honour, discipline and so on. Such slogans bypass the rational mind and stir the emotions. As we have seen in the context of advertising, this is exactly the way to get people to do something you want them to do.

  Part of Hitler's charisma was undoubtedly his identification of himself with the German nation. For instance, in his famous speech delivered at the Berlin Sportspalast on 26 September 1938, in which he attempted to justify on historical grounds the imminent invasion of Czechoslovakia, he portrayed the coming conflict as a personal confrontation between himself and Eduard Beneš, the Czech leader, and famously said: ‘My patience is now at an end!’ This was more than a monstrous piece of personal vanity; it was as if Hitler felt that he had transcended his individuality – that his soul was the soul of the nation. This seeming transcendence, this identification with something larger than themselves, is an important element in the attraction of leaders, both secular and religious.

  The Charisma of Gurus

  I once saw a TV documentary about the notorious guru Bhagwan Rajneesh. He had a marvellous speaking voice, and was incredibly erudite, but that was only part of his charm. He would hold out a piece of fruit and say: ‘You see, truth is like an orange.’ What on earth is that supposed to mean? If Rajneesh spelled it out at all, it was only in the vaguest terms. In other words, he gave his followers a lot of space for personal commitment or surrender to himself and his views – a lot of rope to hang themselves with. As we have seen in other instances of mind control, from selling to political tyranny, you have to allow the ‘prospect’ to convince himself.

  Paradoxically, though, at other times Rajneesh and all other gurus and tyrants come up with well-defined views. If a tyrant is by definition someone who thinks he knows better than you what is good for you, and seeks to impose this good on you, to convince you of the correctness of his views, then this definition also fits gurus. Are all gurus tyrants, then? There must be a borderline, which some cross and deserve to be called tyrants, while others remain safely on the near side. Perhaps the border is the degree to which they try to impose their views on you, and the amount of repayment they require in terms of either money or devotion. A tyrant in any sphere – political, religious or domestic – manipulates a system of rewards and punishments, dispensing guilt, shame and forgiveness. By these means, he demands and gains surrender of the will. The deeper the surrender, the greater the emotional investment in seeing the guru or tyrant as perfect, whatever the indications to the contrary.

  Surrender of the will is another way in which the profile of a typical guru coincides with that of a hypnotist, since there must be what I have called ‘inequality of will’ in the hypnotic relationship. Here are some of psychologist Anthony Storr's conclusions about gurus:

  Gurus tend to be élitist and anti-democratic, even if they pay lip-service to democracy. How could it be otherwise? Conviction of a special revelation must imply that the guru is a superior person who is not as other men are. Gurus attract disciples without acquiring friends. Once established, gurus must exercise authority, which again precludes making friends on equal terms … A guru's conviction of his own worth depends upon impressing people rather than on being loved. Gurus seldom discuss their ideas; they only impose them.

  A range of similar ideas are presented in a novelistic fashion in John Updike's marvellous satirical S., published in 1988 and clearly based on Rajneesh.

  It so happens that in my life I have met many charismatic people, and it is clear to me that, as Storr says, charisma is, at least in part, a product of conviction (and note that again Broch got his portrait of Tatti right). It is only when someone is absolutely certain that he knows, that he has or is conveying the truth, that he can be charismatic. Also, what that person knows or conveys has to be something larger than himself. Thus a politician or a religious fanatic is more likely to be charismatic than the world's greatest expert in earthworms, because the politician or the religious fanatic is or appears to be acting as a channel for a message of broad, even universal importance. Compared with conviction, eloquence pales into relative insignificance. Gurdjieff was never fluent in either Russian or English, but he succeeded in charming followers in both languages; I have met Japanese and Tibetan religious teachers whose attraction was enormous, despite the fact that some of them could hardly string two words together in English (apart from, perhaps, ‘Come: follow me!’). For all their linguistic difficulties, they exuded authority.

  Charisma is not only a characteristic of tyrants and fringe religious leaders. Some people are charismatic because of their personal integrity. Basil Hume (1923–99), leader of the Roman Catholic Church in England and Wales, was such a man. What made him charismatic was not just his conviction, his faith, but also his personal humility and honesty. But anyone who watches footage of one of Hitler's speeches in front of a mass audience, even if he knows little or no German, can feel the charisma exuding from the man, so that he could hold in his hand not just the immediate audience, but an entire nation. The difference is that Hume was not attempting to impose himself on anyone else; he was charismatic in spite of himself, so to speak. But any leader who uses his charisma to acquire personal power, and to bolster his self-assurance by the adulation of disciples, is a person to avoid.

  In the fields of entertainment and politics charisma is channelled into legitimate paths. The same may happen in religion too, as in the case of Basil Hume, but if the person's thinking is at all unusual or unorthodox, if he challenges the existing order, then he gathers followers and forms a sect. The motto of the establishment is: Trust in the order, not the individual. This is why charismatics often pose a challenge, and they and their followers become marginalized as a sect or cult. A good definition of a cult is given by authors Joel Kramer and Diana Alstad: cults are ‘groups with an authoritarian structure where the leader's power is not constrained by scripture, tradition or any “higher” authority … In a cult, absolute authority lies in a leader who has few if any external constraints.’ But it is important to note that all or most of the world's major religions began as sects.

  There is a great deal of fear about the religious cults that proliferated in the West especially in the 1970s, and there are organizations all over the world to pr
ovide information and even ‘deprogramming’ services to anxious parents. People enter cults of their own choice: they are not turned into zombies. But the presence of apparently free choice is not enough in itself to disprove the charge of mind control. After all, we choose to buy the advertised brand of coffee over the unadvertised one. And it is quite clear that in the early days of a person's membership of a cult his sense of self is broken down, and his fear of isolation played on, in ways that constitute a form of reprogramming. The fear of isolation or exile is very potent in all of us, and plays a part not just in minor sects, but in all mass meetings, religious and political, mainstream or marginal. Experiments have shown that individuals in crowds enter a kind of hypnoidal state, with increased and contagious suggestibility. Again, this falls short of (or, in a sense, exceeds) hypnotism proper – but again it is a culturally acceptable form of similar techniques. The state of an individual in a mob has best been described in a classic book by the nineteenth-century French social psychologist Gustave Le Bon:

  He is no longer conscious of his acts. In his case, as in the case of the hypnotized subject, at the same time that certain faculties are destroyed, others may be brought to a high degree of exaltation. Under the influence of a suggestion, he will undertake the accomplishment of certain acts with irresistible impetuosity. This impetuosity is the more irresistible in the case of crowds than in that of the hypnotized subject, from the fact that, the suggestion being the same for all the individuals of the crowd, it gains in strength by reciprocity.

  So precisely the same features that make tyrants attractive also constitute the charisma of gurus. They have conviction, they allow their followers to entrance themselves, they channel something larger than themselves, they are sure they know what's best for us, they are committed to the cause themselves, but if they lapse at all that only adds to their charm, making them suddenly human. They have sexual magnetism. It helps if they are good speakers; it helps if they are different from their followers (as many gurus in the West are Asian, and as Marius Tatti was an Italian in Austria). They manipulate the emotions, and perhaps especially the desire to belong, to be part of the crowd, and the wish to escape a humdrum existence. These are the elements of mind control, as employed by advertisers, salespeople, tyrants and gurus.

  The CIA and Brainwashing

  The Background

  In 1959 Richard Condon published his famous thriller The Manchurian Candidate. The plot of the book has an American soldier, Raymond Shaw, hypnotized by a Chinese psychologist during the Korean War so that he becomes a communist assassin. His reconditioning goes so deep that US psychologists are unable to unlock it. The trigger for him to become a covert assassin is that he is told to play a game of solitaire; when the queen of diamonds comes up during the game, he goes into a passive state and receives his instructions. Needless to say, afterwards he remembers nothing about his conditioning, and nothing about the murders he carries out. His chief controller, the communist agent in charge of him in the States, is said at one point to know that ‘Raymond had to do what he was told to do, that he could have no sense of right or wrong about it, nor suspect any possibility of the consciousness of guilt.’ That would indeed be powerful control.

  Actually (and not surprisingly, since it is a good plot), Condon's was not the first thriller along these lines. In 1945 Colgate University professor of psychology George Estabrooks, perhaps irritated by the military's refusal to allow him to help them create a hypnotized super-spy or assassin, created one in fiction instead, in his co-authored book Death in the Mind. In this novel the cunning Germans have hypnotized American servicemen to turn against their side, so that, for instance, a submarine commander sinks an Allied battleship. The suave, gung-ho hero, Johnny Evans, realizes that this is a powerful weapon, and decides to turn it against the Germans themselves. The book is appallingly dated now, but it does raise a halfway interesting dilemma: since you can't hypnotize someone without their consent, how can you hypnotize someone to commit treason? They must already have given their consent to you, which is to say that they believe your ideology, and there is therefore no need for them to be hypnotized to commit treason. This dilemma flummoxes Johnny Evans for a good half of the book, which is surprising, given his apparent intelligence on other occasions, because the solution is obvious: they can be hypnotized by you provided you convince them that you are on their side. But in real life, as I've said before, that would be extremely difficult.

  Outside the pages of fiction, the Germans did experiment during the Second World War with a combination of hypnotism and drugs (especially mescalin), but found it of limited use in controlling someone's mind. When the same idea occurred to Stanley Lovell, the head of the Office of Strategic Services (OSS) Research and Development department during the war, he approached several US psychologists, but they told him that hypnotism could not make people do things they would not normally do, and so he dropped the idea. Estabrooks disagreed. He had notoriously claimed in his 1943 non-fiction book Hypnotism that in all probability hypnotism had a number of military uses, including getting someone to commit treason against his country. In this book he paints fanciful scenarios of the damage a team of hypnotized spies, or a singly highly placed individual, could wreak on US defence systems – precisely the scenario he could later create only in fiction. Less melodramatically, he claims that by the use of a ‘disguised technique’ you can hypnotize someone – say, an enemy agent you have captured – against his will. You pretend you are taking his blood pressure, perhaps, and tell him that as part of the procedure you need to get him to relax – and then you take him through a standard induction until he is hypnotized. I cannot see that this would work. Nothing blocks induction better than suspicion, and a captured agent would naturally be highly suspicious of anyone attempting to take his blood pressure or whatever.

  Still, during and shortly after the Second World War, there was a certain amount of official interest in the potential of hypnotism. There were tests to see if soldiers could remember complex secret codes better when hypnotized than when normal, and in 1947 US army psychologist John Watkins got a soldier to hallucinate that a US officer was Japanese, and then persuaded the soldier to attack the officer (see p. 228). The soldier would presumably not under ordinary circumstances have attacked a superior officer, so Watkins was attempting to demonstrate Estabrooks's belief that the idea that no one will do anything under hypnosis that he would not do when awake is rubbish. On another occasion Watkins got a WAC to believe that she had learnt a number of military secrets, all of which she was persuaded to reveal under hypnosis. But when other psychologists attempted to reduplicate these and similar scenarios, they failed. It is quite likely that factors other than hypnosis were at work in Watkins's experiments. The subjects might well have been willing to comply with Watkins's suggestions, knowing that it was just an experiment. Moreover, they had been selected specifically for their ability to enter a deep trance state; the chances of getting just anyone to perform these acts of betrayal would be considerably less.

  Nevertheless, these army experiments were found impressive. A government-sponsored think tank, the Rand Corporation, concluded in 1949 that the possibility of communist governments employing hypnotism against the United States was a real threat. The report pointed in particular to experiments in inducing antisocial or abnormal behaviour in hypnotized subjects by Professor Alexandr Luria in Russia, and stressed that to outside observation a hypnotized person may well behave no differently from an unhypnotized person. It also suggested that the use of drugs such as sodium pentothal could speed up the process or deepen the trance. The seeds of paranoia were sown in the fertile soil of US government agencies, and they were reinforced by the show trial in Hungary, later that same year, of Cardinal Josef Mindszenty, and then by the behaviour of US prisoners during the Korean War (1951–3). Mindszenty was shown dazedly confessing to crimes he obviously did not commit; American servicemen were shown admitting US aggression and praising communism. Was
it drugs? Was it hypnosis? Just as in the later ‘space race’, US officials felt that the commies were ahead of the game, and they were eager to catch up and overtake. It was all part of what American journalist and author John Marks has called ‘the ancient desire to control enemies through magical spells and potions’.

  The CIA Gets Involved

  A senior CIA official, Morse Allen, was the first to take up the challenge. In 1951 he spent a total of four days in New York studying hypnotism with a stage hypnotist, who had impressed him with his boasts to be able to use hypnotism to get women into bed. Allen put his new skills to use by persuading CIA secretaries to steal classified files and hand them on to total strangers, or to fall asleep in a stranger's bedroom. By early 1954 he was ready for the ultimate experiment, in which he persuaded one hypnotized secretary to fire a gun at another, who was asleep. The gun, of course, was not loaded. The ‘assassin’ claimed amnesia afterwards, and protested that she would never shoot anyone. Had Allen succeeded in creating a hypno-assassin? As he himself was the first to admit, these experiments were more or less worthless as analogies for what might be possible in the field. ‘All he felt he had proved was that an impressionable young volunteer would accept a command from a legitimate authority figure to take an action she may have sensed would not end in tragedy. She presumably trusted the CIA enough as an institution, and Morse Allen as an individual, to believe he would not let her do anything wrong.’ Allen knew that it would take months of preparation and careful work to get an unwilling and hostile subject into any such condition. Later in 1954, this kind of research was moved away from Allen and his ‘Operation Artichoke’ team to Sidney Gottlieb, from M-K-Ultra, the well-funded department within the CIA which remained for many years central to research into brainwashing and mind control in general.

 

‹ Prev