Cults Inside Out: How People Get in and Can Get Out

Home > Other > Cults Inside Out: How People Get in and Can Get Out > Page 16
Cults Inside Out: How People Get in and Can Get Out Page 16

by Rick Alan Ross


  Influence Techniques

  Robert Cialdini, professor of psychology at Arizona State University, lists “six principles of influence” within social environments.585 These basic principles are inherently neither good nor bad; rather they exist as the foundational building blocks of influence.

  “The rule of reciprocity,” Cialdini says, is a “sense of future obligation according to the rule [that] makes possible the development of various kinds of continuing relationships, transactions, and exchanges that are beneficial to society. Consequently, virtually all members of society are trained from childhood to abide by this rule or suffer serious social disapproval.” Expounding on this principle, Singer says cults can twist this basic rule regarding behavior. That is, the cult provides a sense of security, salvation, well-being, and love but implicitly expects its devotees to repay these benefits through obedience and behavioral compliance. Singer also points out that this rule can be used to induce feelings of guilt and shame. She says, “If you have made a commitment to the group and then break it, you can be made to feel guilty” and by extension ashamed. 586

  “Commitment and consistency”—Cialdini points out that “people have a desire to look consistent through their words, beliefs, attitudes, and deeds,” which are “highly valued by society.” Singer explains that a destructive cult can use this rule to make its members feel guilty whenever they somehow fall short regarding their performance or are inconsistent in their duties and obligations as understood by the commitments they have made to the group.

  “Social proof”—Cialdini says that the “means used to determine what is correct is to find out what others believe is correct. People often view a behavior as more correct in a given situation, to the degree that we see others performing it.” Within a cult environment, Singer sees this as yet another means for achieving compliance. She explains, “If you look around in the group, you will see people behaving in particular ways. You imitate what you see and assume that such behavior is proper, good, and expected.”587

  “Liking,” Cialdini says, is the principle that “people prefer to say yes to individuals they know and like.” Singer elaborates that new initiates within a cult group may be the target of seemingly unconditional love, frequently called “love bombing.” This may make members feel wanted and loved; therefore, it pushes them to reciprocate by liking the people in the group who “love” them. Now that they like members of the group, it follows that they should comply with their concerns. Singer says, “You feel you ought to obey these people.”588

  “Authority,” Cialdini says, is characterized by “the strong pressure within our society for compliance when requested by an authority figure” and “that such obedience constitutes correct conduct.” Singer explains that one can easily apply this ubiquitous tendency to respect authority to a cult leader who “claims superior knowledge, power, and a special mission.” 589 In fact, members of destructive cults frequently see their leader as the ultimate authority, who may eventually come to supersede any other authority, such as the law, law enforcement, and government authorities.

  “Scarcity,” Cialdini says, is the simple principle that “people assign more value to opportunities when they are less available.” Singer sees this principle as analogous to cult members who are told that “without the group they will miss out on living life without stress; miss out on attaining cosmic awareness and bliss; miss out on changing the world, gaining the ability to travel back in time; or whatever the group offers that is tailored to seem essential.”590 The group may also exemplify this rule through a claim of exclusivity, such as the claim that no other group exists that can offer the same or equal path of attainment and fulfillment.

  Authority

  There seems to be an innate human reliance on, and compliance with, authority. Yale University psychologist Stanley Milgram laid a foundation for this understanding in his seminal research, which made points other researchers later used in their work.591 The Milgram experiment, done in 1961, was an examination of the predisposed human tendencies to obey authority figures, even when such obedience violated ethical concerns. This proclivity partly explains the subservience and submission of cult members under the influence of a perceived authority figure they have been persuaded to accept.

  Milgram’s experiment included student volunteers functioning in the roles of “teachers” or “learners,” who were separated from each other in individual rooms. Teachers would question learners. When the answer was wrong, teachers administered an electric shock to the respondents. The shocks were supposedly increased incrementally until they became quite severe. What teachers didn’t know was that learners were only actors pretending; no electrical shock was actually administered. The focus of the experiment was to see how compliant teachers would remain to authority, even when they witnessed increasing distress and pain from the learners. Whenever teachers hesitated, verbal prodding encouraged them. In the end, despite the learners’ feigned screams of agony, 65 percent (twenty-six out of forty) of those functioning as teachers willingly administered shocks of up to 450 volts. 592

  Verbal prodding consisted of a succession of encouragement from the authority figure:

  “Please continue.”

  “The experiment requires that you continue.”

  “It is absolutely essential that you continue.”

  “You have no other choice; you must go on.”

  Teachers were also told that they had no responsibility regarding the shocks administered.

  Milgram summarized his experiment in an article titled “The Perils of Obedience”593 as follows:

  The legal and philosophic aspects of obedience are of enormous importance, but they say very little about how most people behave in concrete situations. I set up a simple experiment at Yale University to test how much pain an ordinary citizen would inflict on another person simply because he was ordered to by an experimental scientist. Stark authority was pitted against the subjects’ strongest moral imperatives against hurting others, and, with the subjects’ ears ringing with the screams of the victims, authority won more often than not. The extreme willingness of adults to go to almost any lengths on the command of an authority constitutes the chief finding of the study and the fact most urgently demanding explanation.

  Milgram concluded,

  Ordinary people, simply doing their jobs, and without any particular hostility on their part, can become agents in a terrible destructive process. Moreover, even when the destructive effects of their work become patently clear and they are asked to carry out actions incompatible with fundamental standards of morality, relatively few people have the resources needed to resist authority.

  Milgram proposed two interpretations or theories pertaining to the ultimate meaning of his research experiment.

  The theory of conformism is based on the Solomon Asch conformity experiments or what has been called the “Asch Paradigm.”594 That is, when someone feels unable or unqualified to make decisions, he or she is likely to defer to an authority, most especially in a crisis situation.

  The agent state theory, Milgram s says, is “the essence of obedience [that] consists in the fact that [people come] to view themselves as the instrument for carrying out another person’s wishes, and they therefore no longer see themselves as responsible for their actions. Once this critical shift of viewpoint has occurred in the person, all of the essential features of obedience follow.”

  Milgram’s explanations of behavior can also be useful in developing an understanding of the obedience cult members display in their responses to the often-extreme and destructive demands their leaders make. Cult members in such situations see themselves as the agents of some higher power or authority and are therefore exonerated of any personal responsibility for their actions. Moreover, if any doubts arise, they feel compelled to marginalize such concerns in deference to a higher authority and conform to whatever behavior their leader prescribes. Cult leaders may also create the perception of a crisis or impending
catastrophe as a means of leveraging greater submission. Cult members must obey and conform to the leader’s demands to gain a sense or assurance and safety. This is commonly expressed when leaders in many “doomsday cults” claim only their faithful flock will be protected from some final judgment or dispensation.

  Philip Zimbardo, Stanford University professor of psychology, did additional research about the influence of authority and dynamics of group persuasion. He conducted what became known as the “Stanford Prison Experiment” in 1971. Zimbardo chose a group of students to play the roles of prisoners and guards in a simulated prison environment set up in the basement of the Stanford University psychology building. Zimbardo led a team of researchers and played the pivotal role as the “prison superintendent.”595

  What the prison experiment substantially demonstrated through largely anecdotal evidence was that a controlled authoritarian environment could potentially have a profound effect on the behavior of those involved. Zimbardo’s prisoners became increasingly docile and obedient, while those playing guards became harsh and rigid. The Stanford Prison Experiment, much like the research of Stanley Milgram a decade earlier, offers compelling evidence that environment and authority can affect the way a person thinks and feels, and then influence his or her behavior. Zimbardo’s research strongly suggests that in certain situations and environments, ordinary, ostensibly normal people can be influenced to do harmful things. This harmful behavior, which otherwise would seem objectionable, is nevertheless done under the influence of a perceived authority.

  Similar patterns of behavior have been observed in destructive cults and often to a greater degree in the more tightly controlled environment of a cult compound. Certain members may be designated to roles of subordinate leadership or enforcement, while the general members frequently function in various degrees of subservience. Those in the role of enforcers often become increasingly harsh as they reinforce the rules or implement the edicts of leadership, even though their previous personal history may not reflect this type of behavior. In this sense the group dynamic of the cult has a profound effect on the personalities and behavior of those involved; the result is much like that of the Stanford Prison Experiment but over a much more prolonged period of time.

  Personality Change

  Flavil Yeakley, a researcher with a BA degree in psychology and MA and PhD degrees in speech communications, studied the effects of group influence on individual personality traits in a controversial church, which had been called a cult.596 Yeakley comes from a distinctly evangelical Christian perspective in his book. His concern was that the group he examined was engaged in a kind of cloning process through its indoctrination and corresponding pressure tactics. Yeakley suspected that there was an effort to influence members of the group to think, feel, and behave the same way. He wrote, “A central element in the criticism that has been directed against the Boston Church of Christ [BCC], other discipling churches, and the discipling movement generally has been the charge that these churches employ methods that produce unnatural and unhealthy personality changes. Critics charge that discipling churches tend to make the members over after the image of the group leader, the group norm, or the group ideal.”597

  The “discipling” methodology Yeakley examined was based on a system of training through an ascending hierarchal structure of authority. Every member of the group was assigned to a discipling partner that became an advisor with implicit authority. The established chain of command continued, until it culminated in a single supreme leader at the top, named Kip McKean. What Yeakley’s research suggests is that the group essentially developed and preferred a prototype largely based on the personality of Kip McKean. Yeakley came to this conclusion by testing nine hundred members of the group.598 He used the Myers-Briggs Type Indicator (MBTI) as his preferred testing instrument.599 The MBTI is a descriptive test sometimes used in psychological contexts. It is used as a tool to describe personality differences. Sixteen outcomes are possible in the MBTI; they reflect various combinations of personality traits.

  Yeakley explained that the creators of the MBTI surmised, from their study of Swiss psychotherapist and psychiatrist Carl C. Jung‘s writings, “that some people prefer to deal with the world through a judging process (either thinking or feeling), while others prefer to deal with the world through a perception process (either sensing or intuition)… Those who prefer to extravert a judging process tend to be highly organized while those who prefer to extravert a perception process tend to be adaptable.”600

  Critics of the MBTI have noted, “Although we do not conclude that the absence of bimodality [functioning as more than one type] necessarily proves that the MBTI developers’ theory-based assumption of categorical ‘types’ of personality is invalid, the absence of empirical bimodality in IRT [item response theory] based MBTI scores does indeed remove a potentially powerful line of evidence that was previously available to ‘type’ advocates to cite in defense of their position.”601 In fact, the accuracy of the MBTI depends on the honesty of the person who is being tested, since his or her chosen response to questions determines the results. In 1991 a committee of the National Academy of Sciences concluded that there was “not sufficient, well-designed research to justify the use of the MBTI in career counseling programs.”602 Self-reporting and psychometrically weak tests like the MBTI can confuse results when participants answer in socially desirable ways. The limitations of the MBTI should be taken into account when the results are interpreted.

  Tested members of the BCC were asked to respond to the MBTI three times. Yeakley relied on their honesty in responding to questions. He explains, “One time the members were told to answer the questions the way they think they would have before their conversion [entrance into the BCC] or five years ago for the few who had been members that long. The members were also told to answer the questions the way they would at that present time. Finally, they were told to answer the questions the way they think they will answer them after they have been discipled [by the BCC] for five more years.”

  Yeakley noted “the degree of change in psychological type scores” and saw there was a “pattern of convergence in a single type.” The MBTI personality profile ESFJ (i.e. extroverted, sensing, feeling judging) was the preferred and sought after point of convergence. Yeakley observed that “the past distribution [before entering the BCC] was the closest to population norms while the present and future distributions increasingly deviated from those norms.”603 Additional testing of other churches or organizations used as control groups appeared to show similar results.604 Yeakley explains, “What was being investigated in this research was simply the overall group pattern. The focus was not on any individual, but on the dynamics of the group.”605

  We should note that Yeakley’s testing results might also be attributed to the so-called halo effect or halo bias. 606 That is, those BCC members he tested may have tried to emulate qualities they attributed to their leader, Kip McKean, because he was perceived as an ideal person. Marty Wooten, a prominent group leader who worked closely with McKean, demonstrated this fact. Wooten said, “I cannot think of any virtue that Kip is not known for. There is no greater discipler, disciple, brother, husband, father, leader, and friend than Kip McKean. Some say it is dangerous to respect any one man that much. I believe it is more dangerous not to.”607 Another ranking leader in the BCC, Jim Blough, said, “Let me tell you my attitude towards Kip. Let me explain to you. You know, I may have a good quality here and there, occasionally. If you look hard enough, you can find one in almost everyone, you know. But I believe this. I believe if I could become exactly like Kip, I’ll be a whole lot more useful to God than I am by myself.”608

  Based on his study, Yeakley interpreted and hypothesized the following:

  “Most of the members of the Boston Church of Christ showed a high level of change in psychological type scores.”

  “There was a strong tendency for introverts to become extroverts, for [intuitive people] to become [sensing pe
ople], for thinkers to become feelers, and for perceivers to become judgers.”

  “This kind of pattern was not found among other churches of Christ or among members of five mainline denominations, but that it was found in studies of six manipulative sects.”609

  Interestingly, the “manipulative sects” Yeakley referred to with the same pattern of results as the BCC included the Unification Church, Scientology, Hare Krishnas (ISKCON), and the Children of God.610

  What Yeakley hoped to demonstrate through his research was the process of personal manipulation, which he believed occurred through the discipleship training process within the BCC. His research results can be seen as a possible confirmation of the process of coercive persuasion Edgar Schein identified.611 That is, the discipleship training appeared to employ a process of “unfreezing,” “changing,” and then “refreezing” people. Yeakley says, “To the extent that the members respond to that group pressure, the observed changes in psychological type scores are likely to become (or have already become) actual changes in the personality that is manifested.”612

  We should note that the results Yeakley achieved were descriptive and do not precisely explain why these changes occurred or whether they can be seen as permanent. Most probably without continuing group pressure and influence, the observed changes could crumble and ultimately dissipate. Yeakley’s study can be helpful, though, in understanding how malleable people can be within a group environment that uses extreme pressure to manipulate and influence behavior. His study demonstrates that personality characteristics can be affected and shaped through high pressure tactics and then perhaps hardened in place.

 

‹ Prev