“The Sacred Science,” or what Ofshe explains as “agreement that the group ideology is faultless” or that it is essentially perfect. Lifton describes this as an “ultimate vision for the ordering of all human existence.” Whatever the leader or group determines is right must be right, and whatever is labeled wrong is wrong. People influenced by such a “Sacred Science” will routinely subject their value judgments to its narrow rules. Within the world of Sacred Science, as in the Demand for Purity, there is no space for a ambiguity or shades of gray.
“Loading the Language” or what Ofshe describes as the “manipulation of language in which clichés substitute for analytic thought.” Lifton has characterized this as “thought terminating clichés,” which are “brief, highly reductive, definitive-sounding phrases” that become insider verbiage the group frequently uses. Many groups or subcultures have their own insider jargon, but Lifton draws distinctions between that verbiage and what he describes as “totalist language,” which “is repetitiously centered on all-encompassing jargon, prematurely abstract, highly categorical [and] relentlessly judging.” And rather than being used as a means of communicating individual ideas and personal opinions, Lifton sees it as “the language of nonthought.”
“Doctrine Over Person,” as Ofshe explains, is the “reinterpretation of human experience and emotion in terms of doctrine” or as seen through the lens or the group mind-set and its world view. Everyone and everything must be subjected to, and fitted in, this doctrinal framework, including those who use it.
“The Dispensing of Existence,” which Ofshe interprets as the “classification of those not sharing the ideology as inferior and not worthy of respect.” These criteria can create a sense of elitism and social isolation. Lifton explains, “The totalist environment draws a sharp line between those whose right to existence can be recognized and those who possess no such right.” The only means of validation is acceptance of the group and its beliefs. Lifton says, “Existence comes to depend upon creed (I believe, therefore I am), upon submission (I obey, therefore I am) and beyond these, upon a sense of total merger with the ideological movement.” Under such influence cult members may dispense with the existence of family, old friends, previous goals, and aspirations. Nothing has a right to exist unless it fits within the framework of the group and the Sacred Science, as its leadership has dictated.
Lifton notes, “The more clearly an environment expresses these eight psychological themes, the greater its resemblance to ideological total-ism; and the more it utilizes such totalist devices to change people, the greater its resemblance to thought reform.”559 Sociologist Benjamin Zablocki observed, “It is probably not necessary to have every one of Lifton’s eight structural characteristics of ideological totalism in place for [thought reform] to occur.”560
These criteria may be expressed in varying degrees of intensity from group to group. Typically the more control a particular group seeks to exert over its members, the more it may intensify or express these eight criteria. For example, not all destructive cults maintain group compounds, but we may see the maintenance of such compounds as a more intensified means of exerting control over the environment, or what Lifton cites as “Milieu Control.” Another understanding is that the more extreme the demands of the group are, the more extreme the controlling criteria may be expressed to meet those demands.
Not all individuals will respond in exactly the same way to such group controls. Some may become unhappy and leave due to conditions in the group environment, or leadership may reject them due to their lack of compliance. Likewise, each individual initially brings his or her own unique history and personality to the group experience, which to varying degrees provides the basis for his or her response to the leadership. Some may be more vulnerable than others due to their circumstances or individual history. The deceptive manner in which a destructive cult presents itself can also potentially affect people differently. For example, some may respond more readily to a group using a facade of religion as opposed to one of philosophy, politics, or pseudoscience, based on their personal interests and background. And individual interest or deference can account for an initial or ongoing acceptance of the group’s imperatives, which are given in the guise of some religious, political, or otherwise relatable context.
This cult process of manipulation doesn’t require overt physical coercion but may instead in part rely on sleep deprivation, dietary controls, intimidation, implied threats, or inducement of unreasonable fears. Sociologist Benjamin Zablocki notes, “Cult movements rarely retain their members by the use of physical force or constraint. But is the necessity of force or the threat of force required for true brainwashing? This widespread belief is based upon a misreading of Lifton and Schein. This misreading came about because, in fact, many (although by no means all) of the cases they studied were brought to a state of agency by real or threatened force.”561
Zablocki, like Schein, sees this process in three phases, the third drawn from Lifton.
“The stripping phase: The cognitive goal of the stripping phase is to destroy prior convictions and prior relationships of belonging. The emotional goal of the stripping phase is to create the need for attachments. Overall, at the completion of the stripping phase, the situation is such that the individual is hungry for convictions and attachments and dependent upon the collectivity to supply them.”
“The identification phase: The cognitive goal of the identification phase is to establish imitative search for conviction and bring about the erosion of the habit of incredulity. The emotional goal of the identification phase is to instill the habit of acting out through attachment. Overall, at the completion of the identification phase, the individual has begun the practice of relying on the collectivity for beliefs and for a cyclic emotional pattern of arousal and comfort.”
“The symbolic death and rebirth phase: In the rebirth phase, the cognitive and the emotional tracks come together and mutually support each other. This often gives the individual a sense of having emerged from a tunnel and an experience of spiritual rebirth. The cognitive goal of the rebirth phase is to establish a sense of ownership of (and pride of ownership in) the new convictions.” 562
“Information Disease”
The long-term impact of the control and manipulation of information can be seen in a collection of symptoms Flo Conway and Jim Siegelman call “information disease.”563 They define this as “an alteration through experience of a person’s everyday information processing capacities—his [or her] everyday powers of thinking, feeling, perception, memory, imagination and conscious choice.” And that it “marks a lasting change of awareness at the most fundamental level of personality.” We can see this in the suicides of Solar Temple members. Initially, there was a reported “mass suicide” in 1994, which included leader Luc Joret and more than seventy of his followers. But the lasting impact of Joret’s influence and the group experience was evident in the continuing suicides of surviving Solar Temple members over the following three years. The last suicides linked to the group occurred in 1997.564 The cult-related changes in thinking that occurred continued to dominate and animate the lives of surviving Solar Temple members until they ended them.
According to Conway and Siegelman, information disease can also result in part from physical causes such as “poor diet and lack of sleep,”565 which have been reported in many cults, including the Unification Church, founded by the Reverend Moon.566 Former members of that group allege that they were fed a low-protein diet and often slept only four to five hours a night. But the authors say information disease may also occur solely as the “result of information alone, especially from intense experiences that abuse an individual’s natural capacities for thought and feeling.” In the carefully managed confines of a Unification Church training retreat, potential recruits can see and experience only what the group wants according to its planned program. All access to information and personal associations is under tight group control. In this environment participants
experience what has been called “love bombing,” which is a term used to describe the seemingly unconditional affection church members direct toward them. However, this “love” is actually highly conditional and based on their growing acceptance of Unification Church principles and corresponding progress in the group. This contrived but intense experience in the context of a controlled group environment can produce the desired commitment.
After recovering from her experience under the control of the political cult called the Symbionese Liberation A Army, the heiress Patty Hearst commented that she had been told how to think. Upon reflection Hearst compared the process she endured to something like “the disciplining of your mind.”567
Conway and Siegelman notably include such practices as “group encounter, guided fantasy [and] meditation” as a means of implementation. The authors conclude, “By tampering with basic distinctions between reality and fantasy, right and wrong, past, present and future, or simply by stilling the workings of the mind over time, these intense communication practices may break down vital faculties of mind.” The authors also point out that there is growing evidence that such abuses may ultimately “impair crucial working connections in the brain’s underlying synaptic networks and neurochemical channels,” which may potentially “destroy long-standing information processing pathways in the brain.” 568
Such changes in the brain were the focus of the book Craving for Ecstasy by professors Harvey Milkman and Stanley Sunderwirth, which examines how addiction and behavior affect the brain. Milkman and Sunderwirth, who specialize in brain chemistry, reinforce Conway and Siegelman’s observations. They write, “Individuals can change their brain chemistry through immersion in salient mood-altering activities as well as through ingesting intoxicating substances.” The researchers add, “If our synaptic chemistry changes dramatically we seem to possess altogether different personalities.”569 The authors specifically cite the power of “cults,” which they say “may be used to short-circuit the usual course of an addictive process.” They then offer “the tragic example of Jonestown, blind devotion to a religious cult,” which “burned a path straight to the suicidal vortex.”570 The comparison of cults to chemical addiction may explain the seemingly addictive pattern of behavior often evident in cultic involvement. This analogy may also explain why discontinuing that involvement, especially after years of reinforcement, is frequently difficult.
Conway and Siegelman have identified “four distinct varieties of information disease” we can see by observing an affected individual.
“Ongoing altered state of awareness”—characterized as a “state of narrowed or reduced awareness.” This can be brought on by an encapsulated environment controlled by a group and/ or leader that virtually excludes any other focus or outside frame of reference.
“Delusional phase”—“vivid delusions [and] hallucinations” that lead to “irrational, violent and self-destructive behavior,” which can be brought on through techniques of sensory deprivation and/or overload
“Not thinking”—“literally shutting off the mind”
“Not feeling”—“actively suppressing one’s emotional responses” that may “ultimately numb a person’s capacity for human feeling”571
Marshall Applewhite, the leader of the Heaven’s Gate cult, prohibited his followers from watching television and strictly regulated their reading. Each member of the group had an assigned partner and was told never to be alone. These measures were taken “to keep [members] in the mindset.” Communication was often limited to simply saying “Yes,” “No,” or “I don’t know.”572
“Emotional Control”
Conway and Siegelman succinctly explain in their first book, Snapping: America’s Epidemic of Sudden Personality Change, how the mind can be stymied, sidetracked, and potentially subjugated by what it sees as “information disease.” In their second book, Holy Terror: The Fundamentalist War on America’s Freedoms in Religion, Politics, and Our Private Lives, they discuss the interlocking emotional control that controlling groups and leaders often use.
Conway and Siegelman write, “Because as human beings, beyond all differences of faith and culture, our feelings are our most important resource, our most complex and fully integrated and universal communication capacity. They may also be our most accurate monitor of personal morality—of what is right and wrong for each of us as individuals—and of the fairness of our conduct in relation to one another. When at that intimate level the wisdom of our feelings is stilled, distorted or thrown into confusion, our greatest strength may quickly be turned into our greatest vulnerability.”573
The authors explain that such emotional control is achieved through “the reduction of individual response to basic emotions such as love, guilt, fear, anger, hatred, etc.” This is accomplished by “means of suggestion” through “the indirect use of cues, code words, symbols, images and myths.” For example, Bible-based groups may use the images of Jesus and Satan to emotionally manipulate members. In his book Thought Reform and the Psychology of Totalism, Robert Jay Lifton correlates the use of such imagery to the category of “ultimate terms” or “God terms” and “devil terms.”574
This means of manipulation allows its practitioners to assign any action or feeling they perceive as negative or challenging to their authority in the category of “satanic” or “demonic” while simultaneously using the image of Jesus or God as a facade for their own authority. Within this box, whenever disobedience occurs or doubts surface, they are consigned to the devil or dark forces. Obedience to the leadership is correspondingly characterized as compliance to the will of God and heavenly authority.
In the family cult Marcus Wesson led, his children were taught that he was “God’s messenger” and that the “end times” were “close at hand.” One son said, “He was God. That’s just the way it was.”575 Within the confines of this construct to obey Wesson was to obey God. Disobedience was, therefore, defiance of God. Those who opposed Wesson, such as the authorities, were characterized as “Satan.”576 Caught within this world of polarized imagery, one of Wesson’s daughters said she felt “trapped.”
But religious imagery isn’t the only way such ultimate terms can be used. Secular symbols can easily be used, such as the popular principles and corresponding icons of business, art, philosophy, nationalism, political theory, psychology, philosophy, or virtually any field of interest.
Conway and Siegelman summarize that ultimately “the secret is surrender.”577 That is, surrender is not seen as giving in to the authority of a group or leader but rather as subordinating yourself to a supposedly higher cosmic power, honored purpose, or principle. It is by posing behind a carefully constructed facade of myths, symbols, and/or images that the practitioners of such indirect manipulation can effectively garner obedience, engender dependence, and ultimately solidify their control.
Powerful Suggestions
Certain suggestible states induced through hypnosis, trance induction, meditation, yoga, chanting, and various repeated physical exercises may also serve as a means of manipulation. These altered states of consciousness can potentially make people malleable. In such an altered state, techniques may be applied such a as guided imagery, indirect directives, and peer pressure. Modeling of behavior can be used more effectively. If a licensed mental health professional uses such techniques, ethical concerns may arise, since the use of these techniques is most often done without informed consent.
The process of guided imagery has been used in hypnotherapy.578 The subject is first in an induced state of deep relaxation or trance. This can be brought about through hypnosis, certain forms of meditation, yoga, chanting and breathing exercises. In this state of mind, a leader or therapist can then guide an individual or group to experience various feelings, an imagined environment, or sensations. Psychologist Margaret Singer observes, “A considerable number of different guided imagery techniques are used by cult leaders and trainers to remove followers from their normal frame of reference.”579
Wrapped within the mystique of a cult, such guided imagery can become a facet of what Lifton calls “Mystical Manipulation” and used to mold a mind-set and shape perceptions of reality to build an alternate cult reality.
The Indian professor of philosophy and self-styled guru known as Bhagwan Shree Rajneesh conducted what he called “Dynamic” and “Kundalini” meditation camps, in which his followers were transformed through the suggestible states of the experience.580 One former devotee remarked, “I think I brainwashed myself” but observed, “Bhagwan had one line: the good disciple follows what the master says, the good disciple doesn’t think.”581
Indirect directives can also be an effective means of persuasion. Singer says, “Cult leaders who have become skillful at getting acts carried out through indirection and implication,” rather than by directly ordering something to be done. That is, to “imply that something should happen, and it does.”582 This can also be achieved through the tone of voice, asking a question that prompts an action and offering other indirect suggestions that solicit compliance. Cult members often attempt to deflect accusations that they observe mindless obedience by offering the apology that no one directly orders them to do anything. Submission to authority and compliance, however, can be achieved indirectly.
Peer pressure and modeling, according to Singer, “[are] an effective means to get people to fit their behavior to group norms.” This goal is achieved by noting “models around to imitate.”583 For example, those in a new environment created by a cult may observe the behavior, dress, and speech of others around them to understand the preferred way of living within that environment. Change largely occurs without direct demands being made; it occurs through the influence of the group environment. Singer explains, “The clever cult leader or mind manipulator manages to use the innate tendencies toward group conformity that we bring with us as a powerful tool for change.”584
Cults Inside Out: How People Get in and Can Get Out Page 15