Honoring the Self
Page 15
Symbiotic dependency is not a foundation for powerful, passionate love between a man and a woman. When such children marry and fantasy collapses to reality, disillusionment and mutual blame are commonly the result. Passionate love, on the other hand, ignites in a context of separation and individuation.
We now turn to the cognitive realm.
Maturity, in its broadest sense, is the state of being fully grown or developed. Cognitive or intellectual maturity is successful development of a human consciousness. The most famous and influential student of cognitive development is Jean Piaget.61 No brief summary can do justice to the richness of his observations and researches in this area. Here I offer only the briefest of sketches.
Piaget divides cognitive development into four major periods; these are subdivided into a number of stages and sub-stages. Lasting from birth through the first eighteen to twenty months, the sensorimotor stage basically concerns preverbal intellectual development. This is the very period in which the infant is making the early stages of transition from submergent consciousness to explicit consciousness, from non-ego to ego, from nonself to self. Here Piaget’s studies track the child’s way of interacting with the world, from primitive reflex sucking, hand movements, and the random eye movements of the neonate to the stage when the child is using internalized visual and motor symbols to invent new means of solving problems at a still fairly primitive level.
During the preoperational period, which extends approximately until the child enters school, children acquire the ability to use symbols and language. They are not yet able to construct chains of reasoning, note contradictions, or adapt what they say in ways appropriate to the needs of the listener.
The years between the start of schooling and the onset of puberty are the period of concrete operations, during which children acquire a coherent cognitive system with which they are able to understand their world and effectively interact with it, and into which they can fit unfamiliar experiences.
Adolescence marks the start of the period of formal operations, when boys and girls become capable of thinking propositionally, of conceptualizing, and of using hypotheses. Full mastery of the level of formal operations entails the ability not only to conceptualize but to think about thinking, to reflect on one’s own reasoning processes, which, according to Piaget, represents the highest level of intellectual development. It is also the level that is least often mastered completely. Rarely do individuals reach it on their own; in most cases a good deal of training and education is needed for the cultivation of the cognitive self-examining and self-critical faculty.
Once again we can note a theme mentioned earlier: progressive movement to higher and higher levels of abstraction, culminating in the stage in which abstract thought becomes its own matrix.
In the early stages of development, then, a child knows only perceptual concretes; he or she does not know abstractions or principles. A child’s world is only the immediate now; the ability to think, plan, or act on a long-range basis is not yet developed; the future is largely unreal. At this stage, a high level of dependency is naturally inevitable, even though great differences are already observable among children with regard to independence and self-assertiveness.
As the child grows, the intellectual field widens. The child learns language, begins to grasp abstractions, begins to generalize, learns to make increasingly subtle discriminations, learns to look for principles, progressively acquires the ability to project a more and more distant future; the child rises from the sensory-perceptual level of consciousness to the conceptual level. Entailed in this progression is a growing capacity for objectivity—an ability to perceive persons, situations, and facts apart from the individual’s own desires, fears, and needs. This capacity for objectivity is an essential of independence, of autonomy.
Intellectual maturity is the ability to think in principles; it presupposes both the conceptual function and the capacity for objectivity. Of course, cognitive or intellectual development is not entirely separable from moral and emotional development. There are persons who are brilliant at conceptualizing principles in higher mathematics or the stock market but who become helplessly blind to abstractions and principles when their focus is on personal problems with their spouses, children, friends, or associates at work. Perhaps I should say that an ultimate manifestation of maturity is evidenced by the ability to think in principles about oneself.
I need hardly stress how rare is the ability to have the same perspective on one’s own behavior that one has on the behavior of another person.
In the absence of this level of maturity, there is always some dependency, no matter how responsible the person may be in other aspects of his or her life. It is our spouses, our children, our friends, our work associates who are left to absorb the consequences of our underdeveloped consciousness.
Maturity entails the ability to perceive the fairness or unfairness, the right or wrong, the justice or injustice, of our own behavior and the behavior of others with equal clarity This is one of the meanings of objectivity. It is also one of the manifestations of well-realized autonomy. It is not “unselfishness” but a triumph of selfhood—a triumph of individuation.
This leads us directly to the fourth aspect of self-development that I would like to consider: the progression toward moral autonomy.
While the ability to make appropriate moral or ethical discriminations is obviously tied in the most intimate way to the ability to make cognitive discriminations, it does not follow that a high level of cognitive development automatically guarantees a correspondingly high level of moral development. Moral development is a separate track in an individual’s evolution.
Contrary to the teachings of behaviorists and social-learning theorists, whose accounts of morality and moral behavior are put forth exclusively in terms of social conditioning, positive and negative reinforcement, and the like, there exists within the growing human organism the need to generate moral choices, decisions, and discriminations; this is intrinsic to the developmental process. And it is easy enough to perceive why it would have to be. Reality continually confronts us with a wide variety of alternatives; we require a code of values to guide our actions. Moral values are not an arbitrary invention of society or of religion; they are biological, in that they are requirements of survival and effective functioning. I shall elaborate on this point when I discuss the relationship of morality and self-esteem.
Just as a newborn infant has no sense of ego or self, so it has no awareness of morality. A capacity for ethical judgment slowly evolves and goes through various stages in the course of an individual’s growth. While psychologists specializing in the process of development are by no means in full agreement concerning how the self as moral agent evolves, there is fairly strong agreement that successful culmination of this process is a condition of moral autonomy, in which the individual behaves in ways he or she judges to be moral, not because of fear or punishment or social disapproval, and not because of blind, conformist rule following, but rather because of an authentic, firsthand assessment of the right and wrong of given situations.41,46
I am not concerned, in the immediate context, with the important question of how we justify our assessments; that will have to await the discussion of ethics in a later chapter. Here I am concerned only to observe that autonomy in the field of ethics represents a very advanced level in the maturational process and is not widely attained.
We have already seen, in our discussion of guilt, how commonly people reproach themselves for violating standards not of their own choosing: the values and expectations of significant others, of parents and other authority figures. Often, when persons are wrestling with moral dilemmas, they are unaware of the different voices debating within them, the voice of mother pointing to one path, the voice of father to another, the voice of a teacher, spiritual guru, or psychotherapist to a third—and underneath all of this, the often faint voice of the individual involved, weakly struggling to have a say in the decision-making
process.
One of the characteristics of a high level of autonomy is respect for inner signals, the voice of the authentic self, which often contradicts the teachings of conventional morality. This self might tell one to be compassionate when conventional morality says to be stern, or to be angry when conventional morality says to be humble, or to be proud when conventional morality says to be self-disparaging, or to be challenging when conventional morality says to be compliant.
While psychologists generally agree that imitative rule following represents a fairly early stage of a child’s moral development, a stage to be outgrown and transcended with subsequent knowledge and maturity, it is difficult to escape the conclusion that imitativeness and conformity to authority are more the norm than the exception among most adults. Lest this judgment seem too severe, let us shift our perspective from the intimately personal to the sociological.
Every twenty seconds on this planet, one human being kills another human being.
To interpret the above statistic as evidence of humanity’s innate cruelty or “selfishness” is to miss the point of the horror completely. Viewed globally, the overwhelming majority of these killings were not committed for personal gain. They do not fall into the category of individual crime. Most of the persons who did the killing were obeying authority, fighting for a cause, submerging self and personal judgment in the service of something allegedly greater than themselves, more important than their “private egos” or “individual consciences.”
Recall, in this context, the famous experiment conducted at Yale University by Stanley Milgrim and reported in his book Obedience to Authority. Since the experiment is so well known, my summary will be brief.
In a brilliantly executed research study, Milgrim arranged that a group of experimental subjects, drawn from the general population, would be led to believe that they were serving the goals of science by administering increasingly severe and painful electric shocks to other volunteer subjects who failed to answer certain questions correctly. They were told that they were taking part in a study of the effects of punishment on learning.
Unaware that this latter group of subjects were, in effect, playacting, that the screams and cries to be released were merely a performance, and that the electric shocks were not real, the “aggressor” subjects were being tested, in a daringly imaginative way, on the limits of their willingness to surrender moral autonomy to the voice of authority.
Numerous controls were built into the design of the experiment to rule out any element of personal aggressiveness. The presiding experimenter had absolutely no power over the volunteer subjects and no financial reward to offer for compliance. Every factor was effectively eliminated except one: the disposition to obey perceived authority.
In advance of conducting the experiment, Milgrim invited a group of psychiatrists to predict the outcome. “With remarkable similarity they predicted that virtually all the subjects would refuse to obey the experimenter,” he reports. The thirty-nine psychiatrists who answered Milgrim’s questionnaire shared the view that “most people would not go beyond 150 volts (i.e., when the victim asked the first time to be released). They expected that only 4 percent would reach 300 volts, and that only a pathological fringe of about one in a thousand would administer the highest shock on the board.” 53
Under the instructions of the presiding scientist/authority figure, ignoring the cries and screams of the “victims,” more than 60 percent of the Yale subjects kept pressing the dummy buttons up to the limit of 450 volts, even though this voltage was clearly marked “Danger—severe shock.”
This experiment has been repeated in a number of universities throughout the world, with essentially the same results. In other countries, the percentage of persons who obeyed to the upper limit of voltage was generally higher than at Yale. In Munich it was 85 percent.
Milgrim writes:
For a man to feel responsible for his actions, he must sense that the behavior has flowed from “the self.” In the situation we have studied, subjects have precisely the opposite view of their actions—namely, they see them as originating in the motives of some other person. Subjects in the experiment frequently said, “If it were up to me, I would not have administered shocks to the learner.” 53
As suggested by the psychiatrists’ predictions, most people are astonished when they learn the results of this experiment. They are certain that in the same circumstances they would act differently. They profess not to understand how such cruelty is possible among civilized human beings, especially fellow Americans.
The experiment is well known; I submit that its meaning is not. One of the theses of this book is that most of us have been trained to push those buttons since the day we were born. This training is not the result of some person’s or group’s malevolence but is inherent in our methods of child-rearing and education. We deal here with the whole process by which a new human being is prepared for life in society, a process that throws countless obstacles in the path of developing moral autonomy.
We are taught very early to respect external signals above internal signals, to respect the voice of others above the voice of self. A “good” child is one who “minds” his or her elders, who “behaves.” We are taught to identify virtue with compliance with the wants and expectations of others. We are taught conformity as the ultimate civic good. We are taught obedience as the price of love and acceptance. We are taught, sometimes explicitly, sometimes implicitly, and from the widest possible variety of sources, that the self is evil, or unimportant, or petty, or something to be tamed and suppressed, or negligible in the vast scheme of things, or merely an illusion—and that to honor the self in the sense I have been developing is to alienate the individual from family or community or society or God or the Universe or the Whole.
* * *
Very few forces within our culture actively encourage intellectual or moral autonomy. The more common goal of parents and teachers is social adaptation.
Generally, schools are places where children learn not to think, but to follow the rules. That favorite word of psychologists and sociologists, socialization, which describes this process of learning the rules, is also used in a political context to signify “given over to public ownership.”
I vividly recall my own experiences in grade school and high school. I quickly learned the two most important values in that world: the ability to remain silent and motionless for long periods of time and the ability to march with my fellow students in a neat row from one classroom to another. In other words, don’t cause the teacher trouble. School was not a place to learn independent thinking, to have one’s self-assertiveness encouraged, to have one’s autonomy nourished; it was a place to learn how to fit into some nameless system created by some nameless others and called “the world” or “society” or “the way life is.” And “the way life is” was not to be questioned.
Many brilliant minds have commented on their dismal experiences in school, their boredom, their lack of appropriate intellectual stimulation and nourishment, their sense that the last thing the educational system was designed for was the cultivation of minds. Schools are interested, not in autonomy, but in the manufacture of someone’s notion of “good citizens.”
“In education,” writes Carl Rogers in On Becoming a Person, “we tend to turn out conformists, stereotypes, individuals whose education is ‘completed,’ rather than freely creative and original thinkers.”
What makes this state of affairs particularly unfortunate is that schools represent priceless opportunities to undo or at least counteract a child’s negative experiences at home. Teachers have a unique opportunity to offer the child an alternative view of self and the world, to give a child the experience of having his or her feelings, dignity, and mind respected, and thereby to provide a powerfully healing transition to adolescence and adulthood. And this sometimes does happen—but when it does it is the exception, not the norm.
Commenting on the disposition of parents and t
eachers to demand obedience and conformity as primary values, to discourage normal and healthy progress toward autonomy, Piaget writes in The Moral Judgment of the Child, “If one thinks of the systematic resistance offered by people to the authoritarian method, and the admirable ingenuity employed by children the world over to evade disciplinarian constraint, one cannot help regarding as defective a system which allows so much effort to be wasted instead of using it in cooperation.”
None of the foregoing is offered as an argument for giving a child unrestricted freedom. Children need limits. They need guidelines. They need them for their security, and they need them for their survival. Teachers and parents who refuse to take a stand on anything with children, refuse to uphold any values, or who convey the notion that all moral principles are old-fashioned or irrelevant, do not do their children a service. Adults do possess greater knowledge than children. The question is, How is this knowledge to be transmitted? One can teach with respect, or one can teach with intimidation. One can speak to a child’s intelligence or to his or her fear of punishment. One can offer a child reasonable choices within sane and comprehensible ground rules, or one can lay down the law, as is done in the army. One can accept a child’s making mistakes as a natural part of the growth process, or one can inculcate a terror of mistakes by reacting with ridicule or harsh punitiveness.
Let us pause for a moment on this issue of the right to make mistakes, because it is of supreme importance to the art of teaching and to the art of parenting. Many a client in therapy, when given the sentence stem “If I had been given permission to make mistakes—,” has responded with such ends as:
I wouldn’t be such a procrastinator.
I’d be willing to tell people what I think.