by Neil Postman
It would not be easy to educate teachers to approach matters in this way. Unlike the study of sickness and injustice, the study of error has rarely been pursued in a systematic way. But this does not mean that the subject has no history. There are many honorable books that take human error as a theme. The early dialogues of Plato are little else but meditations on error. Acknowledging that he did not know what truth is, Socrates spent his time exposing the false beliefs of those who thought they did. Erasmus’s In Praise of Folly also comes to mind, as does Jonathan Swift’s Gulliver’s Travels. In a more modern vein, one thinks of Jacques Ellul’s A Critique of the New Commonplaces, Stephen Jay Gould’s The Mismeasure of Man, I. A. Richards’s Practical Criticism, Mina Shaughnessy’s Errors and Expectations, and S. I. Hayakawa’s Language in Thought and Action.
Such books are not normally included as part of the education of teachers. Were they to be used, teachers would be likely to come to three powerful conclusions. The first is that everyone makes errors, including those who write about error. None of us is ever free of it, and we are most seriously endangered when we think we are. That there is an almost infinite supply of error, including our own, should provide teachers with a sense of humility and, incidentally, assurance that they will never become obsolete.
The second conclusion is that error is reducible. At present, teachers consume valuable time in pointless debates over whether or not intelligence is fixed, whether it is mostly genetic or environmental, what kinds of intelligences exist, and even how much intelligence one or another different race has. Such debates about error are entirely unnecessary. Error is a form of behavior. It is not something we have; it is something we do. Unlike intelligence, it is neither a metaphor nor a hypothetical construct whose presence is inferred by a score on a test. We can see error, read it, hear it. And it is possible to reduce its presence.
The third conclusion is that error is mostly committed with the larynx, tongue, lips, and teeth—which is to say, error is chiefly embodied in talk. It is true enough that our ways of talking are controlled by the ways we manage our minds, and no one is quite sure what “mind” is. But we are sure that the main expression of mind is sentences. When we are thinking, we are mostly arranging sentences in our heads. When we are making errors, we are arranging erroneous sentences. Even when we make a nonverbal error, we have preceded the action by talking to ourselves in such a way as to make us think the act is correct. The word, in a word, brings forth the act. This fact provides teachers with a specific subject matter in which they may become “experts”: Their expertise would reside in their knowledge of those ways of talking that lead to unnecessary mischief, failure, misunderstanding, and even pain.
I believe Bertrand Russell had something like this in mind when he said that the purpose of education is to help students defend themselves against “the seductions of eloquence,” their own “eloquence” as well as that of others. As I have previously mentioned, the ancient Greeks—that is, the Sophists—believed that the study of grammar, logic, and rhetoric would provide an adequate defense. These arts of language were assumed to be what may be called “meta-subjects,” subjects about subjects. Their rules, guidelines, principles, and insights were thought to be useful in thinking about anything.
In poking fun at those who saw no purpose in learning about language, Erasmus (in his In Praise of Folly) wrote with sharp irony: “… what use of grammar, where every man spoke the same language and had no further design than to understand one another? What use of logic, where there was no bickering about the double-meaning words? What need of rhetoric, where there were no lawsuits?”
He meant to say that as humans we will always have difficulty understanding one another, will always bicker about the meaning of words, always claim we have been injured by another. There is nothing that happens among humans that is not instigated, negotiated, clarified, or mystified by language, including our attempts to acquire knowledge. The Greeks, and, indeed, the medieval Schoolmen, understood well something we seem to have forgotten—namely, that all subjects are forms of discourse and that therefore almost all education is a form of language education. Knowledge of a subject mostly means knowledge of the language of that subject. Biology, after all, is not plants and animals; it is a special language employed to speak about plants and animals. History is not events that once occurred; it is language describing and interpreting events, according to rules established by historians. Astronomy is not planets and stars but a special way of talking about planets and stars, quite different from the language poets use to talk about them.
And so a student must know the language of a subject, but that is only the beginning. For it is not sufficient to know the definition of a noun, or a gene, or a molecule. One must also know what a definition is. It is not sufficient to know the right answers. One must also know the questions that produced them. Indeed, one must also know what a question is, for not every sentence that ends with a rising intonation or begins with an interrogative is necessarily a question. There are sentences that look like questions but cannot generate any meaningful answers, and, as Francis Bacon said, if they linger in our minds, they become obstructions to clear thinking. One must also know what a metaphor is, and what is the relationship between words and the things they describe. In short, one must have some knowledge of a metalanguage—a language about language—to recognize error, to defend oneself against the seductions of eloquence.
In a later chapter, I will offer a more detailed description of what such a metalanguage might consist of for modern students. Here, I should like to suggest some other means of educating students to be error detectors—for example, that all subjects be taught from an historical perspective. I can think of no better way to demonstrate that knowledge is not a fixed thing but a continuous struggle to overcome prejudice, authoritarianism, and even “common sense.” Every subject, of course, has a history, including physics, mathematics, biology, and history itself. I have previously quoted William James to the effect that any subject becomes “humanistic” when taught historically. His point almost certainly was that there is nothing more human than the stories of our errors and how we have managed to overcome them, and then fallen into error again, and continued our efforts to make corrections—stories without end. Robert Maynard Hutchins referred to these stories as the Great Conversation, a dynamic and accurate metaphor, since it suggests not only that knowledge is passed down from one thinker to another but modified, refined, and corrected as the “conversation” goes on.
To teach about the atom without including Democritus in the conversation, electricity without Faraday, political science without Aristotle or Machiavelli, astronomy without Ptolemy, is to deny our students access to the Great Conversation. “To remain ignorant of things that happened before you were born is to remain a child,” Cicero said. He then added, “What is a human life worth unless it is incorporated into the lives of one’s ancestors and set in an historical context?” When we incorporate the lives of our ancestors in our education, we discover that some of them were great error-makers, some great error-correctors, some both. And in discovering this, we accomplish three things. First, we help students to see that knowledge is a stage in human development, with a past and a future. Second (this would surely please Professor E. D. Hirsch, Jr.), we acquaint students with the people and ideas that comprise “cultural literacy”—that is to say, give them some understanding of where their ideas come from and how we came by them. And third, we show them that error is no disgrace, that it is the agency through which we increase understanding.
Of course, to ensure that the last of these lessons be believed, we would have to make changes in what is called “the classroom environment.” At present, there is very little tolerance for error in the classroom. That is one of the reasons students cheat. It is one of the reasons students are nervous. It is one of the reasons many students are reluctant to speak. It is certainly the reason why students (and the rest of us) fight so hard to justify w
hat they think they know. In varying degrees, being wrong is a disgrace; one pays a heavy price for it. But suppose students found themselves in a place where this was not the case? In his book Mindstorms, Seymour Papert contends that one of the best reasons for using computers in the classroom is that computers force the environment to be more tolerant of error. Students move toward the right answer (at least in mathematics) by making mistakes and then correcting them. The computer does not humiliate students for being wrong, and it encourages them to try again. If Papert is right, then we do, indeed, have a good reason for having students use computers. Of course, if he is right, it is also an insult to teachers. Is it only through the introduction of a machine that the classroom can become a place where trial and error is an acceptable mode of learning, where being wrong is not a punishable offense?
Suppose teachers made it clear that all the materials introduced in class were not to be regarded as authoritative and final but, in fact, as problematic—textbooks, for example. (And here is my more serious answer to the teacher who wondered what we would do without them.) It is best, of course, to eliminate them altogether, replacing them with documents and other materials carefully selected by the individual teacher (what else is the Xerox machine for?). But if elimination is too traumatic, then we would not have to do without them, only without their customary purpose. We would start with the premise that a textbook is a particular person’s attempt to explain something to us, and thereby tell us the truth of some matter. But we would know that this person could not be telling us the whole truth. Because no one can. We would know that this person has certain prejudices and biases. Because everyone has. We would know that this person must have included some disputable facts, shaky opinions, and faulty conclusions. Thus, we have good reason to use this person’s textbook as an object of inquiry. What might have been left out? What are the prejudices? What are the disputable facts, opinions, and conclusions? How would we proceed to make such an inquiry? Where would we go to check facts? What is a “fact,” anyway? How would we proceed in uncovering prejudice? On what basis would we judge a conclusion unjustifiable?
Professor Hirsch worries about such an approach, indeed, condemns it, because he believes that by learning about learning, students are deflected from getting the facts that “educated” people must have. But to proceed in this way permits students to learn “facts” and “truths” in the text as one hopes they will, and it also permits them to learn how to defend themselves against “facts” and “truths.” Do we want our students to know what a noun is? The text will tell them, but that is the beginning of learning, not the end. Is the definition clear? Does it cover all cases? Who made it up? Has anyone come up with a different definition?
Do we want students to know what a molecule is? The text will tell them. But then the questions begin. Has anyone ever seen a molecule? Did the ancients believe in them? Was a molecule discovered or invented? Who did it? Suppose someone disbelieved in molecules, what then?
Do we want students to know about the causes of the Revolutionary War? A text will give some. But from whose point of view? And what sort of evidence is provided? What does objectivity mean in history? Is there no way to find out the “real” truth?
If students were occupied with such inquiries, they would inevitably discover the extent to which facts and truth have changed, depending upon the circumstances in which the facts were described and the truths formulated. They will discover how often humans were wrong, how dogmatically they defended their errors, how difficult it was and is to make corrections. Do we believe that our blood circulates through the body? In studying the history of biology, students will discover that 150 years after Harvey proved blood does circulate, some of the best physicians still didn’t believe it. What will students make of the fact that Galileo, under threat of torture, was forced to deny that the Earth moves? What will students think if they acquaint themselves with the arguments for slavery in the United States?
Will our students become cynical? I think not—at least not if their education tells the following story: Because we are imperfect souls, our knowledge is imperfect. The history of learning is an adventure in overcoming our errors. There is no sin in being wrong. The sin is in our unwillingness to examine our own beliefs, and in believing that our authorities cannot be wrong.
Far from creating cynics, such a story is likely to foster a healthy and creative skepticism, which is something quite different from cynicism. It refutes the story of the student learner as the dummy in a ventriloquism act. It holds out the hope for students to discover a sense of excitement and purpose in being part of the Great Conversation.
Since I began this chapter with three ideas that were not taken as seriously as they were intended, I will end it with another one that is likely to have the same fate. I suggest the following test be given in each subject in the curriculum. We might think of it as the “final” exam:
Describe five of the most significant errors scholars have made in (biology, physics, history, etc.). Indicate why they are errors, who made them, and what persons are mainly responsible for correcting them. You may receive extra credit if you can describe an error that was made by the error corrector. You will receive extra extra credit if you can suggest a possible error in our current thinking about (biology, physics, history, etc.). And you will receive extra extra extra credit if you can indicate a possible error in some strongly held belief that currently resides in your mind.
Can you imagine this question being given on the SATs?
7 • The American Experiment
I have before me the Report of the New York State Curriculum and Assessment Council (dated April 1994). Its name will tell you, straight off, that reading it is likely to be a painful experience, since reports produced by councils are not known for easy comprehensibility, let alone literary style. Nonetheless, I have read it, and find in it the standard-brand stuff, meaning that it is filled with clichés, and thoroughly exhausted ones at that. The report is intended to bring to life (so it says) the recommendations “pursuant to A New Compact for Learning which was adopted by the Board of Regents in 1991.” Among the “key principles” of this New Compact is the doctrine that “all children can learn,” which leads one to suppose that the Old Compact assumed that only some children can learn, or maybe even only a few. Another of its key principles is that education should “aim at mastery,” an idea that probably has never occurred to teachers before, and explains why thirty distinguished educators were needed to contribute their collective originality to the council. A third principle (there are only six) is that education should reward success and remedy failure, which is actually a fairly interesting principle if one is allowed to discuss it. For example, an argument can be made (didn’t I come close to making it in the last chapter?) that we would do much better if we rewarded failure and remedied success.
I could go on in this petulant way, but it is not my intention to analyze this report. There can’t be many people in the world, in the United States, maybe even New York State, who could care very much about what ideas the New York Board of Regents experts have come up with. But there is one idea that they didn’t come up with that is worth the notice of many people. As an appendix to the report, there is included a list of forty-one goals directed toward “what children should be, know, and be able to do.” The goals are stated in a form that specifies what students in elementary, middle, and secondary school must understand, acquire, develop, apply, respect, and practice. As you can imagine, there is plenty to be done here, including a few things that probably aren’t any of the school’s business. But putting that last phrase aside, students are expected, for example, to develop self-esteem, understand people of different cultural heritages, and develop knowledge and appreciation of the arts. There are many other goals along these lines with which no one could disagree. But there is one that is, as we say, conspicuous by its absence, at least to me. I refer to the goal of “acquiring and/or deepening a love of one’s country
.” One would have thought that among forty-one goals designed for students going to school in America, and going to school free of charge, and pretty close to as long as they wish, at least one of them would concern promoting an affection, even if a muted one, for their country. There is, I must acknowledge, a goal that says students should acquire knowledge of the “political, economic, and social processes and policies in the United States at the national, state, and local levels.” This strikes me as rather cold and distant language, especially when it is followed by a suggestion that students learn the same sort of thing about other countries. There is, also, something about students’ learning to respect civic values and acquiring attitudes necessary to participate in democratic self-government. I assume the authors of this goal would accept America as an example of democratic self-government, although they do not explicitly say so. And as for the attitudes necessary to participate in democratic self-government, they do not include respect, let alone affection, for America’s traditions and contributions to world civilization. They do include a list of “values” students should accept, such as justice, honesty, self-discipline, due process, equality, and majority rule with respect for minority rights, each of which is not so much a “value” as it is a focal point of a great and continuous American argument about the meanings of such abstract terms. That students may not have an opportunity to learn about these arguments is suggested by the fact that nowhere included in the list of goals is that of acquiring knowledge of the history of America.