Teaching What Really Happened: How to Avoid the Tyranny of Textbooks and Get Students Excited About Doing History

Home > Other > Teaching What Really Happened: How to Avoid the Tyranny of Textbooks and Get Students Excited About Doing History > Page 17
Teaching What Really Happened: How to Avoid the Tyranny of Textbooks and Get Students Excited About Doing History Page 17

by James W. Loewen


  Absent solid evidence whether people and ideas from China or Egypt reached the Americas before 1492, for instance, ideas from their own times have unduly influenced archaeologists’ conclusions. The belief that pyramids had to have come from Egypt, for example, fit with the Nadir of race relations, that terrible period from 1890 to 1940 when the United States grew more and more racist. The possibility that Native Americans might have built the pyramids all by themselves seemed unlikely to archaeologists at that time; they assumed that such an “advanced civilization” required “white” intelligence.15

  As the Nadir waned, archaeologists and anthropologists came to feel that minimizing diffusion gave Native Americans full credit. Doing so also helped the new generation of social scientists look more professional, since they sought more rigorous proof than their amateur predecessors had required. Laudable as these motivations were, presentist bias is still bias. They tended to go to the opposite extreme. For example, during and after the Civil Rights Movement, archaeologists were even skeptical that the Norse had reached Canada, for example, despite detailed evidence from Icelandic sagas. Applying harsher standards for claims of cultural diffusion than for claims of independent American development is not defensible, even when done for the finest reasons.

  TODAY’S RELIGIONS AND YESTERDAY’S HISTORY

  One particular form of presentism—projecting ideas from present-day religions onto the distant past—can make it particularly hard for teachers to connect with some students and their parents. Consider how the questions posed in the title of this chapter get answered in Utah. In my many workshops in Utah, I have encountered seven commonly held local answers to the question of how Native America came to the Americas. They are:

  the common textbook answer, stemming from the old archaeological consensus: people walked across the Bering Strait from Siberia about 13,000 years ago

  newer answers from archaeology, DNA research, and other sciences: by boat or perhaps on foot much earlier, around 35,000 or even 70,000 years ago

  one Mormon answer: ancient seafaring Israelites, the so-called lost tribes of Israel, came about 2,600 years ago. (Most Mormons still believe this. DNA research has convinced many Mormon leaders, however, that most people came to the Americas from Siberia; the Israelites were only a small later addition.)16

  a Navajo answer: Long ago, after being rejected by three former worlds owing to quarreling, the first people came up into this world but found Pueblo people already here

  the most common Fundamentalist Christian answer: from Adam and Eve, who were formed about 4004 BC. Fundamentalists are not sure just how Native Americans arrived, but it had to be no more than 6,000 years ago

  a Shoshoni answer: Long ago, from a water jug carried by a mythical being (Coyote)

  a Ute answer: we have always been here

  How is a teacher to cope with that panoply of preexisting views?

  Teachers should not shy away from discussing religion at any point in U.S. history, including here at the beginning. Avoiding a topic only disempowers teachers. But teachers should not impose their religious preference or point of view in the classroom. Indeed, it’s usually a poor idea even to share one’s religious beliefs with students. I learned this early in my career when, in response to a student’s question, I replied honestly that I (then) was an atheist.17 A curtain descended in front of his eyes, and I was unable to reach him for the rest of the semester.

  The way out is to teach students the word “compartmentalize” and what it means. Compartmentalizing is not mere hypocrisy, or, if it is, everyone does it at times. I know I do. Years ago I contemplated the northern lights, for example, in northern Wisconsin. As a college student, I worked summers at a canoe base for Explorer Scouts just east of Boulder Junction. For three straight nights one summer in the early 1960s, the northern lights covered almost the whole sky, to within 30° of the southern horizon. This display went far beyond the usual pale yellow-green bands. We saw curtains of red, of white, of yellow-green with vertical folds, and other formations as well. They swayed and soared. Sometimes the whole sky throbbed. For two nights, the camp staff stayed out in a field in our sleeping bags, eyes on the sky, speaking in whispers, not sleeping until well past midnight, marveling at what we were witnessing. Astronomy had been a major interest of mine in high school, so I understood the scientific explanation. I knew that prior sunspot activity threw off electrons and protons. Some got captured by the magnetic attraction of Earth, which drew them toward the north and south magnetic poles. They then collided with molecules in the upper atmosphere, causing them to glow. At the time, I think I even knew how that glow was created, though I don’t now. Nor do I understand what caused the folded drapery effect, the rapid movements, or the throbbing.

  They were so astounding that they communicated a spiritual energy as well. Even decades later, typing these words, I get stirred again, recalling the power that they had over all of us. Did this display prove the existence of a higher power? Well, no, it did not prove it. It did show that science does not offer the only way to fathom the northern lights.

  Not only the magnificent sights of nature inspire awe. Walt Whitman instructs us that nature’s mundane aspects are equally miraculous:

  I believe a leaf of grass is no less than the journey work of the stars….

  And a mouse is miracle enough to stagger sextillions of infidels.18

  When a person dies, one perfectly reasonable response is, “God have mercy on her soul.” Another is, “What did she die of?”

  Likewise, there is more than one way to understand the distant past. To let a student claim that God formed the world a few thousand years ago, and formed today’s animals fully modern within six days, will not do, however.19 About the question posed in this chapter, teachers want students to know and understand the first and second commonly held answers in Utah, listed above. Teachers need not argue with answers three through six, which are in essence religious. They can say frankly that those answers are appropriate in some circumstances, but not in school. After all, professors of epidemiology at Wake Forest Medical School, affiliated with North Carolina Baptist Hospital, believe in evolution, at least while teaching epidemiology or treating tuberculosis. Evolution is not compatible with the notion that God formed people whole a mere 6,000 years ago. What these professors believe in church on Sunday morning is another matter. Answers one and two to the question, how and when did people first reach the Americas, do not fit in some religious settings, just as the scientific explanation for the northern lights did not seem appropriate to me lying on a Wisconsin meadow at night.

  No longer do Americans pay any attention to former religious views of the solar system from the Middle Ages, owing to advances in the science of astronomy. No longer do people pay much attention to religious views of the human body, owing to the science of anatomy/physiology. Religious views about our earliest history still have some influence, precisely because knowing the ancient past is so hard. Nevertheless, they need to be labeled religious presentism and kept separate from the historical and scientific approach.

  CONCLUSIONS ABOUT PRESENTISM

  Presentism is evident in archaeology about the distant past, especially in earlier archaeology that is now regarded as manifestly preposterous. That’s why it’s good to bring it up in the first unit of a U.S. history course. Presentism is a complicated charge, however, not to be tossed around lightly. Therefore, teachers need to complexify the discussion before moving on. After all, we always look at the past from where we stand. We have no alternative. Students especially have little interest in learning about the past for its own sake. Teachers always need to “make it relevant” to the present. Moreover, simply lobbing the charge of “presentism” does not prove that the history being challenged is biased. Evidence is still required.

  Presentism has often been invoked inappropriately about moral issues. For example, some people charge that criticizing Columbus for enslaving American Indians is presentist, because peo
ple did not consider slavery wrong in 1493. This charge is unconvincing. First, it implies a thorough-going moral relativism: if a culture deems a practice OK, who are we to say otherwise? But there are moral absolutes. The “golden rule,” found in many religious and philosophical systems, is one. As we (or Columbus, for that matter) would not want to be chained and confined to a ship’s hold and forced across an ocean to work for others for nothing, neither is it right to so chain and confine others.

  As well, in 1493 many people opposed Columbus’s plan to enslave Arawaks and send them across the Atlantic to Spain and the Canaries. Most of these opponents were Arawaks, to be sure. To leave them out of the moral calculus, notwithstanding, condemns us to write white history, not history. Furthermore, when historians look closely, they usually find that even among the oppressors, some at the time stood for justice and told the truth about what was going on. Among the Spaniards, these included Antonio de Montesinos, the Dominican priest who in 1511 denounced the Europeans on Haiti for their treatment of American Indians, and Bartolomé de las Casas, the first great historian of the Americas.20

  Related to moral presentism is the error historians call “Whig history.” Herbert Butterfield coined the term in his 1931 book, The Whig Interpretation of History. He used it to criticize historians associated with the Whig Party in Britain around the middle of the 1800s, like Thomas Macaulay. They wrote about the past in a way that made Britain’s dominion over other countries and the Whig Party’s dominion over Britain seem altogether right and natural.

  “Whig history” has come to mean any portrayal of the past that makes the present seem foreordained and natural. Because the term derives from specific circumstances in Great Britain, rather than teach “Whig history” to students, I prefer a more general sociological term, chronological ethnocentrism.

  CHRONOLOGICAL ETHNOCENTRISM

  Chronological ethnocentrism makes explicit the assumption that was implicit in Whig history: that our society and culture are better than past ones. That’s why we developed whatever institution we are now trying to explain, and that’s why we don’t have to try very hard to explain it. It’s just “natural” that people would do things better, as time passed and they and society evolved. A corollary of chronological ethnocentrism is that we are smarter and more knowledgeable than past peoples.21

  In the nineteenth century, chronological ethnocentrism provided a splendid rationale for imperialism. Whites, allegedly more advanced, saw themselves as necessary to perform the service of exploiting natural resources, which natives were too backward to do. It was Manifest Destiny, Americans said, for the U.S. to expand to the Pacific, maybe to Panama and Cuba as well. At its worst, this kind of thinking repeatedly led to genocides. Spaniards wiped out most Canary Islanders in the fifteenth century, Anglo Americans killed or dispersed into the West Indies most Pequot Indians in 1637, the French shipped virtually the entire Natchez nation in chains to the West Indies in 1731, British policies and diseases ended Tasmanian culture between 1803 and 1833, Germans slaughtered the Hereros in Namibia in 1904, and the list goes on and on. It didn’t seem to matter, because history had already consigned these peoples to its backwaters. Their annihilations, while tragic, amounted to mere potholes on our otherwise smooth highway to the present, in this view. After all, things have gotten better since. Thus, despite such horrors, chronological ethnocentrism lets textbooks present our American journey—and Western civilization in general—as a morality play, but without suspense, for all along students know that good triumphs in the end.

  Chronological ethnocentrism is a distinctive form of presentism, with distinctive and deleterious effects. For one, it can keep students from grasping that individuals—sometimes just a handful—caused things to turn out as they did. The godlike monotone used by textbook authors also has this effect—one reason why textbooks are so dull. By removing contingency, chronological ethnocentrism removes tension from the narrative. Things were bound to turn out as they did. Things were bound to turn out well. Why, then, should students concern themselves?

  PRIMITIVE TO CIVILIZED

  We have seen how chronological ethnocentrism prompts archaeologists and textbook writers to give inadequate attention to the possibility that the earliest settlers came to the Americas by boat. Those “primitives” must have walked. The same thinking imposes the idea of automatic progress as the underlying storyline of U.S. history textbooks from start to finish. Asking how people first got to the Americas provides a splendid moment to point out the inadequacies in this storyline.

  As historians project today’s social relations onto the distant past, they know without even thinking about it that human societies have much more capability now than they did way back then. Consider medicine. In earlier times, societies blamed various factors when a person became ill. She ate too much “hot food” (not always referring to temperature), someone “sang her” (put a hex on her), she did something bad, and so on. People were ignorant of the germ theory of disease. Ignorant of anatomy and physiology too, many societies did not understand how problems with organs like the heart, kidneys, and even gums could cause symptoms elsewhere in the body. Since we moderns know more, we assume we must be smarter.

  Of course, if we thought about the matter directly, we might realize that people probably aren’t getting smarter. On the contrary, we may be growing stupider. I know a man whose job for years was to drill seven holes in a road grader frame as it passed him on an assembly line. I know a woman whose job—also for years—was refolding and stacking clothing after shoppers had unfolded it at K-mart. For them and many others, our modern division of labor does not provide experience in solving varied problems, and hence may not build mental capacity. Place an eighteen-year-old in the “wilderness,” as we civilized folk have learned to call it, and watch (or imagine) what happens. Most likely, he does not know how to find food, shelter, or even north. He cannot tell a poisonous from an edible fungus, snare a rabbit, or start a fire without a match, and maybe not even with one. Not to put down our eighteen-year-old—he can identify thumbnail photos of Paris Hilton and Britney Spears at a glance, from the back. He can take a photo with his cell phone, add commentary, upload it, and post the result on a website. Perhaps the soundest conclusion is that the knowledge and intelligence of modern people may be different—neither more nor less—from that of our primitive counterparts.

  At this point, students need to become sophisticated in their use of two terms: “primitive” and “civilized.” “Civilized” is a problematic word, because it has at least two very different meanings. Anthropologists use it in contrast to “primitive” to denote societies with a complex division of labor. All societies show some division of labor, to be sure. Women do somewhat different tasks from men, and children engage in different activities from adults. In primitive societies, however, the division of labor does not go much further than that. The principal chief of a village does not “chief” for a living—he hunts, fishes, perhaps farms, just like other men. The same holds for the shaman or religious leader. Traditionally, every Inuit woman knew how to soften seal skin and combine it with caribou fur to make boots. Every man knew how to harpoon a seal.

  In our society, few mayors know how to hunt and fish.22 Few women can make footwear. Our jobs have become so specific that the Los Angeles metropolitan area boasts more than 100 businesses whose work is poodle clipping. Such specialization makes our society very capable—not only at clipping poodles well, but also at open-heart surgery, computer design, and a million other tasks. That capability is what distinguishes civilized from primitive societies.

  Unfortunately, “civilized” has quite a different meaning in common discourse: polite, refined, even humane. Students can see the disjoint by considering this question: Did Nazi Germany show a high degree of civilization? An anthropologist would reply, “Of course,” compared to such primitive societies as Shoshoni Indians in Utah, pre-1492. Which society was more civilized in the sense of polite and humane? The
Shoshonis, of course.

  Our society labels some ancient cultures—Iraq, Egypt, China, Rome—“higher civilizations,” in contrast to “more primitive” peoples like Shoshonis or Australian Aborigines. These very terms—“higher,” “primitive,” “civilized”—are suffused by the notion of progress. Many societies apply the same labels to religions, viewing the “animist” or “pagan” beliefs of American Indians as less advanced than “religions of the book” like Judaism, Christianity, and Islam. Earlier, mankind believed in many gods, goes this line of thought; then in fewer and more abstract deities, like the Trinity; and finally, in just one god.

  Anthropologists are uncomfortable ranking anything other than science and technology higher or lower. Most elements of culture—religion, family structure, language, law, and the like—cannot be graded higher or lower but only different. When we look carefully at schemes that purport to rank religions rationally, for example, we find weaknesses. In a sense, like Christians, Native Americans believed in one god, a “Great Spirit” that infused itself into sacred places like a meadow, beautiful lake, islet, and so on. In a sense Christians believe in several gods—Father, Son, and Holy Ghost for starters, and perhaps Satan, the Virgin Mary, and a whole realm of saints and angels.

  U.S. history textbooks do recognize that some Native American cultures—usually the Incas, Mayas, and Aztecs—were civilized. But not cultures in the United States. “Throughout the continent to the north and east of the … Pueblos,” says The American Pageant, “social life was less elaborately developed—indeed, ‘societies’ in the modern sense of the word scarcely existed.” Such “analysis” exemplifies ethnocentrism. The authors of Pageant admire hierarchical Indian societies—more like our society today—and think little of those American Indian societies whose citizens were more equal. They are still in thrall to the outmoded primitive-to-civilized continuum.

 

‹ Prev