Pandora's Seed

Home > Other > Pandora's Seed > Page 21
Pandora's Seed Page 21

by Spencer Wells


  Religion, too, has provided guidance on these same ethical questions, attempting to define morality partly in terms of an absence of prohibited behaviors, or sins. In addition to murder, most religions have prohibitions on adultery, greed, and so on. The seven deadly sins defined by the Catholic church are lust, gluttony, greed, sloth, wrath, envy, and pride. Boil them down and they basically fall into four categories: laziness, lust, rage, and wanting more than you have.

  The first three, while considered important, clearly weren’t emphasized as much as the fourth, which gets four sins devoted to it, including what was considered the most serious, pride. Think about it, though: wanting more than you have is actually central to modern life. Without it we would still be sitting around a campfire somewhere on the savanna, as the saying goes. Gordon Gekko’s infamous creed “Greed is good” in the film Wall Street could only have been uttered in a society in which it is possible to accumulate excess—in other words, an agricultural society of the sort that we’ve had for the past 10,000 years. In the world of the hunter-gatherer there can be no excess, since everyone has what he or she needs and accumulating excess would make no sense. Why kill more animals than you can eat when the meat will only spoil? But accumulating surpluses is desirable, even necessary, in a society where resources are limited, since by doing so you assure yourself and your family of access to these resources in the future. Wandering from one minimum-wage job to another, living paycheck to paycheck, is not a lifestyle most of us in the modern would want for ourselves or our children. Essentially, though, this is the life led by hunter-gatherers, and by our ancestors, since they were always reasonably sure that they would be able to find enough of the resources they needed to survive.

  Jean-Jacques Rousseau, the eighteenth-century French philosopher and social critic, is perhaps most famous for his depiction of primitive peoples (as well as humanity in the distant past) as living in a state of grace—a view that has come to be encapsulated by the term “noble savage.” Rousseau felt that humans are inherently good, and that it is society that has corrupted us. Friedrich Engels espoused essentially the same position, as have other writers. The idea is that if we were only able to return to a state of nature (whatever form that might take), all of our problems would be washed away and there would be no need for law enforcement. The opposite view was voiced by Thomas Hobbes in his description of the Leviathan. According to Hobbes, the “nasty, brutish, and short” lives suffered by savages were mitigated only through the apparatus of the state.

  These two opposing views of humanity have been debated for over two centuries, with no clear winner. In the 1960s, encouraged by the influence of the anti-establishment counterculture in intellectual circles, Rousseau’s view gained popularity. At the influential “Man the Hunter” anthropology conference held in Chicago in 1966, much of the discussion centered on the lifestyle of hunter-gatherers. This is where Marshall Sahlins coined the phrase “the original affluent society,” referring to the fact that hunter-gatherers seemed to have had everything they needed—though not in excess, of course—and worked far less than most Westerners, allowing them the ultimate affluence, that of free time.

  Much criticism of Sahlins’s view has focused on the relatively small data set on which it was based and the fact that the anthropological observations that led to it were minimal and potentially biased. Today we seem to have turned in the opposite direction, and Rousseau’s views are often portrayed as naïve. Anthropologists have attempted to show that “primitive” peoples have at least as much capacity for immorality (for lack of a better term) as modern civilizations. Lawrence Keeley, in his book War Before Civilization, presents an impressive amount of data on violence in such primitive societies, finding that warfare is common. He provides a wealth of statistical support for his argument, demonstrating that a large fraction of the people living in such primitive groups die violent deaths as a result of intergroup clashes. The Dani of the New Guinea highlands, for instance, lose one man in three to murder. Even the bloodiest wars of the twentieth century haven’t usually killed more than 10 percent of the adult male population of any particular country (Nazi Germany and the Soviet Union in the Second World War being notable exceptions), and for the average American man today, the lifetime chances of being murdered are roughly one in one hundred. Clearly, Keeley argues, primitive peoples are not living in a state of grace.

  Keeley’s analysis often groups all preindustrial societies together, regardless of their mode of subsistence. When we examine the data on warfare among hunter-gatherers alone, though, his argument becomes a bit more tenuous. As detailed by Keeley in his book, anthropologist Keith Otterbein compiled data on fifty societies around the world and made comparisons on the basis of their way of life. Those engaging in agriculture and animal husbandry had the highest frequencies of warfare, with more than 90 percent engaged in warfare continuously or frequently. Of those societies that “rarely or never” engaged in warfare, most were hunter-gatherers—30 percent of the hunter-gatherer groups studied fell into this category. While the small number of groups surveyed makes the significance of this difference debatable, it is suggestive. The fact that the world’s remaining hunter-gatherers are typically marginalized people living in territories that have been reduced by the surrounding agricultural populations, and thus hardly exist in a “natural” state, also complicates the comparison.

  One potential insight into the issue of hunter-gatherer morality comes from the Moriori, who live in the Chatham Islands of the South Pacific and provide an interesting case study of the effects of subsistence mode on violence. An offshoot of the better-known Maori living farther west, in New Zealand (a Polynesian people renowned for their violent intergroup warfare), the Moriori are thought to have migrated to their homeland around the year 1500. The Chathams were not suited to the typical Polynesian crops grown by the agricultural Maori, however, so the Moriori had to return to hunting and gathering, living off the abundant coastal resources in the islands. Their culture, presumably originally as warlike as that of the Maori, changed over time. They became dedicated pacifists, settling arguments with ritualized battles and discussion, since warfare made no sense in their new homeland and with their way of life. The islands were annexed by the British in 1791, and in 1835 nearly a thousand Maori carried there by British ships massacred the two thousand Moriori estimated to live there. The Moriori were essentially extinguished as a people, and within a generation only around one hundred survived, conquered by their far more violent agricultural cousins, their demise helped along by European diseases.

  Perhaps the Moriori are a fluke, a local oddity. But as we saw in Chapter 2, societies typically engage in warfare when resources are limited and there is a high degree of competition. For hunter-gatherers this is not typically the case, unless they are living in an unusual situation, such as the inhabitants of Jebel Sahaba, in Nubia, who relied on fishing in an increasingly arid environment. During most of human history resources were not limited, and the human population expanded throughout the world. When we developed agriculture, of course, the increase in population density meant that the days of easy excess were gone; if you wanted or needed more food, a great deal of investment was required to grow it. Tending fields, maintaining irrigation systems, building permanent settlements—all of it meant that there was too much at stake for anyone to wander off and find another place to live and, therefore, that one’s investment was worth fighting for. While we likely entered into our new lifestyle in response to climatic challenges that made hunting and gathering more difficult, once we had done so we created a new set of competitive challenges.

  I’m not advocating a return to a hunter-gatherer lifestyle, of course—merely pointing out that we can learn something about the state of modern society from those ancestors. Some people, wary of the constant struggle to work and accumulate inherent in modern life, have chosen to live outside the system, or even to actively rebel. The “turn on, tune in, drop out” attitude espoused by Ti
mothy Leary in the 1960s is simply a recent example of such a rebellion. The Luddites of the early 1800s, English cottage weavers who destroyed factories in order to save their jobs and their traditional way of life, were early opponents of modernization and the industrial era. Less violent was the nineteenth-century Romantic movement, steeped in Rousseau, its proponents eager to return to nature and simpler times. Marxists and Leninists embraced modernity but not the capitalist economic system that had spawned it. The Amish and other religious groups who choose to live a life peripheral to the mainstream of Western society are consciously rebelling against what they see as the corrupting influences of technology and other trappings of modernity. Anti-globalization protestors decry the loss of local industries to an American-led cultural juggernaut, and the Slow Food movement urges us to eat simple, local foods free of the corrupting effects of McDonald’s and KFC. All of these movements, regardless of their particular focus, have been formed in response to what are seen by their members as the dangers of modernity.

  Over the past half century another anti-progress trend has been spawned, one more widespread and potentially dangerous than the more limited movements of the past: fundamentalism. Coalescing in the mid—twentieth century in both the Islamic and the Christian faiths, fundamentalism has increasingly dominated political debates around the world. Born of desperation and anger, and driven forward by charismatic leaders, fundamentalist views provide a focus for people who feel left out of the modern world, offering an alternative vision of how life should be lived. Its rise, and what it might mean for the future, is where we’re headed next.

  FUNDAMENTALISM

  In the 1950s, Egypt was going through an unprecedented transformation. Having been part of the British Empire for a short period during and after the First World War, and a virtual client state of Britain during the three decades after it was declared “independent” in 1922, it finally took the political future into its own hands in 1952 when a group of army officers led by Gamal Abdel Nasser overthrew the government in a coup. Nasser was inspired by what he saw as the success of the Soviet Union, and he immediately set about remaking Egypt on a socialist model. Part of his plan included the secularization of Egyptian society. Though the new regime was still technically Muslim, Nasser and his government believed strongly in the separation of mullah and state. Islam was something to be sidelined from the primary goal of development along secular, socialist lines.

  All of this was happening at a time when Arabs felt weakened by the creation of Israel in 1948. A Jewish state occupying one of Islam’s holiest cities was something that many could not forgive, and throughout the Arab world it was known as “the catastrophe.” The Arab-Israeli War, which followed Israel’s declaration of independence from British rule, was won decisively by the Israelis, further deflating Islam’s collective ego. It was against this backdrop that some Muslims began to explore alternatives.

  The Muslim Brotherhood, an Islamic cultural group, was founded in the 1920s as Egypt’s fortunes were in decline. By the late 1940s it had millions of members and was a powerful combination of trade union, cultural organization, and political party whose influence reached into every aspect of Egyptian life. The Brotherhood espoused a philosophy that demanded a strict, all-encompassing adherence to the tenets of Islam—not simply practicing its tenets, like praying five times daily or undertaking the hajj, but living Islam. Its insistence on implementing sharia, or religious law, earned it many enemies in the Egyptian government. While not founded on violent principles, but as an educational organization meant to spread the word about Islam, it became increasingly militant over the years. In 1949 one of its members assassinated the Egyptian prime minister, which led to a retaliatory assassination of the Brotherhood’s founder, Hassan al-Banna, by government forces.

  Sayyid Qutb (pronounced SIGH-eed KOO-tub) was an Egyptian author and literary critic who became a member of the Brotherhood in 1953, at the age of forty-seven. By this point the organization had been outlawed by the Egyptian government as part of Nasser’s rush to secularize the country. Qutb was among many Brotherhood members arrested in the fifties, and he was sentenced to fifteen years of hard labor in 1954. His time in prison, understandably, made a strong impression on him, but instead of rebelling openly or giving up and becoming a broken man, he fought back using that most powerful of prisoners’ weapons: the pen. During his years in jail he wrote two books that would turn out to be hugely influential in Islam: a commentary on the Koran and Ma’alim fi al-Tariq (Milestones). While the former was a scholarly, multivolume treatise, the latter was a manifesto.

  In Milestones, published in 1964, Qutb argued that it was time for Islam to reclaim its supremacy as a world religion. The reason for its decline, he maintained, was that Muslims had surrendered to jahiliyyah, or ignorance of God; in other words, secularism had led to its decline. The only way to reverse this decline, he said, was to surrender completely to an Islamic way of life: sharia and a rejection of the state of jahiliyyah. The latter would be accomplished in part through jihad (struggle), a holy war against corrupt secularism.

  Qutb’s focus on jihad was new, as were the methods he advocated in order to achieve the liberation of Islam. Since the time of Muhammad, Muslims had been engaged in struggle in the name of religion; this was the reason for the spectacular territorial gains of Muhammad’s followers in the Middle East and North Africa during his lifetime and immediately following his death. It was never seen as a primary goal, or tenet, of the religion, however. Qutb reinterpreted Islam within the scope of the modern world, giving it a new focus. In the same way that Lenin’s interpretation of Marxism was used to justify the Russian Revolution, Qutb hoped that his work would lead to an Islamic revolution.

  In her book on the history of fundamentalism, The Battle for God, Karen Armstrong outlines the debate between two opposing realms of thought that have dominated humanity’s worldview, probably since the origin of our species tens of thousands of years ago. Mythos is a mystical way of viewing the world, one preoccupied with received meanings about significant events. Logos—the word provides the Greek root for the word “logic”—is the realm of rationalism, science, and Enlightenment thought. Mythos is about accepting the spiritual aspects of the world, while logos is concerned with questioning and understanding. For thousands of years, human societies have incorporated aspects of both, but in the past few centuries logos has come to the fore. It underpins scientific thought, has provided us with the wonders of modern technology, and has led to unprecedented levels of wealth. In its ascendancy it has also, many would argue, led to the destruction of the old certainties that so many relied on to give their lives meaning. It is this latter view that Qutb was espousing, but with a novel twist: he wanted to apply a logos approach to achieving mythos ends. In other words, the decline of Islam and the rise of global jahiliyyah was a problem to be solved, and the solution lay in jihad.

  At the same time that Qutb’s views were receiving widespread attention in the Islamic world, many Americans were feeling the loss of mythos in their own lives. The social upheavals of the 1960s, with their rejection of traditional Christian family values, were a shock to many. The rise of Christian fundamentalism in the United States was, even more than the spread of Qutb’s scholarly manifesto, an application of logos thinking to a mythos problem. At the forefront of the movement was Jerry Falwell, a southern preacher who founded his congregation—the Thomas Road Baptist Church—in Lynchburg, Virginia, in 1956. Using the power of the new mass communications media of radio and television, he soon became known to millions around the country. According to Karen Armstrong, in the 1960s and 1970s, 40 percent of American households tuned in to his broadcasts. The logos of modern technology was being used in a novel way to mobilize a conservative social movement, its goal being to return religion—mythos—to its rightful place at the center of society.

  In the 1970s, Pat Robertson and Jim and Tammy Faye Bakker would follow Falwell’s lead. Watergate, a demoralizing withdr
awal from Vietnam, the oil crisis of the 1970s, and the Iranian hostage crisis would all combine to challenge America’s sense that it was on the right track. By the time of the 1980 presidential election, many people sensed that the time had come for more religion in the political process. Falwell founded the Moral Majority in 1979, and Ronald Reagan successfully courted that group to win the election with a landslide 489 electoral college votes to Jimmy Carter’s 49. From that point onward, the Republican Party—traditionally the party of the industrial Northeast and big business—would be realigned with the rural South, family values, and Falwell’s Moral Majority. Religion would take its place on the American political stage in a way it never had before.

  The 1980s would see the concepts of jihad and “family values” consolidated and formalized in the Islamic and American spheres of influence. The fundamentalists on the Islamic side shifted to battle mode, inspired in part by Ayatollah Khomeini and the 1979 Islamic Revolution in Iran. While various Arab organizations had always had radical factions that carried out political assassinations—including the Muslim Brotherhood—they’d typically fought for political power or territorial gains, and their targets had often been highly visible people in the opposing camp. The new interpretation of jihad was much broader. The group Islamic Jihad arrived on the scene in 1983 when it claimed responsibility for the bombing of the U.S. Marine Corps barracks and the American embassy in Beirut, and many elements of the organization were incorporated into the Lebanese group Hezbollah later in the decade. Hamas, a radical Palestinian spin-off of the Muslim Brotherhood intent on establishing a separate Palestinian state, was founded in 1987. Al-Qaeda, certainly the most notorious Islamic terrorist organization of the past decade, was founded in 1988. It emerged in the wake of the covert, CIA-funded war in Afghanistan with the aspirations of many mujahideen to continue the battle against non-Muslims elsewhere in the Islamic world, notably Israel.

 

‹ Prev