Book Read Free

Peak Everything

Page 20

by Richard Heinberg


  Human societies are dynamic, complex systems, and most of their signal features are understandable as emergent phenomena. It is a fascinating thought exercise (I’ve been at it for two decades now) to attempt to trace events in the past in order to identify the most decisive developments that enabled the emergence of industrial civilization. Of course, societal complexity (defined by the variety of tools, artifacts, and social roles) depends on humans’ ability to capture increasing amounts of energy from their environment, and so the genetic and social attributes that facilitate energy capture are crucial. Which of those attributes are keys to understanding the entire process?

  Clearly, most of the emergent features of complex societies (their economies, technologies, and governments) depend on language. Now, language itself is an emergent phenomenon, a link in a long chain of them; however, it was a profoundly consequential one. In the grand edifice of human society, language should be considered a foundation stone.

  The questions of how and when language evolved are hotly debated. Some archaeologists argue that the relatively sudden appearance, roughly 40,000 years ago, of counting sticks and new kinds of hunting tools suggests that language arose then. However, humans — including Neanderthals — were anatomically capable of speech much earlier; indeed, there is fossil evidence that the main areas of the brain associated with language (Broca’s area and Wernicke’s area) started to enlarge up to 1.5 million years ago. Moreover, humans’ ability to spread to regions outside of Africa, and especially to islands, may have depended upon their use of language to convey information and intention and to coordinate tasks. It may be that we have been using language so long that our brains, throats, and chests have all evolved in tandem. The situation is likely similar to what has happened in the computer industry over the past few decades: just as hardware and software developers work cooperatively, one designing according to the needs and capacities of the other, our own internal hardware (brain and speech faculties) and software (language) have become, in a sense, made for one another.

  Part of the problem in determining when and how language arose may lie in definitions. The term language can refer in a vague or general sense to any sort of communication; but this usage is not always helpful. All animals communicate using sound, color, scent, or gesture. Even plants and fungi communicate with one another using chemicals and gene packets transmitted via soil or air. Human language differs from these kinds of information transfer in its level of abstraction, its multiplicity of symbols, and the complexity of its grammar (or system of rules for the manipulation of symbols). It is one thing to signal a somatic or emotional state or a general intention, but quite another to discuss events, including hypothetical ones, in the future or the past, or in distant places.

  Language made these things possible, but much more as well. Language generated our peculiarly human form of self-awareness: we can talk about ourselves, talk about talking, and think about thinking. Our relationship with our environment also changed, as language enabled us to coordinate our thinking and behavior across time and distance in a way that was unprecedented, making us a far more formidable species (compare the population size and environmental impacts of humans today with those of chimpanzees or gorillas). Writing only exacerbated these trends, heightening the level of abstraction in language and widening our ability to convey thoughts and align collective action. If talking helped organize effective hunting bands, writing enabled the formation of nation states. Add the printing press, radio, television, and fossil fuels, and here we are today.

  But with language came an array of unintended consequences — which, of course, is just another name for emergent phenomena.

  Language and Religion

  “In the beginning was the Word,” proclaims the Gospel according to John. In Genesis, creation commences with a series of spoken commands, starting with “Let there be light.” The creation stories of the ancient Egyptians, Celts, and Mayans likewise emphasized the generative potency of language.

  This striking coincidence, noted by many scholars of world mythology, cloaks a supreme irony: while religion ascribes magical power to words, there are reasons to think that religion itself may be an inevitable though accidental outgrowth of language.

  It is interesting to speculate whether non-human animals have awareness of something that humans might recognize as a spiritual dimension of existence. Do dogs and cats have near-death or out-of-body experiences? Do birds experience awe and wonder when watching the sunrise? There is no way to know for sure. In any case, it is fairly clear that no non-human species has developed a religion — if we mean by this term an organized set of beliefs about the supernatural, and a set of practices oriented to the service or worship of a divine being or beings.

  Why not? What is unique about humans that would lead us to construct religions? Are we set apart because we alone possess souls? Or do our brains contain some unusual structure shared by no other animal? Research into neurotheology, while controversial, offers some clues: religious or spiritual experiences seem primarily to be associated with the right temporal lobe of the neocortex, implying that feelings associated with such experiences are normal features of brain function under extreme circumstances. Nevertheless, it is likely that the problem of religion is as much an issue of “software” (language) as it is one of “hardware” (brain structure).

  Let us suppose that language was initially used only for practical purposes such as coordinating hunting efforts. Slowly, haphazardly, people must have developed rudimentary elements of vocabulary and grammar, often in order to aid with planning — an activity inherently implying the senses of location, time, cause, effect, and intention. Women, men, and children began to make simple sentences to ask and explain — who, what, where, when, and why? Once the ability to pose and answer such questions was in place, it inevitably began to be applied to less immediately pressing concerns. The Pleistocene hunter went from asking, “Where did these bison come from?” to “Where did stars, the Moon, the Sun, and people come from?” Hence the mythologies of aboriginal peoples everywhere are rich in origin stories. Language was seductive in its power: once a tiny morsel of reality had been verbally nibbled off, its incomplete digestion provoked a recurring hunger to take another and yet another bite, and eventually to swallow the world whole.

  As power over the environment grew, as society became more complex and formidable, religion mutated accordingly. Hunter-gatherers saw nature as alive and filled with spiritual presences that could directly be engaged by way of shamanic practices. Such beliefs and behaviors grew out of these people’s direct interaction with their environment, and fit their needs for social cohesion within an egalitarian context. With division of labor and thus a hierarchical organization of society came full-time specialists who got their food not directly from nature but from other humans; some of these specialists were spiritual intermediaries (priests) who appealed to sky gods detached from nature and the lives of commoners. With writing, myths about the gods could be codified and carried to distant lands (this story is told in fascinating detail in Bruce Lerro’s From Earth Spirits to Sky Gods).

  German orientalist Max Müller (1823-1900), who virtually created the discipline of comparative religion, put the matter succinctly by asserting that mythology is a “disease of language.”

  Perhaps the word disease seems too harsh. After all, mythology has its uses as well: as Joseph Campbell never tired of saying, myth gives us meaning. And surely meaning is a good thing. Nevertheless, the human need for meaning again highlights our obsessive and dependent relationship with language. Meaning is always attached to symbols: we invest a symbol with meaning, and that meaning is conveyed to whoever correctly interprets the symbol. We see a sentence written in an unfamiliar language and we wonder, “What does it mean?” As we have become ever more hooked on linguistic symbols, we have come to see nearly everything as if it were a sign for something else. We look to stars, tea leaves, and coincidences for meaning. The universe is talking
to us! Myths are verbal narratives that seek to unpack the meaning of existence. We seldom wonder why it is that life must have meaning in order to be satisfying. Is it possible that existence could be sufficient unto itself, with no need for an embedded message?

  Religion consists of more than just mythology, though. Surely religion evolved at least partly to coordinate and moderate collective behavior via systems of morality and ethics which, in their most basic forms, appear to be genetically coded. The senses of good and evil, of honor and shame, have become such powerful internal motivators for humans that even most atheists are continually compelled by them. There is nothing quite like this among other species, whose behavior tends to be less learned and more genetically coded, and who therefore do not engage in the practices of rewarding or punishing one another’s behavior nearly to the same degree we do. Ironically, morality often contributes to humans’ most brutal acts, which have little precedent in other animals (witch burnings, as just one example, were morally motivated).

  Nevertheless, the development of complex societies would surely have been difficult if not impossible without morality — which had previously often been turned toward ecological ends, as early societies codified their needs to moderate reproduction, avoid incest, and protect natural resources via their taboos (“Do not kill the red kangaroo during its mating season!”). But then, once religion and society had mutually mutated in the direction of abstraction and complexity, morality became at least partly unhinged from environmental and genetic necessity and began increasingly to adhere to written myths about the verbally hallucinated sky gods.

  From an ecological point of view, the results were sometimes inadvertently salutary: religious wars (such as the Crusades) helped temporarily to moderate human population levels — though comparable results had been achieved by hunter-gatherer societies using gentler methods such as herbal contraception. Some religions also promoted celibacy among priests, monks, and nuns, again helping to stem population growth. But as people’s verbal obsessions began to be taken up with myths that had more to do with consolidating the power of religious elites than with regulating people’s relations with the natural world, religion served increasingly as an instrument of social and ecological conquest.

  Nevertheless, if language muddied humans’ connections with nature by way of verbal speculation, regimentation, and hallucination, it also fostered a countervailing tendency.

  Grammar, Reason, Logic, and Evidence

  Other animals observe, plan, draw conclusions from experience, and continually revise their mental pictures of reality. These capacities, the foundations of reason, are not uniquely human. Logic, which is the study of reasoning, is uniquely human, however, because it requires language.

  Logic is inherent in grammar, which people developed and used long before there were grammar schools, or schools of any sort, and young children still absorb the basic rules of grammar intuitively without having to be drilled in them. In language, each coherent packet of meaning (such as a sentence) must adhere to some agreed-upon standards if it is to be useful. In this regard a sentence is like a mathematical equation (mathematics, after all, is itself a language): before an equation can be correct or incorrect, it must conform to basic rules. Unlike the statements “2+6=8” and “3+4=9” (one of which we would recognize as being true, the other false), the statement “=5+7 -” cannot be said to be true or false; it is simply unintelligible because it is not organized as a complete equation according to the rules of arithmetic. (Quantum physicist Wolfgang Pauli, who was known for his abhorrence of sloppy thinking, once famously commented that another scientist’s work was “not even wrong.”)

  Grammar and logic give us the basis for making comprehensible statements about the world; linking logic with empirical evidence helps us formulate true statements and recognize when statements are false. This, again, is a long-standing practice: millennia before the scientific method was codified, people relied on feedback between language and sensory data to develop an accurate understanding of the world. Are the salmon running yet? Let’s go look.

  However, not all possible statements could be checked empirically. If someone said, “These berries taste good,” that was at least a matter for investigation, even if everyone didn’t agree. But the situation was more complicated if someone said, “The volcano smokes — that must be because the gods are angry; and if the gods are angry it must be because we haven’t provided enough sacrifices.” Unlike the observation that the volcano was smoking, the following two statements and the reasoning behind them had no verifiable basis — unless the gods could be called into the village commons and publicly queried about their moods and motives (the attempt to do so may have led to the origin of shamanic trance mediumship). This was magical thinking — reasoning based on mere correlation rather than an empirically, publicly verifiable chain of causation.

  It was inevitable that magical thinking would flourish given that there were so many subjects of interest for which empirical investigation was impractical or impossible. That situation continues: there is still no empirical basis for answering, once and for all and to everyone’s satisfaction, questions like, “Does God exist?”, “Who am I?”, “What happens to us when we die?”, or “What is the greatest good?”

  Yet however strong the temptation to engage in it, magical thinking when tied to religion failed to provide much practical help in industry or commerce. As these limits came to be appreciated, and as industry and commerce expanded, philosophers and students of nature began to construct the formalized system of inquiry known as the scientific method. Here was a way to obtain verifiable knowledge of the physical world; better still, it was knowledge that could often be used to practical effect. The method came to hand at a propitious time: wealth was flowing to Europe from the rest of the world due to colonization and slavery; meanwhile the development of metallurgy and simple heat engines had proceeded to the point where the energy of fossil fuels could be put to widespread use. When coupled with the project of technological invention, science and mathematics yielded undreamt-of power over the environment. When further coupled with capitalism (corporations, banking, and investment) and fossil fuels, the result was the industrial growth machine.

  All of this would have been fine if we lived in an infinite sea of resources, but instead we inhabit a bounded, finite planet. Humanity had set a course toward disaster.

  Language and the Ecological Dilemma

  The ecological dilemma (which consists of the mutually rebounding impacts of population pressure, resource depletion, and habitat destruction) is certainly not unique to the modern industrial era; indeed, it is not unique even to humans. However, modern humans have created a dilemma for themselves of unprecedented scope and scale.

  The dilemma, whether encountered by people or pigeons, is often a matter of the failure of success: the genetically engrained aims of the organism are to reproduce and to increase its energy capture, but its environment always has limited resources. Thus temporary population blooms (which are, in their way, evidence of biological success) are usually followed by a crash and die-off. In humans, the powers conferred by language, tools, and social organization have enabled many boom-and-bust cycles over the millennia. But the recent fossil fuel era has seen so much growth of population and consumption that there is an overwhelming likelihood of a crash of titanic proportions.

  This should be glaringly obvious to everyone. Our ecologists have studied population blooms and crashes in other species. Our soil scientists appreciate the limits of modern agriculture. Our geologists understand perfectly well that fossil fuels are finite in quantity. And our mathematicians can easily calculate exponential growth rates to show how quickly population increase and resource depletion will outstrip our ability to satisfy even the most basic human needs. Verbal and mathematical logic, joined with empirical evidence, make an airtight case: we’re headed toward a cliff.

  But language also keeps most of us in the dark. This is partly because
magical thinking is alive and well — and not just in churches and New Age seminars.

  In the last couple of centuries, the magical thinking associated with religion, under assault from science, has found a new home in political and economic ideologies. Economics, which masquerades as a science, began as a branch of moral philosophy — which it still is in fact. For free-market ideologues, the market is God and profit is the ultimate good. We have used language to talk ourselves into the myth of progress — the belief that growth is always beneficial, and that there are no practical limits to the size of the human population or to the extent of renewable or even non-renewable natural resources we can use. This particular myth was an easy sell: it is an inherently welcome message (a version of “you can eat your cake and have it too”) and it seemed to be confirmed by experience during a multi-generational period of unprecedented expansion based on the one-time-only consumption of Earth’s hydrocarbon stores.

  Meanwhile, at the business end of economic theory, masters of advertising, marketing, and public relations have learned deftly to manipulate symbols and images for emotional effect, sculpting the public’s aspirations for comfort and prestige. This new kind of magical thinking did contribute to commerce and industry — and spectacularly so! (For historical details on this, see the BBC television documentary series “Century of Self ” by Adam Curtis, and the books of Stuart Ewen.)

  In politics, the 20th century saw battles between the quasi-religious ideologies of the left and right — Leninism, Stalinism, Fascism, Nazism, and Maoism, along with British “it’s-for-your-own-good” colonialism and equally benevolent Yankee imperialism. In recent years, the political philosophy of Leo Strauss and his followers has come to the fore via the neoconservative members of the current Bush administration. Strauss taught a doctrine that is really just the explicit utterance of an implicit belief common among ruling elites — it is the duty of wise leaders to cloak their policies in potent patriotic and religious symbols and myths in order to galvanize the internal ethical imperatives of the masses. In other words, lies (if told by the right people for the right reasons) are not only good and necessary; they are the very foundation of responsible statecraft. On this basis, however, language ceases to provide a toolset for accurately mapping the world and instead becomes a mental haze enveloping society, preventing us collectively from grasping our situation. Only the rulers are expected (or allowed) to know the true score; but all too often they come to believe their own myths.

 

‹ Prev