by Henry Gee
The current consensus is that bipedality was the first distinctive feature to have evolved in the human lineage, long before the expansion of the brain. Before many fossils had been discovered, of course, the view was that the large brain of humans evolved before the upright, bipedal stance: this conceit explains why the Piltdown forgery was so effective.
Bipedality might be a distinctively human feature—but is it “special”? Not really—apes have a variety of peculiar modes of locomotion, from quadrupedal knuckle walking (gorillas and chimps) to movement with all four limbs as hands (orangutan) or the forelimbs alone (gibbons). The evidence from Ardipithecus ramidus suggests that the distinctive modes of locomotion in each modern ape species are products of their own very special evolutionary circumstances, and not some relics of ancient times. Some extinct apes, not directly related to hominins, were even bipedal.
As a final note in this chapter, I refer you to the strange case of Oreopithecus. This ape lived in the Late Miocene (7–9 million years ago) and was endemic to a Mediterranean island whose fabric now forms parts of the Italian region of Tuscany. Oreopithecus was, in its own way, a biped, so much so that its hands were sufficiently free to allow for a precision grip, in which the tips of the fingers and thumb can be pressed together, allowing fine manipulation—something often assumed to be exclusive to toolmaking humans.32 But Oreopithecus was a very distant cousin of hominins, not an ancestor.33 Its bipedality was not a harbinger of technology, holding babies close to its chest, or anything else. The free hands of Oreopithecus were not, as far as we know, employed in making tools, thereby refining “planning depth,” swiftly followed by the conquest of the earth. Whether bipedality in the species was connected with sexual display will probably remain forever unknown. We do know that bipedality did not save it. As far as we know, Oreopithecus remained confined to its island home, where it quietly became extinct. For Oreopithecus, bipedality was a trait as individual as any other variety of ape locomotion, not the first step in some progressive path of transformation between Ape and Angel.
And the same is true for us.
8: The Dog and the Atlatl
One of my favorite items of technology has an ancient pedigree. It’s a springy, flexible rod about fifty centimeters long, with a handle at one end and a cup at the other to hold a tennis ball. I use this to throw, with ease, tennis balls for my dogs to chase and fetch, much farther than I could throw them unaided, even with great effort. If on any given day you can’t find me at home or at the office, try the beach: there you’ll find me using the ball thrower to throw balls for my dogs to retrieve.
The principle is simple—by lengthening my arm, it increases leverage. By expending the same force, the ball leaves the end of the atlatl with greater velocity than it would from my unaugmented arm. Devices like this have been used to throw darts and spears for tens of thousands of years. Mine gets a modern makeover in that it’s made of plastic rather than the wood, bone, or antler of the originals. The material aside, the atlatl or “spear thrower” is one of human technology’s more enduring design classics.
Technology needs a definition. To most people, I suspect, the word “technology” conjures images of complicated machinery or modern electronic hardware. But such modern technology, even if apparently changed beyond recognition, is really a compressed combination of many much simpler technologies.
Take, for example, the iPad on which I wrote much of the draft for this book. It is made of plastic and metal. The plastic comes from the chemical processing of petroleum, reliant, at root, on nineteenth-century chemistry based on eighteenth-century engineering. The metals are occasionally exotic, but the basics of mining and metalwork go back to antiquity. The electronics in my iPad are based on VLSI chips (very large scale integrated) of what twenty years ago were called microprocessors, themselves condensed versions of transistors (invented in the 1950s), and before that, vacuum tubes that go back to a nineteenth-century fusion of once-separate technologies such as metalwork and glassblowing. The programming that allows me to write on the iPad has a distant ancestor in punched-paper tape used to program vast computers made of arrays of vacuum tubes: and at the business end I use writing, a system of symbols for the recording of ideas whose concept is (by definition) as old as history. When looked at critically, even the most advanced technologies used today by human beings are variations and combinations of earlier, simpler ones.1 In any case, the kind of technology in use for most of human existence has been of the order of the atlatl: simple machines that allow one person to exert greater force than he might have achieved unaided. Give me an atlatl long enough, as Archimedes might have said, and I could throw a tennis ball from here to next Tuesday.
How does one define technology? One definition might indeed encompass all those many objects that people create to do things they might not be able to achieve (or might achieve less well) unaided—things like the atlatl. Such a definition extends well beyond objects we recognize as tools or weapons, to the clothes and dwellings that allow us to live in places that humans might not otherwise have penetrated, to the pottery used since antiquity to transport or contain such items as nuts and grain, fire and water, and in which food might be cooked.
Cookery in particular is believed to have had a profound influence on human anatomy, physiology, and social structure.2 Cooking food breaks down hard or woody tissues, neutralizes toxins, kills potentially harmful bacteria and parasites, and makes more nutrients available to the diner. This increase in efficiency meant less time spent foraging and digesting, allowing more time for social interaction—aside from the fact that cookery tends to be a social and sociable activity in itself.3 Some scientists think that cookery was followed by a reduction in the mass of the jaws, teeth, muscles, and digestive tract, and perhaps an increase in the mass and capabilities of the brain. It is not trite to suggest that humans have been modified by their own technology.4
This definition of technology—those things we create outside our own bodies that allow us to do things we could not have done unaided—also encompasses things that we might not describe as technological at all. Technology might be said to include such imponderables as legal codes and financial structures. Laws make it easier for people to live together harmoniously. Financial structures—from coins and notes to credit derivative swaps and futures contracts—allow us to exchange goods without having to physically carry them around ourselves. Laws and money are not technology in the sense of tools you hold in your hand. Rather, they represent social conventions. The twenty-pound note in my pocket doesn’t actually do anything—it is merely a promise by the Bank of England to underwrite any transaction made with it to that value. Once upon a time such transactions were backed with a real commodity (gold), but no longer. My twenty-pound note is itself a real thing created by the technology of printing, but represents another sort of technology based on social contract. Technology, therefore, includes things that we might otherwise regard as social conventions rather than physical objects.5 “Money” only has “value” inasmuch as we all agree that it does. Therefore, the importance of such things as money—and with that, wills, contracts, and treaties—lies not in their physical form but in the ideas they represent. If this seems somewhat rarefied to be technology, consider that people are given life or condemned to death as surely by abstractions such as treaties and the passage of money as they are by more concrete examples of technology such as antibiotics or nuclear weapons.
If such intangibles as money can be regarded as technology—spoken of in the same breath as, say, swords and plowshares—then perhaps the earliest and most enduring example of technology is something one might not regard as technological at all: the domestic dog.
Not for nothing does the dog bear the soubriquet of Man’s Best Friend. For tens of thousands of years, dogs have made human beings safer, helped them herd other domestic animals and hunt wild ones.6 In the old days, when, to paraphrase Jared Diamond,7 we didn’t do all our foraging in supermarkets, I’d h
ave gone hunting with my bone or antler atlatl and some spears, and a fast-running dog to chase down the prey and retrieve the kills. These days, my dogs and I reprise the same activity, purely for pleasure and exercise, with a plastic atlatl and some tennis balls.
Modern dogs have been bred for a variety of purposes, some quite remote from what we assume to have been their original uses, such as hunting or retrieving game (Heidi, my golden retriever), ridding campsites of rodents (Saffron, my Jack Russell terrier), guarding against intruders (both), and herding livestock (neither). Modern dogs are used in such sophisticated tasks as helping blind and deaf people get around busy cities, rescuing people washed into the sea or buried under rubble, sniffing out explosives and contraband, but perhaps most of all to provide companionship for people.
My friend Brian Clegg, a full-time author, says that the most important piece of equipment a writer can own is not a computer, nor even a pen, but a dog. Writing is a solitary business, and a dog provides company without being intrusive. One can always postpone a trip to the gym, but those appealing doggy eyes and waggy tail have a way of persuading the writer to take regular breaks for exercise, whatever the weather, during which the writer can think about what he has just written and plan the next bit.
How can a dog count as technology? It fulfills my definition in that it allows humans to do things that they might not have been able to have achieved unaided (hunting, herding, sniffing out drugs, eyesight to the blind, and so on). My definition also specifies that technology is created for its purpose. The domestic dog is definitely a creation that would not have evolved naturally, having been modified quite extensively in both behavior and appearance from its ancestor, the wolf. Most cases of domestication involve humans breeding and selecting animals and plants in true Darwinian style to optimize them as producers of food. Dogs, however, are a special case of one social carnivore being domesticated by another social carnivore for mutual benefit—a kind of symbiosis.
How far back does human technology go? We know for certain that technology antedates modern Homo sapiens. The earliest tools that can be recognized as such are chipped pebbles from Ethiopia that date back 2.5 million years, although it is possible that hominins were using sharp stones to butcher carcasses of other animals (such as antelopes) as long ago as 3.4 million years ago.8
The earliest stone tools don’t look like much, and it takes an expert eye to tell them apart from pebbles broken by natural causes or by accident. However, some stone tools, notably the hand ax—that canonical item in the Stone Age toolbox—are objects of remarkable beauty, and show evidence of extraordinary craftsmanship to rival anything one might find in, say, the workshop of a trained cabinetmaker or stonemason today.
But human beings are not unique in their manufacture of objects that are both useful and beautiful. In fact, living things have been creating such objects for almost as long as life itself has existed. Stromatolites—cushion-shaped domes that survive in salty lagoons—are the sculpture-like objects created by simple, single-celled cyanobacteria (blue-green algae). Around 3 billion years ago, when such creatures were the most complex forms of life on the planet, stromatolites were common and formed the first reefs. Today, reef-forming creatures such as corals (simple animals, related to sea anemones) create complex and beautiful structures, external to their own bodies.
On land, social insects create complex and well-engineered structures in which they live—one thinks of honeycombs with their regular, hexagonal cells made of wax. The towering nests of termites dwarf their creators in the same way that skyscrapers dwarf their human architects—and have air-conditioning systems to rival any created by human designers. Birds, too, make nests, some highly elaborate: think of the nests of weaverbirds, hanging from the branches of African trees, or the elaborate stages created by male bowerbirds on which they display to prospective mates. At least one bird species—the New Caledonian crow—makes and uses tools that conform to my definition of technology, modified leaves for use as probes.9 Anthropologists are even beginning to recognize that apes, chimpanzees in particular, have rudimentary technologies, in which they use stones to crack nuts, or strip leaves from stalks to allow them to probe anthills. Some of these technologies even have a “cultural” dimension.10 That is, they vary from one group of primates to another according to the learned traditions of each.
Such discoveries have led to the emerging discipline of “primate archaeology”—the excavation of occupation sites used by primates other than humans, in order to learn about the history of their technological and cultural traditions.11 This discipline is beginning to offer a much-needed comparative perspective on the relationship between technology and human history, by first admitting that technology is not unique to humans. After all, the first hominins to make tools were not humans in any sense we’d allow today. The first tools used by hominins would have been neither more nor less sophisticated than those used by modern chimpanzees. All this leads to a perhaps obvious question—why should beautiful, useful technology created by humans imply a maker any more intelligent or deliberate than (say) a crow or a weaverbird? A termite or a stromatolite? All such creatures fulfill my definition of technology in that they create things for use outside their own bodies that allow them to do things that they might not have done unaided, irrespective of their cognitive abilities.
To reserve technology for humans requires the inclusion of such imponderables as “planning depth,” assumed to be an exclusively human attribute, which can be defined as the ability to plan for future eventualities. For example, before I can paint this wall, I shall need to open this tin of paint, but before I can do that, I shall need to go to my shed and find a screwdriver to use as a lever for prizing open the lid.12 Should “planning depth” be included in any definition of technology? I suggest not, because the concept creates more problems than it solves. We know, for example, from behavior experiments, that some animals that use technology—notably crows—do indeed plan ahead in a way that is indistinguishable from human behavior in similar circumstances.13 In which case the attribute of planning depth is not unique to humans.
For other animals, the creation of structures outside the body is presumably instinctive, that is, hardwired, rather than learned by observation—or even a completely incidental by-product of metabolism. This is probably true for all those organisms such as corals or stromatolites, which lack what we would recognize as a brain. However, this might be a distinction without a difference, because of the a priori assumption, not stated in my definition above, that brains are an absolute prerequisite for technology. But if a creature, whether or not it has a brain, modifies its environment to allow it to live where it otherwise might not, can that not be regarded as technology? Does the hidden assumption that brains are necessary prejudice one toward the view that technology is something we humans award ourselves, exclusively, because of our privileged view, and our tendency to think that we are the culmination of all organic achievement? Were the myriad polyps responsible for the Great Barrier Reef to be polled on the issue, they might with justification say that theirs is a more magnificent achievement than anything made by Man. Polyps, though, have no brains—but does it matter? Is it not easier to judge what is and what is not technology by easily observable results rather than on the motivational states of the makers, which are much harder to fathom? Because it is easier to infer motivation in fellow humans than in nonhumans, our view on technology will quite naturally be prejudiced to it being a humans-only activity—but only so long as concepts such as “planning depth” are considered as definitive.
Brains are, however, definitely required for “planning depth,”14 whether or not this planning is applied to the manufacture of objects outside the body. However, such a concept runs into a mire of philosophical problems as eloquently discussed by Daniel Dennett in his book Consciousness Explained. Briefly, the idea that we imagine the world outside our heads as a dramatic performance that we are somehow observing through the windows of o
ur senses—what philosophers call the “Cartesian theater”—is an illusion easily bruised by any number of experiments. So, whereas I might imagine a little picture of myself going to my shed to get the screwdriver I need to lever open the lid of that tin of paint I mentioned earlier, such a drama is a rationalization after the fact of a host of disparate thoughts and impulses. If there is no such thing as the Cartesian theater, there can be no such thing as planning depth. If there is no planning depth, there is no necessity to infer that brains are required for technology at all—in which case one is left with the definition of technology with which I started, which therefore must include, along with my iPad and the probes created from leaves by those clever crows from New Caledonia, the insensate creations of bees, corals, termites, and even cyanobacteria, without reference to the internal motivational states (if any) of the creators.
This problem of planning, or intention, crops up whenever we think of the hand ax, that quintessential example of Stone Age artistry. Although it’s always hard to be sure, hand axes are generally associated with a particular species of hominin, Homo erectus, and are examples of a toolmaking tradition or style known as “Acheulian” or “Acheulean,” after the site in France called Saint-Acheul, whence such tools were first described as such. The earliest known tools in the Acheulian style come from Kenya and are almost 1.8 million years old.15 As Homo erectus spread around the Old World, hand axes went too. They have been found from Spain to China, England to Indonesia. Although some later hominins such as Homo heidelbergensis and Neanderthals adopted and modified the hand ax, the basic plan remained more or less the same for 1.5 million years,16 variations dictated more by the different materials used in hand-ax manufacture rather than any change in tradition or culture.