The Shallows
Page 5
It’s easy, today, to chuckle at Aristotle’s error. But it’s also easy to understand how the great philosopher was led so far astray. The brain, packed neatly into the bone-crate of the skull, gives us no sensory signal of its existence. We feel our heart beat, our lungs expand, our stomach churn—but our brain, lacking motility and having no sensory nerve endings, remains imperceptible to us. The source of consciousness lies beyond the grasp of consciousness. Physicians and philosophers, from classical times through the Enlightenment, had to deduce the brain’s function by examining and dissecting the clumps of grayish tissue they lifted from the skulls of corpses and other dead animals. What they saw usually reflected their assumptions about human nature or, more generally, the nature of the cosmos. They would, as Robert Martensen describes in The Brain Takes Shape, fit the visible structure of the brain into their preferred metaphysical metaphor, arranging the organ’s physical parts “so as to portray likeness in their own terms.”2
Writing nearly two thousand years after Aristotle, Descartes conjured up another watery metaphor to explain the brain’s function. To him, the brain was a component in an elaborate hydraulic “machine” whose workings resembled those of “fountains in the royal gardens.” The heart would pump blood to the brain, where, in the pineal gland, it would be transformed, by means of pressure and heat, into “animal spirits,” which then would travel through “the pipes” of the nerves. The brain’s “cavities and pores” served as “apertures” regulating the flow of the animal spirits throughout the rest of the body.3 Descartes’ explanation of the brain’s role fit neatly into his mechanistic cosmology, in which, as Martensen writes, “all bodies operated dynamically according to optical and geometric properties” within self-contained systems.4
Our modern microscopes, scanners, and sensors have disabused us of most of the old fanciful notions about the brain’s function. But the brain’s strangely remote quality—the way it seems both part of us and apart from us—still influences our perceptions in subtle ways. We have a sense that our brain exists in a state of splendid isolation, that its fundamental nature is impervious to the vagaries of our day-to-day lives. While we know that our brain is an exquisitely sensitive monitor of experience, we want to believe that it lies beyond the influence of experience. We want to believe that the impressions our brain records as sensations and stores as memories leave no physical imprint on its own structure. To believe otherwise would, we feel, call into question the integrity of the self.
That was certainly how I felt when I began to worry that my use of the Internet might be changing the way my brain was processing information. I resisted the idea at first. It seemed ludicrous to think that fiddling with a computer, a mere tool, could alter in any deep or lasting way what was going on inside my head. But I was wrong. As neuroscientists have discovered, the brain—and the mind to which it gives rise—is forever a work in progress. That’s true not just for each of us as individuals. It’s true for all of us as a species.
Tools Of The Mind
A child takes a crayon from a box and scribbles a yellow circle in the corner of a sheet of paper: this is the sun. She takes another crayon and draws a green squiggle through the center of the page: this is the horizon. Cutting through the horizon she draws two brown lines that come together in a jagged peak: this is a mountain. Next to the mountain, she draws a lopsided black rectangle topped by a red triangle: this is her house. The child gets older, goes to school, and in her classroom she traces on a page, from memory, an outline of the shape of her country. She divides it, roughly, into a set of shapes that represent the states. And inside one of the states she draws a five-pointed star to mark the town she lives in. The child grows up. She trains to be a surveyor. She buys a set of fine instruments and uses them to measure the boundaries and contours of a property. With the information, she draws a precise plot of the land, which is then made into a blueprint for others to use.
Our intellectual maturation as individuals can be traced through the way we draw pictures, or maps, of our surroundings. We begin with primitive, literal renderings of the features of the land we see around us, and we advance to ever more accurate, and more abstract, representations of geographic and topographic space. We progress, in other words, from drawing what we see to drawing what we know. Vincent Virga, an expert on cartography affiliated with the Library of Congress, has observed that the stages in the development of our mapmaking skills closely parallel the general stages of childhood cognitive development delineated by the twentieth-century Swiss psychologist Jean Piaget. We progress from the infant’s egocentric, purely sensory perception of the world to the young adult’s more abstract and objective analysis of experience. “First,” writes Virga, in describing how children’s drawings of maps advance, “perceptions and representational abilities are not matched; only the simplest topographical relationships are presented, without regard for perspective or distances. Then an intellectual ‘realism’ evolves, one that depicts everything known with burgeoning proportional relationships. And finally, a visual ‘realism’ appears, [employing] scientific calculations to achieve it.”1
As we go through this process of intellectual maturation, we are also acting out the entire history of mapmaking. Mankind’s first maps, scratched in the dirt with a stick or carved into a stone with another stone, were as rudimentary as the scribbles of toddlers. Eventually the drawings became more realistic, outlining the actual proportions of a space, a space that often extended well beyond what could be seen with the eye. As more time passed, the realism became scientific in both its precision and its abstraction. The mapmaker began to use sophisticated tools like the direction-finding compass and the angle-measuring theodolite and to rely on mathematical reckonings and formulas. Eventually, in a further intellectual leap, maps came to be used not only to represent vast regions of the earth or heavens in minute detail, but to express ideas—a plan of battle, an analysis of the spread of an epidemic, a forecast of population growth. “The intellectual process of transforming experience in space to abstraction of space is a revolution in modes of thinking,” writes Virga.2
The historical advances in cartography didn’t simply mirror the development of the human mind. They helped propel and guide the very intellectual advances that they documented. The map is a medium that not only stores and transmits information but also embodies a particular mode of seeing and thinking. As mapmaking progressed, the spread of maps also disseminated the mapmaker’s distinctive way of perceiving and making sense of the world. The more frequently and intensively people used maps, the more their minds came to understand reality in the maps’ terms. The influence of maps went far beyond their practical employment in establishing property boundaries and charting routes. “The use of a reduced, substitute space for that of reality,” explains the cartographic historian Arthur Robinson, “is an impressive act in itself.” But what’s even more impressive is how the map “advanced the evolution of abstract thinking” throughout society. “The combination of the reduction of reality and the construct of an analogical space is an attainment in abstract thinking of a very high order indeed,” writes Robinson, “for it enables one to discover structures that would remain unknown if not mapped.”3 The technology of the map gave to man a new and more comprehending mind, better able to understand the unseen forces that shape his surroundings and his existence.
What the map did for space—translate a natural phenomenon into an artificial and intellectual conception of that phenomenon—another technology, the mechanical clock, did for time. For most of human history, people experienced time as a continuous, cyclical flow. To the extent that time was “kept,” the keeping was done by instruments that emphasized this natural process: sundials around which shadows would move, hourglasses down which sand would pour, clepsydras through which water would stream. There was no particular need to measure time with precision or to break a day up into little pieces. For most people, the movements of the sun, the moon, and the stars provided t
he only clocks they needed. Life was, in the words of the French medievalist Jacques Le Goff, “dominated by agrarian rhythms, free of haste, careless of exactitude, unconcerned by productivity.”4
That began to change in the latter half of the Middle Ages. The first people to demand a more precise measurement of time were Christian monks, whose lives revolved around a rigorous schedule of prayer. In the sixth century, Saint Benedict had ordered his followers to hold seven prayer services at specified times during the day. Six hundred years later, the Cistercians gave new emphasis to punctuality, dividing the day into a regimented sequence of activities and viewing any tardiness or other waste of time to be an affront to God. Spurred by the need for temporal exactitude, monks took the lead in pushing forward the technologies of timekeeping. It was in the monastery that the first mechanical clocks were assembled, their movements governed by the swinging of weights, and it was the bells in the church tower that first sounded the hours by which people would come to parcel out their lives.
The desire for accurate timekeeping spread outward from the monastery. The royal and princely courts of Europe, brimming with riches and prizing the latest and most ingenious devices, began to covet clocks and invest in their refinement and manufacture. As people moved from the countryside to the town and started working in markets, mills, and factories rather than fields, their days came to be carved into ever more finely sliced segments, each announced by the tolling of a bell. As David Landes describes it in Revolution in Time, his history of timekeeping, “Bells sounded for start of work, meal breaks, end of work, closing of gates, start of market, close of market, assembly, emergencies, council meetings, end of drink service, time for street cleaning, curfew, and so on through an extraordinary variety of special peals in individual towns and cities.”5
The need for tighter scheduling and synchronization of work, transport, devotion, and even leisure provided the impetus for rapid progress in clock technology. It was no longer enough for every town or parish to follow its own clock. Now, time had to be the same everywhere—or else commerce and industry would falter. Units of time became standardized—seconds, minutes, hours—and clock mechanisms were fine-tuned to measure those units with much greater accuracy. By the fourteenth century, the mechanical clock had become commonplace, a near-universal tool for coordinating the intricate workings of the new urban society. Cities vied with one another to install the most elaborate clocks in the towers of their town halls, churches, or palaces. “No European community,” the historian Lynn White has observed, “felt able to hold up its head unless in its midst the planets wheeled in cycles and epicycles, while angels trumpeted, cocks crew, and apostles, kings and prophets marched and countermarched at the booming of the hours.”6
Clocks didn’t just become more accurate and more ornate. They got smaller and cheaper. Advances in miniaturization led to the development of affordable timepieces that could fit into the rooms of people’s houses or even be carried on their person. If the proliferation of public clocks changed the way people worked, shopped, played, and otherwise behaved as members of an ever more regulated society, the spread of more personal tools for tracking time—chamber clocks, pocket watches, and, a little later, wristwatches—had more intimate consequences. The personal clock became, as Landes writes, “an ever-visible, ever-audible companion and monitor.” By continually reminding its owner of “time used, time spent, time wasted, time lost,” it became both “prod and key to personal achievement and productivity.” The “personalization” of precisely measured time “was a major stimulus to the individualism that was an ever more salient aspect of Western civilization.”7
The mechanical clock changed the way we saw ourselves. And like the map, it changed the way we thought. Once the clock had redefined time as a series of units of equal duration, our minds began to stress the methodical mental work of division and measurement. We began to see, in all things and phenomena, the pieces that composed the whole, and then we began to see the pieces of which the pieces were made. Our thinking became Aristotelian in its emphasis on discerning abstract patterns behind the visible surfaces of the material world. The clock played a crucial role in propelling us out of the Middle Ages and into the Renaissance and then the Enlightenment. In Technics and Civilization, his 1934 meditation on the human consequences of technology, Lewis Mumford described how the clock “helped create the belief in an independent world of mathematically measurable sequences.” The “abstract framework of divided time” became “the point of reference for both action and thought.”8 Independent of the practical concerns that inspired the timekeeping machine’s creation and governed its day-to-day use, the clock’s methodical ticking helped bring into being the scientific mind and the scientific man.
EVERY TECHNOLOGY IS an expression of human will. Through our tools, we seek to expand our power and control over our circumstances—over nature, over time and distance, over one another. Our technologies can be divided, roughly, into four categories, according to the way they supplement or amplify our native capacities. One set, which encompasses the plow, the darning needle, and the fighter jet, extends our physical strength, dexterity, or resilience. A second set, which includes the microscope, the amplifier, and the Geiger counter, extends the range or sensitivity of our senses. A third group, spanning such technologies as the reservoir, the birth control pill, and the genetically modified corn plant, enables us to reshape nature to better serve our needs or desires.
The map and the clock belong to the fourth category, which might best be called, to borrow a term used in slightly different senses by the social anthropologist Jack Goody and the sociologist Daniel Bell, “intellectual technologies.” These include all the tools we use to extend or support our mental powers—to find and classify information, to formulate and articulate ideas, to share know-how and knowledge, to take measurements and perform calculations, to expand the capacity of our memory. The typewriter is an intellectual technology. So are the abacus and the slide rule, the sextant and the globe, the book and the newspaper, the school and the library, the computer and the Internet. Although the use of any kind of tool can influence our thoughts and perspectives—the plow changed the outlook of the farmer, the microscope opened new worlds of mental exploration for the scientist—it is our intellectual technologies that have the greatest and most lasting power over what and how we think. They are our most intimate tools, the ones we use for self-expression, for shaping personal and public identity, and for cultivating relations with others.
What Nietzsche sensed as he typed his words onto the paper clamped in his writing ball—that the tools we use to write, read, and otherwise manipulate information work on our minds even as our minds work with them—is a central theme of intellectual and cultural history. As the stories of the map and the mechanical clock illustrate, intellectual technologies, when they come into popular use, often promote new ways of thinking or extend to the general population established ways of thinking that had been limited to a small, elite group. Every intellectual technology, to put it another way, embodies an intellectual ethic, a set of assumptions about how the human mind works or should work. The map and the clock shared a similar ethic. Both placed a new stress on measurement and abstraction, on perceiving and defining forms and processes beyond those apparent to the senses.
The intellectual ethic of a technology is rarely recognized by its inventors. They are usually so intent on solving a particular problem or untangling some thorny scientific or engineering dilemma that they don’t see the broader implications of their work. The users of the technology are also usually oblivious to its ethic. They, too, are concerned with the practical benefits they gain from employing the tool. Our ancestors didn’t develop or use maps in order to enhance their capacity for conceptual thinking or to bring the world’s hidden structures to light. Nor did they manufacture mechanical clocks to spur the adoption of a more scientific mode of thinking. Those were by-products of the technologies. But what by-products! Ultimately
, it’s an invention’s intellectual ethic that has the most profound effect on us. The intellectual ethic is the message that a medium or other tool transmits into the minds and culture of its users.
For centuries, historians and philosophers have traced, and debated, technology’s role in shaping civilization. Some have made the case for what the sociologist Thorstein Veblen dubbed “technological determinism” they’ve argued that technological progress, which they see as an autonomous force outside man’s control, has been the primary factor influencing the course of human history. Karl Marx gave voice to this view when he wrote, “The windmill gives you society with the feudal lord; the steam-mill, society with the industrial capitalist.” 9 Ralph Waldo Emerson put it more crisply: “Things are in the saddle/And ride mankind.”10 In the most extreme expression of the determinist view, human beings become little more than “the sex organs of the machine world,” as McLuhan memorably wrote in the “Gadget Lover” chapter of Understanding Media.11 Our essential role is to produce ever more sophisticated tools—to “fecundate” machines as bees fecundate plants—until technology has developed the capacity to reproduce itself on its own. At that point, we become dispensable.
At the other end of the spectrum are the instrumentalists—the people who, like David Sarnoff, downplay the power of technology, believing tools to be neutral artifacts, entirely subservient to the conscious wishes of their users. Our instruments are the means we use to achieve our ends; they have no ends of their own. Instrumentalism is the most widely held view of technology, not least because it’s the view we would prefer to be true. The idea that we’re somehow controlled by our tools is anathema to most people. “Technology is technology,” declared the media critic James Carey; “it is a means for communication and transportation over space, and nothing more.”12