The Internet Archive is in a very different business, one that is always at risk of failure, but can never be superseded. For archives to succeed at digital preservation, the web and the Internet must be open, not closed. Rather than competing against each other, archives can increase their chance of succeeding as they multiply and cooperate. An archive’s assets are never deleted or expunged for political, economic, or even privacy reasons. Access to the data can be controlled for any number of compelling reasons, from protecting an individual’s privacy to embargoing the location of endangered species or archaeological sites to protect them from predation. Paper-based archives routinely observe embargoes for limited times—say, until after the death of those mentioned in the archives.
Digital memory will determine how we know our own past, are exposed to the diversity of other human pasts, and how people remember us when we are gone. A long-lasting conversation among generations separated by hundreds and thousands of years was interrupted and almost totally silenced between the fall of the Roman Empire and the Renaissance. Since its revival and conversion to print, its influence has extended far beyond its origins in the Mediterranean basin. Now, with the Internet, we can continue that conversation as we digitize materials from the past, but also broaden it across multiple languages and civilizations with different historical experiences and expectations of the future. Though the dream of archiving Internet content was born of idealistic and utopian ambitions, in reality, it is evolving into one of the most efficient and economical ways to ensure the continuation of that conversation. We can leverage the proliferation of copies of things on the web to increase their chance of survival. True, with so much redundancy there will be a problem of version control. Solving search across so much information will also be nontrivial, as computer engineers are fond of saying. But these are technical matters, no matter how complicated.
The real challenge lies in mobilizing people to set aside their temporal chauvinism and focus on the long term of the past and the future. Despite the futurist fervor endemic in early digital decades, we now see the past is more valuable to us than ever. As Jefferson understood, it is because we care about the future that we have to know our past—all of it. That might be why the Long Now Foundation, located in the heart of the Bay Area’s technology industry, was formed in 1996 to encourage long-term thinking. (Anticipating another Y2K challenge, the foundation always uses five digits for years, so it was, more accurately, founded in 01996.) The mission of Long Now is to encourage our species to think about our ambitions and challenges in increments not of decades, centuries, or even a few millennia, but ten thousand years. Their goal is to remind us that our actions have consequences. Their programs foster experiments in thinking through future scenarios and anticipating the responsibilities we must assume for the consequences of our actions over long periods of time. Among their projects is the documentation of endangered languages (the Rosetta Project), efforts to preserve and restore endangered or extinct genetic codes, including those of the passenger pigeon, the black-footed ferret, and the woolly mammoth (the Revive & Restore project), and the Manual for Civilization library.
MORE THAN MEETS THE EYE
Sherlock Holmes, the patron saint of detection, warned Watson that “there is nothing more deceptive than an obvious fact.” Our new understanding of memory says facts, especially lots of them, can be misleading, distracting, and impede comprehension if used in the wrong context (or, is in the case of S., if you accumulate lots of them and never align them into coherent patterns or narratives). The point of gathering data is to paint a picture of the world as accurately as possible, with a precision that requires thinking slow, not fast. When so assembled, then the mind can use that picture of the world quickly and safely, in the daily business of life that relies on fast thinking, whether it is crossing the street or getting a first impression of someone at a reception. Facts find value and meaning only in the context of complex and dynamic systems.
Scientific disciplines from ecology to economics focus intently now on the study of interactions—chaos, complexity, and emergence. The neuroscience of memory has made great strides by applying reductionist techniques—investigating the smallest components of a system—to understand the basic elements of the brain such as dendrites, axons, and myelin. Given this rapid progress, as one practitioner said, the time has come “to link the molecular and cellular level with the systems level of analysis. This integration is the major challenge facing the science of memory, and might require, in addition to new methodologies, a change of zeitgeist or an amalgamation of approaches.”
The physicist Robert Laughlin claims that the physical sciences have also stepped firmly from reductionist techniques to systems analysis. “This shift is usually described in the popular press as the transition from the age of physics to the age of biology, but that is not quite right. What we are seeing is a transformational world view in which the objective of understanding nature by breaking it down into ever smaller parts is supplanted by the objective of understanding how nature organizes itself.”
The next advance in knowledge, in other words, will not just study elements comprising the agents of change, but change itself, the processes that combine to create complex behaviors and phenomena. To study climate change, for example, many fields of expertise come together to understand the interactions between atmosphere, ocean, fire, ice, biological cycles of energy production and consumption. The study of how Nature and its creatures organize themselves and change over time demands historical data, and lots of it. Like Sumerian scribes of yore, the researchers are gathering voluminous amounts of documentation in the course of doing business. Much of its value lies in its reuse over time. That is why the major funders of science in the United States, the National Science Foundation and the National Institutes of Health, now mandate that the researchers they support curate, preserve, and share their data with others.
Search engines, whose business is built from the ground up on the reuse of other people’s data, also stake their future on managing and preserving the data they harvest. Google manages a lot of data; how much data it manages is proprietary information. But it is public knowledge that as of 2013 the company spent twenty-one billion dollars on data centers that process as well as store data for use. The cost of keeping data in storage is dependent on the cost of energy. It takes a lot of machines, generating a lot of heat and requiring industrial-strength air-conditioning systems, to keep our data planet spinning. The technology companies that monopolize search, social networks, and shopping are best positioned to develop the technologies that we need for long-term storage not because they are the first to understand the value of reusing data, but because they are best capitalized to work on new technologies. The collective memory systems that depend on machines for recording, preserving, and making sense of digital data are being developed at centers like these, under conditions of commercial secrecy. Together with the national security industry, these private enterprises are best positioned to sustain a massive if unknown percentage of our collective memory. They are also the ones developing technologies to extract more information from existing sources.
In the digital era, the key economic asset companies will compete for is not our labor, but our data. For both data-intensive industries and national defense, their primary assets are the data they get from us. Commercial companies gather this information more or less with our consent, although the average user clicking through license agreements cannot be said to be giving truly “informed consent.” As for national security, Edward Snowden’s revelations have put citizens across the globe on notice that our data are collected and used without our knowledge or consent. Both control vast amounts of intellectual and financial capital critical to solving a wired world’s long-term information challenges. The fact that neither private commerce nor national security operate in ways transparent or directly accountable to the public is itself a significant risk factor for the future of our collective digital memory. That
said, a growing number of critics question the propriety of companies’ use of our data, and even more decry governments’ abuse of our privacy and trust in the pursuit of national defense. In the coming decade, these two issues will be increasingly contested in the public policy arena, becoming key issues in elections and some moving on to the courts for final decision about how to interpret new and existing laws that touch on the definition of privacy and the ownership of data (including copyright and licensing issues).
Then, when the critical operating rules for the rights surrounding data have been defined and normalized, we will see another wave of social and technical innovation around the use of data. (We will probably be using the word “sociotechnical” at this point, as we will be used to seeing technical innovations such as mobile computing change behavior, and behavioral demands spurring technical response.) These innovations will offer new values for old data (old in the digital realm being as young as two software or operating systems ago). Then demand—public and private—for access to secure and reliable digital content will balloon. Sustained long-term major investment in digital infrastructure will have proven itself the critical difference between societies that flourish in this world and those that flounder, fall behind, and as fail to protect their citizens during cyber warfare.
WHY OUR MACHINES NEED US
Now that we have discovered through empirical science that memory is a dynamic process, strongly inflected by emotion and spatialized in the brain, we have almost caught up with the ancient Greeks.
But we still lag in one area: the cultivation of knowledge for its own sake, above and beyond its usefulness in the manipulation of Nature or ability to return financial rewards on investment. The Greeks saw imagination as a divine dispensation from ignorance, a gift from the goddess of memory. Western science is the offspring of the marriage between the Greek love of knowledge for its own sake and the Christian view that knowledge of creation brings us knowledge of the creator. The intersection between the two is beauty and goodness. What is beautiful and good can be seen both as an end in itself and as a revelation of divine providence.
Over time, as we cracked the code of Nature, the providential usefulness of knowledge began to eclipse its beauty. Take Jefferson, quintessential man of science and technophile. In his view, the imaginative arts were to provide pleasure—though not too much. He complained to a friend that “the great obstacle to good education is the inordinate passion prevalent for novels, and the time lost in that reading which should be instructively employed … the result is a bloated imagination, sickly judgment, and disgust towards all the real business of life.” The classification of Imagination or the Fine Arts was the smallest of the three categories in his library, a mere 20 percent of his books. A cultural bias toward instrumental knowledge continued to be a hallowed tradition in the United States, almost bred in the bone of the nation’s political leaders. Even Abraham Lincoln, whose moral imagination was critical to the emancipation of slaves during the Civil War, was a technophile. As he told the Wisconsin State Agricultural Society in 1859, “I know of nothing so pleasant to the mind, as the discovery of anything which is at once new and valuable.”
The Internet is extremely useful. At the same time it encourages the pursuit of curiosity for its own sake and democratizes it. Rather than condemning time spent pursuing our curiosity—online and elsewhere—as wasted hours of productivity, the pursuit of curiosity-driven questions should be encouraged with a clear conscience. Cultivating activities that are rewarding in and of themselves actually deepens our imagination. And—at the risk of making an instrumental argument—piquing our curiosity slows us down and allows for different modes of thought and processing information. Creativity flows from the suspended state of mind in which knowledge is nothing and attention is everything.
So how are we to think about thinking with our machines? We are advancing rapidly into a world where artificial intelligence creates, manages, and uses digital data for many tasks that humans are used to thinking only humans can do. Machines can replicate logical functions of human intelligence, and do it faster and better than we can. With good programming, they extrapolate patterns from masses of seemingly inchoate information, make astute predictions about our preferences, find the shortest route from A to Z, figure out betting odds more accurately than bookies, keep better time and know our schedules better than we do.
The realm of emotional intelligence, empathy, and imagination—all necessary for judgment in the context of incomplete information or conflicting aims—is beyond the reach of our machines. In the coming decades, we will take advantage of outsourcing logical tasks to our machines to free up time for more imaginative pursuits. As we teach our children how to use digital machines, we need simultaneously to cultivate their capacity for empathy and emotional intelligence by integrating the arts—the making of beautiful things and the telling of meaningful stories—into the new curriculum for digital literacy. Our machines will not grow a moral imagination anytime soon. They must rely on ours.
THE PERILS OF PERFECTIBILITY
The expansion of collective memory benefits everyone. The most readily adaptable animal is the one with the largest repertoire of stored experience to call upon. The smaller our repertoire of experience, the more vulnerable we are. Any society that periodically purges its collective memory of old, obsolete, or unorthodox views puts itself directly in harm’s way. In the last century we were plagued by totalitarian regimes using massive distortion and erasure of people’s histories in their audaciously criminal efforts to steer the future their way. This century is witness to the tyranny of theological terrorists who impose collective amnesia on their subjects by destroying historical and religious sites as ostentatiously as they kill their enemies.
But liberal democracies also imperil collective memory, though in more subtle ways. They tend toward an unnuanced faith in progress through science, “reason in action,” which is a nearly invisible but still potent inheritance from the Enlightenment. Thomas Jefferson thought the human mind “is perfectible to a degree of which we cannot as yet form any conception.” Since Jefferson’s death, his beloved science made spectacular progress in decoding the language of Nature. Technology kept pace, applying the knowledge with ingenuity and alacrity, just as Jefferson hoped it would. But the impressive successes in these arenas have resulted in an overreliance on science as a model of knowledge not only for scientific matters, but also for countless things that science does not claim to address. As Sheila Jasanoff, an expert on scientific and technological expertise, notes, “Science fixes our attention on the knowable, leading to an over-dependence on fact-finding. Even when scientists recognize the limits of their own inquiries, as they often do, the policy world, implicitly encouraged by scientists, asks for more research.” The pursuit of perfect knowledge to solve complex problems, she goes on, is pointless. “Uncertainty, ignorance and indeterminacy are always present.” Not every social problem has a technological solution.
This secular faith in progress has its origins in a Christian conception of history—the cosmic drama of fall from grace and salvation through divine intervention. Though now wholly divorced from any theological underpinnings, a residual evangelical energy still emanates from the warm body of secularized faith, promising miraculous technological interventions that can cheat death, if not someday even defeat it. But as the physicist Steven Weinberg notes, “Science addresses what is true, not what makes us happy or good.” In the wake of Darwin’s discoveries of evolution and its processes, the physical and life sciences moved decisively away from a model of human perfectibility. In Nature, there is no end, no consummation, no point of rest. On the contrary, the work of life is endless adaptation, not incremental perfection. As the political philosopher Isaiah Berlin said, “The historical process has no ‘culmination.’ Human beings have invented this notion only because they cannot face the possibility of an endless conflict.”
Biology offers a word of caution about putting fa
ith in models of perfect knowledge. Long ago, some creatures inherited mutations that made appendages suitable not just for navigating water, but also, potentially, as limbs that could propel a creature on land or wings that could keep it aloft in the air. Birds adapted to the air by flying, but they were able to fly because they carried a useless feature that under the right circumstances became useful: an appendage that could become a winglike thing, a proto-wing. Likewise, mammals did not evolve legs so that they could walk. Mammals can walk because they inherited some preadapted appendages that could become leglike things. Who is to say which biological or cultural maladaptation will emerge in the future as a key feature of success? In recent times, for example, new economies have shifted so radically from physical to intellectual productivity that people who were previously advantaged because of their physical prowess are now routinely sidelined by those with mental prowess—the triumph of the nerds.
Keeping evidence from past paradigms of knowledge, even those long since discredited, is the cultural equivalent of carrying maladaptations in the genetic code. Years, centuries, millennia can pass. But when our environment suddenly changes, as it is changing now, it may be the odd trait, tradition, or idea we carry by accident that helps us to adapt. Then those maladaptations may turn out to be useful. They become exaptations, inherited features recruited for purposes other than originally selected for. The logic of adaptability means the greater the diversity of traits a creature carries, the greater the chance it has to adapt to changing circumstances. As the rate of biological species’ extinctions accelerates, we are learning just how important diversity is in the biological world.
When We Are No More Page 18