Bacon’s categorization of learning, expounded in On the advancement and proficiencies of learning (1605), describes “the Emanations of Sciences, from the Intellectual Faculties of Memory, Imagination, Reason.” Bacon maps the internal landscape of knowledge according to these three mental primitives, the irreducible features of the mind from which our intelligence emanates. (We have replaced this understanding with the notion that our minds emanate from physiological features of the brain, of which memory, reason, and imagination are emergent phenomena.) When Jefferson compiled a catalog of the books he was shipping to Congress in 1815, he began by explaining his cataloging scheme.
Books may be classed according to the faculties of the mind employed on them: these are—I. Memory; II. Reason; III. Imagination.
Which are applied respectively to—I. History; II. Philosophy; III. Fine Arts.
His books were shelved in this order at Monticello, and the order was maintained at the congressional library for decades.
But long after he shipped his books off and had begun his third and smaller library, Jefferson continued to research and refine his categorizations of knowledge. In 1824 he writes to a friend that:
Lord Bacon founded his first great division on the faculties of the mind which have cognizance of the sciences. It does not seem to have been observed by anyone but the origination of this division was not with him. It had been proposed by Charron, more than 20 years before in his book de la Sagesse … and an important ascription of the sciences to these respective faculties was there attempted. This excellent moral work was published in 1600. Lord Bacon is said not to have been entered on his great works until his retirement from public office in 1621.
Note how precise Jefferson is in dating which author published first: his letter bears the enthusiastic pedantry of the graduate student. Pierre Charron (1541–1603), a contemporary and admirer of Montaigne, drew these categories from what he conceived to be the anatomy of the brain and especially its ventricles, where he believed the soul dwells. Thus a moist temperament produces memory, a dry temperament intelligence, and a hot temperament the imagination.
As Jefferson aged, he came to view humankind as naturalized beings, no different in essence from other higher animals. In an early manuscript catalog of his library, compiled in 1783 when he was forty, he has written a note under moral philosophy (part of ethics): “In classing a small library one may throw under this head books which attempt what may be called the Natural history of the mind or an Analysis of its operations. The term and division of Metaphysics is rejected as meaning nothing or something beyond our reach, or which should be called by another name.”
Over time, the category of Reason or Law in his library thinned out as he reassigned more titles to Memory or History to accord with his changing views. Shortly before his eighty-first birthday, Jefferson sent a letter to A. B. Woodward to thank him for sending his own classification scheme. He states that he now prefers Woodward’s method of ascribing categories to types of science, not the faculties of the mind. This is because the scheme has a firmer basis in Matter, not in Mind, in the exterior rather than interior world of humans. (This is how the two most widely used schemes in the United States, the Dewey Decimal and the Library of Congress Classification systems, arrange collections.) For a long time Jefferson had viewed Jesus as a great moral teacher, but not God or the Son of God. Now he realizes the great moral teacher was a materialist as well, evidenced by the fact that Jesus laid great store in the resurrection of the physical body.
Were I to recompose my tabular view of the sciences, I should certainly transpose a particular branch. The Naturalists, you know, distribute the history of Nature into 3 kingdoms or departments, Zoology, botany, mineralogy. Ideology or Mind however occupies so much space in the field of science, that we might perhaps erect it into a 4th kingdom or department. But inasmuch as it makes a part of the animal construction only, it would be more proper to subdivide zoology into physical and moral. The latter including ideology, ethics, and mental sciences generally, in my Catalogue, considering Ethics, as well as Religion, as supplements to law, and the government of man. I had placed them in that sequence. But certainly the faculty of thought belongs to animal history, is an important portion of it, and should there find it’s [sic] place.
Jefferson saw the category of Memory or History encompassing all matters intrinsic to the natural world and the result of natural processes. Rocks, rivers, religions, and the use of the ablative case in Latin are understood alike as historical phenomena. Well before Darwin advanced his theory of how all life has a common ancestry and shared Nature, Jefferson, among many others, proposed that the category of Memory or History comprehends everything contingent, everything conditioned by time and place, everything construed as the product of historical forces. Philosophy itself was recast in Jefferson’s conceptual catalog as simply another product of the human mind, itself contingent, historical, not universal and lawlike, like mathematics and the natural law.
In Jefferson’s day, the word “materialism” was not yet tainted by fateful encounters with Karl Marx and the politics of dialectical materialism. Nor was it blighted by connotations of the banal love of luxury or nonessential material objects for their power to please, pamper, lend status, or provide psychological comfort. To identify the world completely with Nature deepened Jefferson’s sense that the world is comprehensible through reason, even as he lamented that our own rationality—itself a by-product of history—could not be scientifically studied.
Metaphysics have been incorporated with Ethics, and a little extension given to them. For, while some attention may be usefully bestowed on the operations of thought, prolonged investigations of a faculty unamenable to the test of our senses, is an expense of time too unprofitable to be worthy of indulgence.
Jefferson and his peers believed curiosity about the natural world could only lead to a comprehensive and coherent view of the planet, its inhabitants, systems, and principles, an idea developed by Jefferson’s friend and proto-ecologist Alexander von Humboldt in his book Kosmos. They were confident that science would lead to a greater unity of knowledge, not to its splintering. It would foster a greater sense of well-being and integrity of spirit, not a sense of confusion, disorientation, distraction, and most certainly not a greater devotion to luxury.
“WHEN REASON FAILS, WE USE EXPERIENCE”
For Montaigne, experience was helpful if our reason could not find the answer. Today, in our empirical world, if we want to understand a problem, we look to experience—to evidence—first and then use reason to make sense of it. During Jefferson’s lifetime the empirical perspective was embraced by naturalists who were digging up rocks studded with fossilized remnants of unrecognizable creatures. It was hard to make sense of the hard and fast evidence without calling into doubt the commonly assumed cosmological time frame. Understanding that rocks are clocks, geologists used fossils and rocks as demarcations of historical eras that could be dated relative to each other, if not absolutely.
Questions about the true antiquity of the Earth had been raised since the seventeenth century. But things came to a critical head when the true antiquity of humankind itself was debated in the context of growing consensus among geologists that time extended back hundreds of millions of years. Scottish geologist Charles Lyell (1797–1875) said as much in his Principles of Geology, published in the 1830s. In 1831, the young Charles Darwin took a volume of Lyell’s book on his two-year-long voyage on the HMS Beagle. In his account of his travels, The Voyage of the Beagle, Darwin makes frequent reference to Lyell and other geologists as he tries to sort through various enigmas he encounters. What caused the extinction of native horses in South America? Why are there fossilized seashells in the Andean mountains? Everywhere he saw evidence piling up, slowly but inexorably, that the world came into being in a deep abyss of time, changes continuously, and that the laws that brought the world into being are active today, shaping the future. What is past has now become pro
logue. And if we learn how to read the past as it is written in stone, tree rings, ice cores, and fossils, we will learn about the future. It was the power of predictions inherent in materialism that captured the public and scientific minds alike.
That the past is prologue was a very old idea. Many Greeks had viewed matter as a form of memory encoding information not only about the past, but also about the future. The gods Proteus and Mnemosyne embodied the dynamic forces of matter and memory as divine beings, inhabiting both the mundane world of existence, accessible to our senses, and the extramundane world, indisputably real but beyond direct sensory perception. Among the first Europeans to recover the Greek view of matter as a dynamic elemental force that stores information was Francis Bacon, the powerful and influential advocate for empirical science. He was captivated by the story of Proteus, the god of the sea, whom Bacon writes about in On the wisdom of the ancients as “the Old Man of the Sea who never lies.” He “plumbs the depths of the seas,” and knows the truth of what is, what has been, and what is to come. But the Old Man of the Sea who knows everything communicates nothing—nothing, that is, unless one is able to capture him and hold him long enough to force him to speak. Here is the trick: Proteus is a shapeshifter and mutable, like water that can change from ice to vapor. He is as elusive to capture as the wind and as hard to hold as fire.
“The addition in the fable that makes Proteus the prophet, who had the knowledge of things past, present, and future, excellently agrees with the nature of matter; as he who knows the properties, the changes, and the processes of matter, must, of necessity, understand effects and some of what it does, has done, or can do, though his knowledge extends not to all the parts and particulars thereof.”
THE DISCOVERY OF DEEP TIME
While Bacon was interested in the implications of understanding matter’s mutability, he did not pursue the question of the world’s age. The geologists of the late eighteenth and early nineteenth centuries, though, found themselves backdating the age of the Earth in order to accommodate the evidence they uncovered. The lasting impact of their discovery of deep time—the millions and billions of years that elapsed before humankind and even life itself came along—is not a final and definitive birth date of the universe. That is still a matter of debate, under constant revision as more evidence about the infancy of the universe comes to light. No, the point is that it messed with a fixed and sacred chronology as such and changed the way we understand the fundamental processes of creation. Deep time is deeply generous to scientists. It allows for all sorts of random small alterations to accrete and add up to something altogether different—for change in quantity to scale up to a change in quality.
In hindsight we see Darwin’s proposal that human beings evolved from primates as singularly traumatizing. Troubling as these ideas were (and continue to be for some), the truly momentous impact of materialism on the human psyche is what we have learned about the size, scale, and complexity of the material universe. As the universe got bigger, we got smaller.
William James pointed out that “the God whom science recognizes must be a God of universal laws exclusively, a God who does a wholesale business, not a retail business. He cannot accommodate his processes to the convenience of individuals.” In exchange for the eminent powers over Nature we gained with the new science, we lost more than a god who knows our name and cares about our individual fate. We lost anyone to hold accountable for the ills of the world. We cannot blame our creator for the existence of evil. Nor do we have an all-powerful being to appeal to for help against our misfortunes. We do not blame God’s wrath for plagues and droughts. Nor do we thank his benevolence for our health and good harvests. It also has serious implications for our collective memory, because materialism robs us of a supernatural source of knowledge. We do not believe in revealed truths and supernatural inspirations. This means that the cumulative knowledge and know-how of humanity, the collective memory that constitutes the entire repertoire of knowledge we need to understand ourselves and the world, is dependent on our careful stewardship. If we lose it, we cannot regain it through divine revelation. As we think about the future of memory in the digital age, it becomes clear that the stakes are very high indeed. Even though we may subjectively experience the world as overloaded with information, until we build the memory systems that will ensure the future access of digital information, it is all potentially at risk.
THE ORIGINS OF MATERIALISM
Where does this materialist view come from? Although materialism has had its proponents from earliest recorded times, the materialism that became the bedrock of contemporary science is the offspring of Christian theology, direct descendant of the belief that God is immanent in the world and that man, using the God-given faculty of reason, can read his presence in all of creation. Becoming adept in the language in which this knowledge is encoded is a sanctified endeavor. This was a view shared by many pioneers in the history of science—Roger Bacon, Galilei Galileo, Francis Bacon, Isaac Newton. As a consequence, the awe and wonder inspired by creation that belonged to the realm of religion was transferred to science—though God has been cut out of the picture.
Many of the scientists who established the principles and methods of materialist science were believers. They saw no conflict between religion and what they learned about the age of the Earth, because their interest was in how the Earth rose, the processes of creation and history. The questions of who and why were outside the scope of their inquiry. They followed the Galilean precept that scripture imparts moral law, but the Earth itself and all creation are the bible of God’s natural laws. They saw no reason to put moral and natural law into conflict or competition for authority. It may seem a bit incongruous that the ability to build a case that this or that event happened in the past by assembling physical evidence could have the power to disarm religion and steal its magic—and even stranger that the church sanctioned and even encouraged this. But that is what happened.
It is certainly one of history’s great ironies that religion—above all Roman Catholicism, but joined in time by Protestantism—marginalized itself by encouraging the study of Nature as an act of religious devotion. The founders of the American Republic sought to protect the authority of religion by officially separating church and state. This was not intended to remove religion from public life, which is why we see the routine inclusion of prayers in Congress, the mention of God in federal and state oaths of office, and so forth. On the contrary, it was intended to allow a diversity of creeds to flourish and reduce the general undermining of religion by sectarian infighting. The Eastern Seaboard had, after all, been colonized by waves of religious dissidents from England and continental Europe. The disestablishment of the church meant that the pursuit of science and learning was protected behind a cordon sanitaire from sectarian battles. In a nation where there were many different and often competing religious persuasions, liberating the pursuit of knowledge from religious oversight seemed eminently practical as well as self-evidently moral. The particular role that learning played in the Western imagination first as a religious and then a civic virtue, and strengthened by the protections created in the United States to keep learning out of the cross-hairs of sectarian battles, set the stage for the rise of science, engineering, and information technologies in the century after Jefferson’s death.
EVIDENCE AND THE FORENSIC IMAGINATION
The discovery of deep time created a forensic imagination, first and foremost among scientists, but soon among the general population as well. While forensics is associated in the public mind with legal cases and crime scenes, materialist science says that it is not just a crime scene that begs interpretation. It is the entire world. In this view, Nature is the ultimate archive, the most complete set of records about the past, the Universal Library itself. And science becomes the ultimate library card, granting unlimited access to the curious. Scientific investigation becomes a form of detection, and the key to solving the puzzle of “who, what, when, where, and how” is
material evidence.
From the perspective of memory, the most consequential effect of embracing materialism is its unquenchable appetite for information in its smallest increments—single data points—and as many of them as possible. This was new. Certainly procuring evidence had played crucial roles in adjudicating disputes among people. The original cuneiform token was invented to be evidence, to bear witness as a visible and outward sign of a contract or agreement entered into by two or more parties and witnessed by a third. Evidence was used in Roman courts of law, where people were called together to judge a case based on information made public to one and all. The term “forensic” is used in reference to law courts and public testimony, and derives from forensis: of or before the forum where Romans appeared publicly for the disposition of criminal accusations. Evidence, in other words, is information available to all without prejudice. It is not esoteric, subjective, or privileged information. Even if not literally to be introduced at court, information has forensic value when it is reliable, publicly available, and authentic—that is, being what it purports to be, not a false representation.
The notion of judging truth by outward and visible signs did not die out with the Roman Empire only to be revived in the nineteenth century by scientists. Most trials in intervening centuries would introduce some form of evidence, such as the testimony of an individual with knowledge of the matter. Written records were deemed more reliable, because they were fixed and less likely to change than an individual’s memories. But objective signs are not always written on paper. They can be written on the body as well. The seventeenth century, time of the glorious scientific revolution fought by Newton, Leibniz, Boyle, and Descartes, was also the time of witch trials throughout Europe and its colonies. And both pursuits of truth—of Nature and of the soul—placed great store in procuring evidence. During witch trials people would search for marks on a person’s body to judge whether they were a witch or not, or they would thrust them underwater and wait for them to drown to prove they were innocent. The marks on a body or the lack of susceptibility to drowning were outward and visible signs of witchery. The point is not that we would not accept these signs as evidence, but that evidence is always something that a given group of people have equal access to in order to render judgment. And the criteria for judgment are known in advance and reflect a social consensus about truth. What counts for evidence is always culturally determined, as is its interpretation. This is as true today as it was in third-century Rome or seventeenth-century Salem. What distinguishes scientific use of evidence from other interpretive systems is the rejection of any and all supernatural and extramundane factors in either cause or effect.
When We Are No More Page 10