When relativity theory and quantum mechanics undid the absolute certainty of the Newtonian paradigm, science demonstrated, in a way that Kant as a convinced Newtonian could never have anticipated, the validity of Kant’s skepticism concerning the human mind’s capacity for certain knowledge of the world in itself. Because he was certain of the truth of Newtonian science, Kant had argued that the categories of human cognition congruent with that science were themselves absolute, and these alone provided a basis for the Newtonian achievement, as well as for man’s epistemological competence in general. But with twentieth-century physics, the bottom fell out of Kant’s last certainty. The fundamental Kantian a prioris—space, time, substance, causality—were no longer applicable to all phenomena. The scientific knowledge that had seemed after Newton to be universal and absolute had to be recognized after Einstein, Bohr, and Heisenberg as limited and provisional. So too did quantum mechanics reveal in unexpected fashion the radical validity of Kant’s thesis that the nature described by physics was not nature in itself but man’s relation to nature—i.e., nature as exposed to man’s form of questioning.
What had been implicit in Kant’s critique, but obscured by the apparent certainty of Newtonian physics, now became explicit: Because induction can never render certain general laws, and because scientific knowledge is a product of human interpretive structures that are themselves relative, variable, and creatively employed, and finally because the act of observation in some sense produces the objective reality science attempts to explicate, the truths of science are neither absolute nor unequivocally objective. In the combined wake of eighteenth-century philosophy and twentieth-century science, the modern mind was left free of absolutes, but also disconcertingly free of any solid ground.
This problematic conclusion was reinforced by a newly critical approach to the philosophy and history of science, influenced above all by the work of Karl Popper and Thomas Kuhn. Drawing on the insights of Hume and Kant, Popper noted that science can never produce knowledge that is certain, nor even probable. Man observes the universe as a stranger, making imaginative guesses about its structure and workings. He cannot approach the world without such bold conjectures in the background, for every observed fact presupposes an interpretive focus. In science, these conjectures must be continually and systematically tested; yet however many tests are successfully passed, any theory can never be viewed as more than an imperfectly corroborated conjecture. At any time, a new test could falsify it. No scientific truth is immune to such a possibility. Even the basic facts are relative, always potentially subject to a radical reinterpretation in a new framework. Man can never claim to know the real essences of things. Before the virtual infinitude of the world’s phenomena, human ignorance itself is infinite. The wisest strategy is to learn from one’s inevitable mistakes.
But while Popper maintained the rationality of science by upholding its fundamental commitment to rigorous testing of theories, its fearless neutrality in the quest for truth, Kuhn’s analysis of the history of science tended to undercut even that security. Kuhn agreed that all scientific knowledge required interpretive structures based on fundamental paradigms or conceptual models that allowed researchers to isolate data, elaborate theories, and solve problems. But citing many examples in the history of science, he pointed out that the actual practice of scientists seldom conformed to Popper’s ideal of systematic self-criticism by means of attempted falsification of existing theories. Instead, science typically proceeded by seeking confirmations of the prevailing paradigm—gathering facts in the light of that theory, performing experiments on its basis, extending its range of applicability, further articulating its structure, attempting to clarify residual problems. Far from subjecting the paradigm itself to constant testing, normal science avoided contradicting it by routinely reinterpreting conflicting data in ways that would support the paradigm, or by neglecting such awkward data altogether. To an extent never consciously recognized by scientists, the nature of scientific practice makes its governing paradigm self-validating. The paradigm acts as a lens through which every observation is filtered, and is maintained as an authoritative bulwark by common convention. Through teachers and texts, scientific pedagogy sustains the inherited paradigm and ratifies its credibility, tending to produce a firmness of conviction and theoretical rigidity not unlike an education in systematic theology.
Kuhn further argued that when the gradual accumulation of conflicting data finally produces a paradigm crisis and a new imaginative synthesis eventually wins scientific favor, the process by which that revolution takes place is far from rational. It depends as much on the established customs of the scientific community, on aesthetic, psychological, and sociological factors, on the presence of contemporary root metaphors and popular analogies, on unpredictable imaginative leaps and “gestalt switches,” even on the aging and dying of conservative scientists, as on disinterested tests and arguments. For in fact the rival paradigms are seldom genuinely comparable; they are selectively based on differing modes of interpretation and hence different sets of data. Each paradigm creates its own gestalt, so comprehensive that scientists working within different paradigms seem to be living in different worlds. Nor is there any common measure, such as problem-solving ability or theoretical coherence or resistance to falsification, that all scientists agree upon as a standard for comparison. What is an important problem for one group of scientists is not for another. Thus the history of science is not one of linear rational progress moving toward ever more accurate and complete knowledge of an objective truth, but is one of radical shifts of vision in which a multitude of nonrational and nonempirical factors play crucial roles. Whereas Popper had attempted to temper Hume’s skepticism by demonstrating the rationality of choosing the most rigorously tested conjecture, Kuhn’s analysis served to restore that skepticism.4
With these philosophical and historical critiques and with the revolution in physics, a more tentative view of science became widespread in intellectual circles. Science was still patently effective and powerful in its knowledge, but scientific knowledge was now regarded as, in several senses, a relative matter. The knowledge science rendered was relative to the observer, to his physical context, to his science’s prevailing paradigm and his own theoretical assumptions. It was relative to his culture’s prevailing belief system, to his social context and psychological predispositions, to his very act of observation. And science’s first principles might be overturned at any point in the face of new evidence. Moreover, by the later twentieth century, the conventional paradigm structures of other sciences, including the Darwinian theory of evolution, were coming under increasing pressure from conflicting data and alternative theories. Above all, the bedrock certainty of the Cartesian-Newtonian world view, for centuries the acknowledged epitome and model of human knowledge and still pervasively influential in the cultural psyche, had been shattered. And the post-Newtonian world order was neither intuitively accessible nor internally coherent—indeed, scarcely an order at all.
Yet for all this, science’s cognitive status would still have retained its unquestioned preeminence for the modern mind. Scientific truth might be increasingly esoteric and only provisional, but it was a testable truth, continually being improved and more accurately formulated, and its practical effects in the form of technological progress—in industry, agriculture, medicine, energy production, communication and transportation—provided tangible public evidence for science’s claims to render viable knowledge of the world. But it was, paradoxically, this same tangible evidence that was to prove crucial in an antithetical development; for it was when the practical consequences of scientific knowledge could no longer be judged exclusively positive that the modern mind was forced to reevaluate its previously wholehearted trust in science.
As early as the nineteenth century, Emerson had warned that man’s technical achievements might not be unequivocally in his own best interests: “Things are in the saddle and ride mankind.” By the turn of the century
, just as technology was producing new wonders like the automobile and the widespread application of electricity, a few observers began to sense that such developments might signal an ominous reversal of human values. By the mid-twentieth century, modern science’s brave new world had started to become subject to wide and vigorous criticism: Technology was taking over and dehumanizing man, placing him in a context of artificial substances and gadgets rather than live nature, in an unaesthetically standardized environment where means had subsumed ends, where industrial labor requirements entailed the mechanization of human beings, where all problems were perceived as soluble by technical research at the expense of genuine existential responses. The self-propelling and self-augmenting imperatives of technical functioning were dislodging man and uprooting him from his fundamental relation to the Earth. Human individuality seemed increasingly tenuous, disappearing under the impact of mass production, the mass media, and the spread of a bleak and problem-ridden urbanization. Traditional structures and values were crumbling. With an unending stream of technological innovations, modern life was subject to an unprecedentedly disorienting rapidity of change. Gigantism and turmoil, excessive noise, speed, and complexity dominated the human environment. The world in which man lived was becoming as impersonal as the cosmos of his science. With the pervasive anonymity, hollowness, and materialism of modern life, man’s capacity to retain his humanity in an environment determined by technology seemed increasingly in doubt. For many, the question of human freedom, of mankind’s ability to maintain mastery over its own creation, had become acute.
But compounding these humanistic critiques were more disturbingly concrete signs of science’s untoward consequences. The critical contamination of the planet’s water, air, and soil, the manifold harmful effects on animal and plant life, the extinction of innumerable species, the deforestation of the globe, the erosion of topsoil, the depletion of groundwater, the vast accumulation of toxic wastes, the apparent exacerbation of the greenhouse effect, the breakdown of the ozone layer in the atmosphere, the radical disruption of the entire planetary ecosystem—all these emerged as direly serious problems with increasing force and complexity. From even a short-term human perspective, the accelerating depletion of irreplaceable natural resources had become an alarming phenomenon. Dependence on foreign supplies of vital resources brought a new precariousness into global political and economic life. New banes and stresses to the social fabric continued to appear, directly or indirectly tied to the advance of a scientific civilization—urban overdevelopment and overcrowding, cultural and social rootlessness, numbingly mechanical labor, increasingly disastrous industrial accidents, automobile and air travel fatalities, cancer and heart disease, alcoholism and drug addiction, mind-dulling and culture-impoverishing television, growing levels of crime, violence, and psychopathology. Even science’s most cherished successes paradoxically entailed new and pressing problems, as when the medical relief of human illness and lowering of mortality rates, combined with technological strides in food production and transportation, in turn exacerbated the threat of global overpopulation. In other cases, the advance of science presented new Faustian dilemmas, as in those surrounding the unforeseeable future uses of genetic engineering. More generally, the scientifically unfathomed complexity of all relevant variables—whether in global or local environments, in social systems, or in the human body—made the consequences of technological manipulation of those variables unpredictable and often pernicious.
All these developments had reached an early and ominous proleptic climax when natural science and political history conspired to produce the atomic bomb. It seemed supremely, if tragically, ironic that the Einsteinian discovery of the equivalence of mass and energy, by which a particle of matter could be converted into an immense quantity of energy—a discovery by a dedicated pacifist reflecting a certain apex of human intellectual brilliance and creativity—precipitated for the first time in history the prospect of humanity’s self-extinction. With the dropping of atomic bombs on the civilians of Hiroshima and Nagasaki, faith in science’s intrinsic moral neutrality, not to say its unlimited powers of benign progress, could no longer be upheld. During the protracted and tense global schism of the Cold War that followed, the numbers of unprecedentedly destructive nuclear missiles relentlessly multiplied until the entire planet could be devastated many times over. Civilization itself was now brought into peril by virtue of its own genius. The same science that had dramatically lessened the hazards and burdens of human survival now presented to human survival its gravest menace.
The great succession of science’s triumphs and cumulative progress was now shadowed by a new sense of science’s limits, its dangers, and its culpability. The modern scientific mind found itself beleaguered on several fronts at once: by the epistemological critiques, by its own theoretical problems arising in a growing number of fields, by the increasingly urgent psychological necessity of integrating the modern outlook’s human-world divide, and above all by its adverse consequences and intimate involvement in the planetary crisis. The close association of scientific research with the political, military, and corporate establishments continued to belie science’s traditional self-image of detached purity. The very concept of “pure science” was now criticized by many as entirely illusory. The belief that the scientific mind had unique access to the truth of the world, that it could register nature like a perfect mirror reflecting an extrahistorical, universal objective reality, was seen not only as epistemologically naive, but also as serving, either consciously or unconsciously, specific political and economic agenda, often allowing vast resources and intelligence to be commandeered for programs of social and ecological domination. The aggressive exploitation of the natural environment, the proliferation of nuclear weaponry, the threat of global catastrophe—all pointed to an indictment of science, of human reason itself, now seemingly in thrall to man’s own self-destructive irrationality.
If all scientific hypotheses were to be rigorously and disinterestedly tested, then it seemed that the “scientific world view” itself, the governing metahypothesis of the modern era, was being decisively falsified by its deleterious and counterproductive consequences in the empirical world. The scientific enterprise, which in its earlier stages had presented a cultural predicament—philosophical, religious, social, psychological—had now provoked a biological emergency. The optimistic belief that the world’s dilemmas could be solved simply by scientific advance and social engineering had been confounded. The West was again losing its faith, this time not in religion but in science and in the autonomous human reason.
Science was still valued, in many respects still revered. But it had lost its untainted image as humanity’s liberator. It had also lost its long-secure claims to virtually absolute cognitive reliability. With its productions no longer exclusively benign, with its reductionist understanding of the natural environment apparently deficient, with its evident susceptibility to political and economic bias, the previously unqualified trustworthiness of scientific knowledge could no longer be affirmed. On the basis of these several interacting factors, something like Hume’s radical epistemological skepticism—mixed with a relativized Kantian sense of a priori cognitive structures—seemed publicly vindicated. After modern philosophy’s acute epistemological critique, the principal remaining foundation for reason’s validity had been its empirical support by science. The philosophical critique alone had been in effect an abstract exercise, without definite influence on the larger culture or on science, and would have so continued if the scientific enterprise had itself continued being so unequivocally positive in its practical and cognitive progress. But with science’s concrete consequences so problematic, reason’s last foundation was now unfirm.
Many thoughtful observers, not just professional philosophers, were forced to reevaluate the status of human knowledge. Man might think he knows things, scientifically or otherwise, but there was clearly no guarantee for this: he had no a priori rational access to u
niversal truths; empirical data were always theory-soaked and relative to the observer; and the previously reliable scientific world view was open to fundamental question, for that conceptual framework was evidently both creating and exacerbating problems for humanity on a global scale. Scientific knowledge was stupendously effective, but those effects suggested that much knowledge from a limited perspective could be a very dangerous thing.
Romanticism and Its Fate
The Two Cultures
From the complex matrix of the Renaissance had issued forth two distinct streams of culture, two temperaments or general approaches to human existence characteristic of the Western mind. One emerged in the Scientific Revolution and Enlightenment and stressed rationality, empirical science, and a skeptical secularism. The other was its polar complement, sharing common roots in the Renaissance and classical Greco-Roman culture (and in the Reformation as well), but tending to express just those aspects of human experience suppressed by the Enlightenment’s overriding spirit of rationalism. First conspicuously present in Rousseau, then in Goethe, Schiller, Herder, and German Romanticism, this side of the Western sensibility fully emerged in the late eighteenth and early nineteenth centuries, and has not since ceased to be a potent force in Western culture and consciousness—from Blake, Wordsworth, Coleridge, Hölderlin, Schelling, Schleiermacher, the Schlegel brothers, Madame de Staël, Shelley, Keats, Byron, Hugo, Pushkin, Carlyle, Emerson, Thoreau, Whitman, and onward in its diverse forms to their many descendants, countercultural and otherwise, of the present era.
Passion of the Western Mind Page 47