Serving the Reich

Home > Other > Serving the Reich > Page 29
Serving the Reich Page 29

by Philip Ball


  In 2011 the Debye Prize was awarded in Maastricht for the first time since 2004. It was given to the director of the reinstated Debye Institute in Utrecht, and was presented in the town hall where the bronze bust of Debye still presides. This seems a defensible outcome, since the decision to remove Debye’s name from these institutions was an ill-considered and reactionary political gesture that did no one any credit. But this does not mean that the question of Peter Debye has been resolved in his favour. Nor should it be.

  Indeed, one minor consequence of the Debye affair is that it should prompt some reconsideration of the practice of naming institutions after ‘great scientists’.*3 The motivation is questionable at best. In response to the Debye affair, historian Leen Dorsman of Utrecht University lamented the ‘American habit’ of naming institutes for individuals: ‘The motive is not to honor great men, it is a sales argument. The name on the facade of the institute shouts: Look at us, look how important we are, we are affiliated with a genuine Nobel laureate. It is now clear from this sequence of events that this raises problems. The stakes are so high that panic reactions (on both sides) are the logical consequence.’ The practice is widespread in academia, but science seems peculiarly keen to canonize its ‘greats’ in this way. The intent is undoubtedly not always as ignoble as Dorsman implies, but evidently scientific eminence alone is not the determining factor: there is no longer a Philipp Lenard Institute at Heidelberg. If that is the case, then such accolades impose an unrealistic expectation of probity on the part of those so honoured. Scientists rightly insist on a distinction between the quality of one’s science and the quality of one’s character. If so, why create a situation where the two must necessarily be conflated?

  The sequence of removing Debye’s name from an institute and then reinstating it implies a process of disgrace and rehabilitiation—of verdicts of guilt and then innocence. That is precisely what, in the cases of Debye, Heisenberg, Planck, and many others in Nazi Germany, we must seek to avoid. For by simplistically condemning or absolving them, we abrogate responsibility for the dilemmas that science and scientists face, always and everywhere.

  Epilogue:

  ‘We did not speak the same language’

  A Nobel laureate recently suggested to me that the Nazi atrocities were a consequence of ‘religiosity’, which all good scientists—as torch-bearers of the tradition of Enlightenment rationalism—should reject. While both the historical validity and the logic of this claim are as warped as that of Pope Benedict XVI’s suggestion that the Nazi tyranny was a consequence of ‘atheist extremism’ (there is no surer or more facile way to win an argument than by placing Hitler in the opposition), nevertheless it expresses a common notion among scientists that their calling should insulate them from the excesses of ideologies of all sorts. Scientists, Otto Hahn claimed with breathtaking arrogance in 1947 as he railed against the iniquities of denazification, are ‘accustomed to regarding matters perhaps a little more calmly and rationally than other professions’. Science historian Joseph Haberer concluded in 1969 that ‘an idealization of science as a superior form of activity remains deeply entrenched in the contemporary scientific consciousness’. One can safely make the same statement today.

  The dangerous complacency of this assumption is laid bare by the history of German science under National Socialism. It should be obvious from even a cursory consideration of the matter that the rational and impersonal viewpoint required in science here conferred absolutely no advantage in matters of morality. Indeed, the behaviour of German physicists in the 1930s shows that the situation is potentially worse still. While several German religious leaders, writers and artists, industrialists and politicians mounted strong opposition to Nazi rule at great personal cost, sometimes of their lives, there was nothing comparable to be found in German science.

  This was not, on the whole, because the scientists sympathized with the regime, even if like many middle-class liberals they might initially have agreed with some of its general principles, such as nationalism, robust leadership and foreign policy, and a reduction in the influence of Jews in public life. All the same, the scientists’ later insistence that colleagues who were ardent supporters of the Third Reich were aberrations, mediocrities or lunatics must be seen as an attempt to ‘cleanse’ the scientific profession of ideological taint. Yet as Haberer has said,

  The real issue involves how it was possible for men trained in the sciences, like Lenard and Stark, to become fanatical National Socialists. If Nobel laureates can be so infected, what protection does scientific training and practice provide against the excesses of irrational personal, economic, social or political conduct? Most scientists have tended to assume that they (more than any other professional type) follow the paths of rational, disinterested, and even humane conduct. The evidence increasingly demonstrates that scientists as a whole are no more immune to the ailments of political man than other men.

  This is more even than a matter of saying that scientists are no better, morally speaking, than the rest of us—a conclusion that should surprise no one, despite the delusion of some scientists that reason and moral virtue go hand in hand. For while Stark and Lenard were indeed in a minority, many scientists found in their profession a justification for avoiding questions of social justice and probity: their duty was only to science. Thus, Haberer implies, while there is no reason to expect science to be any more principled than other areas of human activity, it’s possible that it may be less so. ‘Whether they support the regime or not’, a group of science historians has recently written, ‘most scientists, or perhaps better put, scientific communities, will do what they have to in order to be able to do science.’

  All the same, science does have a tradition of liberalism. Today scientists of almost any nationality tend to be more internationalist, more tolerant, more left-leaning and progressive, than a cross-section of the population from which they are drawn. But this is probably more to do with the culture that has evolved in post-war science, and among the educated intelligentsia generally, than with scientific training per se. In similar fashion, it was the background and professional development of the German physicists, not their science, that dictated their responses to Nazi rule: their conservatism, patriotism and sense of duty. There have been and continue to be among scientists some individuals with a strong commitment to global peace, such as Joseph Rotblat and Linus Pauling, as well as prominent political dissidents such as Fang Lizhi and Andrei Sakharov—but these are generally brave, principled people who just happen to be scientists (and who owe their political voice to that fact).

  We must also distinguish opposition to state interference in order to protect the scientific profession from expressions of broader social conscience. Many scientists are frequently and rightly outspoken today about infringements of the freedom of speech, and will steadfastly support oppressed colleagues working in authoritarian regimes. But defending the rights of one’s peers doesn’t always entail an acknowledgement of the wider moral issues. I once attended a session on human rights during an international physics conference in Paris—itself a highly commendable rarity at such an event—at which the panellists spoke eloquently and passionately on behalf of scientists imprisoned for challenging their political leaders, but fell silent when asked about the legitimacy of weapons research in the light of the clear link between arms trading and human-rights violations. To address that matter would mean to infringe on colleagues’ freedom to choose the direction of their research.

  Moreover, championing ‘free speech’—in principle an asset to the scientific enterprise that is rightly treasured—may become a reflex formula that trumps any other moral judgement. When a lecture at London’s Science Museum by the Nobel laureate biologist James Watson was cancelled in 2007 after Watson made racist remarks about intelligence in a newspaper interview (he claimed that ‘people who have to deal with black employees’ know the assumption of equal intelligence among races to be untrue), biologist Richard Dawkins protested at �
��the hounding, by what can only be described as an illiberal and intolerant “thought police”, of one of the most distinguished scientists of our time’. Not only did this fail to recognize that Watson was using his privileged platform to voice anecdotal prejudice rather than a scientific hypothesis, but it implied that his professional standing as a scientist should in itself offer some protection against censure. Without wishing to draw too lurid a parallel, one can’t help being reminded of Arnold Sommerfeld arguing that the judgement of the Nuremberg trials on Johannes Stark be mitigated by his ‘scientific importance’.

  Scientists often say that they cannot be expected to be proficient in making moral and ethical judgements as well as technical ones. This position was adduced by the American physicist Percy Bridgman in an article on ‘Scientists and Social Responsibility’ in the March 1948 issue of the Bulletin of the Atomic Scientists. Bridgman argued that the social consequences of research must lie outside the scientist’s domain. After all, how can scientists possibly expect to foresee the ways in which their work will be applied, let alone then ensure that only beneficial uses are pursued? Either they would be regulated and constrained beyond measure, not to mention legally vulnerable, or they would be paralysed by bureaucracy. They are not, in any case, trained to be competent in areas of ethics or public policy.

  In fact Bridgman’s view was rather more extreme. He considered that the demands of science make it necessary for scientists to be freed from the shackles of moral or social constraints altogether, so that they have no obligation to consider what consequences their work might have:

  The challenge to the understanding of nature is a challenge to the utmost capacity in us. In accepting the challenge, man can dare to accept no handicaps. That is the reason that scientific freedom is essential and that the artificial limitations of tools or subject matter are unthinkable.

  Most scientists today might be hesitant to express such a forthright view, although I have no doubt that some would defend it with passion, and others would secretly find it alluring. Certainly, many are happy to proclaim the simplistic notion that ‘there are no questions that should not be asked’—forgetting what politicians and the media show us every day, which is that the mere framing of a question can be a politically freighted act.*1

  Although Bridgman is right to say that scientists have no special moral competency, the statement is somewhat self-fulfilling. Scientific training rarely incorporates an ethical dimension. Even when it does, the emphasis tends to be solely on codes of professional conduct: issues such as intellectual property, citation, treatment of staff, conflicts of interest and whistle-blowing. One might also ask whether an essentially technical vocation should incur any more moral responsibility than that expected of the average citizen: car mechanics and chefs, say, are not troubled by such demands. But it seems proper that one’s obligations in this regard should follow in proportion to the potential impact of one’s actions. The development of nuclear weapons during the Second World War brought this issue to a head by revealing how socially and politically transformative, not to mention how destructive, a new technology can be.

  In the light of developments such as genetic engineering and nanotechnology, there is far greater awareness today that new technologies raise important societal and ethical questions that should be debated within and beyond the scientific community in parallel with their technical development. It is also generally recognized that scientists themselves cannot be expected to anticipate all such problems and dilemmas, or to adjudicate them alone. Yet this has not necessarily bred a readiness in scientists to engage with these matters beyond the role of offering technical advice. A common response is to acknowledge that these are important questions but to insist that they must be left for ‘others’, or for ‘society’, to decide—that the accountability of scientists extends only to matters of technical judgement and the objective presentation of data and evidence.

  The limited ethical horizons of much of the scientific community go hand in hand with a conscious disengagement from politics. The belief that science should somehow be ‘above’ politics has been evident at least since the inception of modern science in the seventeenth century. Yet at the same time, that historical perspective also shows how ineluctably science has been bound to politics, not least in terms of the scientific community’s need for state sanction and support. The practice of science, says Haberer, ‘is infused with problems which require political modes of thought and political instrumentalities’.

  A reluctance to embrace this aspect of science has meant that its community has generally not distinguished itself in the political arena. Compared with the clashes that have arisen between governments and some artistic and religious movements, Haberer claims that ‘scientific leadership has tended, almost without exception, to acquiesce in any fundamental confrontation with the state, especially when opposition was likely to evoke serious sanctions’. And individual scientists have often displayed a misplaced conviction that they can manipulate state leaders for their own ends, only to find that it is they who are used and then discarded. Even scientists who do show moral courage are prone to this mistake. There can be few more poignant scenes in the history of nuclear proliferation than Niels Bohr’s disastrous audience with Winston Churchill in which he hoped to convince the British prime minister of the need to engage in frank dialogue with the Soviets about atomic weapons. C. P. Snow described that meeting as ‘one of the blackest comedies of the war’, in which Bohr and his son Aage were sent away with a flea in their ear. ‘He scolded us like schoolboys’, Bohr said afterwards. ‘We did not speak the same language.’

  While one can’t expect scientists to be braver or more morally astute than any other section of the population, science can and should as a community organize itself to maximize its ability to act collectively, ethically and—when necessary—politically. That objective would need to include more explicit recognition of the political nature of science itself, and should relinquish its reliance on unexamined myths about ‘scientific martyrs’ to ideology such as Galileo or (so the conventional story goes) Giordano Bruno.

  When science does confront politics, it has often been apt to do so with a kind of naïve, Platonic view in which political action is conducted in some abstract sphere where questions of right or wrong hardly exist. The German psychiatrist and philosopher Karl Jaspers detects this baleful tendency in Robert Oppenheimer’s pronouncements on the social roles of science, full of imagery of the statesman practising his skill of ‘statecraft’ upon the body politic while failing to locate any particular nexus of moral choice. The scientist, meanwhile, wanders in innocent awe among nature’s marvels, detached from consequences. As Oppenheimer put it:

  We regard it as proper and just that the patronage of science by society is in large measure based on the increased power which knowledge gives. If we are anxious that the power so given and so obtained be used with wisdom and with love of humanity, that is an anxiety we share with almost everyone. But we also know how little of the deep knowledge which has altered the face of the world, which has changed—and increasingly and ever more profoundly must change—man’s views of the world, resulted from a quest for practical ends or an interest in exercising the power that knowledge gives. For most of us, in most of those moments when we were most free of corruption, it has been the beauty of the world of nature and the strange and compelling harmony of its order, that has sustained, inspirited, and led us. That also is as it should be.

  While Oppenheimer’s statement does speak to the honourably idealistic impulse that motivates many scientists, it is at the same time a sweetly worded diversion from the issues and a misrepresentation of the daily business of science—another myth of its apolitical character. In contrast to Oppenheimer’s rose-tinted view, scientists in fact rarely miss an opportunity to point out the possible applications of their discoveries. If we now deplore Paul Harteck’s and Heisenberg’s efforts to win military funding or political prestige by parading
the possible uses of nuclear physics, it is not because of those appeals in themselves but because they were directed to the Nazis. Oppenheimer’s comments on the alleged moral neutrality of science—words strikingly similar to those voiced by Peter Debye—take on a very different complexion when read against the context of German physics in the 1930s, as he more than anyone should surely have known:

  In most scientific study, questions of good and evil, or right and wrong, play at most a minor and secondary part . . . The true responsibility of a scientist, as we all know, is to the integrity and vigor of his science. And because most scientists, like all men of learning, tend in part also to be teachers, they have a responsibility for the communication of the truths they have found.

  Well might we then understand Jaspers’ complaint:

  We hear different language from a scientist like Oppenheimer . . . talking of ‘beauty’, or our faculty of seeing it in remote, strange, unfamiliar places or paths that maintain existence in a great, open, windy world . . . This is the premise of man, and on those terms we can help, because we love one another. In such sentences I can see only an escape into sophisticated aestheticism, into phrases that are existentially confusing, seductive, and soporific in relation to reality.*2

  Seen against the wider historical backdrop, the behaviour of German physicists under the Nazis was evidently not an aberration under extreme circumstances but rather, a fairly typical example of how science and politics interact. As Mark Walker says, ‘It must be possible both to respect the unique, terrible nature of National Socialism [in Germany] and compare it with other periods in history.’ Historian Kristie Macrakis is surely right to claim that ‘Many of the ways in which the social order influences science in turbulent times are present in dormant forms in science organizations, science policy, and the practice of scientific research in normal times, or in a democracy.’ And while of course this particular episode cannot illuminate or exemplify all aspects of how scientists operate morally and politically, nevertheless such case studies are a more trustworthy gauge of how science functions within society than general assertions about the ‘scientific attitude’. Robert Oppenheimer’s vague ruminations about ‘statecraft’ tell us far less about how scientists and politicians interact than the McCarthyite realpolitik that stripped him of security clearance and authority in the 1950s.

 

‹ Prev