Book Read Free

Homage to Gaia

Page 40

by James Lovelock


  Computers

  The most important single item that has sustained me as an independent scientist is the computer. So important has it been that I would like to digress and tell you how these wonderful devices entered my life and then changed it.

  The plane from Los Angeles arrived on time and, with my small carry-on bag, I was the first of the passengers to arrive at the Customs counter. In those days, the largest plane was a Boeing 707, and Heathrow was comparatively peaceful and quiet. When asked if I had anything to declare, I said, ‘Yes, I have two small electronic items.’ I showed the Customs officer two of the early integrated circuit amplifiers. They looked like a pair of frozen black beetles, with their stiff wire legs hanging beneath. When he asked me what they were and what they were worth, I replied, ‘They are small amplifiers. They are research items given to me in America.’ He then said, ‘This is a matter for the supervisor. Hang on, I’ll give him a call.’ Within minutes, they escorted me to an office occupied by an amiable middle-aged civil servant. I showed him the chips and he asked, ‘Do you mean to tell me that these things are amplifiers. You know, like I have with my hi-fi?’ When I nodded, yes, he snorted and said, ‘Their Lordships are pissing against the wind if they think I’ll be able to stop people bringing these things in with them.’ He then helped me to fill in some forms that gave what he called Treasury direction, so that there was no Customs duty payable. It was a pleasant encounter to start the day. The chips that I had imported were early operational amplifiers made by the Fairchild Company and called µA709. I needed them to control a palladium separator, which I was building for the Mars mission.

  Without at the time realising it, JPL was providing me also with a priceless education in the new electronics, and not only amplifiers. Soon I was familiar with and bringing in silicon-chip electronics: logic gates, timers, and inverters. By 1970, I started one of the extravagances of my scientific life. I bought an early HP computer, a 9100B calculator. Although it was heavy, it sat easily enough on my desk. It cost over £2,000 with its peripherals, enough then to buy a small house. Its memory was made of ferrite beads strung out on a grid of wires, and it was less than one kilobyte. The programme language was only slightly removed from the fundamental binary talk of computers, but thanks to HP’s well-composed manuals, I rapidly became fluent in it. Soon I was able to answer the problem that had plagued me over the past ten years: how does the electron capture detector work? The differential equations that describe electron capture and the behaviour of the detector were easy enough for me to write down, but they were far beyond my ability to solve analytically by ordinary mathematics. I have since childhood suffered a peculiar form of dyslexia. I cannot distinguish immediately between left and right and I reverse the order of letters in words I write. Worst of all, I have great difficulties with the manipulation of algebraic equations. I cannot instinctively tell which side is which. In spite of this, I love mathematics and have little difficulty with its concepts, only with the practical execution of them. Even this primitive computer, or calculator as they called it, was my liberation. Now I could leave it to the device to solve my equations numerically. Soon it was plotting graphs to show how and why the detector behaved so strangely. I was hooked, and as any teenager would do, I spent a sizeable proportion of what HP paid me for my advice, to buy their computers. Moore’s Law meant that every two years the computers evolved to a new desirable level and I bought a new one. It still goes on, thirty years later; soon I will be buying yet another.

  It would be no exaggeration to say that computers have transformed my life, and I do not mean that I would be a lot richer without them. They are the mental equivalent of a bicycle, something that lets me travel much faster using my own mental muscle. Without Hewlett Packard’s wonderful but expensive devices, Gaia would never have risen from the debating level. I would never have written the program of the definitive model, Daisyworld. It showed that the self-regulation of climate could emerge on a simple planet populated by organisms living under the rules of natural selection. Having a personal computer on my desk enabled me to interact directly with my problems. This was a privilege available to few scientists anywhere in the 1970s and 1980s. In those decades universities and scientific institutions tended to frown on personal computers; they preferred a single large computer with terminals connected to it in every laboratory.

  Composing computer programs to solve scientific problems is like writing poetry. You must choose every word with care and link it with the other words in perfect syntax. There is no place for verbosity or carelessness. To become fluent in a computer language demands almost the antithesis of modern loose thinking. It requires many interactive sessions, the hands-on use of the device. You do not learn a foreign language from a book, rather you have to live in the country for years to let the language become an automatic part of you, and the same is true of computer languages. There is little chance of this happening in most universities and institutes. All too often the scientists have to ask another group, the programmers, to draft the mathematical steps of their problem in the language the computer can understand. Often the languages chosen, like Fortran and C, are for the convenience of the programmers, not the scientists. This means that there is little chance of the scientists interacting directly with their computers and getting that feedback so essential in the development of the solution of a problem. To delegate a problem to a programmer is appropriate when the computer is merely performing a range of tedious calculations, far beyond any individual’s patience and stamina, but when a scientist faces important steps in understanding, delegation is unwise. Long ago, I learned that whenever one of my programs did not work, the error was nearly always in my thoughts, not in the computer. The hands-on interaction, sitting at the keyboard, looking at the screen, was like talking to a fair and honest critic. Are you sure, it seemed to say, that you have asked the right questions? This invaluable advice is hopelessly lost when it goes through several hands.

  At the JPL and at Hewlett Packard individual scientists used their bench top computers interactively in the design of instruments and space hardware and it was a hands-on, personal way of working. In the 1960s, there were few computer languages available and we had to communicate in the raw set of instructions, which were all the chips themselves could understand. It was hard and tedious work, but the rewards were such that few minded the task of learning. We had the same urge as a holiday traveller has to learn the language of a foreign country. By the 1970s, a number of so-called high-level languages evolved. They were ALGOL, FORTRAN, C, LISP, and PASCAL. HP had started with its own version of ALGOL that they called HPL, and this was my first high-level language. I loved it and grew fluent in it to the point of composing elegant and parsimonious code. In 1975 Professors JG Kemeney and TE Kurtz of Dartmouth College, New Hampshire, introduced a new language BASIC, the acronym of ‘basic all-purpose symbolic instruction code’. They intended it as a clear and comprehensible language with which to teach computer science. However, like human languages in the streets, dialects of this popular language soon evolved. Each tribe spoke its own version. These dialects of BASIC were often almost impossible for anyone not of the tribe to read. Teachers in universities soon damned them and BASIC from which they had evolved. This was bad, for BASIC was the computer equivalent of English and therefore a flexible language. It was sad to see FORTRAN and PASCAL taught in its place. These languages are like French, kept inflexible by the dictates of an academy. Logical consistency was preserved, but at the cost of freezing evolution.

  Hewlett Packard was unusual in that it sold its computers mainly to engineers and other professionals rather than to academics. And obstinately, but with wisdom insufficiently appreciated, they persisted in using their own version of BASIC for their computers. The computer language, HP BASIC, was every bit as thorough and professional as the other computer languages, but much easier to use. The popularity of BASIC led Hewlett Packard to discard their ALGOL-like HPL language and I was obliged to le
arn their version of BASIC, since it was the only language of their desktop computers. Since the late 1970s, I have worked solely in this although, as Microsoft grew more professional, it also chose to favour BASIC, and in the 1990s, their version, called VISUAL BASIC, has served me well. I envy the young scientist now with a personal computer, who can explore the sheer elegance of the language of Mathematica, which I think is the best of all computer languages for interactive use. Computers seduce and enchain the mind so that the communing with them becomes so large a part of life that the rest of it is in danger. Nothing else, apart from an affair, would keep me up past midnight, or neglecting food. Fortunately the normal mind has its own way of restoring balance—the obsession calms to become the state like that of a good marriage. What does matter is the near total failure of those who use computers to realize that the elegant charts and illusions that they produce are products of their own imaginations. The models, the simulations of the world that evolve on computer screens, enchant us, but it is all too easy to forget that they are Pygmalion constructions, Galateas, and not the real world.

  The Ozone War, the environmental cause that drew attention to the threat to stratospheric ozone from CFCs, started from a scientific hypothesis produced by the American atmospheric chemists, MJ Molina and FS Rowland. My part in this environmental cause is described in Chapter 8; I mention it now because of the importance of computer models in its development. The arguments that led to the first legislation banning the use of CFCs were from these models and not from measurements in the real world. This weakness was recognized by the scientists concerned, and in the 1980s, they flew instruments to measure the fluctuations of the ozone layer on satellites orbiting the Earth. These instruments were accurate and reliable, but no one watched their measurements on a dial or a digital display and the measurements accumulated in the data banks of a computer. At intervals, the scientist instructed the computer to analyse and present this data as illustrated maps and summaries. The programmers of the computers that handled the data had to make allowances for the natural errors of the instruments. Even the best instruments generate noise—random signals that are not observations—and in the real world events such as solar flares produce transient changes in the ozone layer. Therefore, the programmers told their computers to ignore signals that were outside what the scientists thought was the reasonable range of ozone values. The scientists used values predicted by their models to set what this reasonable range should be. The satellite instruments observed ozone every day throughout the 1980s, when ozone was rapidly declining over the Poles, especially over Antarctica. The instruments saw this and sent their message to the Earth, but the computers on Earth, following their instructions, disregarded these low values as wrong. This was why the big and expensive science with satellites and computers failed to discover the ozone hole. Human observers, Joe Farman and Brian Gardiner, found it in Antarctica by looking up at the sky with a simple Dobson spectrometer.

  The Royal Society

  In 1972, Archer Martin and his wife, Julia, came to Bowerchalke to see me. I was happy with anticipation. A visit from the Martins was a rare treat and I wondered what was important enough for him to make the long journey to Bowerchalke. We had just had tea in our dining room and were chatting about nothing in particular when Archer turned to me and said, ‘I would like to propose you for the Fellowship of the Royal Society. Is that acceptable to you?’ I was astonished and delighted at the same time. I had thought that going solo, becoming a scientific maverick, put me outside all the perquisites and honours of the cosy scientific community. Yet, here was Archer proposing me for the Fellowship of that wonderful and historic society of scientists, the world’s first scientific academy. Now I knew the purpose of his visit. ‘I’d be honoured and so pleased,’ I replied. ‘You realize,’ said Archer, ‘you may have to wait years before they elect you, if ever. Almost no one is elected on the first round and very few on the second.’ Privately, I thought what a kind thing Archer had done, but so low did I rate my chances that I dismissed thoughts of it from my mind.

  Then, in March 1974, the telephone rang just after breakfast. I picked up the handset and heard Sir Christopher Andrewes’s voice saying, ‘Is that James Ephraim Lovelock?’ No one uses my middle name and I wondered what my old friend, the sawfly catcher, was up to now. ‘I just saw your name on a short list that came by post this morning, new Fellows of the Royal Society. You mustn’t tell anyone until you hear from them; it’s very private you know.’ I was dazed and the happy daze lasted a whole week. Then came the purple cardboard tube with a Certificate of Fellowship and I knew it was really true and not one of Sir Christopher’s jokes. The certificate is strange, written as a letter inviting one to attend the next meeting and it asked me to sign that historic book that bears the signatures of King Charles II and all the monarchs that followed him together in juxtaposition with the distinguished scientists elected each year. Names like Newton, Wren, Maxwell, Rutherford, Huxley, and Darwin, down to the present day. I went to London for the meeting in April. Together with the other new Fellows admitted that day we were shown the facilities of our National Academy: the press and travel agencies, the library and the small suite of rooms on the top floor that provide bed and breakfast for Fellows. The ceremony itself, when the President, Sir Alan Hodgkin, called the new Fellows to come forward and sign the book, was a moving occasion and has sustained me ever since.

  In its charter of 1663 the Society chose as its motto ‘Nullius in Verba’, an expression of its determination to stand against dogma and verify all statements by an appeal to facts. ‘Nullius in Verba’ translates as ‘take nothing on authority’. It says something of the monarch, King Charles II, that he accepted it. Cynics have preferred to translate it as ‘put nothing in writing’. I sometimes wonder if it should be emblazoned on every document that comes from the Society to remind us that we pledged ourselves to appeal to facts, and not to the consensus of chatterers.

  Among the letters of congratulation that I received was one from Sir Stanley Hooker, the engineer who designed the Rolls Royce RB211 engines that powered so many jet aircraft. He said he had been Chairman of the Ad Hoc Committee that had recommended my election to the Council, and added, ‘We need scientists like you who are also inventors.’ The Ad Hoc Committee served to elect scientists outside the main divisions of science—chemistry, physics and biology. The Council of the Royal Society closed it shortly after my election. I doubt there was any connection between the two events, but its closure marked a change in the Society. It was not the Royal Society alone that was changing; science throughout the world was becoming a career employment instead of a vocation. Governments everywhere were funding and therefore assuming the right to interfere with the running of science. Science, like sport, was becoming less an international pursuit and more a matter of national interest. It may seem hard to believe now, but in the 1950s and earlier, society respected its scientists and paid them well. They were seen neither as threatening inventors of nuclear power and bombs, nor as makers of poisonous chemicals to destroy the environment. They suffered a mild disrespect, in that the public saw them as figures of fun, but they regarded their mad professors with affection; certainly never saw them as a threat. There were about ten times fewer graduate scientists in the United Kingdom then than now, yet nearly half of all the Nobel Prizes awarded came to the United Kingdom. Of the hundred or so scientists present during my time at the Mill Hill Institute, six were or were to become Nobel Laureates.

  We scientists in those days ran our own affairs and we ran them well. I have never seen a satisfactory explanation of the decline of science in the United Kingdom. Few of us, indeed, will admit that it has declined. Scientists now complain about their lack of recognition by poor pay, and about their bad image as portrayed by the media. They go on to say that government does not spend enough money on science. I suspect that we, as taxpayers, contribute in real terms at least ten times as much now to science as we did in the fecund 1950s. Why i
s there no proportional increase in scientific output? True enough, as our Chief Scientist, Sir Robert May, points out, we still do moderately well compared with the rest of the world. If papers published are anything to go by, we come second after the United States. But we have fallen relative to our past performance—in the 1950s and earlier we led the world. When I meet scientists employed in government service, in universities and in commercial enterprises, their conversation seems to have changed over the years. Now, there are fewer impassioned accounts of their new ideas and more mere talk, interspersed with complaints of bureaucratic interference and lack of funds. Many of the good scientists seem to have lost heart and talk wistfully of early retirement.

  There may be many reasons for our decline but I would blame the perverse anti-elitism of our present culture. It led to the building of many new universities so that all could have the benefits of education. This push for numbers sounds good to the public, but what if scientific progress comes from a tiny group of truly creative scientists. What if they are born, not made by higher education? When I started science at the NIMR in the 1940s and 1950s, peer review hardly existed. Editors of journals like Nature were prepared to decide themselves whether or not a paper was good enough to publish, or, if in doubt, merely to telephone a few experts in the field and ask their opinion. The small number of scientists there were then helped it all; perhaps a tenth as many as there are now, and they tended to know one another. Also, there was no ‘publish or perish’ ethos. At good institutes like the NIMR, one good paper in three years was quite acceptable as evidence of a satisfactory career. Indeed, it was preferred over ten minor contributions.

 

‹ Prev