Book Read Free

A Point of View

Page 37

by Clive James


  The philosopher Wittgenstein often turns up in these broadcasts because when I was an undergraduate at Cambridge he was my ideal example of what a thinker should be. When he was teaching at Cambridge he made zero impact in this new sense. Even under the outgoing Research Assessment system he would have been a liability to his department, because he published only one philosophical book in his lifetime. The book was the Tractatus Logico-Philosophicus and it had a huge influence in the long run, but it might not have scored many funding points after he told the assessment board they were a bunch of dummies. He wasn’t just incapable of diplomacy, he disapproved of it.

  If he were teaching now under the incoming Research Excellence system he would be a disaster for his department. You couldn’t imagine him making contact with a television producer and saying, ‘Look, I’ve got this terrific idea for a programme about a man obsessed with language and it’s perfect for Daniel Day-Lewis.’ He would have been hopeless. But that was just what I liked about him. It was what I liked about all the dons, even the crazy ones. There was one guy who was given a fellowship in about 1923 and spent the rest of his life walking around town with a bundle of newspapers under his arm. But that was the price a great university was willing to pay for extending to its scholars the freedom to pursue an interest for its own sake.

  In the years I spent pretending to study for a Ph.D., I would sit in the Copper Kettle cafe opposite King’s College and read Wittgenstein’s Philosophical Investigations. Occasionally I would look up to make a philosophical investigation of a passing undergraduette. Then I looked down again to puzzle at another brilliantly compressed paragraph. Wittgenstein was having his impact, and it was an impact that couldn’t be measured. A university is, or should be, a place where you can’t yet tell what will be useful to the outside world, because it deals with the inside world, the most inside of all worlds, the mind. For all I know, that was what my uncle Harold was doing. He had done his time in the outside world, fighting for democracy against Japanese soldiers on the Kokoda trail, and now he was gazing within.

  Postscript

  At the family house in Cambridge I lead the life of a civilian on an army base. I live surrounded by so many academics that if I spent too much time listening I would be stifled with feelings of inadequacy and unable to write a line. Sometimes, though, it pays to tune in. I was alerted to the Impact boondoggle by a raft-load of dedicated young scholars who had taken one look at the new rules and realized that not only the subjects would be under threat, but that the whole of the humanities would inevitably suffer if these mad plans were carried out. Whichever committee decided that the world of the humanities didn’t need to know about, say, the early trans-Baltic Normans would soon decide that it didn’t need to know about the ancient Greeks either.

  I like to think that I would quite soon have been alerted to the topic on my own account, if only by the manic intensity with which its proponents abused the English language. For the academy, anyone talking managerial jargon should count as an invader. Wherever there is an invasion, however, there will be collaborators, and one of the marks of the new barbarism in academic life has always been the number of begowned figures who magically pop up and start cooperating. In Cambridge, at a time when almost every ancient college was sprouting a new building or two, invariably the ugliest new buildings were approved by committees of dons. With their appreciation of the most pretentious and brutalist strains in modern architecture, the dons were merely laying the foundations for their subsequent accommodation to managerial double-talk. And as with any other cast of experts, they were deadlier in committee than when alone. Nigel Balchin, in his fine novel The Small Back Room, wrote a pioneering study of how supervisory committees are bad at attaining a desirable goal. E. V. Jones, in his essential memoir Most Secret War, put the study on what should have been a permanent basis when he demonstrated that a scientific committee could defeat its own ends simply by being too big, or by having even one self-serving member. (According to Jones, any committee containing Sir Robert Watson-Watt, nominal inventor of radar, would have been useless even if radar had been its subject.) It’s a condition of academic freedom that the universities should organize their own affairs, but the freedom can be fatal when exercised in furtherance of a governmental diktat conceived in the name of efficiency, equality, diversity or any other measure except high standards of knowledge.

  That being said, several fans of Nicholas Cage told me I had been unfair to him. They were probably right. I had been made angry by how, in Leaving Las Vegas, he kept getting between me and Elisabeth Shue. Male film stars obsess me because they are crucial to my fantasy projects. One of those is a script about Wittgenstein, but in my nightmares the studio wants Brad Pitt for the lead. What a wonderful subject, though. It would be ideal for Daniel Day-Lewis, but Rupert Everett might be even better. You will judge correctly that I am whistling to keep my spirits up. It never occurred to me that the universities, in my time, would become the entry portals for the space invaders. Like most goof-offs I was counting on the continued stability of the institutions I goofed off from.

  HERMIE’S GHOST

  Dates of show: 11 and 13 December 2009

  About forty years ago now, the world used to hear a lot from a futurologist called Herman Kahn. Of ample girth and unquenchable volubility, Herman Kahn, who died in 1983, was always making confident pronouncements about what would happen in the future. So and so, he would say, would happen ten, twenny, twennyfive years from now. It wouldn’t happen tomorrow, so that you could check up on it straight away, but it would happen ten, twenny, twennyfive years from now. Some of us realized that he had invented a new unit of time, and we gave it a name. As an echo of the Fermi, which is a measure for the size of an electron, we called his new unit of time the Hermie.

  The merit of the Hermie, as a unit of measurement, was that, while being vague, it sounded impressive. The prediction itself might or might not have been right. Herman Kahn predicted that within one Hermie everyone in the West would fly his own helicopter and have access to free-fall sex. That didn’t happen within one Hermie, but it still might happen in the next Hermie. All we can be sure of is that Herman Kahn’s language exemplified an impressive way of talking about the future, a way of sounding impressive that sounded less impressive only when you realized that sounding impressive was its main motive. Big things would happen. It was big talk. And it paid the penalty of all big talk. As you got used to it, you got tired of it.

  Over the last ten years we have heard a lot about how civilization would be in trouble if it didn’t soon do something drastic about global warming. But this impressive message tended to sound less impressive as time went on. It wasn’t just that the globe uncooperatively declined to get warmer during the last ten years. It was that the language of alarm wore out its welcome as it became ever more assertive about what had not yet happened.

  The world, when it resumed warming again – this brief, unarguably still hot period when it had somehow refused to grow any hotter was soon explained, although it seemed strange that it had not been predicted – the once-again warming world would heat up by so many degrees, or so many more degrees than that, and within ten, twenny, twennyfive years – within a single Hermie – there would be the corpses of fried polar bears floating past your penthouse window.

  According to the media, scientists were agreed, the science was settled, science said, that all this would happen. The media promoted this settled science, and the politicians went along with the media. The whole deal had the United Nations seal of approval. The coming catastrophe that had to be averted wasn’t exactly like knowing when the asteroid would arrive so you could send Bruce Willis, but unless we did something, irreversible damage, if not certain doom, was only a Hermie or two away.

  Today, after recent events at the University of East Anglia’s Climate Research Unit, that supposedly settled science is still the story, but the story is in question. The top guy at the UN’s Intergovernmental Panel for Clima
te Change has called for an investigation into at least part of the science that was supposed to have been settled. Suddenly there are voices to pronounce that the reputation of science will lie in ruins for the next fifty years. For two Hermies at least, nobody will trust a single thing that a scientist says.

  Well, even to a non-scientist like myself, that last prediction sounds suspiciously like the others. My own view is that true science, the spirit of critical enquiry that unites all scientists, or is supposed to, is reasserting itself after being out-shouted by at least half a Hermie of uninterrupted public relations. But I hasten to admit that my view is not only not the view of a scientist, it is the view of somebody who can still remember the first day he was exposed to calculus and realized that he had no more chance of understanding it than of rising into the air like a helicopter carrying its pilot in search of free-fall sex.

  As I said in one of these broadcasts earlier in the season, before the events at the Climate Research Unit, my only position on the matter of man-made global warming was that from my own layman’s background reading I thought the reported scientific unanimity that global warming is man-made, and likely to be catastrophic, was always a more active area of scientific debate than you would have guessed from the way the media told the story.

  Just saying that much was enough to get me condemned by one of the broadsheet environmentalist gurus. He said I was an old man resistant to the facts because I didn’t care what happened to the world after I was gone. As I bounced my granddaughter on my knee, rather hoping that in the course of the next Hermie she would not be obliged to star in a remake of Waterworld as the sea rose thirty feet above her house, I bit back a rude word. But the guru still had a point when he said that my scepticism about the settled science was a wilful defiance of established fact. Unfortunately the fact had been established largely by the media, who had been telling only one story. If you said the story might have two sides, that sounded like scepticism.

  People in my position had to get used to being called sceptics, as if scepticism were a bad thing. We even have had to get used to being called denialists, although clearly it was an unscrupulous word. We were also called, are still called, flat-earthers by people like Gordon Brown and Ed Miliband, but that kind of abuse is comparatively easy to take, because everybody knows that neither man would be capable of proving mathematically that the earth is not a cube.

  So what happened at the Climate Research Unit? Well, basically nothing new. A bunch of e-mails got hacked, or perhaps leaked. Some of the phrases that supposedly reveal skulduggery reveal a lot less when you put them in the context of locker-room enthusiasm. In the correspondence columns of the scientific websites – where the level of discussion has consistently been miles above anything the mainstream media has provided for the last decade – there are already wise voices to warn that the sceptics should not make the same mistake as the believers by treating any slip they can find in the arguments of their opponents as evidence of the biggest fraud since Bernie Madoff made off with the money.

  That would be Hermie talk, and self-defeating, because the more absolutist man-made global warming case has always looked sufficiently vulnerable just by the way it has been reluctant to listen to opposing voices no matter how well qualified. There has never been any point, and there is no point now, in calling the alarmists a bunch of devious conspirators against the truth. All you ever had to do was notice how their more strident representatives didn’t want to hear any other opinions, even when the opinions came from within their own ranks.

  Far from there having been unanimity among scientists on the subject of catastrophic man-made global warming, there has scarcely been unanimity among climate scientists. It only takes one dissenting voice to punch a hole in the idea of unanimity, if that voice has a chance of being right. There was a time when almost every scientist except Einstein thought that Newton had buttoned up the subject of celestial mechanics. And this time, on the subject of global warming, there was always, right from the beginning, a number of climate scientists who didn’t endorse the alarmist picture.

  You could say that the number was small, and a few of them were vengeful because they had been side-lined for not being sufficiently doom-laden in their claims. But a few of them were older men who just wouldn’t go along with the prevailing emphasis. One of these few was Professor Lindzen of MIT. I never could convince myself that the Professor of Meteorology at the Massachusetts Institute of Technology knew less about the earth’s climate than I did, so I started to watch him. Hopeless on the media, Lindzen is the sort of pundit with a four-figure IQ who can somehow never figure out that you are supposed to talk into the microphone. His fellow anti-alarmist Professor Fred Singer was even worse. Singer not only formed a thought too slowly for radio, he was too slow for smoke signals.

  But gradually, as I watched the side roads, it seemed to me that these few dissenting scientists with zero PR skills increased in number. The number of scientists who endorsed the orthodox view increased also, but the number of those who didn’t went up instead of down. I couldn’t do the calculus, but I could count heads. There were scores of eminent scientists who signed the 2007 open letter to the Secretary-General of the United Nations, and then later on there were hundreds quoted in the US Senate minority reports. It could be said that few of them had expertise in climate science, but that argument looked less decisive when you considered that climate science itself was exactly what they were bringing into question.

  So science was not speaking with one voice on the matter. It only seemed to be, because the media, on the whole, was giving no other story. Then this Climate Research Unit thing happened, and it was the end of the monologue. The dialogue has begun again. The scientists are arguing on the matter, which is the proper thing for science to do, because in science the science is never settled. Some say that the argument about how all this happened will go on for another two Hermies at least. We can hear, from deep underground, the contented purr of Herman Kahn. It’s all turning out exactly as he predicted.

  Postscript

  This one I had to fight for, and I give Mark Damazer credit that he let me win. As I have said, the BBC, mistakenly in my view, had copied other major outlets of the worldwide mainstream media – the New York Times was a powerful example – in taking up a position on Catastrophic Anthropogenic Global Warming which condemned anyone who expressed doubts about it to the status of an eccentric. Theoretically the Web is proof against the print media’s urge to conform, but in practice Wikipedia sets the tone – as well as being the port of entry for every print journalist in search of quick background – and Wikipedia was resolutely pro-catastrophe throughout this period, all the way until October 2010, when it finally summoned up the resolve to dismiss the single zealot who had been laundering the stories as they came in. But there was never any excuse for the BBC – under no real commercial pressure except from those administrators of its pension fund who liked the idea of alternative energy futures – to embrace the doom scenario just because everybody else had. Yet a rigid orthodoxy was maintained by the Corporation even as the term Global Warming modulated into the term Climate Change, thus to gloss over the awkward fact that the globe had not warmed significantly in fifteen years. This embarrassing datum was actually admitted by the UEA-CRU’s leading light Professor Phil Jones in his temporary panic after the e-mails scandal broke, but he, like his colleagues, soon discovered that all they had to do was sit tight, because it would be in the interests of their university, if it investigated the matter, to make sure they were absolved. It caused me no pleasure to see that happen: as the holder of an Honorary Doctorate from the UEA, I was in the position of having been shown respect by an institution which now seemed intent on losing all the respect it could. Luckily the UEA’s excellent record as a patron of modern writing will always ensure that it stays high in the esteem of any modern writer.

  In the following year the UEA-CRU crew were duly found to have indulged in nothing much more reprehen
sible than schoolboy pranks. A foretaste of that absurd clemency can be found in my script, at the point where, yielding to heavy pressure, I said that there was not much in their unprepossessing behaviour beyond the standard manoeuvrings of ordinary group dynamics. This was humbug on my part and I knew it was at the time, but I was ready to soft-pedal a particular point in order to get my general message on the air. In fact, the CRU team had constructed a neo-Swiftian machine for rewriting the past in order to lend weight to their fantasies about the future, and had taken steps to make certain, by managing the ‘peer review’ in which they claimed to set such store, that it would not be challenged in the learned journals. They had behaved, that is, almost exactly as the Wegman committee of 2006 had said they had already behaved, and would go on behaving unless they submitted their conjectures to proper statistical discipline. Not to do so, after Wegman caught them out, was their fatal mistake. The same applied to Michael Mann’s outfit at Penn State, and to the rest of the surprisingly small cluster of ad-hoc research centres (the one in New Zealand was the first to fold) which had looked like guaranteeing their members a secure career in predicting planetary disaster until the moment when the grown-up scientists arrived at the door of the nursery to find out why the kids inside were making such a row.

  Easily surviving one tap on the wrist after another, they all got away with it in the narrow sense, but in the wider sense they were finished, and the thing began to come apart. At the time of writing it is still coming apart. The disintegration happens very gradually because the mainstream media outlets are even slower than governments to let go of the standard story. When the Obama administration, late in 2010, appointed a chief climate adviser – as a career alarmist, he had once been an advocate of Global Cooling – who announced that the proper name for Climate Change was Climate Disruption, the story was over as far as governments were concerned, even if the useless wind farms continued to be erected. The concept of Global Warming had just about survived the declension to Climate Change, but the further declension to Climate Disruption could mean anything. It was widely realized, although rarely mentioned, that something that could mean anything meant nothing. The prospect of coordinated global action by developed nations was as dead as the Kyoto protocol. But in the media the story continued, because too many of the major outlets had no way of switching it off: they had handed the responsibility for geophysical prediction to science correspondents who had correctly assessed that tales of oncoming disaster would play best to the gallery, and unless the science correspondents got fired, the outlets they worked for would continue to propagate a world vision indistinguishable from that of Steven Seagal. How the media got themselves into such a mess has by now become the true subject, thereby falling, if I may say so, within my only area of expertise, which is the language of communication, and what happens when it is duped into communicating poppycock. As my illustrious compatriot the late Murray Sayle used to say, what matters most isn’t the story: it’s the story of the story.

 

‹ Prev