by John Carey
These paragraphs of figures, with so many noughts in them, may confuse rather than enlighten. Their purpose was to show that the acquisition of our three-pound brain is full of contradiction. It was a tremendous increase; yet would have been of little note had it occurred with some other kind of tissue. The three pounds are only beginning to achieve their potential for a very few people in modern times; yet they were developed for all our ancestors in relatively simple times. The brain’s cells and synapses are merely numerous; the quantity of interconnections is about as infinite as anything we know. The brain’s size is plainly crucial; and yet those individuals with twice the brain of others are none the wiser for it. Its growth was undoubtedly critical for the emergence of Homo sapiens, and for the development of this species; yet its size was probably curtailed by the practical demands of a relatively minor portion of anatomy, namely the elasticity at birth of the pelvic canal. It was easy for evolution to permit a steady growing of the foetal head; but birth must have been an increasing problem. Teleologically speaking, it was a good time for the mammals to introduce live birth in place of the egg birth that had ruled, more or less, since life began; but viviparity meant, in time, a limitation to head size. (Even so, we now have a brain more than suited to our needs. Perhaps it will teach us one day how to tap its real potential.)
Source: Anthony Smith, The Mind, London, Hodder & Stoughton, 1984.
On Not Discovering
Ruth Benedict (1887–1948), the author of this extract, was an American anthropologist and poet who did fieldwork among the Pueblo, Apache and Blackfoot Indians. Her most famous book was Patterns of Culture (1934).
History is full of examples of apparently simple discoveries that were not made even when they would be surpassingly useful in that culture. Necessity is not necessarily the mother of invention. Men in most of Europe and Asia had adopted the wheel during the Bronze Age. It was used for chariots, as a pulley wheel for raising weights, and as a potter’s wheel for making clay vessels. But in the two Americas it was not known except as a toy in any pre-Columbian civilization. Even in Peru, where immense temples were built with blocks of stone that weighed up to ten tons, these huge weights were excavated, transported, and placed in buildings without any use of wheels.
The invention of the zero is another seemingly simple discovery which was not made even by classic Greek mathematicians or Roman engineers. Only by the use of some symbol for nothingness can the symbol 1 be used so that it can have the value either of 1 or 10 or 100 or 1000. It makes it possible to use a small number of symbols to represent such different values as 129 and 921. Without such inventions figures cannot be added or subtracted by writing them one above another, and multiplication and division are even more difficult. The Romans had to try to divide CCCLVIII by XXIV and the difficulty was immense. It was not the Egyptians or the Greeks or the Romans who first invented the zero, but the Maya Indians of Yucatán. It is known that they had a zero sign and positional values of numbers by the time of the birth of Christ. Quite independently the Hindus made these inventions in India some five to seven centuries later. Only gradually was it adopted in medieval Europe, where it was known as Arabic notation because it was introduced there by the Arabs.
Source: Ruth Benedict’s essay in Man, Culture and Society, ed. Harry L. Shapiro, New York, Oxford University Press, 1956.
Negative Predictions
Of all scientific Nobel Prize-winners from the English-speaking world, the British zoologist Sir Peter Medawar (1915–87) is perhaps the most remarkable for wit and panache, as evidenced in his autobiography, Memoirs of a Thinking Radish (1986). In Pluto’s Republic (1982), from which this extract is taken, he does battle with several of his pet hates – psychoanalysts, mystical theologians, believers in ‘rhapsodical intellection’ and peddlers of paradoxes (‘a paradox’, wrote Medawar, ‘has the same significance for the logician as the smell of burning rubber has for the electronics engineer’). He (and Macfarlane Burnet) were awarded the Nobel Prize in 1961 for their work on immunological tolerance in mice, which showed for the first time that the problem of transplanting tissues from one individual to another was soluble, and so opened the way for transplant surgery.
No kind of prediction is more obviously mistaken or more dramatically falsified than that which declares that something which is possible in principle (that is, which does not flout some estabished scientific law) will never or can never happen. I shall choose now from my own subject, medical science, a bouquet of negative predictions chosen not so much for their absurdity as for the way in which they illustrate something interesting in the history of science or medicine.
My favourite prediction of this kind was made by J. S. Haldane (the distinguished physiologist father of the geneticist J. B. S. Haldane), who in a book published in 1930 titled The Philosophy of Biology declared it to be ‘inconceivable’ that there should exist a chemical compound having exactly the properties since shown to be possessed by deoxyribonucleic acid (DNA). DNA is the giant molecule that encodes the genetic message which passes from one generation to the next – the message that prescribes how development is to proceed. The famous paper in the scientific journal Nature in which Francis Crick and James Watson described the structure of DNA and how that structure qualifies it to fulfil its genetic functions was published not so many years after Haldane’s unlucky prediction. The possibility that such a compound as DNA might exist had been clearly envisaged by the German nature-philosopher Richard Semon in a book The Mneme, a reading of which prompted Haldane to dismiss the whole idea as nonsense.
In the days before the introduction of antisepsis and asepsis, wound infection was so regular and so grave an accompaniment of surgical operations that we can hardly wonder at the declaration of a well-known surgeon working in London, Sir John Erichsen (1818–96), that ‘The abdomen is forever shut from the intrusions of the wise and humane surgeon.’ Of course, the coming of aseptic surgery to which I refer below, combined with the improvement of anaesthesia, soon made nonsense of this and opened the door to the great achievements of gastrointestinal surgery in the first decade of our century.
One of the very greatest surgeons of this period was Berkeley George Moynihan of Leeds (1865–1936), a man whose track-record for erroneous predictions puts him in a class entirely by himself.
Around 1900 the famous British periodical, the Strand magazine (the first to publish the case records of Sherlock Holmes), thought that at the turn of the century its readers would be interested to know what was in store for them in the century to come; ‘a Harley Street surgeon’ (unmistakably Moynihan) was accordingly invited to tell them what the future of surgery was to be. Evidently not spectacular, for Moynihan opined that surgery had reached its zenith and that no great advances were to be looked for in the future – nothing as dramatic, for example, as the opening of the abdomen, an event regarded with as much awe as the opening of Japan.
Moynihan’s forecast was not the hasty, ill-considered opinion of a busy man: it represented a firmly held conviction. In a Leeds University Medical School magazine in 1930 he wrote: ‘We can surely never hope to see the craft of surgery made much more perfect than it is today. We are at the end of a chapter.’ Moynihan repeated this almost word for word when he delivered Oxford University’s most prestigious lecture, the Romanes Lecture, in 1932. He was a vain and arrogant man, and if these quotations are anything to go by a rather silly one too, but surgery is indebted to him nevertheless, for he introduced the delicacy and fastidiousness of technique that did away for ever with the image of the surgeon as a brusque, over-confident and rough-and-ready sawbones. Moreover Moynihan, along with William Stewart Halsted of Johns Hopkins (1852–1922), introduced into modern surgery the aseptic technique with all the rituals and drills that go with it: the scrupulous scrub-up, the gown, cap and rubber gloves, and the facial mask over the top of which the pretty young theatre nurse gazes with smouldering eyes at the handsome young intern who is planning to wrong her. These innovations
may be said to have made possible the hospital soap opera and thus in turn TV itself– for what would TV be without the hospital drama, and what would the hospital drama be without cap and masks and those long, meaningful stares?
The full regalia of the surgical operation did not escape a certain amount of gentle ridicule – in which we may hear the voice of those older, coarser surgeons whom Moynihan supplanted. Moynihan was once described as ‘the pyloric pierrot’, and upon seeing Moynihan’s rubber shoes a French surgeon is said to have remarked ‘Surely he does not intend to stand in the abdomen?’
Source: Peter Medawar, Pluto’s Republic, London, Oxford University Press, 1982.
Clever Animals
Lewis Thomas, a distinguished American research pathologist, won bestsellerdom with The Lives of a Cell (1974), a collection of his scientific journalism. This extract is from Late Night Thoughts (1983).
Scientists who work on animal behavior are occupationally obliged to live chancier lives than most of their colleagues, always at risk of being fooled by the animals they are studying or, worse, fooling themselves. Whether their experiments involve domesticated laboratory animals or wild creatures in the field, there is no end to the surprises that an animal can think up in the presence of an investigator. Sometimes it seems as if animals are genetically programmed to puzzle human beings, especially psychologists.
The risks are especially high when the scientist is engaged in training the animal to do something or other and must bank his professional reputation on the integrity of his experimental subject. The most famous case in point is that of Clever Hans, the turn-of-the-century German horse now immortalized in the lexicon of behavioral science by the technical term, the ‘Clever Hans Error.’ The horse, owned and trained by Herr von Osten, could not only solve complex arithmetical problems, but even read the instructions on a blackboard and tap out infallibly, with one hoof, the right answer. What is more, he could perform the same computations when total strangers posed questions to him, with his trainer nowhere nearby. For several years Clever Hans was studied intensively by groups of puzzled scientists and taken seriously as a horse with something very like a human brain, quite possibly even better than human. But finally in 1911, it was discovered by Professor O. Pfungst that Hans was not really doing arithmetic at all; he was simply observing the behavior of the human experimenter. Subtle, unconscious gestures – nods of the head, the holding of breath, the cessation of nodding when the correct count was reached – were accurately read by the horse as cues to stop tapping.
Whenever I read about that phenomenon, usually recounted as the exposure of a sort of unconscious fraud on the part of either the experimenter or the horse or both, I wish Clever Hans would be given more credit than he generally gets. To be sure, the horse couldn’t really do arithmetic, but the record shows that he was considerably better at observing human beings and interpreting their behavior than humans are at comprehending horses or, for that matter, other humans.
Cats are a standing rebuke to behavioral scientists wanting to know how the minds of animals work. The mind of a cat is an inscrutable mystery, beyond human reach, the least human of all creatures and at the same time, as any cat owner will attest, the most intelligent. In 1979, a paper was published in Science by B. R. Moore and S. Stuttard entitled ‘Dr. Guthrie and Felis domesticus or: tripping over the cat,’ a wonderful account of the kind of scientific mischief native to this species. Thirty-five years ago, E. R. Guthrie and G. P. Horton described an experiment in which cats were placed in a glass-fronted puzzle box and trained to find their way out by jostling a slender vertical rod at the front of the box, thereby causing a door to open. What interested these investigators was not so much that the cats could learn to bump into the vertical rod, but that before doing so each animal performed a long ritual of highly stereotyped movements, rubbing their heads and backs against the front of the box, turning in circles, and finally touching the rod. The experiment has ranked as something of a classic in experimental psychology, even raising in some minds the notion of a ceremony of superstition on the part of cats: before the rod will open the door, it is necessary to go through a magical sequence of motions.
Moore and Stuttard repeated the Guthrie experiment, observed the same complex ‘learning’ behavior, but then discovered that it occurred only when a human being was visible to the cat. If no one was in the room with the box, the cat did nothing but take naps. The sight of a human being was all that was needed to launch the animal on the series of sinuous movements, rod or no rod, door or no door. It was not a learned pattern of behavior, it was a cat greeting a person.
Source: Lewis Thomas, Late Night Thoughts, London, Oxford University Press, 1985. First published in the USA as Late Night Thoughts on Listening to Mahler’s Ninth Symphony, New York, Viking, 1983.
Great Fakes of Science
A celebrated popularizer of science and mathematics, Martin Gardner has devoted much of his life to debunking ESP, ‘psychic’ phenomena, metal-bending and other paranormality. This excerpt is from Science Good, Bad and Bogus (1983).
Politicians, real-estate agents, used-car salesmen, and advertising copy-writers are expected to stretch facts in self-serving directions, but scientists who falsify their results are regarded by their peers as committing an inexcusable crime. Yet the sad fact is that the history of science swarms with cases of outright fakery and instances of scientists who unconsciously distorted their work by seeing it through lenses of passionately held beliefs.
Gregor Johann Mendel, whose experiments with garden peas first revealed the basic laws of heredity, was such a hero of modern science that scientists in the thirties were shocked to learn that this pious monk probably doctored his data. R. A. Fisher, a famous British statistician, checked Mendel’s reports carefully. The odds, he concluded, are about 10,000 to 1 that Mendel gave an inaccurate account of his experiments.
Brother Mendel was a Roman Catholic priest who lived in an abbey in Brünn, now part of Czechoslovakia. More than a century ago, working alone in a monastery garden, he found that his plants were breeding according to precise laws of probability. Later, these laws were explained by the theory of genes (now known to be sections along a helical DNA molecule), but it was Brother Mendel who laid the foundations for what later was called Mendelian genetics. His great work was totally ignored by the botanists of his time, and he died without knowing he would become famous.
Most of the monk’s work was with garden peas. Seeds from dwarf pea plants always grow into dwarfs, but tall pea plants are of two kinds. Seeds from one kind produce only tails. Seeds from the other kind produce both tails and dwarfs. Mendel found that when he crossed true-breeding tails with dwarfs he got only tails. When he self-pollinated these tall hybrids he got a mixture of ¼ true-breeding tails, ¼ dwarfs, and ½ tails that did not breed true.
Today one says that tallness in garden peas is dominant, dwarfness is recessive. Mendel’s breeding experiment is like shaking an even mixture of red and blue beads in a hat, then taking out a pair. The probability is ¼ you will get red-red, ¼ you will get blue-blue, and ½ you will get red-blue. These, however, are ‘long-run’ probabilities. Make such a test just once, with (say) 200 evenly mixed beads, and the odds are strongly against your getting exactly 25 red pairs, 25 blue, and 50 mixed. Statisticians would be deeply suspicious if you reported results that precise.
Mendel’s figures are suspect for just this reason. They are too good to be true. Did the priest consciously fudge his data? Let us be charitable. Perhaps he was guilty only of ‘wishful seeing’ when he classified and counted his tails and dwarfs.
Geologists find strange things in the ground, but none so strange as the ‘fossils’ unearthed by Johann Beringer, a learned professor of science at the University of Würzburg. German Protestants of the early eighteenth century, like so many American fundamentalists today, could not believe that fossils were the relics of life that flourished millions of years ago. Professor Beringer had an unusual th
eory. Some fossils, he admitted, might be the remains of life that perished in the great flood of Noah, but most of them were ‘peculiar stones’ carved by God himself as he experimented with the kinds of life he intended to create.
Beringer was ecstatic when his teen-age helpers began to dig up hundreds of stones that supported his hypothesis. They bore images of the bodies of strange insects, birds, and fishes never seen on earth. One bird had a fish’s head – an idea God had apparently discarded. Other stones showed the sun, moon, five-pointed stars, and comets with blazing tails. He began to find stones with Hebrew letters. One had ‘Jehovah’ carved on it.
In 1726 Beringer published a huge treatise on these marvelous discoveries. It was written in Latin and impressively illustrated with engraved plates. Colleagues tried to convince Beringer he was being bamboozled, but he dismissed this as ‘vicious raillery’ by stubborn, establishment enemies.
No one knows what finally changed the professor’s mind. It was said that he found a stone with his own name on it! An inquiry was held. One of his assistants confessed. It turned out that the peculiar stones had been carved by two peculiar colleagues, one the university’s librarian, the other a professor of geography.
Poor trusting, stupid Beringer, his career shattered, spent his life’s savings buying up copies of his idiotic book and burning them. But the work became such a famous monument to geological gullery that twenty-seven years after Beringer’s death a new edition was published in Germany. In 1963 a handsome translation was issued by the University of California Press. Beringer has become immortal only as the victim of a cruel hoax.