by David Boyle
Magic is about breaking out of categories, words and definitions, and I should declare an interest – I want a bit more of it. Measuring things takes away the childish sense of wonder where things are really possible. A serious-looking man with a white coat and clipboard – one of those disinterested people who counts a lot but feels little – will have to put me right, and tell me off for filling people’s minds with airy-fairy nonsense.
But don’t blame me. I was plummeted into this frame of mind as a teenager when I came across a poem by D. J. Enright called ‘Blue Umbrellas’, which in a few short lines summed up the poverty of definitions:
The thing that makes a blue umbrella with its tail –
How do you call it? You ask. Poorly and pale
Comes my answer. For all I can call it is peacock.
Now that you go to school, you will learn how we call all sorts of things;
How we mar great works by our mean recital.
You will learn, for instance, that Head Monster is not the gentleman’s accepted title;
The blue-tailed eccentrics will be merely peacocks; the dead bird will no longer doze
Off till tomorrow’s lark, for the letter has killed him
The dictionary is opening, the gay umbrellas close.
Bizarre measurement No. 1
Guz
(Middle Eastern measurement of variable length. One Guz = 27 inches in Bombay, 37 inches in Bengal, 25 inches in Arabia and 41 inches in Iran.)
* * *
Americans who claim to have been abducted by aliens: 3.7 million
Speed of London traffic in 1900: 12 mph
Speed of London traffic in 1996: 12 mph
Average time US patients are allowed to speak before being interrupted by their doctors: 18 seconds
Chapter 1
A Short History of Counting
Know then thyself, presume not God to scan,
The proper study of mankind is man.
Alexander Pope
I have often admired the mystical way of Pythagoras, and the
secret magic of numbers
Sir Thomas Browne, Religio Medici
I
It was 12 September 1904. The Kaiser was on the throne, the Dreadnought was less than a few rivets on the ground and Freud was in his Vienna consulting rooms, thinking the unthinkable. In Berlin, the unthinkable seemed to be becoming real.
As many as 13 of the city’s greatest scientific minds were convinced. The leading psychologists, veterinary surgeons, physiologists – even the director of the Berlin Zoo – had come away from the demonstration shaking their heads, worrying slightly for their professional reputations. Yet they had just signed the paper: the horse they had spent the day watching was not responding to signals from its owner when it demonstrated its considerable mathematical powers. Clever Hans, in other words, was officially not a circus act. He really was clever.
Clever Hans sounds like the title of a Grimm fairy tale or one of Freud’s more spectacular patients. Actually he was a horse belonging to a retired maths teacher called Wilhelm von Ostein, who believed passionately in its ability to do complicated multiplication and division – even fractions – tapping out the answer with its hoof and manipulating sets of numbers up to six decimal places. What’s more, by converting his answers into numbers, Hans could also read, spell and identify musical tones. Zeros he communicated with a shake of the head.
Wearing a hard black hat over his streaming white hair and beard, von Ostein exhibited Hans in a northern suburb of the city every day at noon. He refused to take money for the show, rewarding Hans with a pile of bread and carrots for answering the questions of the daily audience who gathered around.
A leading biologist had become fascinated with the Hans phenomenon, and had invited the 13 eminent scientists – the so-called Hans Commission – to defend him and von Ostein from ridicule in the press. The commission recommended further study by a rising young psychologist, Oskar Pfungst. In the six weeks that followed, Pfungst had been severely bitten by Hans, von Ostein had withdrawn his horse in a rage, and (with a sigh of relief) modern science had cracked the mystery of the counting horse.
First of all, Pfungst noticed that Hans got excited if he could not see the questioner, and made strenuous efforts to see round his blindfold so that he could. They also found that the horse lost the arithmetical plot if he was asked questions that the questioners didn’t know the answer to themselves. Clearly he must be responding to some kind of unconscious signal from the person asking the questions. When the implications of the blindfold experiment sank in, von Ostein exploded with fury at Hans, but the following day he had regained his ardent belief and took the horse away.
It was too late. Pfungst’s report became a legend in experimental psychology. He argued, completely convincingly, that Hans was able to pick up the slight incline of the questioners’ heads when they had finished asking the question and expected the answer to be tapped out. When Hans had reached the right number of taps, he was able to notice the tiny relaxation, the minute straightening up or raised eyebrow with which the questioners betrayed themselves, and he stopped tapping. Hans also tapped faster when he knew it was a long answer (a practice that added to his intellectual reputation) and this too, said Pfungst, he was able to deduce from tiny changes of facial expression.
Pfungst’s own reputation was made, modern science had been vindicated – animals could not count. Von Ostein died a few months later. History does not relate what happened to Hans, but I’m not hopeful.
It was, of course, the dawn of the century of numbers. A hundred years later, we prove our humanity every time we open our newspapers with the mass of statistics on offer. Numbers are our servants, the tools of human domination. For centuries, counting was accepted as one of the key differences between human beings and animals. ‘Brutes cannot number, weigh and measure,’ said the great pioneer of quantification, the fifteenth-century cardinal Nicholas of Cusa. The arrival of a mathematical horse was a serious challenge to the numerical world view.
But 1904 was not just the year of Rolls-Royce and the entente cordiale, it was a moment of fantasy and wish-fulfilment. Peter Pan was on stage for the first time, British troops were taking the mysterious Tibetan city of Lhasa, and there was an absolute rash of ‘clever’ animals on offer, each one challenging the accepted view of numeracy as exclusively human. There was the English bulldog Kepler, owned by Sir William Huggins, which barked out its numerical answers. There was Clever Rosa, the so-called Mare of Berlin, and doyenne of the local music-hall stage. There was the clever dog of Utrecht, the reading pig of London, all forerunners of Babe in their own way. Pfungst despatched many of their reputations, but he was too old later to investigate Lady, the talking and for-tune-telling horse of Virginia.
Lady managed to count and tell fortunes by flipping up letters on a special chart. Pfungst’s biographer told the story of a colleague of his who had visited Lady to ask where his missing dog had gone. The horse spelled out the word DEAD. Actually, the dog turned up alive and well a few days later, and following Pfungst he gave his opinion – having studied Hans in such detail – that Lady had probably been able to sense the man’s conviction that the dog was dead.
So we can all breathe a sigh of relief – animals can’t count; numbers are safely human. But a century later, I still want to shake them all and say: ‘Hang on a minute!’ Here was a horse that was apparently able to read minds and spell correctly, never mind counting.
The accepted order of things is not absolutely safe, but we will never be able to set the clock back long enough to find out. Lady and Hans have long since gone to the knackers, and modern science is blind to strange phenomena like that. But the issue of counting and who is entitled to do so is still with us. Numbers have been in constant use for the past 6,000 years, but we have never quite resolved what they are. Are they intellectual tools for humans, invented by us for our own use? Or are they fantastical concepts, pre-existing in the universe befor
e Adam, which we had to discover along with America and the laws of thermodynamics? Which came first: man or numbers? Are they available for any species to use or just an aspect of mankind? Are they real or human?
The consensus moves backwards and forwards through the centuries, and always with political implications. If numbers are a mysterious aspect of the universe put there by God, we tend to become subject to control and manipulation by accountant-priests. If they are a method by which humanity can control chaos, they become part of the tools of a technocratic scientific elite. The modern world is firmly in the second camp. We have rejected rule by priests in favour of rule by science. Measuring is something humans have invented for themselves, and animals – by definition – can’t hack it. They might be able to spell or pick up astonishingly subtle body language, but it is important for our world view that they can’t count.
The other view – that numbers have meaning in their own right – was represented by the Greek philosopher Pythagoras, in the sixth century BC, who was the great believer in the natural God-given beauty of numbers. For Pythagoras, numbers corresponded to a natural harmony in the universe, as bound up with the music of the spheres as they are with calculations. Music and beauty were underpinned by numbers. The story goes that Pythagoras listened to a blacksmith hammering away and heard the musical notes made by the anvil. He realized that they were generated by different lengths of hammer, and that there were perfect ratios of halves, thirds and quarters which generated perfect chords. They were the secret harmonies generated by the real numbers in nature. Another legend says that he learned about such things from the wisest people among the Egyptians and Phoenicians, and spent 12 years studying with the Magi after being taken captive and imprisoned in Babylon.
Numbers existed even before the universe itself, according to Pythagoras. But even that was too mild for St Augustine of Hippo, who declared that six was such a perfect number that it would be so even if the world didn’t exist at all. ‘We cannot escape the feeling,’ said the mathematician Heinrich Hertz, ‘that these mathematical formulae have an independent existence and an intelligence of their own, that they are wiser than we are, wiser even than their discoverers, that we get more out of them than was originally put into them.’
Numbers rule the universe, said Pythagoras and his followers. Anything less like irrational numbers was ‘unutterable’ and initiates were sworn to secrecy about them. According to his follower Proclos, the first people who mentioned such possibilities all died in a shipwreck. ‘The unutterable and the formless must needs be concealed,’ he said. ‘And those who uncovered and touched this image of life were instantly destroyed and shall remain forever exposed to the play of the eternal waves.’
It was irrational numbers that eventually did for Pythagoras. When his descendants opened up a whole new world of paradoxes, irrationality, bizarre computations, negative numbers, square roots, then nothing ever seemed the same again. And although technocrats might breathe a sigh of relief about this evidence of the modern rationality breaking through, we may also have lost something from that sense of pre-existing perfection.
II
The tyranny of numbers over life began with the simple counting of things with marks on wood. You find notched reindeer antlers from 15000 BC, well before Britain separated itself from continental Europe. These methods lasted into modern times, and were known in the English medieval treasury as ‘tally sticks’. Tally sticks were finally abandoned by the British civil service as a method of keeping track of public spending as late as 1783. After that, the old ones hung around for a generation or so, piled into the Court of Star Chamber until they needed the room. Someone then had the bright idea of burning them in the furnace that was used to heat the House of Lords. The result was that the furnace set light to the panelling and led to the conflagration in 1834 which burned down the Palace of Westminster, and led to the world-famous monstrosity that we know today, complete with Big Ben and mock Gothic.
A few more of these dangerous items were found during repairs to Westminster Abbey in 1909, and they were put safely into a museum, where they could do less damage.
Notches probably came before language. Prehistoric people probably used words like ‘one’, ‘two’, ‘three’ and ‘many’ for anything more complicated. In fact, sometimes ‘three’ might mean ‘many’. Take the French, for example: ‘trois’ (three) and ‘tres’ (very). Or the Latin: ‘tres’ (three) and ‘trans’ (beyond). A tribe of cave dwellers was discovered in the Philippines in 1972 who couldn’t answer the question ‘How many people are there in your tribe?’ But they could write down a list of all 24. But then counting is a philosophical problem, because you have to categorize. You have to be able to see the similarity in things and their differences, and decide which are important, before you can count them. You have to be able to do Venn diagrams in your head. ‘It must have required many ages to discover that a brace of pheasants and a couple of days were other instances of the number two,’ said the philosopher Bertrand Russell. But once you have grasped that concept, there are so many other categories you have to create before you can count how many people there are in your tribe. Do you count children? Do you count foreigners who happen to live with you? Do you count people who look completely different from everybody else? Counting means definition and control. To count something, you have to name it and define it. It is no coincidence that it was the ancient Sumerian civilization, the first real empire, which developed the idea of writing down numbers for the first time. They had to if they were going to manage an imperial culture of herds, crops and people. Yet any definition you make simply has to be a compromise with the truth. And the easier it is to count, the more the words give way to figures, the more counting simplifies things which are not simple. Because although you can count sheep until you are blue in the face, actually no two sheep are the same.
The old world did not need precision. If Christ’s resurrection was important, it wasn’t terribly vital to know what the actual date was. Instead Europeans used numbers for effect – King Arthur was described as killing tens of thousands in battles all by himself. Modern politicians are the last remaining profession which does this, claiming unwieldy figures which they have achieved personally, and pretending a spurious accuracy by borrowing the language of statistics, when actually they are using the numbers for impact like a medieval chronicler. Nor were the numbers they used much good for calculation. Nowadays Roman numerals only exist for things which powerful people want to look permanent – like television programmes or the US World Series – but which are actually very impermanent indeed.
The new world needed accuracy and simplicity for its commerce. Although they were briefly banned by an edict at Florence in 1229, the new Arabic numbers – brought back from the Middle East by the crusaders – began to be spread by the new mercantile classes. These were the literate and numerate people – with their quill pens tracing the exchange of vast sums – plotting the despatch of fleets for kings, managing the processing of wool with the new counting boards.
And soon everybody was counting with the same precision. King John’s Archbishop of Canterbury, Stephen Langton, had already organized a system of chapters and verses for the Bible, all numbered and meticulously indexed, which by the following century used the new Arabic numerals. Soon the new numbers were being used to measure much more elusive things. By 1245, Gossoin of Metz worked out that if Adam had set off the moment he was created, walking at the rate of 25 miles a day, he would still have to walk for another 713 years if he was going to reach the stars. The great alchemist Roger Bacon, who tried to measure the exact arc of a rainbow from his laboratory above Oxford’s Folly Bridge, calculated shortly afterwards that someone walking 20 miles a day would take 14 years, seven months and just over 29 days to get to the moon.
It’s a wonderful thought, somehow akin to Peter Pan’s famous directions for flying to Never Never Land, ‘turn right and straight on till morning’. But it was a different time the
n, when space was measured in the area that could be ploughed in a day and when time was dominated by the unavoidable changes between day and night. There were 12 hours in the medieval day, and 12 hours in the night too, but without proper tools for measuring time, these were expanded and compressed to make sure the 12 hours fitted into the light and the dark. An hour in the summer was much longer than an hour in the winter, and actually referred to the ‘hours’ when prayers should be said.
Nobody knows who invented clocks, though legend has it that it was the mysterious Gerbert of Aurillac, another medieval monk who spent some time in Spain learning from the wisdom of the Arabs, and who, as Sylvester II, was the Pope who saw in the last millennium. He was said to be so good at maths that contemporaries believed he was in league with the Devil. It was not for 250 years that clocks arrived in the mass market, but once they had, you could not argue with their accuracy. From the 1270s, they dominated European townscapes, insisting that hours were all the same length and that trading times and working times should be strictly regulated. Counting in public is, after all, a controlling force, as the people of Amiens discovered in 1335 when the mayor regulated their working and eating time with a bell, attached to a clock.
Clocks had bells before they had faces, and were machines of neat precision, as you can see by the fourteenth-century one still working in the nave of Salisbury Cathedral, with its careful black cogs swinging backwards and forwards, the very model of the new medieval exactitude. Soon every big city was imposing heavy taxes on themselves to afford the clock machinery, adding mechanical hymns, Magi bouncing in and out and – like the one in Strasbourg in 1352 – a mechanical cockerel which crowed and waggled its wings.
Where would they stop, these medieval calculators? Scholars at Merton College, Oxford in the fourteenth century thought about how you can measure not just size, taste, motion, heat, colour, but also qualities like virtue and grace. But then these were the days when even temperature had to be quantified without the use of a thermometer, which had yet to be invented. They must have been heady days, when the whole of quality – the whole of arts and perception – seemed to be collapsing neatly into science.