Harari, Yuval Noah - Sapiens, A - Sapiens, A Brief History Of Hum

Home > Nonfiction > Harari, Yuval Noah - Sapiens, A - Sapiens, A Brief History Of Hum > Page 26
Harari, Yuval Noah - Sapiens, A - Sapiens, A Brief History Of Hum Page 26

by Unknown


  Suppose a single modern battleship got transported back to Columbus’ time. In a matter of seconds it could make driftwood out of the Nina, Pinta and Santa Maria and then sink the navies of every great world power of the time without sustaining a scratch. Five modern freighters could have taken onboard all the cargo borne by the whole world’s merchant fleets.5 A modern computer could easily store every word and number in all the codex books and scrolls in every single medieval library with room to spare. Any large bank today holds more money than all the world’s premodern kingdoms put together.6

  In 1500, few cities had more than 100,000 inhabitants. Most buildings were constructed of mud, wood and straw; a three-storey building was a skyscraper. The streets were rutted dirt tracks, dusty in summer and muddy in winter, plied by pedestrians, horses, goats, chickens and a few carts. The most common urban noises were human and animal voices, along with the occasional hammer and saw. At sunset, the cityscape went black, with only an occasional candle or torch flickering in the gloom. If an inhabitant of such a city could see modern Tokyo, New York or Mumbai, what would she think?

  Prior to the sixteenth century, no human had circumnavigated the earth. This changed in 1522, when Magellan’s expedition returned to Spain after a journey of 72,000 kilometres. It took three years and cost the lives of almost all the crew members, Magellan included. In 1873, Jules Verne could imagine that Phileas Fogg, a wealthy British adventurer, might just be able to make it around the world in eighty days. Today anyone with a middle-class income can safely and easily circumnavigate the globe in just forty-eight hours.

  In 1500, humans were confined to the earth’s surface. They could build towers and climb mountains, but the sky was reserved for birds, angels and deities. On 20 July 1969 humans landed on the moon. This was not merely a historical achievement, but an evolutionary and even cosmic feat. During the previous 4 billion years of evolution, no organism managed even to leave the earth’s atmosphere, and certainly none left a foot or tentacle print on the moon.

  For most of history, humans knew nothing about 99.99 per cent of the organisms on the planet - namely, the microorganisms. This was not because they were of no concern to us. Each of us bears billions of one-celled creatures within us, and not just as free-riders. They are our best friends, and deadliest enemies. Some of them digest our food and clean our guts, while others cause illnesses and epidemics. Yet it was only in 1674 that a human eye first saw a microorganism, when Anton van Leeuwenhoek took a peek through his home-made microscope and was startled to see an entire world of tiny creatures milling about in a drop of water. During the subsequent 300 years, humans have made the acquaintance of a huge number of microscopic species. We’ve managed to defeat most of the deadliest contagious diseases they cause, and have harnessed microorganisms in the service of medicine and industry. Today we engineer bacteria to produce medications, manufacture biofuel and kill parasites.

  But the single most remarkable and defining moment of the past 500 years came at 05:29:45 on 16 July 1945. At that precise second, American scientists detonated the first atomic bomb at Alamogordo, New Mexico. From that point onward, humankind had the capability not only to change the course of history, but to end it.

  The historical process that led to Alamogordo and to the moon is known as the Scientific Revolution. During this revolution humankind has obtained enormous new powers by investing resources in scientific research. It is a revolution because, until about AD 1500, humans the world over doubted their ability to obtain new medical, military and economic powers. While government and wealthy patrons allocated funds to education and scholarship, the aim was, in general, to preserve existing capabilities rather than acquire new ones. The typical premodern ruler gave money to priests, philosophers and poets in the hope that they would legitimise his rule and maintain the social order. He did not expect them to discover new medications, invent new weapons or stimulate economic growth.

  During the last five centuries, humans increasingly came to believe that they could increase their capabilities by investing in scientific research. This wasn’t just blind faith - it was repeatedly proven empirically. The more proofs there were, the more resources wealthy people and governments were willing to put into science. We would never have been able to walk on the moon, engineer microorganisms and split the atom without such investments. The US government, for example, has in recent decades allocated billions of dollars to the study of nuclear physics. The knowledge produced by this research has made possible the construction of nuclear power stations, which provide cheap electricity for American industries, which pay taxes to the US government, which uses some of these taxes to finance further research in nuclear physics.

  The Scientific Revolution’s feedback loop. Science needs more than just research to make progress. It depends on the mutual reinforcement of science, politics and economics. Political and economic institutions provide the resources without which scientific research is almost impossible. In return, scientific research provides new powers that are used, among other things, to obtain new resources, some of which are reinvested in research.

  Why did modern humans develop a growing belief in their ability to obtain new powers through research? What forged the bond between science, politics and economics? This chapter looks at the unique nature of modern science in order to provide part of the answer. The next two chapters examine the formation of the alliance between science, the European empires and the economics of capitalism.

  Ignoramus

  Humans have sought to understand the universe at least since the Cognitive Revolution. Our ancestors put a great deal of time and effort into trying to discover the rules that govern the natural world. But modern science differs from all previous traditions of knowledge in three critical ways:

  a. The willingness to admit ignorance. Modern science is based on the Latin injunction ignoramus - ‘we do not know’. It assumes that we don’t know everything. Even more critically, it accepts that the things that we think we know could be proven wrong as we gain more knowledge. No concept, idea or theory is sacred and beyond challenge.

  b. The centrality of observation and mathematics. Having admitted ignorance, modern science aims to obtain new knowledge. It does so by gathering observations and then using mathematical tools to connect these observations into comprehensive theories.

  c. The acquisition of new powers. Modern science is not content with creating theories. It uses these theories in order to acquire new powers, and in particular to develop new technologies.

  The Scientific Revolution has not been a revolution of knowledge. It has been above all a revolution of ignorance. The great discovery that launched the Scientific Revolution was the discovery that humans do not know the answers to their most important questions.

  Premodern traditions of knowledge such as Islam, Christianity, Buddhism and Confucianism asserted that everything that is important to know about the world was already known. The great gods, or the one almighty God, or the wise people of the past possessed all-encompassing wisdom, which they revealed to us in scriptures and oral traditions. Ordinary mortals gained knowledge by delving into these ancient texts and traditions and understanding them properly. It was inconceivable that the Bible, the Qur’an or the Vedas were missing out on a crucial secret of the universe - a secret that might yet be discovered by flesh-and-blood creatures.

  Ancient traditions of knowledge admitted only two kinds of ignorance. First, an individual might be ignorant of something important. To obtain the necessary knowledge, all he needed to do was ask somebody wiser. There was no need to discover something that nobody yet knew. For example, if a peasant in some thirteenth-century Yorkshire village wanted to know how the human race originated, he assumed that Christian tradition held the definitive answer. All he had to do was ask the local priest.

  Second, an entire tradition might be ignorant of unimportant things. By definition, whatever the great gods or the wise people of the past did not bother to tell us wa
s unimportant. For example, if our Yorkshire peasant wanted to know how spiders weave their webs, it was pointless to ask the priest, because there was no answer to this question in any of the Christian Scriptures. That did not mean, however, that Christianity was deficient. Rather, it meant that understanding how spiders weave their webs was unimportant. After all, God knew perfectly well how spiders do it. If this were a vital piece of information, necessary for human prosperity and salvation, God would have included a comprehensive explanation in the Bible.

  Christianity did not forbid people to study spiders. But spider scholars - if there were any in medieval Europe - had to accept their peripheral role in society and the irrelevance of their findings to the eternal truths of Christianity. No matter what a scholar might discover about spiders or butterflies or Galapagos finches, that knowledge was little more than trivia, with no bearing on the fundamental truths of society, politics and economics.

  In fact, things were never quite that simple. In every age, even the most pious and conservative, there were people who argued that there were important things of which their entire tradition was ignorant. Yet such people were usually marginalised or persecuted - or else they founded a new tradition and began arguing that they knew everything there is to know. For example, the prophet Muhammad began his religious career by condemning his fellow Arabs for living in ignorance of the divine truth. Yet Muhammad himself very quickly began to argue that he knew the full truth, and his followers began calling him ‘The Seal of the Prophets’. Henceforth, there was no need of revelations beyond those given to Muhammad.

  Modern-day science is a unique tradition of knowledge, inasmuch as it openly admits collective ignorance regarding the most important questions. Darwin never argued that he was ‘The Seal of the Biologists’, and that he had solved the riddle of life once and for all. After centuries of extensive scientific research, biologists admit that they still don’t have any good explanation for how brains produce consciousness. Physicists admit that they don’t know what caused the Big Bang, or how to reconcile quantum mechanics with the theory of general relativity.

  In other cases, competing scientific theories are vociferously debated on the basis of constantly emerging new evidence. A prime example is the debates about how best to run the economy. Though individual economists may claim that their method is the best, orthodoxy changes with every financial crisis and stock-exchange bubble, and it is generally accepted that the final word on economics is yet to be said.

  In still other cases, particular theories are supported so consistently by the available evidence, that all alternatives have long since fallen by the wayside. Such theories are accepted as true - yet everyone agrees that were new evidence to emerge that contradicts the theory, it would have to be revised or discarded. Good examples of these are the plate tectonics theory and the theory of evolution.

  The willingness to admit ignorance has made modern science more dynamic, supple and inquisitive than any previous tradition of knowledge. This has hugely expanded our capacity to understand how the world works and our ability to invent new technologies. But it presents us with a serious problem that most of our ancestors did not have to cope with. Our current assumption that we do not know everything, and that even the knowledge we possess is tentative, extends to the shared myths that enable millions of strangers to cooperate effectively. If the evidence shows that many of those myths are doubtful, how can we hold society together? How can our communities, countries and international system function?

  All modern attempts to stabilise the sociopolitical order have had no choice but to rely on either of two unscientific methods:

  a. Take a scientific theory, and in opposition to common scientific practices, declare that it is a final and absolute truth. This was the method used by Nazis (who claimed that their racial policies were the corollaries of biological facts) and Communists (who claimed that Marx and Lenin had divined absolute economic truths that could never be refuted).

  b. Leave science out of it and live in accordance with a non-scientific absolute truth. This has been the strategy of liberal humanism, which is built on a dogmatic belief in the unique worth and rights of human beings - a doctrine which has embarrassingly little in common with the scientific study of Homo sapiens.

  But that shouldn’t surprise us. Even science itself has to rely on religious and ideological beliefs to justify and finance its research.

  Modern culture has nevertheless been willing to embrace ignorance to a much greater degree than has any previous culture. One of the things that has made it possible for modern social orders to hold together is the spread of an almost religious belief in technology and in the methods of scientific research, which have replaced to some extent the belief in absolute truths.

  The Scientific Dogma

  Modern science has no dogma. Yet it has a common core of research methods, which are all based on collecting empirical observations - those we can observe with at least one of our senses - and putting them together with the help of mathematical tools.

  People throughout history collected empirical observations, but the importance of these observations was usually limited. Why waste precious resources obtaining new observations when we already have all the answers we need? But as modern people came to admit that they did not know the answers to some very important questions, they found it necessary to look for completely new knowledge. Consequently, the dominant modern research method takes for granted the insufficiency of old knowledge. Instead of studying old traditions, emphasis is now placed on new observations and experiments. When present observation collides with past tradition, we give precedence to the observation. Of course, physicists analysing the spectra of distant galaxies, archaeologists analysing the finds from a Bronze Age city, and political scientists studying the emergence of capitalism do not disregard tradition. They start by studying what the wise people of the past have said and written. But from their first year in college, aspiring physicists, archaeologists and political scientists are taught that it is their mission to go beyond what Einstein, Heinrich Schliemann and Max Weber ever knew.

  Mere observations, however, are not knowledge. In order to understand the universe, we need to connect observations into comprehensive theories. Earlier traditions usually formulated their theories in terms of stories. Modern science uses mathematics.

  There are very few equations, graphs and calculations in the Bible, the Qur’an, the Vedas or the Confucian classics. When traditional mythologies and scriptures laid down general laws, these were presented in narrative rather than mathematical form. Thus a fundamental principle of Manichaean religion asserted that the world is a battleground between good and evil. An evil force created matter, while a good force created spirit. Humans are caught between these two forces, and should choose good over evil. Yet the prophet Mani made no attempt to offer a mathematical formula that could be used to predict human choices by quantifying the respective strength of these two forces. He never calculated that ‘the force acting on a man is equal to the acceleration of his spirit divided by the mass of his body’.

  This is exactly what scientists seek to accomplish. In 1687, Isaac Newton published The Mathematical Principles of Natural Philosophy, arguably the most important book in modern history. Newton presented a general theory of movement and change. The greatness of Newton’s theory was its ability to explain and predict the movements of all bodies in the universe, from falling apples to shooting stars, using three very simple mathematical laws:

  Henceforth, anyone who wished to understand and predict the movement of a cannonball or a planet simply had to make measurements of the object’s mass, direction and acceleration, and the forces acting on it. By inserting these numbers into Newton’s equations, the future position of the object could be predicted. It worked like magic. Only around the end of the nineteenth century did scientists come across a few observations that did not fit well with Newton’s laws, and these led to the next revolutions in physics - the
theory of relativity and quantum mechanics.

  Newton showed that the book of nature is written in the language of mathematics. Some chapters (for example) boil down to a clear-cut equation; but scholars who attempted to reduce biology, economics and psychology to neat Newtonian equations have discovered that these fields have a level of complexity that makes such an aspiration futile. This did not mean, however, that they gave up on mathematics. A new branch of mathematics was developed over the last 200 years to deal with the more complex aspects of reality: statistics.

  In 1744, two Presbyterian clergymen in Scotland, Alexander Webster and Robert Wallace, decided to set up a life-insurance fund that would provide pensions for the widows and orphans of dead clergymen. They proposed that each of their church’s ministers would pay a small portion of his income into the fund, which would invest the money. If a minister died, his widow would receive dividends on the fund’s profits. This would allow her to live comfortably for the rest of her life. But to determine how much the ministers had to pay in so that the fund would have enough money to live up to its obligations, Webster and Wallace had to be able to predict how many ministers would die each year, how many widows and orphans they would leave behind, and by how many years the widows would outlive their husbands.

  Take note of what the two churchmen did not do. They did not pray to God to reveal the answer. Nor did they search for an answer in the Holy Scriptures or among the works of ancient theologians. Nor did they enter into an abstract philosophical disputation. Being Scots, they were practical types. So they contacted a professor of mathematics from the University of Edinburgh, Colin Maclaurin. The three of them collected data on the ages at which people died and used these to calculate how many ministers were likely to pass away in any given year.

 

‹ Prev