Book Read Free

How to Fly a Horse

Page 14

by Kevin Ashton


  Science, while pretending to be dispassionate and rational, has long been an active oppressor of women. Britain’s Royal Society of scientists barred women for almost three hundred years, on grounds including the argument that women were not “legal persons.” The first women were admitted to the society in 1945. Both were from fields similar to Franklin’s: Kathleen Lonsdale was a crystallographer, Marjory Stephenson a microbiologist.

  Marie Curie, history’s most famous female scientist, did no better. The French Academy of Sciences—the equivalent of Britain’s Royal Society—rejected her application for membership. Harvard University refused to award her an honorary degree because, in the words of Charles Eliot, then president emeritus, “Credit does not entirely belong to her.” Eliot assumed that her husband, Pierre, did all her work; so did almost all her male peers. They had no such problems assuming that credit “entirely belonged to” any of the men they wanted to honor.

  These rejections came despite Curie being the first woman to win a Nobel Prize in science and the only person, male or female, to win Nobel Prizes in two different sciences (for physics in 1903 and chemistry in 1911). The prizes were, in part, a result of her fighting for the credit she deserved. When she accepted her second Nobel Prize, Curie used the word “me” seven times at the start of her speech, stressing, “The chemical work aimed at isolating radium in the state of the pure salt, and at characterizing it as a new element, was carried out specially by me.” The second woman to win a Nobel Prize in science was Curie’s daughter Irène. Both women shared their prizes with their husbands, except for Marie’s chemistry prize, which was awarded after Pierre Curie’s death.

  The Curies’ success did not help Lise Meitner. She discovered nuclear fission only to see her collaborator Otto Hahn receive the 1944 Nobel Prize for her work. The third woman to win a Nobel in science—and the first non-Curie—was biochemist Gerty Cori, in 1947, who, like both the Curies, shared the prize with her husband. The first woman to win without her husband was physicist Maria Goeppert-Mayer, in 1963. In total, only 15 women have won Nobel Prizes in science, compared to 540 men, making women 36 times less likely to win than a man. The odds have changed little since Marie Curie’s day: a female scientist wins a Nobel about once every seven years. Only two women other than Curie have won a prize by themselves; there has been only one year, 2009, when women have won prizes in two of the three science categories at the same time; women have never won science prizes in the same category in two consecutive years; and ten of the sixteen prizes given to women have been in the “medicine or physiology category.” Only two women not named Curie have won Nobel Prizes in Chemistry. Only one woman not named Curie has won a Nobel Prize in Physics.

  This is not because women have less aptitude for science. Rosalind Franklin, for example, took better pictures of DNA than anyone had taken before, then used a complex mathematical equation called the “Patterson function” to analyze them. The equation, developed by Arthur Lindo Patterson in 1935, is a classic technique in X-ray crystallography. The two main properties of electromagnetic waves are their intensity, or “amplitude,” and their length, or “phase.” The image created by an X-ray shows amplitude but not phase, which can also be a rich source of information. The Patterson function overcomes this limitation by calculating the phase based on the amplitude. In the 1950s, before computers or even calculators, this work took months. Franklin had to use a slide rule, pieces of paper, and hand calculations to work out the phases for every image, each one of which represented a slice of the three-dimensional crystal molecule she was analyzing.

  While Franklin was concluding this work, her King’s College colleague Maurice Wilkins showed her data and pictures to James Watson and Francis Crick, without her consent or knowledge. Watson and Crick leapt to the conclusion Franklin was diligently proving—that the structure of DNA was a double helix—published it, then shared the Nobel Prize with their secret source, Wilkins. When Rosalind Franklin died, she did not know the three men had stolen her work. Even after she was dead, they did not give her credit. She was not thanked in their Nobel acceptance speech, unlike several men who made lesser contributions. Wilkins referred to Franklin only once in his Nobel lecture and misrepresented her importance by saying that she made “very valuable contributions to the X-ray analysis,” rather than confessing that she did all the X-ray analysis and far more besides. Watson and Crick did not mention her at all in their Nobel lectures.

  3 | THE TRUTH IN CHAINS

  Rosalind Franklin was the most important person in the story of DNA’s discovery. She was the first-ever member of the human race—or any other species on earth—to see the secret of life. She answered Schrödinger’s question “What Is Life?” with a photograph she took on May 1, 1952. She pointed her camera at a single strand of DNA fifteen millimeters, or five-eighths of an inch, from the lens, set the exposure time for one hundred hours, and opened the shutter. It really was her camera. She had designed it and overseen its construction in the King’s College workshop. It tilted precisely so she could take pictures of DNA specimens at different angles. It was able to take photographs at very close range. It protected DNA specimens from humidity with a brass-and-rubber seal that also allowed Franklin to remove the air around a sample and replace it with hydrogen, a better medium for crystallography. There was nothing else like it anywhere in the world.

  Four days later, the picture was ready. It is one of the most important images in history. To any but the most trained eye, it does not look like much: a shadowy circle around something like a ghostly face, its eyes, eyebrows, nostrils, and dimples symmetrically and diagonally aligned, smiling like a Buddha or perhaps God Him- or Herself.

  It was clear to Franklin what the picture showed. DNA had the shape of two helixes, like a spiral staircase with no central support. The shape gave a clear indication of how life reproduced. The spiral staircase could copy itself by unwinding and replicating.

  Franklin knew what she had, but she did not run through the King’s College corridors shouting some equivalent of “Eureka!” She was determined not to leap to conclusions. She wanted to work through the math and have proof before she published, and she was determined to keep an open mind until she had gathered all the data. So she gave the image the serial number 51 and continued her work. She was still completing her Patterson function calculations, and there were many more pictures to take. Then Maurice Wilkins showed picture 51 to James Watson and Francis Crick, and the three men were awarded the Nobel Prize for a woman’s work.

  It was the same when Marietta Blau, an unpaid woman working at the University of Vienna, developed a technique for photographing atomic particles. Blau could not get a paid position anywhere, even though her work was a major advance in particle physics. C. F. Powell, a man who “adopted and improved” her techniques, was awarded the Nobel Prize in 1950. Agnes Pockels was denied a college education because she was a woman, taught herself science from her brother’s textbooks, created a laboratory in her kitchen, and used it to make fundamental discoveries about the chemistry of liquids. Her work was “adopted” by Irving Langmuir, who won a Nobel Prize for it in 1932. There are many similar stories. A lot of men have won Nobel Prizes in science for discoveries made in whole or part by women.

  4 | THE HARRIET EFFECT

  Even in our new post-genomic age, the game of claims is rigged in favor of white men. One reason is an imbalance first recorded fifty years ago by a sociologist named Harriet Zuckerman. Zuckerman was trying to find out if scientists were more successful alone or in teams. She interviewed forty-one Nobel Prize winners and discovered something that forever changed the direction of her research: that, after winning the prize, many Nobel laureates became wary of joining teams because they find they receive too much individual credit for things the group has done. One said, “The world is peculiar in this matter of how it gives credit. It tends to give the credit to already famous people.” Another: “The man who’s best known gets more credit, an inordinate
amount of credit.” Almost every Nobel Prize–winning scientist said the same thing.

  Until Zuckerman, most scholars assumed that the strata of science were more or less meritocratic. Zuckerman showed that they are not. More-recognized scientists get more recognition, and less-recognized scientists get less recognition, no matter who does the work.

  Zuckerman’s discovery is known as the Matthew effect, after Matthew 25:29—“For whoever has will be given more, and they will have an abundance. Whoever does not have, even what they have will be taken from them.” This was the name Robert Merton, a far more eminent sociologist, gave Zuckerman’s findings. Zuckerman discovered the effect, then experienced it: the credit for Zuckerman’s work went to Merton. Merton gave Zuckerman full acknowledgment, but it made little difference. As she’d predicted, he had recognition and so was given more. There were no hard feelings. Zuckerman collaborated with Merton, then married him.

  The Matthew effect—or perhaps more correctly the Harriet effect—is part of the broader problem of seeing what we think, instead of seeing what is. It is unusual that the scientists in Zuckerman’s study were honest enough to know they were getting credit they did not deserve. As we are prejudiced about others, so we are prejudiced about ourselves. For centuries, white men have tried to persuade other people that white men are superior. In the process, many white men have become convinced of their own superiority. People often give and take credit based on their prejudices. If there is a person from a “superior” group in the room when something is created, members of the group often assume that the “superior” person did most of the work, even when the opposite is true. Most of the time, the “superior” person makes the same assumption.

  I was once forwarded an e-mail that a senior, white, male scientist had sent to a junior, non-white, female scientist. She was applying for a patent. The male scientist demanded to be listed as an inventor on her patent, on the grounds that her research might have been “connected” to him. He claimed he had no interest in getting credit—he was only “making sure she did things correctly.” Patent law is complicated, but the patent office’s definition of inventorship is not. “Unless a person contributes to the conception of the invention,” it reads, “he is not an inventor.” If the female scientist named the male scientist as an inventor, she risked invalidating her patent. If she did not, she risked her career. The male scientist’s ploy works: he is named as an inventor on nearly fifty patents, an improbable number, especially as most of the patents have many inventors, even though the average number of people who “contribute to the conception” of an invention is two. The man sincerely believed he must have had something to do with the woman’s invention, even though the first time he heard of it was when he saw her patent application.

  5 | SHOULDERS, NOT GIANTS

  Harriet Zuckerman’s husband, Robert Merton, was a magnet for credit, and not just because he was a man—he was also one of the most important thinkers of the twentieth century. Merton founded a field called the “sociology of science,” which, along with his friend Thomas Kuhn’s “philosophy of science,” scrutinizes the social aspects of discovery and creation.

  Merton dedicated his life to understanding how people create, especially in science. Science claims to be objective and rational, and while its results sometimes are, Merton suspected that its practitioners are not. They are people, capable of being as subjective, emotional, and biased as everybody else. This is why “scientists” have been able to justify so many wrong things, from racial and gender inferiority to canals on Mars and the idea that the body is made of “humors.” Scientists, like all creative people, operate in environments—Merton divided them into what he called microenvironments and macroenvironments—which shape what they think and do. The way of seeing that Kuhn called a “paradigm” is part of the macroenvironment; whose contributions are recognized and why is part of the microenvironment.

  One of Merton’s observations was that the very idea of giving sole credit to any individual is fundamentally flawed. Every creator is surrounded by others in both space and time. There are creators working alongside them, creators working across the hall from them, creators working across the continent from them, and creators long dead or retired who worked before them. Every creator inherits concepts, contexts, tools, methods, data, laws, principles, and models from thousands of other people, dead and alive. Some of that inheritance is readily apparent; some of it is not. But every creative field is a vast community of connection. No creator deserves too much credit because every creator is in so much debt.

  In 1676, Isaac Newton described this problem when he wrote, “If I have seen further it is by standing on the shoulders of giants.” This may seem like modesty, but Newton used it in a letter where he was arguing with rival scientist Robert Hooke about credit. The comment became famous, and Newton is frequently cited as if he coined the phrase. But Newton was already standing on the shoulders of another when he wrote that sentence. Newton got it from George Herbert, who in 1651 wrote, “A dwarf on a giant’s shoulders sees farther of the two.” Herbert got it from Robert Burton, who in 1621 wrote, “A dwarf standing on the shoulders of a giant may see farther than a giant himself.” Burton got it from a Spanish theologian, Diego de Estella, also known as Didacus Stella, who probably got it from John of Salisbury, 1159: “We are like dwarfs on the shoulders of giants, so that we can see more than they, and things at a greater distance, not by virtue of any sharpness of sight on our part, or any physical distinction, but because we are carried high and raised up by their giant size.” John of Salisbury got it from Bernard of Chartres, 1130: “We are like dwarfs standing upon the shoulders of giants, and so able to see more and see farther than the ancients.” We do not know from whom Bernard of Chartres got it.

  Robert Merton pieced this chain of custody together in a book, On the Shoulders of Giants, to exemplify the long, many-handed sequence of gradual improvement that is creation’s reality and to show how one person, usually famous, can accumulate credit they do not deserve. Newton’s line was, in fact, close to a cliché at the time he wrote it. He was not pretending to be original; it was such a common aphorism that he did not need to cite a source. His reader, Hooke, would have already been familiar with the idea.

  But there is a problem with the statement, whether we attribute it to Newton or somebody else: the idea of “giants.” If everybody sees further because they are standing on the shoulders of giants, then there are no giants, just a tower of people, each one standing on the shoulders of another. Giants, like geniuses, are a myth.

  How many people are holding us up? A human generation is about twenty-five years long. If it was not until fifty thousand years ago that our transition to Homo sapiens sapiens—creative people—was complete, then everything we make is built upon two thousand generations of human ingenuity. We do not see further because of giants. We see further because of generations.

  6 | INHERITANCE

  Rosalind Franklin, master crystallographer, stood on a tower of generations when she became the first person to see the secret of life.

  Almost nothing was known about crystals at the start of the twentieth century, but they had been a subject of curiosity at least since the winter of 1610, when Johannes Kepler wondered why snowflakes had six corners. Kepler wrote a book, The Six-Cornered Snowflake, in which he speculated that solving the riddle of the snowflake, or “snow crystal,” would allow us to “recreate the entire universe.”

  Many people tried to understand snowflakes, including Robert Hooke, the recipient of Newton’s “shoulders of giants” letter. They were drawn, described, and categorized for three centuries, but never explained. No one understood what a snowflake was, because no one understood what a crystal was, because no one understood the physics of solid matter.

  The crystals’ mysteries are invisible to the eye. To see them, Rosalind Franklin needed a tool that also has its origins in Kepler’s time: the X-ray.

  While Kepler’s curiosity about s
nowflakes has a clear connection to crystals, the origin of the X-ray starts with something less obvious: improvements in air-pump technology that enabled scientists to wonder about vacuums. One such scientist was Robert Boyle, who used vacuums to try to understand electricity. Others improved on Boyle’s work until, almost two hundred years later, German glassblower Heinrich Geissler created the “Geissler tube,” a partial vacuum in a bottle that glowed with light whenever an electrical coil connected to it was discharged. Geissler’s invention was a novelty, an “interesting scientific toy,” during his lifetime, but decades later, it became the basis for neon lighting, incandescent lightbulbs, and the “vacuum tube”—the principal component of early radios, televisions, and computers.

  In 1869, English physicist William Crookes built on Geissler’s work to create the “Crookes tube,” which had a better vacuum than the Geissler tube. The Crookes tube led to the discovery of cathode rays, later renamed “electron beams.”

  Then, in 1895, German physicist Wilhelm Röntgen noticed a strange shimmering in the dark while he was working with a Crookes tube. He ate and slept in his lab for six weeks while he investigated, then one day positioned his wife’s hand on a photographic plate and pointed his Crookes tube at it. When he showed her the result, a picture of her bones, the first ever image of a living skeleton, she said, “I have seen my death.” Röntgen named his discovery after the symbol for something unknown: “X-ray.”

  But what were these unknown rays? Were they particles, like electrons, or waves, like light?

 

‹ Prev