The Best American Science and Nature Writing 2016

Home > Nonfiction > The Best American Science and Nature Writing 2016 > Page 10
The Best American Science and Nature Writing 2016 Page 10

by Amy Stewart


  For Pitts, this marked the beginning of the end. Wiener, who had taken on a fatherly role in his life, now abandoned him inexplicably. For Pitts, it wasn’t merely a loss. It was something far worse than that: it defied logic.

  And then there were the frogs. In the basement of Building 20 at MIT, along with a garbage can full of crickets, Lettvin kept a group of them. At the time, biologists believed that the eye was like a photographic plate that passively recorded dots of light and sent them, dot for dot, to the brain, which did the heavy lifting of interpretation. Lettvin decided to put the idea to the test, opening up the frogs’ skulls and attaching electrodes to single fibers in their optic nerves.

  Together with Pitts, McCulloch, and the Chilean biologist and philosopher Humberto Maturana, he subjected the frogs to various visual experiences—brightening and dimming the lights, showing them color photographs of their natural habitat, magnetically dangling artificial flies—and recorded what the eye measured before it sent the information off to the brain. To everyone’s surprise, it didn’t merely record what it saw, but filtered and analyzed information about visual features like contrast, curvature, and movement. “The eye speaks to the brain in a language already highly organized and interpreted,” they reported in the now-seminal paper “What the Frog’s Eye Tells the Frog’s Brain,” published in 1959.

  The results shook Pitts’s worldview to its core. Instead of the brain computing information digital neuron by digital neuron using the exacting implement of mathematical logic, messy, analog processes in the eye were doing at least part of the interpretive work. “It was apparent to him after we had done the frog’s eye that even if logic played a part, it didn’t play the important or central part that one would have expected,” Lettvin said. “It disappointed him. He would never admit it, but it seemed to add to his despair at the loss of Wiener’s friendship.”

  The spate of bad news aggravated a depressive streak that Pitts had been struggling with for years. “I have a kind of personal woe I should like your advice on,” Pitts had written to McCulloch in one of his letters. “I have noticed in the last two or three years a growing tendency to a kind of melancholy apathy or depression. [Its] effect is to make the positive value seem to disappear from the world, so that nothing seems worth the effort of doing it, and whatever I do or what happens to me ceases to matter very greatly . . .”

  In other words, Pitts was struggling with the very logic he had sought in life. Pitts wrote that his depression might be “common to all people with an excessively logical education who work in applied mathematics: It is a kind of pessimism resulting from an inability to believe in what people call the Principle of Induction, or the principle of the Uniformity of Nature. Since one cannot prove, or even render probable a priori, that the sun should rise tomorrow, we cannot really believe it shall.”

  Now, alienated from Wiener, Pitts’s despair turned lethal. He began drinking heavily and pulled away from his friends. When he was offered his PhD, he refused to sign the paperwork. He set fire to his dissertation along with all of his notes and his papers. Years of work—important work that everyone in the community was eagerly awaiting—he burned it all, priceless information reduced to entropy and ash. Wiesner offered Lettvin increased support for the lab if he could recover any bits of the dissertation. But it was all gone.

  Pitts remained employed by MIT, but this was little more than a technicality; he hardly spoke to anyone and would frequently disappear. “We’d go hunting for him night after night,” Lettvin said. “Watching him destroy himself was a dreadful experience.” In a way, Pitts was still 12 years old. He was still beaten, still a runaway, still hiding from the world in musty libraries. Only now his books took the shape of a bottle.

  With McCulloch, Pitts had laid the foundations for cybernetics and artificial intelligence. They had steered psychiatry away from Freudian analysis and toward a mechanistic understanding of thought. They had shown that the brain computes and that mentation is the processing of information. In doing so, they had also shown how a machine could compute, providing the key inspiration for the architecture of modern computers. Thanks to their work, there was a moment in history when neuroscience, psychiatry, computer science, mathematical logic, and artificial intelligence were all one thing, following an idea first glimpsed by Leibniz—that man, machine, number, and mind all use information as a universal currency. What appeared on the surface to be very different ingredients of the world—hunks of metal, lumps of gray matter, scratches of ink on a page—were profoundly interchangeable.

  There was a catch, though: this symbolic abstraction made the world transparent but the brain opaque. Once everything had been reduced to information governed by logic, the actual mechanics ceased to matter—the tradeoff for universal computation was ontology. Von Neumann was the first to see the problem. He expressed his concern to Wiener in a letter that anticipated the coming split between artificial intelligence on one side and neuroscience on the other. “After the great positive contribution of Turing-cum-Pitts-and-McCulloch is assimilated,” he wrote, “the situation is rather worse than better than before. Indeed these authors have demonstrated in absolute and hopeless generality that anything and everything . . . can be done by an appropriate mechanism, and specifically by a neural mechanism—and that even one, definite mechanism can be ‘universal.’ Inverting the argument: Nothing that we may know or learn about the functioning of the organism can give, without ‘microscopic,’ cytological work any clues regarding the further details of the neural mechanism.”

  This universality made it impossible for Pitts to provide a model of the brain that was practical, and so his work was dismissed and more or less forgotten by the community of scientists working on the brain. What’s more, the experiment with the frogs had shown that a purely logical, purely brain-centered vision of thought had its limits. Nature had chosen the messiness of life over the austerity of logic, a choice Pitts likely could not comprehend. He had no way of knowing that while his ideas about the biological brain were not panning out, they were setting in motion the age of digital computing, the neural network approach to machine learning, and the so-called connectionist philosophy of mind. In his own mind, he had been defeated.

  On Saturday, April 21, 1969, his hand shaking with an alcoholic’s delirium tremens, Pitts sent a letter from his room at Beth Israel Hospital in Boston to McCulloch’s room down the road at the Cardiac Intensive Care Ward at Peter Bent Brigham Hospital. “I understand you had a light coronary; . . . that you are attached to many sensors connected to panels and alarms continuously monitored by a nurse, and cannot in consequence turn over in bed. No doubt this is cybernetical. But it all makes me most abominably sad.” Pitts himself had been in the hospital for three weeks, having been admitted with liver problems and jaundice. On May 14, 1969, Walter Pitts died alone in a boarding house in Cambridge, of bleeding esophageal varices, a condition associated with cirrhosis of the liver. Four months later, McCulloch passed away, as if the existence of one without the other were simply illogical, a reverberating loop wrenched open.

  Notes

  1. All letters retrieved from American Philosophical Society, Warren S. McCulloch Papers, BM139, Series I: Correspondence 1931–1968, Folder “Pitts, Walter.”

  2. All Jerome Lettvin quotes taken from James A. Anderson and Edward Rosenfield, eds., Talking Nets: An Oral History of Neural Networks (Cambridge, MA: MIT Press, 2000).

  3. Flo Conway and Jim Siegelman, Dark Hero of the Information Age: In Search of Norbert Wiener, the Father of Cybernetics (New York: Basic Books, 2006).

  ROSE GEORGE

  A Very Naughty Little Girl

  FROM Longreads

  SHE WAS A NAME on a plaque and a face on a wall. I ate beneath her portrait for three years and paid it little attention except to notice that the artist had made her look square. There were other portraits of women to hold my attention on the walls of Somerville, my Oxford college: Indira Gandhi, who left without a degree, and Doro
thy Hodgkin, a Nobel Prize–winner in chemistry. In a room where we had our French-language classes, behind glass that was rumored to be bulletproof, there was also a bust of Margaret Thatcher, a former chemistry undergraduate. Somerville was one of only two women’s colleges of the University of Oxford while I was there, from 1988 to 1992, and the walls were crowded with strong, notable women. (The college has since gone co-ed.)

  The plaque saying VAUGHAN was on the exterior wall of my first-year student residence, a building named after Vaughan, Dame Janet Maria, the woman in the portrait and principal of Somerville between 1945 and 1967. She was still alive when I was an undergraduate, and, according to her obituaries, was known for driving around Oxford in a yellow Mini; for always dressing in tweeds; and for going to the Bodleian Library even in her late 80s and inadvertently annoying other readers when her hearing aid hummed and whistled. But when I arrived at Somerville, and was assigned to Vaughan, I thought only with some relief that everyone would finally be able to spell my Welsh third name, usually a puzzle even to English speakers. I did not think back to my two surgeries, or to my birth, where bags of someone else’s blood and plasma would have hung from hooks and saved my life, and thank Janet Vaughan for her role in helping to make that standard medical practice. But I should have.

  Blood. We all have it, this liquid that is nearly half water, red in color, that “circulates in the arteries and veins, carrying oxygen to and carbon dioxide from the tissues of the body,” an Oxford Dictionary definition. Blood is common, ubiquitous, inevitable. But it is also so much more than its dictionary description. It is more expensive than oil. Every two seconds, someone needs some. Every day, millions of people receive blood from anonymous strangers in a wondrous procedure that has become banal.

  In the United Kingdom, where I live, the system of widespread donation of blood by anonymous volunteers, and its transfusion into people who need it, dates back not even a century. The National Blood Service began as the Blood Transfusion Service in 1946; the National Health Service was founded in 1948. But humans have been curious about blood probably since the first human bled. The first date on the history timeline of the National Blood Service is 1628, when William Harvey demonstrated that blood circulates around the body, but the practice of bloodletting is usually dated back to Egyptians of 3,000 years ago. Letting out blood eased the imbalance of the four humors—blood, phlegm, black bile, yellow bile—that governed the body.

  Transfusion—the transfer of blood from one creature to another—is also ancient. Romans, wrote Pliny the Elder, ran to drink the blood of dying or dead gladiators, to gain some of their strength and force. Blood was also thought to carry personality, so when Jean-Baptiste Denis, doctor to Louis XIV, treated a feverish 16-year-old patient with a blood transfusion in 1667, he thought the “mild and laudable” blood of a lamb his best bet. The patient recovered, though a madman treated with calf’s blood did not. Transfusion was largely avoided for 150 years. The Victorians tried, but humans given human blood kept dying, and only when Karl Landsteiner discovered blood types in 1909 (and that some mixed fatally) did transfusion become acceptable.

  Transfusion is now unremarkable. A car-accident victim can require 100 pints of blood. Coronary bypass surgery can require blood from 20 donors, while a premature baby can be saved with three teaspoonsful. In the United States 15.7 million pints of blood are donated every year, and more than 100 million globally, according to the World Health Organization. But in 1920s Oxford, when Janet Vaughan was a medical sciences undergraduate at Somerville, the speedy transfer of blood from one human to another en masse was unthinkable.

  Vaughan was born to privilege. Her father was a headmaster of fine public schools such as Rugby. Her mother, Madge Symonds, was a beauty, though Vaughan described her as “a caged butterfly or hummingbird,” and the cage was her life as a headmaster’s wife in stuffy schools where she ate chicken bones with her fingers to shock the butler. Madge was great friends with Virginia Woolf, Janet Vaughan’s second cousin, and the character of Sally Seton in Mrs. Dalloway is based on her. The Vaughans were connected, but not wealthy. Janet was given indifferent schooling, including a headmistress who described her as “too stupid to be educated.” She ignored that, read voraciously, and after three attempts, was accepted at Somerville to study medical sciences. She arrived, she said, with nothing more than “a little ladylike botany,” yet graduated with a first-class degree, and set about being a physician. For her obstetrics rotation at University College Hospital, London, she was sent into London’s slums. “Terrible poverty,” she told the journalist Polly Toynbee, when Vaughan was interviewed as one of six women chosen by the BBC to be Women of Our Century. She encountered “a woman with no bed except newspaper. A husband who said, ‘Ain’t there no male doctors? I’d rather have a black man than you.’” She saw lines of children sitting up in bed with rheumatic hearts, who would die. She saw that poverty is deadly. “How anyone could do medicine in those days and not become a socialist I find hard to understand,” Vaughan wrote. “What I hated most was people’s acceptance: ‘Yes, I have had seven children and buried six, it was God’s will.’ I hated God’s will with a burning hatred.”

  The slums introduced her to real poverty, but also to blood, her lifelong interest. Anemia was endemic, and by the time she qualified, she began to wonder why the standard cure for anemia was arsenic. She was a pathologist by now: her mother had died, and she thought her widowed father needed her. Pathology, with a more stable routine than doctoring, would make her available for him. But a pathologist can still read. She was trained at Oxford, and at Oxford you are trained to read ferociously. So she had read of the work of Dr. George Minot, an American physician who had been treating pernicious anemia with raw liver extract. Vaughan thought this made more sense than arsenic, so she approached her professor of medicine, a man who did not see a woman who was young and think both those things to be handicaps. “I went to the professor,” she told the BBC, “and said, can we test it on a patient?” He said no, but she could try it first on a dog, if she produced the extract herself. He gave her some money, and off she went to collect as many pails and mincing machines as she could. She did the rounds of her friends. She borrowed Virginia Woolf’s mincer, and minced and minced. It became a scene in Woolf’s A Room of One’s Own, Janet with her mincers, Minot’s paper propped on the kitchen table, a parody of expected domesticity.

  The extract was fed to two dogs, who sickened. Janet said, no more dogs, and took the extract herself. “The next morning when I came back to the hospital there were all the professors of medicine, chemistry, surgery, waiting on the doorstep to see if I was still alive.” It was fed to a patient, “a nice old labouring man.” The patient survived, and of course a senior professor took all the credit for the miraculous new treatment. Janet didn’t much mind: she had other things to do. Her father had remarried, and she could travel, so she got a Rockefeller Scholarship to Harvard, where there were no women. She wasn’t allowed to work with patients and she wasn’t allowed to work with mice either, for when she ordered some Boston mice, a famous lab variety, she was told there were none to spare. No matter. She sourced some “excellent Philadelphian mice,” but Harvard didn’t allow them. She ended up with pigeons, using them to do groundbreaking research on vitamin B12 in blood that wasn’t fully acknowledged for 50 years. She called them her Bloody Pigeons.

  How I love the brisk nervelessness of this woman. Some of it comes from privilege. But much of it is her own, as much as her fictional room was. She had the confidence to make fissures in patriarchal concrete, but also the confidence to get married, because she wanted to. With her husband, David Gourlay, she moved into Gordon Square in Bloomsbury, and Vaughan went to work at a hospital where no one spoke to her and physicians asked for her advice—by then her reputation was significant—only by letter. She treated more anemic patients with liver, though the patients then said, “Don’t give me any more of that medicine, doctor. It makes me hungry and I can’t
afford [to eat].” She taught her patients to fight authorities to get extra milk, for the extra iron it would give them. She taught her students that to practice medicine, they must learn to deal with the public assistance board as well as the hospital dispensary. She had two daughters. She was busy, and happy.

  But war was coming. First, it came to Spain. Her Bloomsbury friends went to fight, and Vanessa Bell’s son Julian was killed. Vaughan began to work with the Committee for Spanish Medical Aid and joined the Communist Party, though she soon lapsed. She said that no one seemed to notice. She sold possessions to raise money for Basque children; she spoke on soapboxes at street corners. She welcomed to London Dr. Duran-Jordá, an exceptionally gifted Spanish hematologist who had worked out how to store large supplies of blood so it could be used in a war situation. By now, taking blood from one human and putting it in another was understood, but storing that blood so it didn’t spoil was not. Vaughan read that Russians had also stored blood, taken from fatal road accidents and kept at low temperatures, then used for civilian needs. She noted all this, and she kept it safe until 1938, when the Munich Agreement was signed, allowing Nazi Germany to annex Sudetenland.

  We think now, in our era of wars that only happen elsewhere, of 1938 as safely prewar. But it can’t have felt like it. A bombing blitz on London was seriously expected to follow Munich. Someone came to Hammersmith and told the medical school to be ready for 57,000 casualties in London that weekend. And Vaughan realized immediately that casualties would need blood. They would need a lot of blood.

  In the Wellcome Library in London, I find a propaganda film published by the Ministry of Information in the 1940s, a time when neither “propaganda” nor “Ministry of Information” sounded sinister. The film is called Blood Transfusion, and is narrated by accents that now sound cut-glass and royal, but then were normal. It tells us that blood transfusion was widely used in World War I on the Western Front, including by Dr. Oswald Robinson of Toronto. Donors were easily on hand, and by then it had been understood that adding sodium citrate to blood stopped coagulation and made blood easier to store.

 

‹ Prev