Book Read Free

Ignorance

Page 13

by Firestein, Stuart


  I joined the laboratory of Professor Frank Werblin, my second great piece of luck in the mentor category. Frank’s lab worked on the retina—remember, that small piece of brain tissue that coats the back of the eyeball. In deference to Frank, I agreed to try working in the retina, at least to learn some techniques that could be applied to olfaction later on. This turned out to be a disaster. I got nowhere. As mentioned in the previous case study, the retina is a complicated but well-studied bit of brain. An excellent model system—the retina has been compared to a little brain in its complexity and function—it has five different types of brain cells that are connected to each other in a sophisticated network of circuits. The retina not only receives light rays, it operates on them in ways that produce a coherent input to the brain. But for all that, it failed to excite me. It seemed that all the big questions about how the retina worked had been answered and what was left were the details. These are important details to be sure, and, indeed, the retina remains a vigorous area of neuroscience research. But for me it was not the dark room that I wanted to venture into.

  Frank Werblin was a generous mentor and a true scientist. Working nights and weekends, I produced some data from olfactory neurons and when I showed the results to Frank he insisted, after recovering from the fact that my data came from the nose not the eye, that olfaction was where I should work. This was not an easy decision for him—his funding was for work in the retina, his excellent reputation had been made in the retina, his colleagues were all in the retina field. But Frank believed in data and in being guided by questions that mattered to you. I think he also liked being a little bit outside the normal distribution, and olfaction was certainly that.

  So I went to work on the olfactory system within his lab. This was a true stroke of good fortune because research on the retina and visual system was far more advanced than the olfactory system. I thus had my “apprenticeship” among students and postdocs in the more sophisticated vision field. I was held to higher standards than if I had just started right out in olfaction. This turned out to be quite important, itself another lesson: you always have to find the highest current standard and measure your work against that. I am to this day indebted not only to Frank but to the graduate students and postdoctoral fellows in his lab who relentlessly challenged me and generously aided me, forced me to work harder and then made that work more productive. Regardless of what the mythology may say, science is rarely done in isolation.

  I managed to complete my PhD work just as I turned 40, something of a milestone as you might imagine. I was offered a postdoctoral position by Dr. Gordon Shepherd at Yale University’s Medical School. Gordon Shepherd is one of the truly decent people in science. I have never seen him put his own self-interest above those of the students in his laboratory. He is mild mannered, decorous, and decent, almost beyond belief. But I have also seen him fight like hell over a scientific issue or a paper that was incorrectly handled by an editor or reviewer.

  I was attracted to Gordon’s laboratory because he saw olfaction as part of mainstream neuroscience. Not everyone did. Olfaction was a bit of a neuroscience cul-de-sac in those days. As a sensory system, it was thought to be idiosyncratic, somehow unique in the way it worked and therefore difficult to make headway. It was, you might say, the opposite of a model system. Gordon saw it differently. Trained at the National Institutes of Health in synaptic physiology and basic neurophysiology, Gordon eschewed the specialness of the olfactory system and instead believed that olfaction should and did obey the rules of neuroscience, known and unknown, just like every other brain system. This was critical because it meant that the advances being made in vision and hearing and touch and in other brain functions could be relevant to understanding olfaction if applied thoughtfully. As important, it meant that things we learned in olfaction could also be relevant to understanding other parts of the brain, and that gave one the feeling of both belonging and contributing. Olfaction in Gordon’s lab was not an isolated island of neuroscience.

  It wasn’t a sure bet in those days—the sense of smell was indeed puzzling, still is in many ways. How could you discriminate the many thousands of chemicals that were known to be odors? Why does one chemical compound smell like caraway and another almost identical compound smell like spearmint? How can adding a single carbon atom to a molecule change its smell from parmesan cheese to rancid sweat? How do smells evoke vivid memories that are decades old? Many of these questions remain current or have morphed into newer versions that are more sophisticated. It is not my intention here to survey the ignorance of olfaction, which would be, like any field, worthy of its own chapter. I mention these questions to show that the field was at the time wide open, full of ignorance and fallow. I was lucky to have stepped in it. I was smart to have stayed.

  I have had such good fortune with mentors that I am always surprised and a bit puzzled when I hear other people’s horror stories. And they are legion: graduate students who have been taken advantage of, mistreated, forced to work on questions they found uninteresting or unimportant, uncredited for work they do or discoveries they made, ripped off, pissed off, crapped on. How could it go so horribly wrong when it seemed so happily right for me? Was I simply lucky? But three in a row? Was it because I was older and perhaps more mature and had different expectations? I don’t know the answers, and I am disappointed to admit that there doesn’t seem to be a formula that can be followed for training graduate students. I wish I knew the prescription so that I could be sure that I would be the mentor to my students that Hal and Frank and Gordon were to me. But there you have it, as much ignorance as there may be about the brain, there is also ignorance about how to study the brain and even how to prepare to study the brain.

  …

  I hope these four case histories have provided you with a feeling for the nuts and bolts of ignorance, the day-to-day battle that goes on in scientific laboratories and scientific minds with questions that range from the fundamental to the methodological, and that initiate and sustain scientific careers. They are merely examples of how the scientific enterprise is carried on by thousands of individual scientists in many hundreds of laboratories and institutes throughout the world, an enterprise that has been continuously pursued through nearly 15 generations. Its worldview is not one that has taken hold in all cultures, and the impetus to see the world as a tractable mystery is not one that is really common. Most human cultures have been dominated by nonscientific explanations, including our own until a mere few hundred years ago. Many still are.

  We often use the word ignorance to denote a primitive or foolish set of beliefs. In fact, I would say that “explanation” is often primitive or foolish, and the recognition of ignorance is the beginning of scientific discourse. When we admit that something is unknown and inexplicable, then we admit also that it is worthy of investigation. David Helfand, the astronomer, traces how our view of the wind evolved from the primitive to the scientific: first “the wind is angry,” followed by “the wind god is angry,” and finally “the wind is a measurable form of energy.” The first two statements provide a complete explanation but are clearly ignorant; the third shows our ignorance (we can’t predict or alter the weather yet) but is surely less ignorant. Explanation rather than ignorance is the hallmark of intellectual narrowness.

  Getting comfortable with ignorance is how a student becomes a scientist. How unfortunate that this transition is not available to the public at large, who are then left with the textbook view of science. While scientists use ignorance, consciously or unconsciously, in their daily activity, thinking about science from the perspective of ignorance can have an impact beyond the laboratory as well. Let me, in a final brief chapter, suggest how ignorance can also be useful in two areas of current concern and debate: public scientific literacy and education. My intention is only to make a few remarks on each of these topics in the hopes that the reader will be inspired to use the ignorance perspective to extend his or her own thoughts about these very public issues.
/>   EIGHT

  Coda

  If you cannot—in the long run—tell everyone what you have been doing, your doing is worthless.

  —Erwin Schrodinger, from “Science and Humanism, Physics in our Time,” a lecture delivered in Dublin, 1950

  PUBLIC AWARENESS OF SCIENCE

  Science, more than ever, uses and requires public money. Scientists therefore have both a responsibility and, quite frankly, a necessity to educate the public, to engage them in the scientific enterprise. The beginning of Western science is often taken as the publication of Galileo’s Dialogue Concerning the Two Chief World Systems in the late Renaissance. Notoriously Galileo got in some serious trouble with the Church powers over this work due, we are taught, to its heretical propositions about the universe, or what were then still called the heavens. In fact, it was not so much what Galileo said about the relation of the sun and the earth in his famous work; the Church fathers are believed to have mostly agreed with it, being intellectuals themselves, but they just hadn’t worked out how to tell the literal Bible-believing public about it. The real objection was that Galileo, following the trend of the Renaissance occurring all around him, published this seminal work in Italian. It was the first book of science ever to be published in a vernacular language rather than in classical Latin or Greek, knowledge of which was restricted to a small class of intellectuals. It was not the ideas, heretical though they were, but rather their potentially wide dissemination that so worried the Church fathers.

  And those churchmen were correct because Galileo’s landmark work began a tradition of publishing science in common languages—Descartes in French, Hooke in English, Leibniz in German, and so forth. The public’s direct experience of the empirical methods of science is widely regarded as responsible for the cultural transformation from the magical and mystical thinking that marked Western medieval thought, to the rationality of modern discourse. Indeed, public accessibility to science may have been the most important contribution of the Renaissance to scientific progress—even more, some might say, than all the remarkable findings of the period beginning with Galileo’s book in 1632. By the time of Maxwell, Faraday, and Hooke, for example, the public’s appetite for science was voracious. Science demonstrations were put on as entertainments in performance halls, and science books sold as briskly as novels.

  Today, however, we find ourselves in a situation where science is as inaccessible to the public as if it were written in classical Latin. The citizenry is largely cut off from the primary activity of science and at best gets secondhand translations from an interposed media. Remarkable new findings are trumpeted in the press, but how they came about, what they may mean beyond a cure or new recreational technology, is rarely part of the story. The result is that the public rightly sees science as a huge fact book, an insurmountable mountain of information recorded in a virtually secret language.

  It’s no small matter for the citizenry to be able to participate in science and understand how their lives are being changed by it. For some reason it seems easier to access the artistic side of the culture, while the science part is daunting. But science and empirical thinking are as indelibly a part of Western culture as the arts and humanities. Maybe more so. Precisely where and how science started, whether with the Greeks or the Arabs, the Phoenicians or the early Asians, it has flowered in the West as nowhere else. For the 15 generations since Galileo, science has molded our thinking and altered our worldview, from how we think the solar system is organized to how we communicate over this nebulous but ubiquitous thing we so appropriately call “the Web.” This brand of science has spread to other cultures and made itself into a global venture long before the word globalization was popularized. For better or worse, our world has been transformed in record time and to a degree unimaginable at the beginning of it all some 400 hundred years ago. And now you live in that world. Your children grow up in that world. You rely on that world. You should know about that world.

  Another no less compelling reason to be in the know about science is that loads of your money, in tax dollars and corporate spending, are going to support it. US government support of scientific research and education is nearly 3.0% of the gross domestic product—to be more blunt about it, that’s some $420 billion annually. Corporate research budgets account for two-thirds more than government spending, amounting to an additional $700 billion. Corporate research is reflected in the price you pay for energy, for drugs, for just about everything and anything. Admittedly, these numbers include military research (although only the nonclassified part of it), but it’s all science no matter what its intended purpose, and it’s all being billed to you.

  Then there are all those thorny ethical issues that keep bubbling up from science—stem cell research, end-of-life definitions, health care expenses, nuclear power, climate change, biotech agriculture, genetic testing—and this list promises to continue growing in the future.

  Clearly what we need is a crash course in citizen science—a way to humanize science so that it can be both appreciated and judged by an informed citizenry. Aggregating facts is useless if you don’t have a context to interpret them, and this is even true for most scientists when faced with information outside their particular field of expertise. I’m a neurobiologist, but I don’t know much more about quantum physics than the average musician, and I could no sooner read a physics paper in a science journal than I could read the score of a Brahms symphony. I’m an outsider, too. I feel your pain.

  I believe this can be changed by introducing into the public discourse explanations of science that emphasize the unknown. Puzzles engage us, questions are more accessible than answers, and perhaps most important, emphasizing ignorance makes everyone feel more equal, the way the infinity of space pares everyone down to size. Journalists can aid in this cause, but scientists themselves must take the lead. They have to learn to talk in public about what they don’t know without feeling this is an admission of stupidity. In science, dumb and ignorant are not the same. We all know this; it’s how we talk to each other and to our graduate students. Can we also let the public in on the secret?

  EDUCATION

  But so soon as I had achieved the entire course of study at the close of which one is usually received into the ranks of the learned, I entirely changed my opinion. For I found myself embarrassed with so many doubts and errors that it seemed to me that the effort to instruct myself had no effect other than the increasing discovery of my own ignorance.

  —Rene Descartes, Discourse on the Method of Rightly Conducting the Reason and Seeking the Truth in the Sciences, 1637

  Perhaps the most important application of ignorance is in the sphere of education, particularly of scientists. Indeed I first saw the essential value of ignorance through teaching a course that failed to acknowledge it. The glazed-over eyes of students dutifully taking notes and highlighting line after line in a text of nearly 1,500 pages, the desperation to memorize facts for a test, the hand raised in the middle of a lecture to ask only, “Will that be on the exam?” These are all the symptoms of a failing educational strategy.

  We must ask ourselves how we should educate scientists in the age of Google and whatever will supersede it. When all the facts are available with a few clicks, and probably in the not very distant future by simply asking the wall, or the television, or the cloud—wherever it is the computer is hidden—then teaching these facts will not be of much use. The business model of our Universities, in place now for nearly a thousand years, will need to be revised.

  In a prescient and remarkable document from 1949 on “The German Universities” appear the following lines from a report by the Commission for University Reform in Germany:

  Each lecturer in a technical university should possess the following abilities:

  (a) To see beyond the limits of his subject matter. In his teaching to make the students aware of these limits, and to show them that beyond these limits forces come into play which are no longer entirely rational, but arise out
of life and human society itself.

  (b) To show in every subject the way that leads beyond its own narrow confines to broader horizons of its own.

  What an extraordinary prescription, improbably coming from a joint government-academic commission no less. It is a clarion call from a half century ago for us to rethink the education of scientists. Yet to be implemented, we would do well to heed it now.

  Instead of a system where the collection of facts is an end, where knowledge is equated with accumulation, where ignorance is rarely discussed, we will have to provide the Wiki-raised student with a taste of and for the boundaries, the edge of the widening circle of ignorance, how the data, which are not unimportant, frames the unknown. We must teach students how to think in questions, how to manage ignorance. W. B. Yeats admonished that “education is not the filling of a pail, but the lighting of a fire.” Indeed. Time to get out the matches.

 

‹ Prev