Book Read Free

Is the Internet Changing the Way You Think?

Page 10

by John Brockman


  Brains, especially youthful ones, have an omnivorous appetite for information, novelty, and social interaction, but it is less obvious why we are so good at unconscious learning. One advantage of unconscious learning is that it allows the brain to build up an internal representation of the statistical structure of the world: the frequency of neighboring letters in words, say, or the textures, forms, and colors that make up images. Brains are also adept at adapting to sensorimotor interfaces. We first adapted to clunky keyboards, then to virtual pointers to virtual files, and now to texting with fingers and thumbs. As you become an expert at using it, the Internet, as with other tools, becomes an extension of your brain.

  Are the changes occurring in your brain as you interact with the Internet good or bad for you? Adapting to the touch and feel of the Internet makes it easier for you to extract information, but a better question is whether the changes in your brain will improve your fitness. There was a time, not long ago, when CEOs didn’t use the Internet because they had never learned to type—but these folks are going extinct and have been replaced with more Internet-savvy managers.

  Gaining knowledge and skills should benefit your survival, but not if you spend all your time immersed in the Internet. The intermittent rewards can become addictive, hijacking your dopamine neurons (which predict future rewards). But the Internet has not been around long enough—and is changing too rapidly—for us to know what the long-term effects will be on brain function. What is the ultimate price for omniscience?

  The Sculpting of Human Thought

  Donald Hoffman

  Cognitive scientist, UC Irvine; author, Visual Intelligence: How We Create What We See

  Human thought has many sculptors, and each wields special tools for distinct effects. Is the Internet in the tool kit? That depends on the sculptor.

  Natural selection sculpts human thought across generations and at geologic time scales. Fitness is its tool, and human nature, our shared endowment as members of a species, is among its key effects. Although the thought life of each person is unique, one can discern patterns of thought that transcend racial, cultural, and occupational differences; similarly, although the face of each person is unique, one can discern patterns of physiognomy—two eyes above a nose above a mouth—that transcend individual differences.

  Is the Internet in the tool kit of natural selection? That is, does the Internet alter our fitness as a species? Does it change how likely we are to survive and reproduce? Debate on this question is in order, but the burden is surely on those who argue no. Our inventions in the past have altered our fitness: arrowheads, agriculture, the control of fire. The Internet has likely done the same.

  But has the Internet changed the patterns of thought that transcend individual differences? Not yet. Natural selection acts over generations; the Internet is but one generation old. The Internet is in the tool kit but has not yet been applied. Over time, as the Internet rewards certain cognitive skills and ignores or discourages others, it could profoundly alter even the basic patterns of thought we share as a species. The catch, however, is “over time.” The Internet will evolve new offspring more quickly than Homo sapiens, and they, rather than the Internet, will alter human nature. These offspring will probably no more resemble the Internet than Homo sapiens resembles amoebae.

  Learning sculpts human thought across the lifetime of an individual. Experience is its tool, and unique patterns of cognition, emotion, and physiology are its key effects. Psychologists Marcel Just and Timothy Keller found that poor readers in elementary school could dramatically improve their skills with six months of intensive training and that white-matter connections in the left hemispheres of their brains increased measurably in the process.*

  There are, of course, endogenous limits to what can be learned, and these limits are largely a consequence of mutation and natural selection. A normal infant exposed to English will learn to speak English, but the same infant exposed to C++ or HTML will learn little.

  Is the Internet in the tool kit of learning? No doubt. Within the endogenous limits of learning set by your genetic inheritance, exposure to the Internet can alter how you think no less than can exposure to language, literature, or mathematics. But the endogenous limits are critical. Multitasking, for instance, might be a useful skill for exploiting in parallel the varied resources of the Internet, but genuine multitasking, at present, probably exceeds the limitations of the attentional system of Homo sapiens. Over generations, this limitation might ease. What the Internet cannot accomplish as a tool of learning it might eventually accomplish as a tool of natural selection.

  Epigenetics (the study of changes in appearance or gene expression caused by mechanisms other than changes in DNA) sculpts human thought within a lifetime and across a few generations. Experience and environment are its guides, and shifts in gene expression triggering shifts in cognition, emotion, and physiology are its relevant effects. Neuroscientist Timothy Oberlander and colleagues found that a mother’s depression can change the expression of the NR3C1 gene in her newborn, leading to the infant’s increased reactivity to stress.* Childhood abuse similarly can lead to persistent feelings of anxiety and acute stress in a child, fundamentally altering its thought life.

  Is the Internet in the tool kit of epigenetics? Possibly, but no one knows. The field of epigenetics is young, and even the basic mechanisms by which transgenerational epigenetic effects are inherited are not well understood. But the finding that parental behavior can alter gene expression and thought life in a child certainly leaves open the possibility that other behavioral environments, including the Internet, can do the same.

  Thus, in sum, the relevance of the Internet to human thought depends on whether one evaluates this relevance phylogenetically, ontogenetically, or epigenetically. Debate on this issue can be clarified by specifying the framework of evaluation.

  What Kind of a Dumb Question Is That?

  Andy Clark

  Philosopher and cognitive scientist, University of Edinburgh; author, Supersizing the Mind: Embodiment, Action, and Cognitive Extension

  How is the Internet changing the way I think? There is something tremendously slippery—but actually, despite my attention-seeking title, interestingly and importantly slippery—about this question. To see what it is, reflect first that the question has an apparently trivial variant: “Is the Internet changing the things you think?”

  This is a question that has all kinds of apparently shallow answers. The Internet is certainly changing what I think (it makes all kinds of information and views available to me that would not be otherwise). The Internet is also changing when I think it, how long it takes me to think it, and what I do with it when I’ve finished thinking it. The Internet is even changing how I carry out lots of the thinking, making that a rather more communal enterprise than it used to be (at least in my area, which is scientifically informed philosophy of mind).

  But that all sounds kind of shallow. We all know the Internet does that. What the question means to get at, surely, is something slippery but deeper, something that may or may not be true, viz.: “Is the Internet changing the nature of your thinking?”

  It’s this question, I suggest, that divides the bulk of the respondents. There are those who think the nature of human thinking hasn’t altered at all and those who think it is becoming radically transformed. The question I want to ask in return, however, is simply this: “How can we know?” I don’t think this question has any easy answer.

  One place to start might be to distinguish what we think from the routines we employ to think it. By “routines” I mean something in the ballpark of an algorithm—some kind of computational recipe for solving a problem or class of problems. Once we make this distinction, it can seem (but this may turn out to be a deep illusion) plain sailing. For it then seems the question is simply one for science to figure out. For how would you know whether the way you were thinking had been altered? If what you tend to think alters, does that imply that the way you are think
ing it must be altered, too? I guess not. Or try it the other way around. If what you tend to think and believe remains the same, does that imply that the way you’re thinking it remains the same? I guess not.

  The most we can tell from our armchairs, it seems to me, is that what we’re thinking (and when we tend to think it) is in some way altering. But of course, there can be no doubt that the Internet alters what we tend to think and when. If it didn’t, we wouldn’t need it. So that’s true but kind of trivial.

  Otherwise put: From my philosopher’s armchair, all I know is what anyone else knows, and that’s all about content. I know (on a good day) what I think. But as to the routines I use to think it, I have as little idea as I have (from my armchair) of what moves the planets. I have access to the results, not the means. Insofar as I have any ideas at all about what routines or means I use to do my thinking, those ideas are no doubt ragingly false. At best, they reflect how I think I think my thoughts, not how I do.

  So far, so good. At this point, it looks as if we must indeed turn to some kind of experimental science to find the answer to any nontrivial reading of the question.

  Is the Internet changing the way I think? Let’s put on our lab coats and go find out.

  But how?

  Suppose we go looking for some serious neural changes in heavy Internet users. Problem: There are bound to be some changes, as surfing the Web is a skill and skills alter brains. But when does some such change count as a change to the way we think? Does learning to play the piano change the way I think? Presumably not, in the kind of way that the question means. Even quite large neural changes might not effect a change in the way we think. Perhaps it’s just the same old way being employed to do some new stuff. Conversely, even a quite small neural change might amount to the installation of a whole new computational architecture (think of adding a recurrent loop to a simple neural network—a small neural change with staggeringly profound computational consequences).

  It gets worse.

  Not only is it unclear what science needs to discover, it is unclear where science ought to look to discover (or not discover) it.

  Suppose we convince ourselves, by whatever means, that as far as the basic mode of operation of the brain goes, Internet experience is not altering it one whit. That supports a negative answer only if we assume that the routines that fix the “nature of human thinking” must be thoroughly biological—that they must be routines running within, and only within, the individual human brain. But surely it is this assumption that our experiences with the Internet (and with other “intelligence amplifiers” before it) most clearly call into question. Perhaps the Internet is changing “the way you think” by changing the circuits that get to implement some aspects of human thinking, providing some hybrid (biological and nonbiological) circuitry for thought itself. This would be a vision of the Internet as a kind of worldwide supracortex. Since this electronic supracortex patently does not work according to the same routines as, say, the neocortex, an affirmative answer to our target question seems easily in the cards.

  But wait. Why look there in the first place? What exactly determines (or, better, what should determine) where we look for the circuitry whose operational profile, even assuming we can find it, determines the “way we think”?

  This is a really hard question—and, sad to say, I don’t know how to answer it. It threatens to bring us all the way back to where we started, with content. For perhaps one way to motivate an answer is to look for deep and systematic variation in human performances in various spheres of thought. But even if we find such variation, those who think that our “ways of thinking” remain fundamentally unaltered can hold their ground by stressing that the basic mode of neural operation is unaltered and has remained the same for (at least) tens of thousands of years.

  Deep down, I suspect that our two interrogative options—the trivial-sounding question about what we think and the deep- sounding one about the nature of our thinking—are simply not as distinct as the fans of either response (Yes, the Internet is changing the way we think/No, it isn’t) might wish.

  But I don’t know how to prove this.

  Dammit.

  Public Dreaming

  Thomas Metzinger

  Philosopher; director of the Theoretical Philosophy Group at the Department of Philosophy of the Johannes Gutenberg–Universität Mainz; author, The Ego Tunnel

  I heard a strange, melodic sound from the left and turned away from the Green Woman. As I shifted my gaze toward the empty landscape, I noticed that something wasn’t quite right. What I saw, the hills and the trees, were as real as could be—but somehow they hadn’t come into view as they would in real life. Somehow it wasn’t quite in real time. There was a slightly different temporal dynamics to the way the scene popped up, an almost unnoticeable delay, as if I were surfing the Web, clicking my way onto another page. But I wasn’t surfing. I had just talked to the Green Woman—and no, my right index finger wasn’t clicking and my right hand wasn’t lying on a mouse pad; it hung by my side, completely relaxed, as I gazed at the empty landscape of hills and trees. In a flash of excitement and disbelief, it dawned on me: I was dreaming!

  I have always been interested in lucid dreams and have written about them extensively. They interest consciousness researchers because you can go for a walk through the dynamics of your own neural correlate of consciousness, unconstrained by external input, and look at the way the experience unfolds from the inside. They are interesting to philosophers, too. You can ask the dream characters you encounter what they think about notions such as “virtual embodiment” and “virtual selfhood” and whether they believe they have a mind of their own. Unfortunately, I have lucid dreams only rarely—once or twice a year. The episode just recited was the beginning of my last one, and a lot of things dawned on me at once besides the fact that I was actually inside my own head. The Internet is reconfiguring my brain, not just changing the way I think. It already penetrates my dream life.

  Sure, for academics the Internet is a fantastic resource—almost all the literature at your fingertips, wonderfully efficient ways of communicating and collaborating with researchers around the world, an endless source of learning and inspiration. Something that leads you right into attention deficit disorder. Something that gets you hooked. Something that is changing us in our deepest core.

  But it’s about much more than cognitive style alone. For those of us intensively working with it, the Internet has become a part of our self-model. We use it for external memory storage, as a cognitive prosthesis, and for emotional autoregulation. We think with the help of the Internet, and it helps us determine our desires and goals. Affordances infect us, subtly eroding the sense of control. We are learning to multitask, our attention span is becoming shorter, and many of our social relationships are taking on a strangely disembodied character. Some software tells us, “You are now friends with Peter Smith!” when we were just too shy to click the Ignore button.

  “Online addiction” has long been a technical term in psychiatry. Many young people (including an increasing number of university students) suffer from attention deficits and can no longer focus on old-fashioned, serial symbolic information; they suddenly have difficulty reading ordinary books. Everybody has heard about midlife burnout and rising levels of anxiety in large parts of the population. Acceleration is everywhere.

  The core of the problem is not cognitive style but attention management. The ability to attend to our environment, our feelings, and the feelings of others is a naturally evolved feature of the human brain. Attention is a finite commodity, and it is absolutely essential to living a good life. We need attention in order to truly listen to others—and even to ourselves. We need attention to truly enjoy sensory pleasures, as well as for efficient learning. We need it in order to be truly present during sex, or to be in love, or when we are just contemplating nature. Our brains can generate only a limited amount of this precious resource every day. Today the advertisement and entertainment
industries are attacking the very foundations of our capacity for experience, drawing us into a vast and confusing media jungle, robbing us of our scarce resource in ever more persistent and intelligent ways. We know all that, but here’s something we are just beginning to understand: The Internet affects our sense of selfhood, and it does so on a deep functional level.

  Consciousness is the space of attentional agency: Conscious information is exactly that information in your brain to which you can deliberately direct your attention. As an attentional agent, you can initiate a shift in attention and, as it were, direct your inner flashlight at certain targets: a perceptual object, say, or a specific feeling. In many situations, people lose the property of attentional agency, and consequently their sense of self is weakened. Infants cannot control their visual attention; their gaze seems to wander aimlessly from one object to another, because this part of their ego is not yet consolidated. Another example of consciousness without attentional control is the nonlucid dream state. In other cases, too, such as severe drunkenness or senile dementia, you may lose the ability to direct your attention—and, correspondingly, feel that your “self ” is falling apart.

 

‹ Prev