Book Read Free

The Blind Giant

Page 9

by Nick Harkaway


  In the same vein, Nicholas Carr (like Sven Birkerts in Gutenberg Elegies) warns of the death of ‘deep reading’ – the focused, single-tasking, immersive style of reading he remembers from the days before the intrusion of the Internet. He feels that we are passing through a shift in the way we think, and mirrors the concerns expressed more gently by Dr Maryanne Wolf (Director of the Center for Reading and Language Research at Tufts University) in her book Proust and the Squid, that this shift in how we live and work will change the architecture of the brain itself, and thereby alter what it means to be human.

  It sounds dramatic, but the brain is a versatile and even to some extent a volatile organ. It does, even in adulthood, alter its shape to take on new skills and abilities at the cost of others. The phenomenon is called ‘neuroplasticity’, and it is actually – to a layman’s eye – remarkable. By way of example: the anterior hippocampus – the region associated with spatial memory and navigation – of a London taxi driver, seen in a magnetic resonance image, shows pronounced enlargement.1 Taxi drivers learn the streets and the flow of traffic, and that learning is reflected in the actual physical structure of their brains. In fact, whenever you learn a new skill, the brain begins to devote resources to it. Practice may not make perfect, but it does increase your aptitude for a particular task by building the area of the brain responsible for executing it.

  Perhaps the most extreme example – if not in terms of neurophysiology then certainly of practical application of the brain’s adaptability – is an American man named Daniel Kish. Kish is something of a phenomenon himself: born with a cancer of the eye, he has been completely blind since before he was two. He functions to all intents and purposes as if he can see, however – riding a mountain bike, identifying different objects at a distance, moving with confidence through space – by using echolocation. Kish actually clicks his tongue and uses his hearing – his ears are biologically ordinary – to receive a signal telling him where he is and what is around him. He has learned to interpret this so accurately that he can weave through traffic on his bike. He cannot, obviously, use this skill to read printed text or perform any other task specifically geared towards perception using light. On the other hand, his perception is not restricted to the normal field of vision. He has also passed on the skill to a new generation of echolocators; this is not something specific to Kish, however remarkable he may appear. It’s an ability you can learn.2

  Having said that, it is important not to overstate the extent of neuroplasticity. Steven Pinker, author and Johnstone Professor of Psychology at Harvard, points out in The Blank Slate that ‘most neuroscientists believe that these changes take place within a matrix of genetically organised structure.’ However impressive the flexibility of the brain, there are limits. ‘People born with variations on the typical plan have variations in the way their minds work … These gross features of the brain are almost certainly not sculpted by information coming in from the senses, which implies that differences in intelligence, scientific genius, sexual orientation, and impulsive violence are not entirely learned.’ The question is how far the smaller changes within the brain can take one’s identity before the brick wall of genetic structure is reached.

  The issue for Carr, Greenfield and others is that we may unknowingly be moving away from the very development that made us what we are. Reading is an act of cognition, a learned skill that is not native to the brain. We are not evolved to be readers. Rather, the brain reshapes itself to meet the demands of the reading skill, forming connections and practising it – just as you’d practise throwing and catching – until it is instinctive. You begin by spelling out words from letters, then ultimately recognize words as whole pieces, allowing you to move through sentences much faster. That moment of transition is the brain reaching a certain level of competence at the reading operation – or, rather, at the conventional reading operation, in which the reader consumes a text that is inert. Ostensibly, at least, traditional text cannot be re-edited on the go and contains no hypertextual connections that might distract you from concentrating on what is there and incorporating the information in it into your mind, or imagining the events in a fiction.

  Text in the age of digital technology is somewhat different. It is filled with links to other texts, which the reader must either follow or ignore (a split-second decision-making process that, according to Carr, breaks the deep state of concentration that is at the core of the reading experience, however briefly). Worse, the text is in competition with other media in the same environment – the device – so that email, phone calls and Twitter can interrupt the smooth uptake of what is on the page; and that’s not just an issue for anyone who wants to read a thriller without losing the thread. Reading – not in the cultural sense, necessarily, though that’s no doubt a part of it – has had a profound effect on us as individuals and therefore on our societies.

  The evolution of reading and writing, in concert with our own, seems to have triggered a subtle but vastly significant change in what it means to be human, allowing a greater sense of separation from one’s own knowledge and a greater sense of the individual self. Some thinkers suggest that written language defined a new age of the singular individual, where before our thought was more immediately experiential and our sense of self was fuzzier, more identified with the group. Written and read thought allowed us to see ourselves and our ideas from the outside, and began the long journey to technological society. What, then, will happen to us if we abandon it?

  Apart from anything else, a recent study by researchers at the University of Buffalo suggests that reading increases empathy – or even teaches it. On the one hand, the experiment is slightly alarming: reading Stephanie Meyer’s vampire novels causes you to identify more closely with words like ‘blood’, ‘fangs’ and ‘bitten’, which seems to imply that readers are empathizing with the indestructible and tortured undead; but I did that when I was fifteen and it doesn’t appear to have warped me too much. On the other hand, it seems that what is learned is the forming of an emotional connection in general rather than the creation of a connection just with those characters.3 What isn’t clear – I suspect because it’s outside the scope of the study – is whether this is a consequence of reading specifically or of concentrating on a narrative in any form. Does this effect not occur with film or video game narratives? Perhaps not: those forms are apprehended directly through the senses rather than being taken in cognitively, so maybe there is a difference. Then again, perhaps there isn’t. But the spectral possibility that reducing the amount of simple, disconnected reading we do might also reduce our capacity to empathize is worth spending some time and government money to rule out.

  This kind of concern – like many others in the digital debate – is familiar. Plato records Socrates inveighing against the notion of mass literacy, reportedly worried that if the population could read and write, they would cease to bother to remember. Their thinking might be jeopardized, too, as the new technology of writing created in them a kind of false consciousness, a simulated cognition derived from what they read rather than a real one produced by consideration of the issues from first principles. It might seem outlandish – except that it’s exactly the same as the one we’re discussing now – but if some modern notions of our brain’s history are an accurate depiction of what happened, then Socrates was absolutely right. He was even right to imagine that the nature of thinking would be fundamentally altered by literacy. But he was wrong – at least superficially – in his dire prediction of a society made ignorant by letters.

  Indeed, the development of the modern mind – and perhaps even our modern concept of individuality – can in some ways be seen as starting with the written word. Abstracted thought, reflected in the new medium of letters, is one of the defining characteristics of the world we inhabit today. And, as you will know if you’ve seen a showman memorize a deck of cards in a few seconds or heard an imam who does not speak Qur’anic Arabic recite the Qur’an from memory, we can still learn
the trick of extreme memorization. One way to do it with a deck of cards is to make a narrative out of the numbers and images as they pass by, telling a memorable story rather than trying to retain a random slew of numbers. (In the case of the Qur’an, Islamic tradition holds that the language is so perfect, proceeding directly from the divinity, that the verses are uniquely memorable and impossible to counterfeit.)

  The ability of the brain to acquire new skills is phenomenal, but neuroplasticity is not exclusively a blessing. It’s also the key to any number of bad habits, bad personality traits and some addictions, and – like technology itself – it’s subject to a sort of lock-in, where pathways become so well-trodden as to be hard to vary. (That said, even the most ingrained habits can ultimately be overcome and replaced with new ones.) The fear expressed by critics is that long periods of time using computers will cause the brain to adapt itself to the demands of the digital world rather than the real one – or, I suppose, rather than the one that is not inherently structured around digital technology. Instead of learning to respond to cues from face-to-face interactions, people will become used to dealing with text: a profound distinction, as sense inputs are handled in a different area of the brain from cognitive skills. More, human interactions until now have featured enormous amounts of tacit communication in the form of body language, tone, eyeline and even scent. There’s a great deal going on that is not conveyed by the technology we have now, which is why online poker players do not generally make the transition to the in-person gaming table without some problems: the in-the-flesh game is more about tells and giveaways, subtle personal indicators of confidence or bluff, than it is about knowing the odds.

  Furthermore, runs the objection, digital interactions require – and hence promote – different mental skills; in general, memory is less important (Socrates would not approve) because information can be cached, searched and recalled in the machine. Nicholas Carr makes reference to a scene from the life of Johnson, where the good doctor identified two types of knowledge: ‘We know a subject ourselves, or we know where we can find information upon it.’ What interests me here is the definition of the first sort of knowing. Conventionally, in the traditional textual way of learning, we learn, if not by rote, by acceptance of authority. We acknowledge the primacy of the teacher and take in not only the information they impart but also their value judgement of it, their perception of its reliability and context. Students are encouraged to consider the biases of reported facts and sources after taking them on board.

  It seems to me, though, that the digital environment fosters a far less trusting approach. It is not in the first instance important to know what some guy called Nick Harkaway thinks about the facts, but rather to figure out what they are and then consider whether Harkaway’s opinions are significant or useful. Rather than learning by rote from a single source, Carr’s ‘power browsers’ are assembling their own narrative from a variety of sources. It’s both pre-emptive de-spinning of material – we live in a world where almost nothing is not spun – and the creation of a personal viewpoint rather than the incorporation of someone else’s.

  For those who fear this shift, the gap between the digital and the traditional is profound: Birkerts, seeing reading as an act of translation from the act of looking at printed text to an immersion in the flow of ideas and narrative it conveys, wrote that print communication is active, requiring close engagement from the reader. More, the communication between reader and author is private and disconnected from the world – a kind of perfect connection degraded only, presumably, by the inevitable incompleteness of the acts of transmission and translation; no writer is so good as to convey meaning without room for misunderstanding, and no reader so empathic as to receive what is written without further mistake. Physical reading is also measured in a physical journey through the book, page by page, and a temporal one from beginning to end which is in accordance with the human experience as it is lived.

  By contrast, digital communication is inherently public, part of a larger network. Information can be taken in passively, or interacted with, neither of which is the same as the self-created immersion of the traditional form. The order of digital text can readily be rearranged, hypertext allowing different paths through a document. A greater emphasis is placed on impression and impact than logic. The branched, lateral nature of digital text affects how it is received, which is not similar to the way we live through time, but more like the rapid, convoluted succession of images and events in a Tarantino movie. That may make it less suitable for reading a conventional fictional narrative – in which case the publishing industry will either be relieved to find paper books still sell or appalled as conventional written fictions cease to be part of the culture – but it’s not clear to me what it means for non-fiction. It suggests that books are no longer read so much as they are filleted, consumed and repurposed.

  As a writer, I find myself wondering whether the traditional version of the author/reader relationship is truly so private as Birkerts believes. All reading takes place in the net of human interaction: literary critics have argued for years about the extent to which the experience of reading, say, Charles Dickens is altered by the numerous film adaptations and references to his work in popular culture. Reading A Christmas Carol after having seen Bill Murray play Scrooge is not the same as reading it beforehand. More, books are – and have long been – discussed in literary salons, in book groups and around the family table. Books exist to be experienced and that experience is not complete until it is shared; we’re a more profoundly collaborative species (though perhaps not culture) than we generally imagine. It seems to me that the inherently connected nature of the digital realm that Birkerts talks about is not so much a difference of type as a difference of speed. The pace of analogue discussion is slow, and the number of people involved in the conversation tends to be limited by physical space. Digital, by contrast, allows the same comment to be seen, considered and discussed by an unlimited number of participants at one time. Everything moves faster. That, of course, does make the experience different – but it doesn’t make it entirely foreign.

  I also question the linear nature of human experience that Birkerts leans on so hard. Our memories are intensely selective. You probably don’t remember brushing your teeth every morning and evening this week in great detail. Each of those moments most likely blends into one general recollection of slightly uncomfortable, humdrum mintiness. Similarly, you tend to allow the details of your commute to fade away each day. You may even drift off while it’s happening. Albert Einstein once observed that relativity was to be found in the fact that putting one’s hand on a hot stove for even an instant seemed endless, whereas a long time spent with an attractive woman seemed to pass impossibly quickly. Our experience of time is more like a movie – maybe even a choppy, disconnected one – than we generally acknowledge, and our subsequent memories are edited by us so that we recall the important bits and leave the dull parts behind.

  For Carr, the consequences of our love affair with digital technology are clear. He points to a 2008 study by Professor Gary Small of UCLA’s Memory and Aging Centre. ‘Book readers,’ Carr explains, ‘have a lot of activity in regions associated with language, memory, and visual processing, but they don’t display much activity in the pre-frontal regions associated with decision making and problem solving. Experienced Net users, by contrast, display extensive activity across all those regions when they scan and search web pages.’ Reading a hypertext page is a constant process of evaluation and judgement as well as comprehension; a cycle of reading the text, seeing a link, evaluating the likely level of interest at the other end of it, judging whether or not to click on it, then returning to the origin text (or not). The problem is that that moment of evaluation, however brief, apparently kicks the brain out of the immersive mode of reading. Societally, we are spending less time reading conventionally, and hence less time in the cognitive space that Carr is anxious to preserve.

  If this is true, it
seems to be partly a matter of choice; the simplest solution, if you’re concerned, is to read a text stripped of links, and to take pains to make space in your day for uninterrupted reading. Other solutions exist for work; for a while now, some writing software has included the option of a kind of ‘quiet room’ – a working environment that shuts off access to the distractions of the Internet. If it transpires that the brain is rewiring itself away from traits we need as a consequence of digital technology, and this is detrimental to the way we live and think, surely that high-powered evaluation and decision-making skill we will have acquired will help us to see the obvious remedy: a balance of modern text, complete with connections, and the more traditional variety without them.

  If necessary, in future, we can pick different production tools and different media for different tasks. In a sense, I’ve been doing exactly that in preparing this book, switching between Scrivener (the writing software I use for work) and a pen and paper, reading some items online or on a digital reading device, and others on paper. If neuroplasticity is sufficiently extreme as to put the architecture of the human brain and the mode of living we currently have under threat or strain, then we can simply change it back. The time frame Professor Small noticed was measured in days, rather than months, and neuroplasticity flexes in both directions. An early experiment by Dr George Stratton, detailed at the Third International Congress for Psychology in 1896, involved wearing special glasses that inverted the wearer’s vision. After a few days, the brain adapted, and the wearer was able to see and move around as normal. Removing the glasses then caused a confusion akin to putting them on in the first place, but, again, the brain was able to re-train in less than a week.4

 

‹ Prev