Book Read Free

Tomorrow's People

Page 31

by Susan Greenfield


  Let's start with the banal. In your home even workaday physical objects can no longer be readily classified, but change shape to suit the context of the moment. And your every movement from room to room, your inner body processes and your spoken words all influence the environment from one instant to the next. You no longer think of ‘reality’ as something enduring, independent and somehow ‘out there’ – indeed the term has drifted into obsolescence. Nor is there a clear line between the cyber-world and the atomic one. Your friends and colleagues can adopt cyber-profiles, whilst all domestic facilities have human interface personas. You can watch both human and cyber-characters on the screen, roving around your home or in virtual scenes such as the supermarket. And you can choose to impose yourself and your friends into fictional settings.

  Just as it is hard to demarcate reality from fantasy, so it is difficult even to be sure about the boundary of your body: where does the physical ‘you’ actually end? After all, you probably have at least some synthetic prostheses to improve and heighten your senses and to boost flagging muscle power; not that you really need muscles, as you can now merely ‘will’ objects to move. But it is not just the non-carbon components of your body that blur the essence of ‘you’. Another boundary breached is the one between physical and mental events: you are acutely aware that every thought you have is impacting on your immune and endocrine systems and on your vital organs. Thinking and feeling are now intimately merged: you are keenly aware of how your moment-to-moment emotions, and in the longer term your health, can hinge on a worry or a compliment. So now where are the previously obvious distinctions between mental and physical, true and false, objective and subjective?

  In addition, like most people in modern life, you now carry a 24th chromosome with temporary genes that you will probably upgrade when you donate your gametes for reproduction. And if you decide to pass on your combined natural and artificial gene package, there is no longer a line between reproductive age and infertility, nor does it matter whether you are homo- or heterosexual: the production of progeny is, literally, boundless in possibilities.

  The traditional dichotomy between work and leisure has vanished too, in terms of location: everything happens from home, and in terms of time, your day is completely fragmented, whilst both work and leisure activities are screen-based. And if you ever were to engage in a ‘fact-based’ pursuit, accessing the latest science or even performing a cyber-experiment, then once again you would find that old-fashioned classifications have disappeared, in this case between biomedical and physical sciences.

  But most of your time, as for most people nowadays, you spend oblivious to the past or future, whiling away the moments in the twilight interface between the imagined and the actual. And in interacting with others, too, everything is ambiguous. Fictional dating partners, and indeed the entire virtual family, really do seem imbued with all the inner turmoil and emotions of real people – it is just that they are eventually compliant with your whims. The stereotyped roles associated with a bygone ‘classic’ relationship are all in the past. Conventional dating and courtship patterns are no longer appropriate or necessary: instead the full gamut of emotions and sensations entailed in a complete sexual relationship can be provided artificially by a combination of IT and, most important of all, your receptive mind.

  And the ambiguity extends into all family relationships: there is no longer a clear mother–father–child nuclear structure. Even in the previous century the concept of the blended family had increasingly become the norm. Now many children may have as many as six parents, including the providers of gametes, host-eggs and a womb as well as those making a purely environmental contribution to child rearing. Moreover, this generational relationship is far less clear-cut than it used to be. Because children now glean much of their education, both formal and informal, from a screen that condenses all time and space, the notion of ‘parental influence’ is regarded as a historical phenomenon that was already on the wane at the turn of the century.

  As the nuclear family vanishes, the increasing number of older people confuses the picture still further. Not only can the senior generation carry on reproducing, and be grandparents and parents simultaneously, but they are also far more agile and healthy than in the old days, so the differences between them and the younger generations are no longer so obvious. Moreover, these older people cannot be readily distinguished from their successors by what they know; everyone can now access all facts immediately. The experiences of the elderly are no different from anyone else's; for the most part life experience is second-hand, recorded, and annotated by augmented reality. Since no one actually lives life in real time anymore, or in a constant physical environment, the benefits of experience and the acquisition of wisdom to deal with unforeseen circumstances are no longer at a premium.

  The nuclear family, the icon of the post-war years but already strained by the late 20th century, has now disintegrated completely in the face of ambiguous generations, complex reproductive relations and a pervasive cyber-world. The need to belong to a group, a tribe, is catered for by the virtual and real friends you contact via your screen, who are so much more compliant and more tolerant of you and your whims and habits. But then your ‘place’ in this society is not fixed as it used to be in previous eras – not only because society is no longer divided into families, class or generations but also because you yourself are no longer a well-defined entity.

  It is almost as if the old thought experiment ‘colour-blind Mary’, popular with philosophers at the turn of the century, has come true – only in reverse. The utterly unlikely story of colour-blind Mary was conceived by the philosopher Frank Jackson to illustrate the difference between understanding and direct experience. Mary (at least she is female) is a brilliant scientist who understands everything there is to know about the physiology of colour vision, but she herself has never experienced it at first hand because she has been raised in an entirely monochrome environment. Now, if she is unleashed into the real world of vivid colours, will she learn anything new? The critical issue here is the question that we are asking about Mary's state. Will she learn anything more, understand anything more about how the visual system within the brain that processes colour? Probably not. Will she have a dramatic change in her conscious state? Almost certainly yes.

  In your late-21st-century incarnation, you are a kind of Mary-in-reverse: raw, direct experience has replaced knowledge and insight. If true understanding is seeing one thing in terms of something else, of making a connection, then nowadays you understand nothing at all. In your present world no person or object or process is unambiguously and consistently linked to anything else; instead you live in a blur of pulsating sensations. You have none of Mary's understanding but all the first-hand sensation that she was missing. Your moments are visceral, sensation-laden and invariably abstract. Perhaps even your senses are not stimulated in the way that they would have been in the 20th century, but then how could you know?

  As Mary's awareness perhaps changed, when she finally experienced colour, your consciousness will be changing. But for you, it will mean very little. You do not have an attention span of sufficient duration to work out an explanation, to track a plodding sequence of events or premises so that you are able to place any particular issue in question into a context of other places, people, objects or events. Most important of all, you do not have the motivation. Why should you bother to make an effort to force a connection between random bits of information? You can just access the ready-made conclusion as a stand-alone fact. You are a Person of the Screen.

  So, your world is highly interactive, highly personalized and highly unstable – changing both within and around you according to whatever thoughts you might have. Yet these thoughts are disconnected, reactive to the moment, as fragmented as your life is and similarly with no narrative continuity, no meaning. ‘Who are you?’ or ‘What are you?’ – even ‘Where are you?’ – are all hard questions to grasp nowadays. And perhaps, if you have
no delineated thoughts, ideas and body of your own, then perhaps you are not an individual at all…

  This depersonalization is both a cause and effect of the most basic fact of modern life: for every move you make, or rather for every thought you have, there is a record. Your entire life narrative – from your daily bowel movements to the changing status of your immune system to your choice of entertainment – is logged and open to third-party scrutiny. Remember that, thanks to quantum computing, nothing is confidential any longer; every chemical, biological, medical, psychological, financial and social detail about you is public.

  The bitter truth, however, is that this doesn't really matter. Since everyone is so homogeneous, in health and behaviour patterns along with reproductive and relationship options, neither you nor anyone else for that matter is that interesting anymore. Even the once individual and much vaunted genome is now eclipsed by standardized upgrades, increasingly powerful yet reducing variety; in any case the limits of gene technology, especially in relation to mental function, are now much better understood. On one hand, intervention at the level of a single gene is truly valuable for reducing the risk of a problem where the genetic profile has been established; on the other hand, no one now expects the impossible – a vague enhancement of a highly desirable and subtle mental trait achieved by tweaking a single strand of DNA in every cell of the body. But in any event there is not much demand for that kind of thing anymore; after all, what would such enhancement get you? More lovers? Children? A better appearance? A better job? None of these values are meaningful anymore; there is, quite simply, nothing left to gain from the status that they would bring. For the first time in the culture and history of humans' status is irrelevant. Experience is all there is.

  So take a look at yourself living out your life narrative in front of a screen, watching yourself watching yourself. You are in no discomfort, all your bodily needs are satiated, and you can turn fantasy into a cyber-reality that seems as real as anything you have ever experienced. All your experiences, which are for the most part virtual, are documented – and these experiences, which can't really be distinguished from your thoughts, feed into a worldwide network.

  Early in the 20th century Pierre Teilhard de Chardin, a Jesuit, conceived a most bizarre and visionary notion, especially by the clipped and clear standards of the imperialist era: the concept of a ‘noosphere’ – from the Greek for ‘mind’, ‘reason’. His noosphere was a collective system of thinking that linked up all individuals around the globe. Now, of course, such a scenario does not seem so crazy, substantiated in part as it has been by the web and the net. There are those who have already likened the web-net to a brain, each individual, fancifully compared to a neuron, eventually communicating incessantly with each other. But the most important aspect of de Chardin's vision has been so far overlooked: each person in the noosphere is completely subsumed by the greater, collective consciousness. In brief, the notion of an independent individual, with a private life and a unique portfolio of thoughts, knowledge and opinions, is finished.

  I imagine that everything about this strange future would be abhorrent were it not happily offset, for most sensible and realistic folk, by its sheer improbability. For a start, you sigh in world-weary complacency, the technology is not available and never will be, so it just could not happen for practical reasons. The Cynics, whom we met at the very beginning of this book, were right. Further, you consider, such a situation could never really arise even if we had the technology simply because of the ‘yuck factor’, that redeeming reflex in all human beings to say ‘enough is enough’. Human nature, being what it is, will surely damp down the excesses of the gadget-obsessed, dysfunctional nerd. A technology won't necessarily be realized just because it is possible, especially if it violates our sense of what is right. And anything that depersonalized us, made us less individual, would go against the grain of human nature.

  Up until this current moment, whether we were in a medieval court or the modern Amazonian rainforest, we would always have had a sense of identity: most of us adult human beings feel our individuality very keenly most of the time. We are aware that we have a mind that is like no one else's, that we see the world in our own special way. Unless we are swept up in a strong sensual or sensory experience where we, tellingly, ‘lose our minds’ or ‘let ourselves go’ we are continuously conscious of our selves as distinct entities. This individuality is, for most of us, our most treasured asset. After all, the dystopias featuring in the two great novels of the mid 20th century are so chilling to encounter not because of any gadgets that they catalogue but precisely because they threaten the sense of self.

  Aldous Huxley's Brave New World paints a future dominated by, as we would now see it, genetic engineering. Society is carefully stratified into levels of humans genetically destined to have different abilities. George Orwell's Nineteen Eighty-Four threatens individuality by breaking down the privacy of the inner world, the mind. Citizens are under almost constant surveillance, and the plot thereby anticipates the possibility of a new information technology for monitoring body and brain processes that invites not only third-party scrutiny but also, as a logical consequence, third-party manipulation.

  When both these books were written biotechnology, information technology and the science of quantum theory were still in their infancy. Half a century or so later the idea of manipulating genes, and even manipulating brain processes with implants, is no longer science fiction. Many of the diverse developments that could soon be transforming all aspects of our lives are repugnant basically because their misapplication could lead eventually to the loss of the sense of self as a distinct entity.

  Until now it may have been easy to think, as The Cynics still do, that simply being human, ‘knowing our own mind’, is the most appropriate and effective bulwark against the technology threatening to engulf us. But we have seen that our grandchildren are destined to have different fears and hopes from ours, or even none at all, engendered by different influences on their brains. The more we learn of the exquisite dynamism and sensitivity of our brain circuitry, the more the prospect of directly tampering with the personalized brain, the mind, becomes a distinct possibility. And the big issue is this: these influences have the potential to be far more pervasive and more direct than anything that has gone before, even beyond the modern solace of punching the switch of the bedside-lamp, a simple yet far-reaching invention, to instantly expunge a nightmare. At the technological limit, then, just how robust will human nature turn out to be? Before we can answer this question we need first to agree on what human nature actually is.

  Until now, human nature has often been invoked to ‘explain’ illogical or irrational behaviour, that undefinable something about being human, sometimes against all odds. Perhaps the collapse of the Berlin Wall is one example; on a smaller scale, the personalization of a bleak cubicle in an open-plan office with postcards and photos or the increasing popularity of farmers' markets all stand testimony to ‘human’ requirements in our lives, beyond the technological and merely functional.

  I found another example of how the David of human nature has restrained the Goliath of modern technology in a new hospital that I visited recently in Oslo. As we looked down from one of the gantries spanning a broad corridor my host explained that the whole design was meant to resemble a street. The hospital was like a village with a main street, the corridor, where staff, patients and visitors could wander, browsing in the shops and cafés along the way. Particularly sensitive to our natural disposition was the way that this corridor was deliberately designed to curve rather than run in a functional straight line – simply because streets in villages have never been dead straight. The effect was indeed remarkably reassuring and comforting in what could otherwise be perceived as a frightening, impersonal atmosphere. And let's face it, it is the impersonal and the ‘depersonalizing’ large-scale that scares us – which is why some of the predictions in this book tend to be so distasteful. So how might the personal, the
individualsized human factor, be described in biological terms?

  Over a hundred years ago Charles Darwin propounded the theory that emotions are universal, and therefore surely part of what we could now call human nature. Following on from this ground-breaking insight the psychologist Paul Ekman, of the University of California Medical School, has much more recently classified human facial expressions as revealing any one of six basic emotions: fear, surprise, anger, happiness, disgust and sadness. One of his seminal studies was of an isolated, pre-literate Stone Age culture that had had no contact with television, radio or any other lines of communication with the modern world. Ekman showed these isolated societies photographs of people with the basic range of facial expressions, and instructed them to point to the one ‘where the person is angry’, and ‘about to fight’, and next to indicate the person who has ‘just learned his child has died’. The responses were the same as they would be in our Western culture, and were the same in every society where the study was conducted.

  This recognition of facial expressions, common to all humans irrespective of their culture, is clearly deeply ingrained, and occurs automatically. Ekman can detect tiny changes in the ‘fight or flight’ systems of the body – changes in heart rate and blood pressure, sweaty palms – when he shows his subjects the six basic emotional expressions; amazingly, these reactions occur even before the subject has consciously recognized the actual identity of the individual face.

  Although these feelings are common to all adult humans, they are arguably not all exclusive to our species. True, disgust is hard to identify in animals, even in chimps, but then neither does it seem to be present at all in small children. For example, infants have no reservation about eating chocolate that in appearance and colour resembles faeces or drinking urine-coloured apple juice from a (pristine) bedpan. Once beyond a certain age, however, these behaviours are simply unacceptable and are met with, yes, disgust.

 

‹ Prev