Tomorrow's People

Home > Other > Tomorrow's People > Page 19
Tomorrow's People Page 19

by Susan Greenfield


  The second problem is the worldwide implication of genetic enhancement. Even if the process were cheap enough to be commonplace in the developed world, it is hard to see how rapidly developing countries such as those in sub-Saharan Africa could catch up with this technology, when the majority of their citizens are still without access to clean water and have yet to make a phone call. Gregory Stock argues that we need not fear a shrinking of our human gene pool: even if there were as many as 1 million genetically altered babies born per year that would still be only 1 per cent of births worldwide.

  But surely there's the rub. Imagine a world where a minority from the developed countries have no congenital physical or mental defects, let alone that they are healthier, stronger, age more slowly and have a higher IQ, compared with those from poorer nations. So little would the two groups have in common, so different would be their interests, agendas and potential, that they would scarcely interact. Or perhaps a more real risk would be that they would interact, but in a situation where the first exploits the second.

  So might we, ultimately, see a speciation, a divergence of the human race into two separate species, the ‘enhanced6 and the ‘naturals’? Closer to home, if enhancement were a commodity just for the rich, for the stratum in society, say, that can currently channel income into private education, then might such segregation even occur within a single society, a return to the rigid class divide of old, only even more definitive? The non-negotiable hierarchy of Brave New World might become a reality.

  But there is still a further step that Aldous Huxley didn't contemplate. There remains one final tweak to a world where we manipulate genes with such ease and sophistication: a ‘synthetic’ genome. Currently the concept of a ‘synthetic gene’ is the addition of a new gene to a string of existing genes. Here the innovation, compared to traditional, longer-term methods of breeding, is partly that the same gene can flourish and function when taken from one species and introduced into another. For example, a natural gene that some refer to as ‘synthetic’ because it is taken from an insect can make grapevines resistant to attack. These alien genes produce proteins that kill the relevant bacterium, and have now been successfully inserted into an important variety of vine.

  But entire genes do not need to be transported: one spin-off of the genome project is the development of tools for manipulating individual components within a DNA molecule. Researchers can then construct any sequence of base pairs, the rungs in the ladder of DNA, and the ultimate goal is the precise, targeted placement of artificial gene segments. ‘New’ genes may originate from another species, as when a viral or bacterial gene becomes part of the human genome, or they may come about as the result of random mutations of the existing DNA structure. These mutations could be the alteration either of ‘normal’ gene structure or of gene sequence in the genome. One way in which a gene could mutate, or be made to mutate, would be the release from a radioactive element of a high-energy particle that could enter the cell, and collide with a strand of DNA. Such an encounter fractures DNA, and although internal cell-repair mechanisms can usually reassemble the strand in question, several base pairs end up out of order or missing entirely. In any event, however they are produced, these new genes, spliced into the existing genome, are ‘synthetic’, though naturally occurring.

  The latest technology is challenging even that constraint. Already the known sequence of a particular bacterium genome has enabled the Institute for Genomic Research in Maryland to determine the number of genes needed for survival – about 50 per cent. These genes can now be synthesized and inserted into artificial membranes. It is a very real possibility that these ‘cells’ could then divide, effectively creating artificial life. If so, eventually one would not need human donors or a Gamete Marketing Board at all; the virtual child could be just that, made flesh. In the future we may be able to type out a wish-list sequence of base pairs, and an automated DNA synthesis machine will produce the relevant string of DNA to order; we would need to be able to produce base-pair sequences that code for completely unique compounds.

  One of the many hazards is that, once again, there may be additional unforeseen consequences; for example, many non-protein compounds may be expressed by reactions similar to those used by RNA to make proteins. Ultimately it may be possible to trigger synthetic life forms capable of reproducing themselves and to use as ‘factories’ to produce the compounds we need. After all, it is the sequence of base pairs within DNA that makes genetic code; any change will change the blueprint, and hence the product.

  As yet no one knows the size of the minimal genome for sustaining life. We humans have some 3 billion base pairs, whilst a virus, not taxed with the strain of an autonomous existence, has a mere 10,000 bases. To date, the smallest genome known – 6,000 bases – belongs to the bacterium Mycoplasma genitalium. It makes sense, therefore, to start with bugs made from ‘artificial’ genes; Clyde Hutchinson, of the Institute for Genomic Research, is convinced synthetic viruses will be available within a few years.

  The problem has been previously that within a single chain of DNA there are hundreds of thousands of molecular rungs on the ladder – the base pairs; but until recently molecular biologists could join together only a hundred or so, at most. But now Glen Evans, Director of the Genome Science and Technology Center, University of Texas, has got around that problem, and devised a method to eliminate ‘junk’ DNA. As a result, he has mapped out the way DNA is configured to create ‘synthetic organism one’ (SOi), a microbe. ‘SOi will have no specific function but once it is alive we can customize it. We can go back to the computer and change a gene and create other new forms of life by simply pressing a button.’

  One application of this new technology would be the appearance of designer bugs for infecting target tissues such as tumours and then killing them. Or we could have our guts infected to produce Vitamin C. But just imagine the perils, intentional or accidental, of infecting humans and wildlife. If that is not awesome enough, imagine what would happen if SOI could feed and reproduce, and thereby exist independently. Scientists like Hutchinson and Evans would have, in the eyes of some, lived up to the Dr Frankenstein image of scientists by creating new life. By manipulating and even creating genes future generations will lead very different lives from ours, and indeed they will view life itself in a very different way. Even taking the highly indirect relation of mental trait to genetic provenance into account, it seems likely that dysfunctions such as depression or schizophrenia will be much rarer, not only because of screening and intervention at the level of the gene but also because daily life will be more homogeneous, further removed from reality, less left to chance events ‘out there’, more rooted in a cyber-existence.

  Yet a gene-driven reduction in mental anguish, as well as the almost certain alleviation of physical suffering, might mean that human nature will be sanitized. It is of course an age-old debate whether suffering ennobles and enables human beings to reach their true potential by coming to terms with the vicissitudes of life. What if that life was no longer highly individual? We have seen that not only might generations to come have a more limited and spruced-up gene pool but also that our manipulation of those genes could eventually break down the traditional stages of our life narrative. Soon there will be little reason to have more than one child with the same partner, and indeed each child could lay claim, in theory, to a variety of parents: the genetic donors, the egg donor, the surrogate mother and the parents who bring him or her up. Each of these claims to parenthood of one child could come from different individuals.

  Clearly, our attitude to life, and to living, could well be transformed. If everyone is healthy and mentally agile into old age, if we all have a homogeneous lifestyle, passively receiving incoming sensory stimulation, if sex and reproduction are utterly segregated, if anyone can be a parent at any age, or indeed if parents as such can be abolished by a combination of artificial wombs, IVF, and even artificial genes, then all the milestones that mark out one's life narrative w
ill be removed: being a child in a nuclear family, being a parent, being a grandparent, coping with the unexpected, with illness and with ageing. We may not ever be successfully enhanced to be super-clever or super-witty or good at cooking, as some might imagine, but in the future we may all adhere to a physical and mental norm, not just through the direct manipulation of genes but also through a new type of life made possible by such intervention, a life where proactive individuality, an ego, is less conspicuous, less used, less abused and less needed.

  Some, such as Francis Fukuyama or the psychologist Steven Pinker, argue that human nature can survive changes in the environment, that it is an irreducible ‘factor X’ wired into our brains and bodies that makes us so special, and so different from all other species. But our bodies and brains are only composed of genes, the proteins they make, and other molecules that those proteins make, which in turn make cells. This shifting landscape of chemicals will be influenced dramatically in the future by gene manipulation – not to mention manipulations of the environment. In any case, we have seen that human nature is inextricably caught in an endless dialogue with human nurture, so let's now turn to the prospects for 21st-century child-raising – education.

  6

  Education: What will we need to learn?

  Education is currently in crisis. As every year marks record successes in national exams so the protestations about dumbing-down become more shrill, whilst universities remain remote and expensive enclaves of an elite. Teachers are demoralized and parents angry and anxious. Given the existing culture of core curricula, pressure, audits, consultations and experimental new ideas, it is small wonder that there is no consensus on the bigger picture: what we should be teaching the next generation to equip them for citizenship in the mid 21st century, and beyond.

  The large-scale changes in our lifestyle that might become the norm before too many decades have passed raise fundamental questions about the point of education as we know it, and most importantly about the type of mindset that 21st-century education will create. If the environment is about to change so radically then so will our minds; neuroscience and neurology are offering a wealth of examples which all illustrate a basic yet exciting notion. The human brain reflects, in its physical form and function, personal experiences with supreme fidelity.

  Soon after his birth it was clear that there was something wrong with Luke Johnson. The little boy couldn't move his right arm or leg: he had been the victim of a stroke just as he was about to enter the world. But over the next two years the paralysis slowly, seemingly miraculously, receded. Luke now has completely normal movements – and he is far from unique. A staggering 70 per cent or so of newborn babies who have suffered disabling strokes in the peri-natal period regain mobility. The brain, we now know, is able to ‘rewire’ itself. It is this ‘wiring’, the connections between our brain cells, that makes each of us the individuals we are. However, the electronic metaphor of fixed hard-wired circuits is not really appropriate, since it misses the critical point: our connections in the brain are constantly changing, adapting to our experiences as we interact with the world and live out our personal set of experiences. So just how might the experience of 21st-century life leave its mark on the brains of upcoming generations?

  When a baby is born he or she has a far greater density of connections (synapses) between neurons than an adult does. However, neuroscientists cannot easily study, on a large scale, precisely how these connections configure in different regions of the human brain from one person to the next: the distressing attitudes of certain cavalier and unscrupulous pathologists in the past, combined with a conviction that human body parts are an integral part of the deceased, now mean that human brains are usually destined not for laboratories but for funerals; the valuable clues about mental function that they might contain are therefore locked away inside them and lost for ever.

  Yet from the limited numbers of autopsies of the human brain that have been possible, neuropathologists have discovered that synapses in the outer layer of the brain (cortex) relating to vision peak at about ten months; after that, the density slowly declines until it stabilizes at about ten years of age. But in another region towards the front of the brain, the prefrontal cortex, the formation of connections (synaptogenesis) starts conspicuously later and the subsequent pruning of those contacts takes longer than in the visual regions; in this case the density of connections starts declining from mid adolescence and reaches a plateau only at eighteen years old.

  How do these developments in the physical brain relate to the development of mental abilities? Everyone recognizes that the first few years of life are critical for the acquisition of certain capacities and skills. The most obvious deduction, therefore, would be that the number of synapses at any one stage of development is linked to the sudden appearance of some new ability. But the problem is this: we now know that these skills carry on improving, even after the densities of connections between neurons dwindle to adult levels. So what really counts, ultimately, in determining how information gains long-term access to our neurons? It turns out that, rather than brute number, the pattern of connections must be all important. Nature provides a plethora of synapses over the first few years of life, which grow as the brain does; these connections are long enough to connect up vast tracts of brain terrain. ‘Sculpturing’ (a clichéd but highly appropriate label) then occurs, whereby supernumerary contacts disappear – and just as a statue emerges from a block of stone so a unique brain takes shape. But unlike a stone statue, the individual pattern of connections that make up your brain remains highly dynamic. As you experience each moment of life, each event will exaggerate or blur some aspect of the overall design within your head.

  In 1981 vision experts David Hubel and Torsten Wiesel won a Nobel Prize for an astonishing discovery: in the developing brain, it turned out, there are particular windows of time, ‘critical periods’, in which certain large-scale wiring occurs. One particularly moving example of critical periods at work in real life is the story of a small boy who, aged six, presented as a medical mystery: he was blind in one eye, even though the eye seemed perfect. Only after extensive questioning of the parents did it emerge that he had had a minor infection when he was less than a year old. In itself the condition was trivial but the treatment had involved bandaging the eye for several weeks, weeks that corresponded to the ‘critical period’ for the eye to establish appropriate contact with the brain. As a result, the unclaimed neuronal territory was invaded instead by connections from the working eye, so that when the bandage was removed there was no brain space left for the previously infected eye: it was useless therefore, and the boy was blind in that eye for the rest of his life.

  Usually, however, the windows of time are not as rigid nor are the effects of ‘missing’ them as irreversible. For example, there is a naturally occurring equivalent of the ocular deprivation imposed by the bandage. Sometimes babies are born with cataracts. Surgery can often help enormously and, of most relevance here, is often more effective when there has been a cataract in both eyes. If neither eye is working normally, and thus remains unstimulated by the ongoing act of seeing, then the brain territory will go unclaimed: there are no connections from a working eye to invade it. But after surgery, once the eye on each side is able to function, each will hook up with the relevant part of the brain. These observations show that some recovery of function is possible, even though certain time frames are clearly very important.

  The notion of key periods for development of a basic brain function like vision has led some educationalists to speculate that such windows of opportunity might exist for more rarefied activities, such as reading and arithmetic. As yet such a question is hard to address; so many different factors must contribute to literacy and numeracy that it would be hard to dissect, across a wide range of diverse individuals, the isolated single factor of age. Moreover, we are very likely to learn in different ways as we grow older. A young child will absorb any incoming information with scant sales
resistance (remember the Jesuits' promise, ‘Give me a child until he is seven, and I will give you the man’). However, the older we become the more any experiences, including formal teaching, will be measured up and evaluated by the less accepting, maturing mind.

  These checks and balances that make up an individual's mindset have their root in the connections between brain cells. For the most part these connections converge onto a zone on the target cell named after the Greek for tree, ‘dendrites’. Dendrites are so called by virtue of their physical shape, which does indeed resemble the branches of a tree, and, like trees, some neurons have more extensive branches than others. The greater the ramification of dendrites the more readily a cell will be able to receive signals from incoming neurons. The basis of brain growth is not the increase in bulk numbers of neurons themselves but is primarily a story of the proliferation of the dendrites.

  These dendrites will configure in a way that reflects what has happened to you. A pivotal, classic experiment in rats shows that the post-natal environment exerts a massive influence in determining how widespread this ramification of connections will be. Scientists compared the effects of an ‘enriched’ environment, complete with rat toys – ladders, exercise wheels and the like – with the ‘normal’ lot of a lab rat – a warm home cage with food and water but little else. Post-mortem examination did indeed reveal that the brain cells of rats who had experienced enrichment had more extensive branches than their more typically housed counterparts. Although this is clear evidence of how the stimulation of everyday life can make a difference to the brain, neuroscientists nowadays are quick to point out that for a ‘normal’ feral rat ‘normality’ would be something more like the enriched environment, whilst the lab situation, sadly, is more truly akin to deprivation. Yet the findings are surely all the more chilling as a consequence.

 

‹ Prev