Book Read Free

Tomorrow's People

Page 12

by Susan Greenfield


  Since quantum systems allow so many possibilities, they can outstrip our fastest conventional chips to the order of some billion times. In practical terms, it means that a system with just forty atoms could track down a single phone number out of every one in the world in just twenty-seven minutes – a modern supercomputer would take a month. Needless to say there are currently severe drawbacks to realizing such formidable computing potential. The biggest problem is that any atom in the quantum computer that collides with another would count as a measurement, and therefore the system has to be completely isolated from the external world. One way round this problem, though hardly very practical, has been to use highly sophisticated and expensive techniques to cool the atoms to near absolute zero, thereby preventing endless random collisions.

  A more realistic approach has been to use a technique (nuclear magnetic resonance, NMR) that makes the nuclei of atoms behave by pushing them into alignment with an externally applied magnetic field. Two alternative alignments of nuclear spin, parallel or not parallel to the external field, would correspond to two quantum states with different energies, the qubit.

  One objection is that atoms are only so obedient, even when marshalled by NMR, for a few seconds at a time. Even so, the systems built to date still permit some 1,000 operations to be performed at any one time. We have only just begun to develop this new technology, and can hardly yet appreciate its potential. But, as one quantum-computer expert has pointed out: ‘All along, ordinary molecules have known how to do a remarkable kind of computation. People were just not asking them the right questions.’

  The excitement over future quantum computers is not so much that they will replace silicon systems for word processing or email – in any case our future will probably not include such word-based communications. Instead, quantum systems would be used for large-scale applications, such as cryptography. It is a disquieting thought that no information on the internet would be safe ever again. But then, future generations may be much more habituated to living in the public eye, with privacy as an outdated notion. Protection of individual data might simply be an unplanned consequence of the enormous plethora of data in the cyber-world, so great that no one could be bothered to access it, even if they knew where to start: at the level of domestic lifestyle such activity would be pointless and unappealing. More worrying, however, are the implications not so much for each citizen and their once-private life as for the commerce, politics and security of the nations in which they live: easy code-cracking and infiltration of secret material would have, obviously, devastating financial and military consequences.

  How might it feel to work with such powerful machines? We shall see later that a sense of insecurity will probably be the underlying tone of everyday life in the future. Generations to come may well be habituated to a lack of privacy, and take for granted the garnering of information over time frames that today would leave us breathless. Just as we are used to a computer culture that would have been impossible to comprehend in the mid 20th century, so the relatively modest shift in mindset to living with vastly more powerful computers should presumably be easy to make.

  By contrast, a basic but possibly catastrophic issue at the heart of the interconnected future of computers and work is the big problem of supplying fuel for those machines. The prospect of life without fuel is not some extremist eco-warrior's prophecy but a very real concern. Our grandchildren may well face up to a crisis where coal is too difficult and too sparse to dig up, nuclear fission considered too dangerous, and solar batteries too expensive; the bleak prospect of an effective return to the Middle Ages, to a life reduced to working, sleeping and eating with only the first two as certain.

  Remember that energy cannot be created, only transformed from a limited pool in the universe. And the traditional, easily accessible sources are running out fast. We are currently witnessing an exponential growth of output from Latin America, Asia and in particular China, which has an increasing energy production capacity of fifteen gigawatts a year. But according to World Bank estimates, developing countries alone will need 5 million megawatts of new generating capacity over the next thirty to forty years, whereas the total world capacity is about 3 million megawatts at the moment. The average citizen of the developed world is using only 1 per cent (approx. 2,500 kilocalories) for staying alive each day, dedicating the remaining 99 per cent (250,000 kilocalories) to making their daily life more enjoyable.

  As the need for power is increasing, reserves of fossil fuels – oil, coal and gas – are decreasing. They will be used up between 2020 and 2080, or according to the American Physical Society a few decades later, by 2100. Nuclear energy has a very bad public image – but the only answer for the longer term seems to lie in this direction. Yet even if public opinion could be persuaded that nuclear energy was not as unpalatable as many believe, uranium supplies are also dwindling. What will happen in the very long term? Although there is no simple plan B, the outlook is not completely bleak. Already there is more than the promise of a completely novel way of generating electricity – a different type of battery.

  We are all familiar with the batteries in radios, toys and electric toothbrushes, the primary cell that produces energy from chemicals sealed into it at manufacture and which cannot be recharged. Another type of battery is that used in mobile phones, or the lead-acid batteries in cars. These are secondary cells: they are rechargeable. However, there is a third type of battery: the fuel cell, which generates electricity from fuel supplied continuously from outside the cell. The fuel in question, usually hydrogen gas, is passed over one electrode and oxygen is passed over the other. The ensuing flow of electrons is, of course, electricity, whilst the subsequent combination of hydrogen and oxygen yields water (H2O) as a by-product.

  The basic principle of the fuel cell is hardly state-of-the-art high-tech. The first battery of this type was built as long ago as 1839 by a Welsh judge, Sir William Grove. Prototype modern fuel cells already exist, and as such are now in limited use. For example, the space shuttle needed continuous energy, so a fuel cell was eventually developed that ran on hydrogen and oxygen; there were no pollutants and the sole by-product could be drunk by the crew. So, since fuel cells offer a source of energy that does not run down, do not need recharging and produce valuable water, they would seem to be the perfect means of future energy supply. But, as always with new technologies, the components, the batteries, are bulky and expensive, which prohibits their everyday use; yet a still more fundamental requirement is a cheap and easy source of hydrogen, the essential ‘fuel’ of fuel cells.

  Hydrogen has benign environmental characteristics, but it is hard to come by cheaply. It can be extracted from fossil fuels, but that would solve no problems. Another possibility is to split water into hydrogen and oxygen using electrical current (electrolysis) from other renewable energy sources: but what might those sources be? A completely different way of tackling the question, which breaks this vicious cycle, is to turn to biology.

  One idea is to extract hydrogen from methanol, which could eventually come from genetically modified trees. And another alternative is proposed by Professor Tasios Melis of the University of California, Berkeley, who is modifying the properties of algae to produce hydrogen from sunlight and water. Melis has made the important discovery that sulphur deprivation dramatically changes the metabolism of the algae so that they can function without generating oxygen. Within twenty-four hours the algae become independent of oxygen (anaerobic), and in so doing activate an enzyme that produces hydrogen in the light. This ‘microbial electrochemistry’ approach has the appeal of harnessing biological sources rather than energy-inefficient and potentially toxic mechanical ones.

  Along different biological lines, Stuart Wilkinson, of the University of South Florida, has provided a complete alternative to electricity with a prototype ‘gastro-robot’. As the name suggests, ‘Chew Chew’ gets his, or arguably her, energy from food – in this case sugar. In Chew Chew's container are E. coli bacteri
a which metabolize sugar, releasing electrons on one side of a fuel cell; the negatively charged electrons are drawn to oxygen atoms on the other side, hence creating a flow of electricity. The immediate, obvious applications of Chew Chew's technology would be mowers and all farm machinery, which could use vegetation as food. Although there are still non-trivial issues of food location and identification, food gathering, chewing, swallowing, digestion (energy extraction) and waste removal, such a device – like the sulphur-deprived algae – at least holds the promise, given appropriate supplies of food or water and sunlight respectively, of generating electricity indefinitely.

  Future generations may well see ‘energy’ in a different light, if it becomes, one way or another, so closely linked to biology. From our early-21st-century perspective we might imagine that this would make everyone more energy-conscious; but if our successors know nothing else, taking bio-energy for granted might be yet another example of how a mindset is transformed because traditional divisions have broken down – this time between the physics of domestic appliances and living things.

  In any event, let's assume that, unconstrained by the laws of physics or energy shortages, IT will continue to shrink whilst becoming ever more powerful; how will it transform the way we work? The first and clearest impact for most of us will be in our physical surroundings. Entering your workplace and logging on to your computer will be very different: passwords will be long gone. Software such as ‘FaceIt’ already exists that scans some fourteen facial features that do not change. And once in your office you will be surrounded by interactive, smart devices…

  High-quality visual displays will be able to ‘pop out’ of small objects such as pens. Philips are already dreaming up ways in which the surface of a desk could ‘recognize’ tools placed on it. Such ‘active’ tools would include video-telephony, enabling you to communicate not just with sound but also with body language and gesture. Even the techno-equivalent of a 20th-century handshake might involve the placing of palms on the screen, thereby putting you in virtual contact with your interlocutor. Once-cumbersome monitors and processing units of PCs will be merged into desktops, whilst the currently state-of-the-art Bluetooth radio system for streamlining communications will evolve into a truly single physical system comprising voice- and email, mobile phone, fax, internet access, diary, word processor and video-conferencing facility.

  But as the Information Age matures we will not necessarily carry on working in exactly the same way as before only more efficiently. Instead, the new information technologies will change not just how we work but what we actually do – and most conspicuously – where. Thirty years ago the ‘new idea’ was the notion of modular office boxes with moveable walls. Another innovation was ‘hotel-ing’ whereby employees were assigned a phone and workplace on a day-to-day basis, as office nomads. Yet still we cling to the idea of an office, a physical place set aside essentially for talking into dictaphones, into the phone or face to face. Even the current transitory phase, whereby the boss dispenses with a traditional secretary because he or she can email or word process, will eventually fade as voice-interface IT takes hold. A room for just talking does seem then to be increasingly odd. Indeed the basic concept of the office is actually 150 years old, and may be about to become as obsolete as outdoor privies are today.

  Take, for example, IBM in Chicago. The company used to employ 10,000 on site; now the workforce is down to 3,500, of which 80 per cent work from home, and the building itself is up for sale. Similarly, Motorola is planning to relocate 40 per cent of its workforce in their homes. One general prediction is that soon a third of the workforce will be doing gainful work from home. There will be a rise in virtual organizations operating more flexible, demand-oriented production networks. But as more people work from home the ‘beehive’ mentality of humans will surface as an increasingly important factor – the perhaps obvious human need to feel a valued part of a busy, thriving community, which is not met by living as an isolated hermit with no immediate incentives or constraints on performance. Perhaps, in the future, that very need might be muffled and muted by the habits of a life dominated by technology, or alternatively the technology might itself be so good that the need is met artificially; but until that transformation – either in human nature or in IT – is complete there will be an uneasy tension between the obvious advantages of working at home and the feelings of depersonalization it could bring.

  In the meantime, not just the place and manner of work but also the type of work available is inevitably already changing as a result of the cyber-revolution. Michael Dell, when starting up his now hugely successful PC company, made the then daring decision to offer his product exclusively over the internet. In one simple step, the burgeoning internet economy is reducing the costs of transaction and of bringing a product to market. The old rationale for forming large companies – namely to reduce the costs of gathering in materials and economizing on other physical desiderata – is no longer sacrosanct; accordingly, we shall start to see an ever-increasing proliferation of alliances. The trend can only continue, therefore, towards a breakdown in monolithic organizations in favour of smaller, more virtual units that, although independent, network with each other.

  Of course, some companies will be larger than others, but the watchword in the knowledge economy will be ‘specialization’. Instead of battling other wannabe leviathans in a winner-takes-all competition, each organization will focus on doing only what it is best at, outsourcing the rest and networking synergistically with other small companies. Imagine a galaxy of small enterprises, with subcontracts, as satellites around a bigger company. The ideal, almost a caricature, company of the future would be all knowledge and no assets at all, except webs of flexible relations with suppliers and subcontractors.

  How will this change the thinking of the workforce of the future? One immediate result will be less competition, and therefore a cooperative culture, yet with less security, involving a much more frequent change of jobs. In fact, the concept of a ‘job’ as we know it may well disappear altogether. Driven by the just-in-time agenda on which small, high-risk businesses thrive, firms will perhaps bid for employee time almost on a day-to-day basis.

  Most certain of all is the prospect that this decade will deliver the final death-blow to the expectation, already on its last gasp in the late 20th century, of a job for life. We shall have to come to terms with more down-time unemployed or retraining, and will do so by taking ‘portfolio’ careers into our own hands, with little regard for corporate loyalty. This death of the ‘job’ concept and rise of the portfolio-toting freelancer is by no means limited to white-collar workers. But we should not underestimate the shift in attitudes that will be required for such changes to become universal. There will be a big shift in requirements towards personal and communication skills, rather than just intelligence. Management expert Tom Peters says that workers ‘must no longer count on corporate hierarchy for their careers or their identities’, but instead ‘must act like freelancers and entrepreneurs, building a portfolio of skills and accomplishments they can use to negotiate the next job’.

  The old structure of workers and bosses has been eroding for decades. This erosion will soon be even more widespread. In 1970 it took some 108 men 5 days to unload a ship of a particular size. However, since containerization, a comparable task is now achieved

  by 8 men in a single day, amounting to a reduction in man-days of 98.5 per cent. The development of enterprise software is currently doing for the white-collar workforce what containerization, forklifts and robots did for blue-collar workers: the managerial sector may be in for a staggering 90 per cent reduction in the next ten years. With the loss of the low-skill, repetitive jobs that will be obsolete in the robotic age, and the rise of newly proactive other ranks, middle management will go. Not only will the rise in dotcoms offer the opportunity to cut out the middleman or -woman but also the trend for outsourcing will cut labour costs and the attendant administration.

&nbs
p; E-commerce itself is ballooning as an industry – netting some $300 billion in revenues in 2001. IT has now grown to the size of the American auto-industry; almost half the workers in industry either produce or are now intensive users of IT. As well as causing massive reorganization in the workplace, the IT revolution has inevitably recast the nature of skills required. As the US Department of Labor has indicated, you need only to look at the job ads to see how rapidly the job market has changed even within the last ten years. Only a decade or so ago there was still a demand for typists, repair technicians and switchboard operators. Now we search for webmasters and desktop publishers.

  Even within the professions, e-consultations and cyber-surgery may eventually oust flesh-and-blood lawyers and doctors; gradually these expensively and exhaustively trained individuals, imperfect repositories of knowledge, will give way to constantly updated robotic and computer systems that are far less fallible and can cater for every contingency. And outside of the professions one possibility is that people could become extensions of computer-driven production, a more high-tech and interactive dehumanization than the assembly line. Such operatives might become, like Silas Marner, distanced from the natural world, methodically working at a task which offers no personal satisfaction but demands rather ‘the bent, tread-mill attitude of the weaver’.

  Then again, job descriptions could become so flexible as to be meaningless. The age of just-in-time operatives, geared to meet the needs of just-in-time production, will be upon us. For most of the next generation, flexibility in learning new skills and adapting to change will be the major requirement as they work their way through smaller companies 99 per cent of UK businesses in 2001 employed less than fifty people. Once again, the trend continues away from the notion of the ‘job’ towards setting your own agenda: you'll then get on with doing whatever needs to be done, in whatever way seems best, with whatever skills you have – and those you don't have you'll outsource.

 

‹ Prev