The Blind Giant

Home > Literature > The Blind Giant > Page 7
The Blind Giant Page 7

by Nick Harkaway


  Giddens also discusses theorist Jean-François Lyotard’s idea that the post- or late modern era is what follows the collapse of grand narratives, over-arching notions of human development and the human place in the world which have been our context for as long as humanity has been capable of abstract thought, but which are vulnerable to the inquisitorial mentality of Enlightenment analysis. Ask enough questions about a grand narrative, and it comes down to a viewpoint, rather than a fundamental truth. Religious prophecies, Marxist historical materialism – even the Enlightenment itself, with its untested and unfalsifiable belief in the idea that rationality and science would lead inexorably to a better world – have given us ways to locate ourselves in the universe, to understand what we are. With them gone or fading, we have nothing left between us and the raw world. We are like a man who has worn orange spectacles for years, and, removing them, has discovered the rest of the spectrum.

  It’s worth considering the idea that we look at the digital technologies we have made and blame them for the changes in our society because we need a fresh narrative. The Cold War which provided a backdrop in my early life gave way to the War on Terror, perhaps itself an attempt to create a simple, bi-polar worldview. But the War on Terror is problematic, fraught with the discovery of our own ill-doings and undermined by the invasion of an unrelated nation – Iraq. Worse yet, even a cursory examination of its causes takes the investigator back into the realpolitik compromises of the Kissinger era, when we chose to endorse strongmen over democrats who might lean towards Moscow, and when we backed the mujahideen in Afghanistan against the Soviets, incidentally creating the basis of many of the organizations which today are opposed to our versions of democracy. The iconic image which undercuts the storyline we are encouraged to accept – that Bad Men came out of the desert and did Bad Things to our Free World – is the now-infamous picture from 1983 showing Donald Rumsfeld shaking hands enthusiastically with Saddam Hussein.3

  The world we live in is governed not by a single narrative, but by a multitude, a Babel of opinions and priorities, of manifest destinies and lofty goals, corporate agendas and private conspiracies. And it is the nature of the Internet and the social media realm that they reflect this. They are not a separate place, but a statement of our world’s identity. The bewildering complexity of the Internet and of our communications is a map of us, and the fact that it seems to intrude is as much because we have reached out to the world as because the world has come to us.

  The feeling of information overload we have – in some ways quite understandable, because the world is more complex, richer in conceptual information than ever before – is derived not from digital technology, but from our encounter with a world whose patterns and tides are increasingly apparent to us through the lens of that technology, and which carries a weight of history and increasing complexity which we are now beginning to appreciate. It’s not that computers made the world more difficult to understand. It’s that the world – which, yes, is shaped in part by our technology and our scientific understanding – is difficult to understand, and now we know. As we extend ourselves into the world, we are vulnerable, and we need to develop coping strategies. But we need to be there, because the alternative is a hermetic, head-in-the-sand existence which refuses to engage with real problems and lets them pile up.

  That being the case, what are the real battlegrounds of the digital world and the society which sustains it?

  blindgiant.co.uk/chapter2

  3

  Peak Digital

  THIS IS THE High Baroque period of the digital culture. Everything that can be is being digitalized, the weight of commerce and received wisdom (at least some received wisdom) asserting that this is progress, and very much for the best. Digital technology is going into everything: water bottles, car keys, pets and running shoes. You can wire your house so that it can report to you on its energy use, humidity and security. You can buy a library of digital books, and query the author directly via Kindle’s built-in @author feature. Queues form outside Apple’s flesh-and-blood stores for the release of each iteration of its products as if Megan Fox and Jack Nicholson were working the red carpet. Users become tribal over iOS (Apple’s mobile system software) and Android (Google’s competitor with the smiling robot face): which is better, which is faster, which ecosystem is more authentic? We habitually replace our desktops every three to five years because the capabilities of the new machines are so far in advance of the ones we have (whose capabilities most of us do not use) that the software no longer fits into the impoverished motherboards of the older computers.

  We’re like the proverbial man with a hammer: every problem looks like the kind you can solve by hitting things. Commercially and culturally, we are herded towards shiny consumer devices – and stampeded, it sometimes seems, to new formats that replace last year’s new format, accumulating a plastic and silicon junkyard of defunct devices and those infuriating proprietary chargers, none of which fit other almost identical products, each of which costs some ridiculous amount of money to replace when you leave them in a hotel.

  It is not sustainable.

  We live in a world of finite resources, and our digital toys and tools entail, just as much as the auto industry, our dependence on petroleum from unstable and oppressive regimes, and rare earths from what are presently called ‘failed states’. A fine example of the latter is coltan, the mineral from which we derive tantalum, a material vital for the production of electronic capacitors and hence for mobile devices and computers. Coltan mining has been implicated in strife in Congo since the 1960s. A UN report from 2004 states: ‘Illegal exploitation remains one of the main sources of funding for groups involved in perpetuating conflict.’ The report is accompanied by a stark triangular diagram: coltan exploitation, conflict and arms trafficking are locked in mutual facilitation. The centre of the triangle is simply ‘impunity/insecurity’.1

  Consumer electronics, it would seem, are like blood diamonds, and our hunger for them, twinned with our unwillingness or inability to change how the system feeds horror upon horror, allows the continuation of appalling violence in Congo. Understand that the prevalence of rape in eastern Congo – as a weapon of terror, not as a civilian crime – was adjudged the worst in the world in 2007. A UN report from May 2011 acknowledges that very little progress has been made in stemming the violence in the region, and described the levels of human rights abuse as ‘alarming’. The term ‘failed state’ suggests misleadingly that ‘failure’ is a static thing, a noun. It’s not. It’s a continuing action – and it’s ours at least as much as it belongs to the inhabitants of the region. The incidental price of Congolese coltan for the world is apparently local atrocity. This is precisely the sort of information that streams into our extended hearth from the digital, connected world. The right answer, obviously, is not to ignore the information, but to solve the problem – and the only way to do that is to begin to take ownership of the consequences of our trade.

  If the humanitarian cost of the consumer digital culture is not enough, the onrushing peak oil crisis and the environmental situation surely must give us pause. There are still, of course, those who don’t accept the reality or seriousness of anthropogenic climate change. Theirs is a position for which I have great emotional sympathy, but as far as I can tell it’s scientifically bankrupt. The climate is changing and the consequences will ultimately be dire. Even if that were not the case, the acidification of the oceans is damaging our food supplies. On a purely practical level, it is clear that oil is finite and therefore is at some point going to run out, and within a humanly comprehensible span of time. If we’re not at peak oil – the moment at which supply begins to drop away as new fields are harder to reach and reserves are depleted, while demand continues to rise – then it will come within a decade or two, bringing inevitable shocks to the global economy and to our way of being. (The debate about peak oil is endless and circular and no one who is not a petrochemical geologist can truly assess the various claim
s. However, the occasional signs and portents we are allowed to see clearly are not good: in February 2011 a US diplomatic cable from the WikiLeaks cache appeared to suggest that Saudi Arabian oil reserves were 40 per cent smaller than had been believed; HSBC warned in May the same year that we may have no more than fifty years of oil left at current rates of consumption.)

  The always-on consumer society which idolizes digital goods is probably not stable in the longer term – at least in its present incarnation. We had a sobering warning in 2008 with the collapse of the sub-prime bubble, but we seem determined, most likely because it means getting richer on paper by doing nothing at all, to repeat that mistake. The latest area of experimentation with new financial instruments is the global food market: having seen what happened with property – as with nutmeg, tulips, South Sea shares and tech stocks before – we should all be rather unhappy about that.

  Thus, it might be fair to say that we are living in the peak digital era: the brief and impetuous flowering of digital technology during which we inhabit a fantasy of infinite resources at low market prices (because the other costs are concealed, just as – according to Raj Patel, author of The Value of Nothing – our economic and agricultural system conceals the genuine cost of a fast-food burger priced at a couple of pounds; the real cost may be in excess of a hundred).2

  And beyond that, truth be told, fashion is fickle. For the moment, digital technology is cool – at least up to a point. But it’s been the new new thing for a while, and there are newer things eyeing the throne. What if you could swallow a pill and change the way you smell, make your skin sweat beads of gold? Well, perhaps you can. A Harvard biologist and an Australian artist have announced that they are developing an orally administered perfume which responds to your body’s state of excitement and your genetics to produce a unique scent. And DNA is about to go DIY. To do DNA sequencing, you need access to a PCR (polymerase chain reaction) machine. They used to be expensive. Now, however, you can get a basic PCR machine for $600. With that kind of price tag, home biohacking becomes a genuine (and yes, somewhat alarming) possibility. Sure, you need a computer. But honestly, given the choice, which would you say was more exciting: writing code for an iPhone app, or for your goldfish or yourself?

  Finally, the mood in digital technology seems to me to be about seamlessness and integration. There’s a shift taking place from devices that do things to devices that connect you to the Cloud, to digital shopfrontage, to services. The trend in design is for minimalization and integration: it’s all about making the technology vanish into the background, leaving us free to work with the data, the content. Apple’s iPhone is not primarily a phone – that’s just one of the things it does – but a portal, a shopfront for the company’s massive media resources (or, rather, for the company’s role as sales agent for any number of digital items it does not actually produce). The drive is towards removing the sense of mediation from computer-mediated communications, to integrate systems so completely that the hardware vanishes into the world altogether, leaving people, and an environment of manipulable information. When you notice your user interface, that means something has gone wrong. The physical world is being connected and integrated into the digital one, and vice versa. After a certain point, that very ubiquity guarantees a loss of the gloss of digital technology. It becomes a standard tool, and it’s hard to get anyone – except a specialist – excited about a power drill or a winch.

  I’m not suggesting that either fashion or energy problems will mean that we reject the computer – although if we really manage to deplete the planet, that could yet happen. Rather, digital technology will probably replace other, more resource-intensive options in some arenas, such as business meetings. As the cost of aviation rises and video chat technology improves, it seems likely that more and more business trips will go virtual. But at the same time our relationship with digital is already changing from awed to bored, and as we stop thinking about the technologies of the Internet and the mobile phone as new excitements, the systems they are part of and which have evolved around and on top of them will congeal. Hallowed by habit and buried under layers of commercial, governmental and social systems, things will be much, much harder to change. We will reach digital lock-in, and any bad habits and poor choices will be entrenched, just as our present ones are.

  The outcome of the tug between our love affair with digital technology and these various stresses – and others – acting on them probably won’t be clear-cut. As I see it, the polar opposite possibilities are digital ubiquity, in which every house and car is wired and almost all transactions, social and financial, have some digital element; and a retreat from digital technology into other areas, effectively brought on by economic factors such as the price of oil and rare earths and the advance of biotech. I’d hazard that even in the first case, we might still see a move away from the fetishization of digital technology, because it’s hard to fetishize the commonplace. Ubiquity could come to mean invisibility; a seamless integration of the digital world with the physical one in which objects report and can be queried for information about themselves, and buildings, vehicles and furniture go out of their way to be more agreeable to those who use them. Writer and futurologist Bruce Sterling coined the term ‘spime’ for an object which has a digital existence in this way, and whose history can be traced through the Internet, showing its location, point of origin, its journeys.

  As more and more objects can be accessed in this way, or can communicate information to people nearby as they need it, the physical world and the digital one become more thoroughly intertwined. After a certain point, the oddity would be an object which belonged to the class of things which could reasonably be expected to communicate – a car or even a dustbin – but which could not. That point is closer than it seems; any number of items theoretically could already respond to a local request for information if they were set up to do so, and in fact we’re also tagging ourselves in this way, reporting our statuses to Facebook and geotagging (i.e. appending a GPS location to) many of our online interactions. Some tools allow us to do these things in real time: Apple’s new iPhone includes a ‘Find My Friends’ feature which allows users to let their friends check their physical location. There are already services which remind you to act in a given way based on your device’s understanding of your location in the world (for example: ‘buy milk’ if you’re at the market).

  But more likely, to my eye, is a hybrid outcome, an uneven patchwork in space and context. Hybrids – consequence of the interaction of two different strands of thinking and style – are common in our connected world, though you could argue that all the really interesting advances and fashions emerge from the encounter between two or more existing entities to create something new. A hybrid in this case would look a lot like now, only more so. In some countries, some objects would still be digitally inert, while everything else would be wirelessly reporting. In others, privacy laws or practical concerns would prevent full uptake. Some people would disable the system for their devices; others would deliberately feed in bad information. Some cultures and subcultures would simply shy away from digital technology, or disconnect it from the Net. Whatever happens there, however, we are at this moment picking the basic tenets which will define our understanding of our interaction with digital technology – at least in the mainstream. Should technology be ‘free’ or commercial, ‘free’ or surveilled? And, not unrelatedly, what about us?

  It seems to me that we need to move towards our technology and get a better understanding of it, and how we work with it and where it comes from and where it takes us. I’m not worried that technology damages us; I’m concerned, sometimes, that the logic of our technologies – physical and systematic – takes us to places we do not need to visit and leaves us there.

  The human animal is a really cool piece of technology. If you have the chance to go and see the notorious Body Worlds exhibition whenever Gunther von Hagens next rolls it through your town, you should: it’s not the maca
bre and ghoulish spectacle it was made out to be when it came to London in 2002. Rather, it’s the first chance I ever had to appreciate the mechanisms of the human body as extraordinary. In most people’s lives, the only experience they have of the body’s interior landscape is traumatic: a broken bone or an operation. The whole thing is frightening. Even pregnancy scans can give you the willies – especially as the fictional backdrop for all these moments is TV shows like House and ER, in which everything seems to be fine until you start bleeding from the eyes and it turns out you have a variant form of kuru which is transmitted in goldfish urine.

  But the human body beneath the skin when nothing is wrong is hugely impressive and, while squishy-looking and flesh-coloured, also beautiful. (It helps enormously not to have a soundtrack suggesting that the eye-bleeding is about to start.) Von Hagens is often criticized for putting plastinated human cadavers into positions suggestive of athletic leaps or games of chess, but in my view that simply showed them as what they were: human bodies. It wasn’t disrespectful; if anything, it was the opposite. This was a tribute to how amazing we are. I remember particularly a running man, the muscles and tendons which would have been moving his foot frozen in place by the plastination.

  All of which is an obvious preamble to what ought to be an obvious point: there are advantages to analogue technologies and to biological ones. Digital is just one of a string of options that all work, and our present fascination with it has as much to do with skilful marketing and pricing as it does with the technology itself. It is often the most convenient (especially at the moment, where digital is spreading into new arenas in an effort to join everything up), but not always. Sometimes, digital gear isn’t as good for a given job as something else – although that’s a perception that can sometimes raise eyebrows. By way of example, if you asked me to choose what sort of communications gear I’d take into the wilderness for a prolonged trip, I’d want a clockwork radio in there. A satellite phone, of course, is a hugely impressive bit of kit, but if something goes wrong with it I’m stuffed. With a clockwork radio, I might be able to take it apart and reassemble it – especially if there was a manual. Not so the sat phone. I won’t be popping down the mountain for a new SIM and a soldering iron.

 

‹ Prev