Our Own Devices: How Technology Remakes Humanity

Home > Other > Our Own Devices: How Technology Remakes Humanity > Page 32
Our Own Devices: How Technology Remakes Humanity Page 32

by Edward Tenner


  In popular as well as celebrity culture, glasses have been revalued. In the technological boom of the 1990s, plastic-rimmed taped glasses were praised for their “nerd chic.” At midcentury, the spectacles of Piggy in William Golding’s Lord of the Flies (1954) have the vital power to start fires, yet Piggy himself is a born victim. In J. K. Rowling’s Harry Potter book series (beginning in 1997), a bullied child instead becomes the protagonist, and Harry Potter—style round glasses have been revived for fans.

  Glasses are for protection as well as reading, in large part as a positive unintended consequence of military research in the 1940s. Early plastic lenses made of polymethyl methacrylate (PMMA), a material marketed under the trademarks Perspex, Plexiglas, and Lucite, scratched easily. But during World War II, the sheets of glass in some U.S. bombers were held together by a bonding compound called CR-39, developed by Pittsburgh Plate Glass, to reduce weight and increase the planes’ range. An optical researcher, Dr. Robert Graham, was able to experiment with CR-39 after the war and developed technology to turn the difficult, sticky raw materials into lenses. Meanwhile, the U.S. government was mandating higher breakage-resistance standards that added to the weight of optical glass, while chemists in the late 1970s developed long-sought scratch-resistant coatings for plastic. The result was that CR-39 became the preferred material and captured most of the lens market by the 1980s. Glass became a premium material because of the extra step needed to bring it to safety standards. (Graham later used his fortune to establish a eugenically inspired fertility program called the Repository for Germinal Choice, instantly dubbed the Nobel Sperm Bank by the press.) The search for protection did not end with CR-39; polycarbonate, a more expensive material with superior impact resistance, has since become the standard for sports protection. Many factories mandate polycarbonate safety glasses for employees whether or not they need vision correction: hard hats for the eyes.39

  Military aviation also helped create the modern sunglass industry. Tinted glasses had existed in China, and in the West at least since the Renaissance—a group of friars in Don Quixote are wearing them on their travels—but were developed scientifically only in the twentieth century. Bausch & Lomb originally produced Ray-Ban green goggle lenses in the 1920s for U.S. Army aviators troubled by glare reflected from clouds. During the Depression, when plastic sunglasses cost only a quarter, Bausch & Lomb put the new glass into plastic frames and began selling them under the trademark Ray-Ban. In place of the nineteenth century’s optimistic valuation of the sun, postwar research substituted far more sobering warnings about ultraviolet light, which have made sun protection almost as important in outdoor eyewear as shatter resistance.40

  Meanwhile, a second trend was growing. Instead of supplying optics as externally mounted protection for the eyes, it aimed to augment or even reshape the surface of the eye. Like the improvements in worker seating that we have observed, this work originated in wartime necessity. An English ophthalmologist, Harold Ridley, operating on Royal Air Force fighter pilots injured in battle, discovered that their eyes had not rejected the slivers of PMMA from the cockpit hood. He wondered whether this plastic could replace the lenses removed from elderly patients when cataracts, areas of hard and cloudy dead cells, had grown excessively. Since cataracts are one of the world’s leading causes of blindness, and conventional cataract surgery required patients to use special thick glasses afterward, Ridley risked his reputation and incurred the scorn of most of his British colleagues to develop a new operation, the intraocular lens transplant. Since the first procedure in 1949, 200 million people worldwide have received artificial lenses. Meanwhile, new plastics developed from the 1930s to the present have made possible a technology proposed three hundred years earlier by René Descartes, the contact lens. For the first time, laypeople were applying medical devices to the surfaces of their own eyes, and the eyeball lost something of its sacred character. Optometrists, long separated from MDs in part by their lack of training in physical contact with the eye, now began to cross a psychological boundary.41

  It is only a step from the contact lens to surgery that actually reshapes the cornea—most often to treat nearsightedness, but also for farsightedness and astigmatism. The most popular procedure, Lasik (“laser in situ keratomileusis”), was approved by the U.S. Food and Drug Administration in 1995 and now is performed on an estimated million patients a year at a cost of $2.5 billion. A special knife removes a circular flap in the eye, exposing the cornea, which is reshaped by a computer-controlled laser. The paradox of Lasik (and, to a lesser extent, other refractive surgery) is that because it has a generally high rate of satisfaction and offers unusually speedy and visible results to most patients, it has become the most common elective surgery in the United States, so that even a small proportion of complications and disappointing results (such as impaired night vision) affects tens of thousands of people. Yet both this dissatisfaction and the joy of successful results arise from the same unhappiness with the eyeglass as an external technology. The technology of the body seems to follow a trend from heavier to lighter and more flexible external devices, to even lighter ones worn against the body, and finally to the reshaping of the body itself. Thus, as Valerie Steele showed in an exhibition at the Fashion Institute of Technology in 2000, fashion has not so much rejected the corset in the twenty-first century as substituted diet programs and exercise machinery for fabric, whalebone, and steel.42

  THE REVENGE OF MYOPIA

  Even if a safe surgical procedure is developed, there will probably always be a strong market for eyeglasses. Surgical implants to correct presbyopia are still at an early experimental stage and are based on an unconventional theory of the condition’s development. Contact lenses for presbyopia are still at an early stage of development, and many people consider even soft contacts uncomfortable, so that eyeglasses—perhaps using new materials but following principles established hundreds of years ago as musical keyboard layouts do—are likely to remain the most common form of vision correction.43

  Correction will be needed. The twenty-first-century information society, already visible in its outlines a hundred years ago, continues to affect vision. Myopia appears to be spreading in the United States and other industrial nations. It was estimated in the early 1970s that 25 percent of the population between twelve and fifty-four years old is myopic. A 1996 study in Massachusetts, though, revealed a 60 percent rate in the twenty-three- to thirty-four-year-old bracket, declining to 20 percent among state residents older than sixty-five. Contact lenses help conceal the true rate even as they increase the cost of corrected vision.44

  The causes of myopia are still debated. Genetics influences the likelihood of developing the condition; children with two myopic parents are over six times more likely to be affected than those with at least one nonmyopic myopic parent. But environmental influences are even more striking, and epidemiologists have found strong links to schoolwork. Among children enrolled in U.S. Orthodox Jewish secondary schools, males studied sixteen hours each school day—presumably this includes homework as well as instruction—and had an 81.3 percent myopia rate; females, who studied eight hours, had a 36.2 percent rate. (Among Jewish children attending secular high schools, with a six-hour study load, 27.4 percent of males and 31.7 percent of females were nearsighted.) Asian societies with rigorous school programs are similarly affected. In Hong Kong 75 percent of high school students and 90 percent of college students are now myopic. In Singapore, fully 98 percent of medical students are nearsighted; the national air force can find few visually qualified pilot recruits. Yet there are also puzzles in the environmental history of nearsightedness. Iceland has long been both homogeneous and literate; why should its rate of myopia have increased from 3.6 percent in 1935 to 20.51 percent in only forty years?45

  Singapore, with its social discipline, high literacy, precision industries, and educational drive, recalls the values of nineteenth-century Prussia. Seemingly repeating, with a vengeance, the causes and effects that so alarm
ed Dr. Hermann Cohn, it is now believed to have the world’s highest rate of myopia. A new complication has been added by the research of a biologist, Joshua Wallman, whose experiments with animals suggest to him that when children wear eyeglasses to correct myopia they may inadvertently be changing the ways in which the eyes grow, actually elongating them further. This idea still has not been proved or disproved.46

  What is certain is that even a skill as abstract as literacy has an unexpectedly strong physical aspect. In the history of humanity, our attention has shifted from the horizon to the length of our own arms: the printed page or the electronic monitor, or at the farthest the television screen. As with other technologies of the body, in changing our world we have changed ourselves—and not only ourselves. The Western metaphor of the Creator as watchmaker, popular ever since the Enlightenment, suggests even the Deity squinting through a lens, as though remade in our own image.

  CHAPTER TEN

  Hardheaded Logic

  Helmets

  IF THERE IS a distinctively twentieth- and early-twenty-first-century body technology, it is paradoxically one of the oldest external modifications of the body, possibly older than the chair (and indeed sometimes used for seating): the helmet. Of all the familiar objects we have observed, it was the helmet that appeared anachronistic for most of the eighteenth and nineteenth centuries. It was never entirely abandoned in military service, but for years it remained the specialty of firefighters and some police forces.

  Helmets, more than any other technology, defy conventional chronology. They seem to evolve like metallic and polymeric crustaceans, but not conventionally; a form may disappear for a thousand years and then reappear on a new branch. Another may keep its shape but change materials and habitats. Medievalists have advised the commanders of industrial armies, and armorers from dynasties of European craftsmen have helped tool up for new designs with classic jigs and hammers. Helmets represent risk aversion and aggression, ethnocentrism and cosmopolitanism and sometimes both sides of each dichotomy at once.

  In itself the helmet is distinctive because it does not increase comfort or improve performance as some footwear does, and as chairs do for people accustomed to them. It does not simplify learning or communication, as musical and text keyboards do, nor does it augment or extend our senses, as spectacles do. The helmet sacrifices comfort and even performance to protection. It increases the mass of the human head by as much as two or three kilograms, impedes cooling in hot weather, and often limits both peripheral vision and hearing. Human skull shapes and sizes vary so much that fitting troops has always been challenging. And with discomfort comes the risk that soldiers will remove their helmets and expose their heads to the enemy.

  The helmet is also radical apparel. It is the only rigid prosthesis that able-bodied men and women now routinely use—an exocranium. Humanity has evolved to discriminate keenly among faces, to see subtle nuances of feeling in changing expressions. Helmets are not just mechanical shields but symbolic frames, like spectacles. A human being wearing one acquires a new persona. The Greek hoplite peering from the depths of his Corinthian helmet, the medieval knight in his sallet, the World War I—era German in his metal helmet, the World War II GI in his almost spherical pot, the astronaut, the patriotic hard hat, the deep-sea diver: heroes and villains almost become their helmets. And even flawed designs can build cohesion and morale by evoking shared values.1

  TAKING COVER

  Peoples all over the world have probably made head protection from organic materials such as shells, vegetable matter, and layers of cloth, but few objects have survived. Some helmets, like those made by the Ibo of West Africa, helped make warfare more a ritual sport than the carnage of most Mediterranean and Near Eastern conflicts. Other peoples must have developed effective armor because their wars were deadly serious. As recently as the early twentieth century, the Gilbert Islanders of the South Pacific were known for their spears studded with sharks’ teeth and for correspondingly durable armor, which included the skin of a puffer fish killed while inflated: its natural spikes protected the wearer while intimidating at least the inexperienced foe.2

  The first metal helmets probably appeared in the third millennium B.C. in the Middle East; they were linked to the development of complex, literate societies that conducted organized warfare. Head protection was part of the earliest version of the arms race, not yet between swords and shields but between maces (weapons with weights at the end of handles) and helmets. The earliest protection consisted of leather and felt caps, which in Homeric times were further protected by boars’ tusks. There must have been layers of cushioning under the shells of even the simplest metal helmets, for comfort and heat transfer. An industrial study of the 1960s confirmed the value of head protection—and shows the link between the oldest conflicts and recent industry. A metal helmet spreads the force of an impact over the surface of the indentation the blow makes. On the basis of this research, the military historians Richard A. Gabriel and Karen S. Metz have calculated that a Sumerian helmet consisting of two millimeters of copper over four millimeters of leather could spread the force of an impact so that to knock an opponent unconscious required superhuman strength.3

  As Gabriel and Metz observe, defensive equipment—helmets and body armor—drove the development of early offensive weaponry, not the other way around. The mace, despite occasional revivals as late as the European Middle Ages, was considered an ineffective weapon in combat against helmeted men, and became the ceremonial object it remains. In the early third millennium, the Sumerians had already developed excellent metal helmets with protection for the ears and the back of the neck as the contemporary NATO helmet has; and it was the same people who developed a new style of ax to pierce such armor, having a sharp, socketed copper blade in which a haft fit securely.4

  In Mesopotamia, with closely spaced rival city-states and peoples sharing a military culture, these innovations spread quickly. Even Egypt largely abandoned the mace for the ax after invasion by the helmeted Hyksos around 1700 B.C. Throughout the ancient world, the helmet became indispensable for most forms of combat. In the second millennium and the early first millennium B.C., the Assyrians became the first troops to be equipped with helmets on a massive scale. The helmet, usually with a conical shape ideal for deflecting downward blows, was part of the image of the Assyrian army, probably the world’s first dreaded fighting machine— all the more remarkable given Assyria’s short supplies of metal and fuel. (The Assyrians were also the first ancient army to protect feet as efficiently as heads; they wore tall leather jackboots with iron-studded soles.)5

  The Greeks were relative latecomers to the manufacture of metal helmets, yet they rather than the peoples of the Middle East are considered the founders of the Western tradition of armor. Aesthetically and technically, their achievements were as impressive as those of the Sumerians and Assyrians. Around the middle of the eighth century B.C., they developed a new style of warfare and distinctive arms and armor to accompany it. The Greek infantry soldier was typically a free farmer-citizen with a heavy civic and financial investment in a panoply of arms. (For the Greeks, armor’s essential component was not the helmet but the great shield or hoplon, of bronze-reinforced wood.) The best known Greek armor is the Corinthian helmet, first documented about 750 B.C. It reflected exceptional craftsmanship, conforming closely to the head and sweeping forward to form integrated cheekpieces so that only the wearer’s eyes and throat were even partially visible.6

  The protection of helmet and shield was cumbersome as well as costly. The classical historian Victor Davis Hanson has speculated about how the features of the Corinthian helmet, which prevailed from about 750 to 500 B.C., affected ancient Greek tactics, in other words, how a technology helped shape techniques. The helmet had no earholes and allowed only a limited range of vision. These restrictions encouraged the rigidly stylized form that Hanson believes combat took: two formations of heavy infantry several ranks deep, protecting each other with their giant shields
, charging in formation until the two opposing masses collided on an open field, pushing each other with their shields and thrusting their spears to strike at throats and other gaps in armor. Only elementary commands could be heard, and all battles were fought in the helmets’ claustrophobic darkness, the pressure of comrades on all sides replacing vision. The helmet weighed about 2.2 kilograms—the same as a U.S. infantry helmet of the Vietnam era—but lacked twentieth-century suspension systems. A blow could knock it off or even break a vertebra if the head snapped backward. Battles were fought in the Greek summer heat; hoplites must have perspired profusely, yet martially correct hairstyles were long rather than close-cropped. Horsehair crests were as much for display and morale as for absorbing shocks. They added unwelcome weight and reduced usability by raising the helmets’ center of gravity. Only immediately before battle did soldiers lower the Corinthian helmet over their faces.7

  Many scholars believe there was more individual combat, sometimes without helmets or armor, than Hanson and others acknowledge. Hanson himself suggests that like later soldiers, hoplites modified and personalized helmets and other equipment, and that helmets may have been individually recognizable to members of a unit. But the Corinthian helmet still evokes the terror as well as the glory of combat for good reason.8

 

‹ Prev