The Glass Cage: Automation and Us

Home > Other > The Glass Cage: Automation and Us > Page 11
The Glass Cage: Automation and Us Page 11

by Nicholas Carr


  A medical exam or consultation involves an extraordinarily intricate and intimate form of personal communication. It requires, on the doctor’s part, both an empathic sensitivity to words and body language and a coldly rational analysis of evidence. To decipher a complicated medical problem or complaint, a clinician has to listen carefully to a patient’s story while at the same time guiding and filtering that story through established diagnostic frameworks. The key is to strike the right balance between grasping the specifics of the patient’s situation and inferring general patterns and probabilities derived from reading and experience. Checklists and other decision guides can serve as valuable aids in this process. They bring order to complicated and sometimes chaotic circumstances. But as the surgeon and New Yorker writer Atul Gawande explained in his book The Checklist Manifesto, the “virtues of regimentation” don’t negate the need for “courage, wits, and improvisation.” The best clinicians will always be distinguished by their “expert audacity.”24 By requiring a doctor to follow templates and prompts too slavishly, computer automation can skew the dynamics of doctor-patient relations. It can streamline patient visits and bring useful information to bear, but it can also, as Lown writes, “narrow the scope of inquiry prematurely” and even, by provoking an automation bias that gives precedence to the screen over the patient, lead to misdiagnoses. Doctors can begin to display “ ‘screen-driven’ information-gathering behaviors, scrolling and asking questions as they appear on the computer rather than following the patient’s narrative thread.”25

  Being led by the screen rather than the patient is particularly perilous for young practitioners, Lown suggests, as it forecloses opportunities to learn the most subtle and human aspects of the art of medicine—the tacit knowledge that can’t be garnered from textbooks or software. It may also, in the long run, hinder doctors from developing the intuition that enables them to respond to emergencies and other unexpected events, when a patient’s fate can be sealed in a matter of minutes. At such moments, doctors can’t be methodical or deliberative; they can’t spend time gathering and analyzing information or working through templates. A computer is of little help. Doctors have to make near-instantaneous decisions about diagnosis and treatment. They have to act. Cognitive scientists who have studied physicians’ thought processes argue that expert clinicians don’t use conscious reasoning, or formal sets of rules, in emergencies. Drawing on their knowledge and experience, they simply “see” what’s wrong—oftentimes making a working diagnosis in a matter of seconds—and proceed to do what needs to be done. “The key cues to a patient’s condition,” explains Jerome Groopman in his book How Doctors Think, “coalesce into a pattern that the physician identifies as a specific disease or condition.” This is talent of a very high order, where, Groopman says, “thinking is inseparable from acting.”26 Like other forms of mental automaticity, it develops only through continuing practice with direct, immediate feedback. Put a screen between doctor and patient, and you put distance between them. You make it much harder for automaticity and intuition to develop.

  IT DIDN’T take long, after their ragtag rebellion was crushed, for the surviving Luddites to see their fears come true. The making of textiles, along with the manufacture of many other goods, went from handicraft to industry within a few short years. The sites of production moved from homes and village workshops to large factories, which, to ensure access to sufficient laborers, materials, and customers, usually had to be built in or near cities. Craft workers followed the jobs, uprooting their families in a great wave of urbanization that was swollen by the loss of farming jobs to threshers and other agricultural equipment. Inside the new factories, ever more efficient and capable machines were installed, boosting productivity but also narrowing the responsibility and autonomy of those who operated the equipment. Skilled craftwork became unskilled factory labor.

  Adam Smith had recognized how the specialization of factory jobs would lead to the deskilling of workers. “The man whose whole life is spent in performing a few simple operations, of which the effects too are, perhaps, always the same, or very nearly the same, has no occasion to exert his understanding, or to exercise his invention in finding out expedients for removing difficulties which never occur,” he wrote in The Wealth of Nations. “He naturally loses, therefore, the habit of such exertion, and generally becomes as stupid and ignorant as it is possible for a human creature to become.”27 Smith viewed the degradation of skills as an unfortunate but unavoidable by-product of efficient factory production. In his famous example of the division of labor at a pin-manufacturing plant, the master pin-maker who once painstakingly crafted each pin is replaced by a squad of unskilled workers, each performing a narrow task: “One man draws out the wire, another straights it, a third cuts it, a fourth points it, a fifth grinds it at the top for receiving the head; to make the head requires two or three distinct operations; to put it on, is a peculiar business, to whiten the pins is another; it is even a trade by itself to put them into the paper; and the important business of making a pin is, in this manner, divided into about eighteen distinct operations.”28 None of the men knows how to make an entire pin, but working together, each plying his own peculiar business, they churn out far more pins than could an equal number of master craftsmen working separately. And because the workers require little talent or training, the manufacturer can draw from a large pool of potential laborers, obviating the need to pay a premium for expertise.

  Smith also appreciated how the division of labor eased the way for mechanization, which served to narrow workers’ skills even further. Once a manufacturer had broken an intricate process into a series of well-defined “simple operations,” it became relatively easy to design a machine to carry out each operation. The division of labor within a factory provided a set of specifications for its machinery. By the early years of the twentieth century, the deskilling of factory workers had become an explicit goal of industry, thanks to Frederick Winslow Taylor’s philosophy of “scientific management.” Believing, in line with Smith, that “the greatest prosperity” would be achieved “only when the work of [companies] is done with the smallest combined expenditure of human effort,” Taylor counseled factory owners to prepare strict instructions for how each employee should use each machine, scripting every movement of the worker’s body and mind.29 The great flaw in traditional ways of working, Taylor believed, was that they granted too much initiative and leeway to individuals. Optimum efficiency could be achieved only through the standardization of work, enforced by “rules, laws, and formulae” and reflected in the very design of machines.30

  Viewed as a system, the mechanized factory, in which worker and machine merge into a tightly controlled, perfectly productive unit, was a triumph of engineering and efficiency. For the individuals who became its cogs, it brought, as the Luddites had foreseen, a sacrifice not only of skill but of independence. The loss in autonomy was more than economic. It was existential, as Hannah Arendt would emphasize in her 1958 book The Human Condition: “Unlike the tools of workmanship, which at every given moment in the work process remain the servants of the hand, the machines demand that the laborer serve them, that he adjust the natural rhythm of his body to their mechanical movement.”31 Technology had progressed—if that’s the right word—from simple tools that broadened the worker’s latitude to complex machines that constrained it.

  In the second half of the last century, the relation between worker and machine grew more complicated. As companies expanded, technological progress accelerated, and consumer spending exploded, employment branched out into new forms. Managerial, professional, and clerical positions proliferated, as did jobs in the service sector. Machines assumed a welter of new forms as well, and people used them in all sorts of ways, on the job and off. The Taylorist ethos of achieving efficiency through the standardization of work processes, though still exerting a strong influence on business operations, was tempered in some companies by a desire to tap workers’ ingenuity and creativity. The co
glike employee was no longer the ideal. Brought into this situation, the computer quickly took on a dual role. It served a Taylorist function of monitoring, measuring, and controlling people’s work; companies found that software applications provided a powerful means for standardizing processes and preventing deviations. But in the form of the PC, the computer also became a flexible, personal tool that granted individuals greater initiative and autonomy. The computer was both enforcer and emancipator.

  As the uses of automation multiplied and spread from factory to office, the strength of the connection between technological progress and the deskilling of labor became a topic of fierce debate among sociologists and economists. In 1974, the controversy came to a head when Harry Braverman, a social theorist and onetime coppersmith, published a passionate book with a dry title, Labor and Monopoly Capital: The Degradation of Work in the Twentieth Century. In reviewing recent trends in employment and workplace technology, Braverman argued that most workers were being funneled into routine jobs that offered little responsibility, little challenge, and little opportunity to gain know-how in anything important. They often acted as accessories to their machines and computers. “With the development of the capitalist mode of production,” he wrote, “the very concept of skill becomes degraded along with the degradation of labor, and the yardstick by which it is measured shrinks to such a point that today the worker is considered to possess a ‘skill’ if his or her job requires a few days’ or weeks’ training, several months of training is regarded as unusually demanding, and the job that calls for a learning period of six months or a year—such as computer programming—inspires a paroxysm of awe.”32 The typical craft apprenticeship, he pointed out, by way of comparison, had lasted at least four years and often as many as seven. Braverman’s dense, carefully argued treatise was widely read. Its Marxist perspective fit with the radical atmosphere of the 1960s and early 1970s as neatly as a tenon in a mortise.

  Braverman’s argument didn’t impress everyone.33 Critics of his work—and there were plenty—accused him of overstating the importance of traditional craft workers, who even in the eighteenth and nineteenth centuries hadn’t accounted for all that large a proportion of the labor force. They also thought he placed too much value on the manual skills associated with blue-collar production jobs at the expense of the interpersonal and analytical skills that come to the fore in many white-collar and service posts. The latter criticism pointed to a bigger problem, one that complicates any attempt to diagnose and interpret broad shifts in skill levels across the economy. Skill is a squishy concept. Talent can take many forms, and there’s no good, objective way to measure or compare them. Is an eighteenth-century cobbler making a pair of shoes at a bench in his workshop more or less skilled than a twenty-first-century marketer using her computer to develop a promotional plan for a product? Is a plasterer more or less skilled than a hairdresser? If a pipefitter in a shipyard loses his job and, after some training, finds new work repairing computers, has he gone up or down the skill ladder? The criteria necessary to provide good answers to such questions elude us. As a result, debates about trends in deskilling, not to mention upskilling, reskilling, and other varieties of skilling, often bog down in bickering over value judgments.

  But if the broad skill-shift theories of Braverman and others are fated to remain controversial, the picture becomes clearer when the focus shifts to particular trades and professions. In case after case, we’ve seen that as machines become more sophisticated, the work left to people becomes less so. Although it’s now been largely forgotten, one of the most rigorous explorations of the effect of automation on skills was completed during the 1950s by the Harvard Business School professor James Bright. He examined, in exhaustive detail, the consequences of automation on workers in thirteen different industrial settings, ranging from an engine-manufacturing plant to a bakery to a feed mill. From the case studies, he derived an elaborate hierarchy of automation. It begins with the use of simple hand tools and proceeds up through seventeen levels to the use of complex machines programmed to regulate their own operation with sensors, feedback loops, and electronic controls. Bright analyzed how various skill requirements—physical effort, mental effort, dexterity, conceptual understanding, and so on—change as machines become more fully automated. He found that skill demands increase only in the very earliest stages of automation, with the introduction of power hand tools. As more complex machines are introduced, skill demands begin to slacken, and the demands ultimately fall off sharply when workers begin to use highly automated, self-regulating machinery. “It seems,” Bright wrote in his 1958 book Automation and Management, “that the more automatic the machine, the less the operator has to do.”34

  To illustrate how deskilling proceeds, Bright used the example of a metalworker. When the worker uses simple manual tools, such as files and shears, the main skill requirements are job knowledge, including in this case an appreciation of the qualities and uses of metal, and physical dexterity. When power hand tools are introduced, the job grows more complicated and the cost of errors is magnified. The worker is called on to display “new levels of dexterity and decision-making” as well as greater attentiveness. He becomes a “machinist.” But when hand tools are replaced by mechanisms that perform a series of operations, such as milling machines that cut and grind blocks of metal into precise three-dimensional shapes, “attention, decision-making, and machine control responsibilities are partially or largely reduced” and “the technical knowledge requirement of machine functioning and adjustment is reduced tremendously.” The machinist becomes a “machine operator.” When mechanization becomes truly automatic—when machines are programmed to control themselves—the worker “contributes little or no physical or mental effort to the production activity.” He doesn’t even require much job knowledge, as that knowledge has effectively gone into the machine through its design and coding. His job, if it still exists, is reduced to “patrolling.” The metalworker becomes “a sort of watchman, a monitor, a helper.” He might best be thought of as “a liaison man between machine and operating management.” Overall, concluded Bright, “the progressive effect of automation is first to relieve the operator of manual effort and then to relieve him of the need to apply continuous mental effort.”35

  When Bright began his study, the prevailing assumption, among business executives, politicians, and academics alike, was that automated machinery would demand greater skills and training on the part of workers. Bright discovered, to his surprise, that the opposite was more often the case: “I was startled to find that the upgrading effect had not occurred to anywhere near the extent that is often assumed. On the contrary, there was more evidence that automation had reduced the skill requirements of the operating work force.” In a 1966 report for a U.S. government commission on automation and employment, Bright reviewed his original research and discussed the technological developments that had occurred in the succeeding years. The advance of automation, he noted, had continued apace, propelled by the rapid deployment of mainframe computers in business and industry. The early evidence suggested that the broad adoption of computers would continue rather than reverse the deskilling trend. “The lesson,” he wrote, “should be increasingly clear—it is not necessarily true that highly complex equipment requires skilled operators. The ‘skill’ can be built into the machine.”36

  IT MAY seem as though a factory worker operating a noisy industrial machine has little in common with a highly educated professional entering esoteric information through a touchscreen or keyboard in a quiet office. But in both cases, we see a person sharing a job with an automated system—with another party. And, as Bright’s work and subsequent studies of automation make clear, the sophistication of the system, whether it operates mechanically or digitally, determines how roles and responsibilities are divided and, in turn, the set of skills each party is called upon to exercise. As more skills are built into the machine, it assumes more control over the work, and the worker’s opportunity to en
gage in and develop deeper talents, such as those involved in interpretation and judgment, dwindles. When automation reaches its highest level, when it takes command of the job, the worker, skillwise, has nowhere to go but down. The immediate product of the joint machine-human labor, it’s important to emphasize, may be superior, according to measures of efficiency and even quality, but the human party’s responsibility and agency are nonetheless curtailed. “What if the cost of machines that think is people who don’t?” asked George Dyson, the technology historian, in 2008.37 It’s a question that gains salience as we continue to shift responsibility for analysis and decision making to our computers.

  The expanding ability of decision-support systems to guide doctors’ thoughts, and to take control of certain aspects of medical decision making, reflects recent and dramatic gains in computing. When doctors make diagnoses, they draw on their knowledge of a large body of specialized information, learned through years of rigorous education and apprenticeship as well as the ongoing study of medical journals and other relevant literature. Until recently, it was difficult, if not impossible, for computers to replicate such deep, specialized, and often tacit knowledge. But inexorable advances in processing speed, precipitous declines in data-storage and networking costs, and breakthroughs in artificial-intelligence methods such as natural language processing and pattern recognition have changed the equation. Computers have become much more adept at reviewing and interpreting vast amounts of text and other information. By spotting correlations in the data—traits or phenomena that tend to be found together or to occur simultaneously or sequentially—computers are often able to make accurate predictions, calculating, say, the probability that a patient displaying a set of symptoms has or will develop a particular disease or the odds that a patient with a certain disease will respond well to a particular drug or other treatment regimen.

 

‹ Prev