Book Read Free

The One Device

Page 20

by Brian Merchant


  The early mobile networks that emerged to administer them were run by top-down telecoms, much like the expansive landline networks built out since the earliest days of Bell’s telephone. Cell markets remained tethered to regional or nationally operated base stations, but with one exception: the more egalitarian and consumer-oriented Nordic Mobile Telephony system.

  Scandinavian nations pioneered wireless before many others out of sheer necessity—routing telephone wires through vast expanses of rocky, snowy terrain was difficult. In the rest of Europe the telecom model was rigid, conventional: national systems provided by the national telecom’s provider. Not so in the Nordic countries, where Swedes, Finns, Norwegians, and Danes wanted their car phones to work across borders—and the germ of roaming was planted. The Nordic Mobile Telephony system, which was established in 1981, marked the re-conception of the phone as something that could—and should—transcend borders, and reshaped the way that people thought about mobile communications, from an implement useful in local markets to a more general, more universal tool. In fact, the NMT network’s stated goal was building a system in which “everyone could call everyone.” It used a computerized register to keep tabs on people’s locations while they were roaming. It would become the first automatic mobile network and a model for all advanced wireless networks to follow.

  “It has certain features which are really influential,” Agar says. “The shared Scandinavian values towards design, with open attitudes towards technologies. One of the crucial things was a willingness to set aside some of those purely national interests in favor of something that was more consumer-oriented. That was about, for example, roaming between countries.” Unsurprisingly, the more open, unbound system proved popular. So popular, in fact, that it effectively minted the model for the mobile standard that would go on to conquer the world.

  In 1982, European telecom engineers and administrators met under the banner of the Groupe Spécial Mobile to hash out the future of the continent’s cell system, and to discuss whether a unified cell standard was technically and politically possible. See, the European Commission wanted to do for all of Europe what NMT had done for the Nordic nations. It’s rarely sexy to discuss vast, slow-moving bureaucracies, but GSM is an incredible triumph of political collaboration. Though it would take a decade to orchestrate the GSM test program, complete the technical specifications, and align the politics, it was a behemoth effort of technical cooperation and diplomatic negotiation. To drastically oversimplify matters, there were those who sought a stronger, more united Europe, and those who argued its states should be more independent; GSM was seen as a vessel for uniting Europe, and thus championed by the European Commission. “The best illustration of GSM as a politically charged European project is given by the facility to roam,” Agar says. “Just as in NMT, roaming, the ability to use the same terminal under different networks, was prioritized, even though it was expensive, because it demonstrated political unity.” If citizens of different European nations could call each other on the go or easily phone home when abroad, the constellation of diverse countries might feel a bit more like they were part of the same neighborhood.

  When it finally launched in 1992, GSM would cover eight EU nations. Within three years, it covered almost all of Europe. Renamed the Global System for Mobile, it soon lived up to its moniker. By the end of 1996, GSM was used in 103 countries, including the United States, though it often wasn’t the only available standard. Today, it’s pretty much everywhere—an estimated 90 percent of all cell calls in 213 countries are made over GSM networks. (The U.S. market is one of the few that’s split; Verizon and Sprint use a competing standard, called CDMA, while T-Mobile and AT&T use GSM. One easy way to tell if your phone is GSM: it comes with an easily removable subscriber identity module—a SIM card.)

  Without the EU’s drive to standardize mobile—and its push for unity—we might not have seen such wide-scale and rapid adoption of cell phones. Critics have railed against some of GSM’s specifications as being overly complicated. It’s been called the “Great Software Monster” and the “most complicated system built by man since the Tower of Babel,” but maybe that’s fitting—standardizing network access for much of the globe, allowing “everyone to talk to everyone”—was a crucial feat.

  While wireless cell networks evolved from massive government-backed projects, the main way our phones get online began as a far-flung academic hackaround. Wi-Fi began long before the web as we know it existed and was actually developed along the same timeline as ARPANET. The genesis of wireless internet harkens back to 1968 at the University of Hawaii, where a professor named Norman Abramson had a logistics problem. The university had only one computer, on the main campus in Honolulu, but he had students and colleagues spread across departments and research stations on the other islands. At the time, the internet was routed through Ethernet cables—and routing an Ethernet cable hundreds of miles underwater to link the stations wasn’t an option.

  Not entirely unlike the way harsh northern terrain drove the Scandinavians to go wireless, the sprawling expanse of Pacific Ocean forced Abramson to get creative. His team’s idea was to use radio communications to send data from the terminals on the small islands to the computer on Honolulu, and vice versa. The project would grow into the aptly named ALOHAnet, the precursor to Wi-Fi. (One of those reverse-engineered acronyms, it originally stood for Additive Links On-line Hawaii Area.) The ARPANET is the network that would mutate into the internet, and it’s fair to say that ALOHAnet would do the same with Wi-Fi.

  At the time, the only way to remotely access a large information processing system—a computer—was through a wire, via leased lines or dial-up telephone connections. “The goal of THE ALOHA SYSTEM is to provide another alternative for the system designer and to determine those situations where radio communications are preferable to conventional wire communications,” Abramson wrote in a 1970 paper describing the early progress. It’s a frank, solutions oriented declaration that—like E. A. Johnson’s touchscreen patent—downplays (or underestimates) the potential of the innovation it describes.

  The way that most radio operators went about sharing an available channel—a scarce resource in a place like Hawaii, where there’s less network coverage—was by dividing it up into either time slots or frequency bands, then giving each wayward station one of either. Once each party gets a band of frequency or a time slot—and only then—they could open communication.

  On Hawaii, though, with the university’s slow mainframe, that meant dragging data transfer to a crawl. Thus, ALOHAnet’s chief innovation: It would be designed with only two high-speed UHF channels, one uplink, one downlink. The full channel capacity would be open to all, which meant that if two people tried to use it at once, a transmission could fail. In which case, they’d just have to try again. This system would come to be known as random access protocols. ARPANET nodes could communicate directly only with a node at the other end of a wire (or satellite circuit). “Unlike the ARPANET where each node could only talk directly to a node at the other end of a wire or satellite circuit, in ALOHAnet all client nodes communicated with the hub on the same frequency.”

  In 1985, the FCC opened the industrial, scientific, and medical (ISM) band for unlicensed use, allowing interested parties to do exactly that. A group of tech companies convened around a standard in the 90s, and marketing flacks gave it the entirely meaningless name “wireless fidelity”—concocted to sound like Hi-Fi—and Wi-Fi was born.

  As GSM grew in Europe and around the world and as the cost of mobile phones fell, more users inevitably got their hands on the technology. And, as Agar says, “The key uses of a technology are discovered by users. They’re not necessarily at the fore of the mind of the designers.” So, it didn’t take long for those users to discover a feature, added as an afterthought, that would evolve into the cornerstone of how we use phones today. Which is why we should thank Norwegian teens for popularizing texting.

  A researcher named Friedhelm Hil
lebrand, the chairman of GSM’s nonvoice services committee, had been carrying out informal experiments on message length at his home in Bonn, Germany. He counted the characters in most messages he tapped out and landed on 160 as the magic number for text length. “This is perfectly sufficient,” he thought. In 1986, he pushed through a requirement mandating that phones on the network had to include something called short message service, or SMS. He then shoehorned SMSing into a secondary data line originally used to send users brief updates about network status.

  Its creators thought that text messaging would be useful for an engineer who was, say, out in the field checking on faulty wires—they’d be able to send a message back to the base. It was almost like a maintenance feature, Agar says. But it also enabled text messaging to appear on most phones. It was a minuscule part of the sprawling GSM, and engineers barely texted. But teenagers, who were amenable to a quick, surreptitious way to send messages, discovered the service. Norwegian teenagers, Agar says, took to texting in far greater numbers than any network engineers ever did. During the nineties, texting was largely a communication channel for youth culture.

  This principle plays out again and again throughout the history of technology: Designers, marketers, or corporations create a product or service, users decide what they actually want to do with it. This happened in Japan at the turn of the century too: The telecom NTT DoCoMo had built a subscription service for mobile internet aimed at businessmen called i-Mode. The telecom tightly curated the websites that could appear on the screen, and pitched services like airline ticket reservations and email. It flopped with business class, but was taken up by twentysomethings, who helped smartphones explode in Japan nearly a decade before they’d do the same in the U.S.

  The user takeover phenomenon has happened a number of times on the iPhone as well; Steve Jobs, recall, said that the iPhone’s killer app was making calls. (To his credit, phone-focused features like visual voicemail were major improvements.) Third-party apps weren’t allowed. Yet users eventually dictated that apps would be central and that making calls would be a bit be less of a priority.

  As everyone began talking to everyone, as the teens started texting, and as we all hit the App Store, wireless networks stretched out across the globe. And somebody had to build, service, and repair the countless towers that kept everyone connected.

  In the summer of 2014, Joel Metz, a cell-tower climber and twenty-eight-year-old father of four, was working on a tower in Kentucky, 240 feet above the ground. He was replacing an old boom with a new one when his colleagues heard a loud pop; a cable suddenly flew loose and severed Metz’s head and right arm, leaving his body dangling hundreds of feet in the air for six hours.

  The gruesome tragedy is, sadly, not a fluke. Which is why it’s absolutely necessary to interrupt the regularly scheduled story of collaboration, progress, and innovation with a reminder that it all comes at a cost, that the infrastructure that makes wireless technology possible is physically built out by human labor and high-risk work, and that people have died to grow and maintain that network. Too many people. Metz’s death was just one of scores that have befallen cell-tower climbers over the past decade.

  Since 2003, Wireless Estimator, the primary industry portal for tower design, construction, and repair, has tallied 130 such accidental deaths on towers. In 2012, PBS Frontline and ProPublica partnered for an investigation into the alarming trend. An analysis of Occupational Safety and Health Administration (OSHA) records showed that tower climbing had a death rate that was ten times that of the construction industry’s. The investigators found that climbers were often given inadequate training and faulty safety equipment before being sent out to do maintenance work on structures that loomed hundreds of feet above the ground. If you ever want to get a taste of how gut-churningly high these workers climb, there’s always YouTube; watch, and get vicarious vertigo, on the LTE network they help keep running.

  The investigation found that one carrier was seeing more fatalities than all of its major competitors combined. Guess who, and guess when: “Fifteen climbers died on jobs for AT&T since 2003. Over the same period, five climbers died on T-Mobile jobs, two died on Verizon jobs and one died on a job for Sprint,” the report noted. “The death toll peaked between 2006 and 2008, as AT&T merged its network with Cingular’s and scrambled to handle traffic generated by the iPhone.” Eleven climbers were killed during that rush.

  You might recall the complaints about AT&T’s network that poured in after the iPhone debuted; it was soon overloaded, and Steve Jobs was reportedly furious. AT&T’s subsequent rush to build out more tower infrastructure for better coverage, ProPublica’s report indicated, contributed to hazardous working conditions and the higher-than-usual death toll.

  The following years saw fewer deaths, down to a low of just one in 2012. Sadly, after that, there was another sharp spike—up to fourteen deaths in 2013. The next year, the U.S. Labor Department warned of “an alarming increase in worker deaths.” The major carriers typically offload tower construction and maintenance to third-party subcontractors, who often have less-than-stellar safety records. “Tower worker deaths cannot be the price we pay for increased wireless communication,” OSHA’s David Michaels said in a statement.

  Tower climbing is known as a high-risk, high-reward job. Ex-climbers have described it as a “Wild West environment,” and a fraction of those who’ve died in accidents have tested positive for alcohol and drugs. Still, the subcontractors are rarely significantly penalized when deaths do occur, and with no substantial reduction in the rate of death, we have to assume that until regulators clamp down or the rate of expansion slows, the loss of lives will persist.

  We need to integrate this risk, this loss, into our view of how technology works. We might not have developed wireless radio communications without Marconi, cell phones without Bell Labs, a standardized network without EU advocates—and we wouldn’t get reception without the sacrifice of workers like Joel Metz. Our iPhones wouldn’t have a network to run on without all of the above.

  These forces combined have propelled a vast expansion of smartphones: There were 3.5 million smartphone subscribers in the U.S. in 2005—and 198 million in 2016. That’s the gravitational power of the iPhone in action; it reaches back into the networks of the past and ripples out into the drive to build the towers of the future.

  iii: Enter the iPhone

  Slide to unlock

  If you worked at Apple in the mid-2000s, you might have noticed a strange phenomenon afoot. People were disappearing. It happened slowly at first. One day there’d be an empty chair where a star engineer used to sit. A key member of the team, gone. Nobody could tell you exactly where they went.

  “I had been hearing rumblings about, well, it was unclear what was being built, but it was clear that a lot of the best engineers from the best teams had been slurped over to this mysterious team,” says Evan Doll, who was a software engineer at Apple then.

  Here’s what was happening to those star engineers. First, a couple of managers had shown up in their office unannounced and closed the door behind them. Managers like Henri Lamiraux, a director of software engineering, and Richard Williamson, a director of software.

  One such star engineer was Andre Boule. He’d been at the company only a few months.

  “Henri and I walked into his office,” Williamson recalls, “and we said, ‘Andre, you don’t really know us, but we’ve heard a lot about you, and we know you’re a brilliant engineer, and we want you to come work with us on a project we can’t tell you about. And we want you to do it now. Today.’”

  Boule was incredulous, then suspicious. “Andre said, ‘Can I have some time to think about it?’” Williamson says. “And we said, ‘No.’” They wouldn’t, and couldn’t, give him any more details. Still, by the end of the day, Boule had signed on. “We did that again and again across the company,” Williamson says. Some engineers who liked their jobs just fine said no, and they stayed in Cupertino. Those who said yes, lik
e Boule, went to work on the iPhone.

  And their lives would never be the same—at least, not for the next two and a half years. Not only would they be working overtime to hammer together the most influential piece of consumer technology of their generation, but they’d be doing little else. Their personal lives would disappear, and they wouldn’t be able to talk about what they were working on. Steve Jobs “didn’t want anyone to leak it if they left the company,” says Tony Fadell, one of the top Apple executives who helped build the iPhone. “He didn’t want anyone to say anything. He just didn’t want—he was just naturally paranoid.”

  Jobs told Scott Forstall, who would become the head of the iPhone software division, that even he couldn’t breathe a word about the phone to anyone, inside Apple or out, who wasn’t on the team. “He didn’t want, for secrecy reasons, for me to hire anyone outside of Apple to work on the user interface,” Forstall said. “But he told me I could move anyone in the company into this team.” So he dispatched managers like Henri and Richard to find the best candidates. And he made sure potential recruits knew the stakes up-front. “We’re starting a new project,” he told them. “It’s so secret, I can’t even tell you what that new project is. I cannot tell you who you will work for. What I can tell you is if you choose to accept this role, you’re going to work harder than you ever have in your entire life. You’re going to have to give up nights and weekends probably for a couple years as we make this product.”

  And “amazingly,” as Forstall put it, some of the top talent at the company signed on. “Honestly, everyone there was brilliant,” Williamson tells me. That team—veteran designers, rising programmers, managers who’d worked with Jobs for years, engineers who’d never met him—would end up becoming one of the great, unheralded creative forces of the twenty-first century.

 

‹ Prev