The Perfect Weapon

Home > Other > The Perfect Weapon > Page 12
The Perfect Weapon Page 12

by David E. Sanger


  Cook believed that giving the US government the back door it demanded into his products would be a disaster. “The most intimate details of your life are on this phone,” he said to me one day in San Francisco, holding up an iPhone. “Your medical records. Your messages to your spouse. Your whereabouts every hour of the day. That information is yours. And my job is to make sure it remains yours.”

  And so a year later, in 2014, Cook went to war with the Obama administration over encryption.

  * * *

  —

  When Cook took the stage in Cupertino in September of that year to announce the iPhone 6, Apple advertised it—not in so many words—as the phone for the post-Snowden age. Over the years, even before the Snowden revelation, Apple had gradually encrypted more and more data on its phones. Now thanks to a software change, the phone would automatically encrypt emails, photos, and contacts based on a complex mathematical algorithm that used a code created by, and unique to, the phone’s user.

  But the bigger news was that Apple would not hold the keys: those would be created and held by users. That change marked a huge break from the past. Until then, Apple always had the keys, unless someone using one of their phones was making use of a special encryption app—a complicated process for most users.

  Now encryption would become automatic. “We won’t keep those messages and we wouldn’t be able to read them,” Cook told me. “We don’t think people want us to have that right.” Even worse for those who might need to get into an iPhone, breaking any individual code would take awhile: the Apple technical guide reported that it could take “more than 5½ years to try all combinations of a six-character alphanumeric passcode with lowercase letters and numbers.”

  Over the next week or so, the implications soaked in at FBI headquarters. If encryption were automatic, it would be nearly universal. And so if the FBI went to court and delivered an iPhone to Apple and demanded its contents, even with a valid court order taped to it, they would get back a pile of encrypted gibberish. Apple would argue that without the user’s code, the company had no way of decrypting the information. If the government wanted the data, it had better start trying to break the code by brute force.

  Comey hadn’t seen a move of this scope coming. It was a complete breach of the tacit understanding he had grown up with, enshrined during the simpler days of the Cold War and the shock of 9/11, that there would always be a way around encryption. Suddenly, that assumption was changing. “What concerns me about this is companies marketing something expressly to allow people to hold themselves beyond the law,” Comey said after the announcement at a news conference devoted largely to combating terror threats from the Islamic State.

  To those who had been around since the “Crypto Wars” of the 1970s and ’80s, it was a familiar argument. At that time, the NSA wanted to control all cryptography research so that it could read anything it wanted. It had fought academics and private firms that wanted to publish cutting-edge research into the most secure ways of encrypting data, and it wanted a role in setting the standards for cryptography so that it could read messages sent around the world. In short, the NSA wanted to control the development of cryptography so that it wouldn’t be locked out of any system.

  Then, in the 1990s, the agency developed the “Clipper chip,” a chip that could be installed in computers, TVs, and early cell phones. The Clipper encrypted voice and data messages but had a back door that the agency could unlock, ensuring that with proper authorization the intelligence agencies, the FBI, and local police could decode any message. The Clinton administration endorsed the idea—for a while—arguing that once the chip went into every device, there would be no way for terrorists to avoid using it and the intelligence agencies could listen in.

  Naturally, consumers and most manufacturers rebelled, and the Clinton administration retreated. “NSA lost both battles,” noted Susan Landau, an expert on the history of these conflicts.

  Now the fight was being waged once more, and with a startling ferocity. Snowden’s revelations made tech companies more determined than ever to beef up their encryption, their case made easier as consumers read about countless hacks on companies that stored their credit-card data. Obama read the tea leaves and created an independent panel to advise him on what kind of new restrictions—if any—to put on the NSA after the Snowden revelations and to guide him on balancing privacy and security. The panel included Mike Morell, who had retired from the CIA not long after I dealt with him on Olympic Games, and a range of other former counterterrorism officials, academics, and constitutional lawyers.

  To the shock of the NSA and the FBI, Morell and his colleagues sided with Big Tech. The panel made a unanimous recommendation that the government should “not in any way subvert, undermine, weaken, or make vulnerable generally available commercial software.” Instead, it should “increase the use of encryption and urge US companies to do so.”

  No sooner had the ink dried on the panelists’ signatures than the NSA urged Obama to ignore its advice. With terror groups already turning to encrypted apps with names like Telegram and Signal, Landau noted, “the last thing the NSA wanted was to make encryption easier for everyone across the world; how would it listen in abroad?”

  The burden of arguing for government access fell on Comey, who had a bit of a flair for the dramatic—as the world later learned in his confrontations with Hillary Clinton and Donald Trump—and soon he was reaching for the most emotive example he could come up with to support his position: What would happen, he asked, when the parents of a kidnapped child came to him with a phone that might reveal the whereabouts of their kid, but its contents could not be determined because they were automatically encrypted—just so that Apple could extend its profits around the world? Comey predicted there would be a moment, soon, when those parents would come to him “with tears in their eyes, look at me, and say, ‘What do you mean you can’t’ ” decode the phone’s contents?

  “The notion that someone would market a closet that could never be opened—even if it involves a case involving a child kidnapper and a court order—to me does not make any sense,” he said. He extended the analogy to apartment doors and car trunks to which there were no keys. That would stymie a legal search warrant, he said. If it wouldn’t be tolerated in the physical world, he said, why should it be tolerated in the digital world?

  From the other coast, Tim Cook had an answer: the apartment keys and trunk keys belonged to the owner of the apartment and the car, not to the manufacturer of their locks. “It’s our job to provide you with the tools to lock up your stuff,” Cook said. At Apple and Google, company executives told me that Washington had brought these changes on themselves. Because the NSA had failed to police their own insiders, the world was demanding that Apple prove their data was secure, and it was up to Apple to do so. Naturally the government saw this as a deliberate dodge. And to some extent it was.

  But Cook had a bigger and better argument, one that the government could not so easily parry: if Apple created a back door into its code, that vulnerability would become the target of every hacker on Earth. The FBI was naïve to think that if the tech companies created a lock and gave the FBI a key, no one else would figure out how to pick it. “The problem is,” Cook said, “anyone with any technical skills knows that if you create an opening for the FBI, you create one for China and Russia and everyone else.”

  Discreetly, Cook took that argument to Obama himself—in quiet sessions in Washington and Silicon Valley. American spy agencies and police had all kinds of other options, he argued. They could find data in the cloud. They could use Facebook to figure out anyone’s acquaintances. But to give them access to that data inside the phone was to undercut an American expectation of privacy—and to invite the Chinese and others to do the same, for far more nefarious purposes.

  “The only way I can protect hundreds of millions of people is the way I’m doing it,” Cook told me du
ring one of his Washington visits, fresh from making this case to Obama and his aides. But he knew that despite his admiration for Obama—“I love the guy,” he’d often repeat—he was losing the argument with him.

  Obama was looking to straddle the problem by arguing that security and privacy could be balanced. In Cook’s view, that slogan sounded nice from the White House Press Room but made no technological sense. Drilling a hole in the iPhone operating system was like drilling a hole in your windshield; it weakened the whole structure and allowed everything to fly in.

  When I spoke with Cook, it was clear that he was worried about another problem, one that American officials weren’t discussing in public because it so complicated their case. China was watching Apple’s struggle with the US government—and it was rooting for Comey. If Apple agreed to create a back door for the FBI, China’s Ministry of State Security would give Apple no choice but to create one for them too, or else be ejected from the Chinese market.

  At the White House, many officials worried about being accused of becoming an accessory to China’s growing crackdowns on dissidents. In fact, the fear paralyzed some of them. But FBI officials quickly waved away this argument. “We’re not the State Department,” one of Comey’s top aides told me. The rest of the intelligence community seemed likewise unconcerned. Just days after the Apple announcement, the director of one of America’s sixteen intelligence agencies invited me to his office to rail against Apple’s top executives.

  “This is a direct result of Snowden,” he declared, the only thing on which he and executives like Cook seemed to agree. “We’re going blind.” Smartphones, he said, were routinely part of “the pocket litter” of every terrorist tracked down by American Special Forces in Pakistan and Afghanistan, and now in the Islamic State. Most were drained of data on the spot. Now they would routinely carry levels of encryption previously available only to government agents of Russia and China.

  “It’s a terrible choice,” another spy chief told me. “We have to decide whether to attack our own companies” or live in a world in which the working assumption of the Western intelligence agencies—that they could obtain any message, break any code—would no longer apply.

  The battle lines were drawn. But the big fight would not come for another year, just as the presidential race of 2016 was heating up.

  * * *

  —

  Just before midday on Wednesday, December 2, 2015, Syed Rizwan Farook and Tashfeen Malik armed themselves with assault rifles and semiautomatic pistols and attacked a holiday party at the city’s health department in San Bernardino, California. Fleeing the site, they left behind an explosive constructed of three pipe bombs that failed to detonate. Fourteen people were killed and twenty-two were injured. The dead ranged in age from a twenty-six-year-old woman who was raising a toddler to a sixty-year-old Eritrean immigrant who had left for a safer life in the States and raised three children with his wife.

  The attackers were killed in a shootout a few hours later, about two miles from the site of the shootings. Farook, twenty-seven, turned out to be the son of Pakistani parents who had immigrated to Illinois before he was born, making him a natural-born American citizen. Malik, a year his senior, was born in Pakistan but lived in Saudi Arabia with family—before she came to the United States after meeting Farook on his hajj, or pilgrimage, to Mecca. Radicalization followed. It turned out she had pledged her allegiance to ISIS on Facebook just before the attack. But no one had noticed until it was all over.

  Then came the detail that would reignite the encryption debate for months. Farook had left behind his work-issued iPhone 5c. It was critical, because while the couple had worked hard to cover their electronic tracks prior to the attack—smashing personal phones and hard drives, deleting emails, and using a disposable burner phone—they had forgotten the work iPhone. The FBI believed this device would provide vital evidence: Farook’s communications with any associates and, most vitally, his GPS coordinates just before the attack. That is, if they could get past the phone’s encryption. (Farook did not upload his data to iCloud, which would have been more accessible.)

  The problem was that Farook had locked his phone with a code, and now he was dead. And while the FBI could try a brute-force cracking—essentially trying all possible combinations—Apple’s safety features include one that wiped all data after ten wrong password attempts. That feature was designed to protect users against any hacker who broke into a phone—mostly criminals seeking financial information, credit-card numbers, or information about how to gain access to a house or a safe.

  For Comey, here was a case tailor-made to fit his argument. If there were other ISIS-inspired Americans or immigrants who were in communication with Farook and Malik, they needed to be picked up quickly. And Apple, in the name of privacy and security, was arguing that it didn’t know Farook’s passcode, so it could be of no help. Comey publicly asked Apple to write new code—essentially a variant of the iPhone operating system—that would allow the FBI access to iPhone password-protection security features, thereby circumventing the problem and gaining access to Farook’s phone. Comey insisted he would use the new code with discretion. In fact, the FBI may have already possessed the technology to unlock the phone, according to a subsequent report by the FBI inspector general. But investigators were told that technology was available mostly for foreign intelligence work, and the inspector general concluded that senior FBI officials were looking forward to a court confrontation with Apple.

  The Justice Department got a federal magistrate in California to order Apple to find a way to crack the phone. Cook realized immediately that Comey saw the San Bernardino case as a chance to short-circuit the intensifying arguments about encryption and had escalated the dispute to the courts. Cook saw this moment as a chance to take a stand and to show his independence from the FBI. He wrote a 1,100-word letter to his customers that was striking for its accusation that the Obama administration was so obsessed with access that it was ready to sacrifice the privacy of its citizens.

  The United States government has demanded that Apple take an unprecedented step which threatens the security of our customers. We oppose this order, which has implications far beyond the legal case at hand….Some would argue that building a backdoor for just one iPhone is a simple, clean-cut solution. But it ignores both the basics of digital security and the significance of what the government is demanding in this case….The implications of the government’s demands are chilling…ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect.

  For the leader of one of the most successful companies on the planet, larger than some European economies, it was a remarkable accusation. Cook was charging an administration that treasured its reputation as a progressive force for civil rights with seeking to undermine a core constitutional principle about individual freedom. With Apple and the FBI at a standoff, Obama dispatched some of his senior intelligence officials to Silicon Valley to talk Cook off the ceiling and look anew for a compromise. Cook wasn’t interested. Though he could not yet reveal it publicly, the FBI’s demand that Apple break into the San Bernardino phone was just one of four thousand law-enforcement requests to the company in the second half of 2015.

  Comey wasn’t about to back down; he told aides that the publicity around the San Bernardino case would, if anything, remind criminals, child pornographers, and terrorists to use encryption. This was the moment to settle the encryption wars, he said, once and for all.

  * * *

  —

  It didn’t turn out that way. In the end the FBI paid at least $1.3 million to a firm it would not name—believed to be an Israeli company—to hack into the phone. The FBI refused to say what the technical solution was, or to share it with Apple, apparently for fear that the company would seal up whatever hardware or software loophole was discovered by the hired hackers. Later the FBI told Congress they did
n’t actually know what technology was used: They hired a locksmith, who picked the lock. They deliberately didn’t ask how it was done—because the White House, under its own rules about disclosing most vulnerabilities to manufacturers, might have been forced to clue in Apple.

  Obama, the constitutional law professor, never solved this problem. And he never implemented the recommendation from his own advisory panel that the government encourage the use of more and more encryption. He told his aides that years of daily warnings in the President’s Daily Brief about terrorist activity around the world had altered his view: The United States simply could not agree to any rules that locked it out of some conversations. It was a breach with the tech community that he simply never overcame.

  “If, technologically, it is possible to make an impenetrable device or system, where the encryption is so strong that there is no key, there is no door at all, then how do we apprehend the child pornographer?” Obama asked publicly a few years later. “How do we disrupt a terrorist plot?”

  If the government cannot crack a smartphone, Obama concluded, “then everyone is walking around with a Swiss bank account in your pocket.”

  Obama had accurately described—but hadn’t solved—one of the central dilemmas of the cyber age.

  CHAPTER V

  THE CHINA RULES

  I mean, there are two kinds of big companies in the United States. There are those who’ve been hacked by the Chinese and those who don’t know they’ve been hacked by the Chinese.

  —James Comey, then FBI director, October 5, 2014

  The boxy twelve-story building along Datong Road on the outskirts of Shanghai was easy to overlook. In the jumble of a city of 24 million people—China’s most populous, and among its most high-tech—it was just another bland, white high-rise. The only hint that the unmarked building was actually a base for the People’s Liberation Army and its pioneering cyber force, Unit 61398, came if you looked at the protections surrounding the tower—or the security forces who came after you if you dared to take a picture of it.

 

‹ Prev