The One Device

Home > Other > The One Device > Page 17
The One Device Page 17

by Brian Merchant


  He built two test points on the board to test the current—and found, oddly, that there was no current flowing at all. “This puzzled him, and everyone else, so we prodded the board, and we discovered that the main five-volt supply to the processor wasn’t actually connected. There was a fault on the board. So he was trying to measure the current flowing into that five-volt supply, and there wasn’t any,” Wilson says.

  The thing was, though, the processor was still running. Apparently with no power at all.

  How? The processor was running on leakage from the circuits next to it, basically. “The low-power big thing that the ARM is most valued for today, the reason that it’s on all your mobile phones, was a complete accident,” Wilson says. “It was ten times lower than Steve had expected. That’s really the result of not having the right sort of tools.”

  Wilson had designed a powerful, fully functional 32-bit processor that consumed about one-tenth of a watt of energy.

  As the CPU critic Paul DeMone noted, “it compared quite favorably to the much more complicated and expensive designs, such as the Motorola 68020, which represented the state of the art.” The Motorola chip had 190,000 transistors. ARM’s had a mere 25,000 but used its power much more efficiently and squeezed more performance out of its lower transistor count.

  Shortly after, in an effort to continue to simplify their designs, Wilson and company built the first so-called System on Chip, or SoC, “which Acorn did casually, without realizing it was a world-defining moment.” The SoC basically integrates all the components of a computer into one chip—hence the name.

  Today, SoCs are rampant. There’s one in your iPhone, of course.

  The Acorn RISC Machine was a remarkable achievement. As Acorn’s fortunes wavered, ARM’s promise grew. It was spun off into its own company in 1990, a joint venture between Acorn and the company it’d once tried to beat out alphabetically—Apple. CEO John Sculley wanted to use the ARM chips in Apple’s first mobile device, the Newton. Over the years, as the Newton sputtered out, Apple’s stake in the company declined. But ARM surged, thanks to its low-power chips and unique business model. It’s that model, Wilson insists, that launched ARM into ubiquity.

  “ARM became successful for a completely different reason, and that was the way the company was set up,” she says. Wilson calls it the “ecosystem model”: ARM designers would innovate new chips, work closely with clients with specific demands, and then license the end designs, as opposed to selling or building them in-house. Clients would buy a license giving them access to ARM’s design catalog; they could order specifications, and ARM would collect a small royalty on each device sold.

  In 1997, Nokia turned to ARM to build chips for its classic 6110 handset, which was the first cell phone to feature the processor. It turned out to be a massive hit, thanks in part to the more advanced user interface and long battery life its low-power chips afforded it. Oh, and it had Snake, one of the first mobile games to become a pop-culture staple. If you’re old enough to remember using a cell phone at the turn of the century, you will remember playing Snake while waiting in line somewhere.

  ARM’s popularity grew alongside mobile devices throughout the early aughts, and it became the obvious choice for the emergent boom of smart electronics.

  “This is a company that supplies all the other companies in the world but is entrusted with their secrets,” Wilson says. “Where the partners know ARM will keep its word and be a solid partner. It’s in everybody’s interest for it to work.”

  So in the end, it was twin innovations—a powerful, efficient, low-power chip alongside a collaboration-centric, license-based business model—that pushed the ARM deeper into the mainstream than even Intel. Yet you’ve probably heard of Intel, but maybe not ARM.

  Today, Wilson is the director of integrated circuits at Broadcom. When ARM was split off from Acorn, she remained a consultant. Wilson came out as transgender in 1992 and she keeps a low profile, but is nonetheless celebrated as an inspiration to women in STEM by LGBT blogs and tech magazines aware of her work. She was recognized as one of the fifteen most important women in tech by the likes of Maximum PC. The blog Gender Science named her Queer Scientist of the Month. She cuts against the stereotype of the straight male inventor that rose to prominence in the 1980s, and it’s hard not to wonder whether her genius would be more widely recognized today if she better fit the Steve Jobs mold.

  Even though I knew I was testing my luck, I put the same question to her that that other hapless interviewer had: How did she feel about the rise of ARM in 2016?

  “I stopped being shocked at ten billion sold.”

  Transistors multiplying like viruses and shrinking like Alice onto integrated, low-power ARM chips. What’s it mean for us?

  Well, in 2007, when the iPhone launched, that ARM-architecture chip, loaded with 1.57 million transistors (and designed and manufactured by a cohort of Samsung chipmakers working closely on-site with Apple in Cupertino, but more on that later), meant something you know very well: iOS.

  That powerful, efficient processor enabled an operating system that looked and felt as fluid and modern as a Mac’s, stripped down though it was to run on a phone-size device.

  And it meant iOS-enabled apps.

  At first, it was just a few of them. In 2007, there was no App Store. What Apple made for the platform was what you got.

  Though apps held the key to propelling the iPhone into popularity and transformed it into the vibrant, diverse, and seemingly infinite ecosystem that it is today, Steve Jobs was at first adamantly opposed to allowing anyone besides Apple to create apps for the iPhone. It would take a deluge of developers calling for access, a band of persistent hackers jailbreaking the walled garden, and internal pressure from engineers and executives to convince Jobs to change course.

  It was, essentially, the equivalent of a public protest stirring a leader to change policy.

  Steve Jobs did in fact use the phrase killer app in that first keynote presentation—and it’s telling what he believed that killer app would be.

  “We want to reinvent the phone,” Jobs declared. “What’s the killer app? The killer app is making calls! It’s amazing how hard it is to make calls on most phones.” He then went on to demonstrate how easy Apple had made it to organize contact lists, visual voicemail, and the conference-call feature.

  Remember, the original iPhone, per Apple’s positioning, was to be

  • A wide-screen iPod with touch controls

  • A phone

  • An internet communicator

  The revolutionary “there’s an app for that” mentality was nowhere on display.

  “You don’t want your phone to be like a PC,” Jobs told the New York Times. And “you don’t want your phone to be an open platform,” Jobs told the tech journalist Stephen Levy the day of the iPhone launch. “You don’t want it to not work because one of three apps you loaded that morning screwed it up. Cingular doesn’t want to see their West Coast network go down because of some app. This thing is more like an iPod than it is a computer in that sense.”

  The first iPhone shipped with sixteen apps, two of which were made in collaboration with Google. The four anchor apps were laid out on the bottom: Phone, Mail, Safari, and iPod. On the home screen, you had Text, Calendar, Photos, Camera, YouTube, Stocks, Google Maps, Weather, Clock, Calculator, Notes, and Settings. There weren’t any more apps available for download and users couldn’t delete or even rearrange the apps. The first iPhone was a closed, static device.

  The closest Jobs came to hinting at the key function that would drive the iPhone to success was his enthusiasm for its mobile internet browser, Safari. Most smartphones offered what he called “the baby internet”—access to a text-based, unappealing shadow of the multimedia-rich glories of the web at large. Safari let you surf the web for real, as he demonstrated by loading the New York Times’ website and clicking around. But letting developers outside Apple harness the iPhone’s new platform was off the menu.
r />   “Steve gave us a really direct order that we weren’t going to allow third-party developers onto our device,” Andy Grignon, a senior engineer who worked on the iPhone, says. “And the primary reasons were: it’s a phone above anything else. And the second we allow some knucklehead developer to write some stupid app on our device, it could crash the whole thing and we don’t want to bear the liability of you not being able to call 911 because some badly written app took the whole thing down.”

  Jobs had an intense hatred of phones that dropped calls, which might have driven his prioritizing the phone function in those early days.

  “I was around when normal phones would drop calls on Steve,” Brett Bilbrey, who served until 2013 as the senior manager of Apple’s Advanced Technology Group, recalls. “He would go from calm to really pissed off because the phone crashed or dropped his call. And he found that unacceptable. His Nokia, or whatever it was that he was using at the time, if it crashed on him, the chances were more than likely that he’d fling and smash it. I saw him throw phones. He got very upset at phones. The reason he did not want developer apps on his phone is he did not want his phone crashing.”

  But developers persisted in trying, even before the phone was launched. Many had been developing Mac apps for years and were eager to take a crack at the revolutionary-looking iPhone system. They aimed blog posts and social-media entreaties at Apple, asking for developer access.

  So, just weeks before the release of the first iPhone, at Apple’s annual developer’s conference in San Francisco, Jobs announced that they would be doing apps after all—kind of. Using the Safari engine, they could write web 2.0 apps “that look exactly and behave exactly like apps on the iPhone.”

  John Gruber, perhaps the best-known Apple blogger and a developer himself, explained that the “message went over like a lead balloon.” Indeed. “You can’t bullshit developers.… If web apps—which are only accessible over a network; which don’t get app icons in the iPhone home screen; which don’t have any local data storage—are such a great way to write software for iPhone, then why isn’t Apple using this technique for any of their own iPhone apps?” He signed off that blog post with a blunt assessment: “If all you have to offer is a shit sandwich, just say it. Don’t tell us how lucky we are and that it’s going to taste delicious.”

  Even the iPhone engineers concurred. “They kind of opened it up to web developers, saying, ‘Well, they’re basically apps, right?’” Grignon says. “And the developer community was like, ‘You can eat a dick, we want to write real apps.’”

  So, developers were irked. And by the time it debuted, at the end of June 2007, other smartphones already allowed third-party apps. Developers did try making those web apps for the iPhone, but, as Steve might have said himself, they mostly sucked. “The thing with Steve was that nine times out of ten, he was brilliant, but one of those times he had a brain fart, and it was like, ‘Who’s going to tell him he’s wrong?’” Bilbrey says.

  The demand for real, home-screen-living, iPhone-potential-exploiting apps led enterprising hackers to break into the iOS system for no reason other than to install their own apps on it.

  Hackers started jailbreaking the iPhone almost immediately. (More on that later.) Essentially, the iPhone was one of the most intuitive, powerful mobile-computing devices ever conceived—but it took an enterprising group of hackers to let consumers really use it like one. Since their exploits were regularly covered by tech blogs and even the mainstream media, it demonstrated to the public—and Apple—a thirst for third-party apps.

  With demonstrable public demand, executives and engineers behind the iPhone, especially senior vice president of iOS software Scott Forstall, started pushing Jobs to allow for third-party apps. The engineers behind the operating system and the apps that shipped with the original iPhone had already cleared the way internally for a third-party app-development system. “I think we knew we’d have to do it at some point,” Henri Lamiraux, vice president of iOS software engineering, tells me. But, he says, in the initial rush to ship the first iPhone, “We didn’t have time to do the frameworks and make the API clean.” An API, or application program interface, is a set routines, protocols, and tools for writing software applications. “It’s something we are very good at, so you’re very careful what you make public.”

  “So at the beginning we were modeling the phone after the iPod,” Nitin Ganatra says. “Everything that you could do was built into the thing.”

  Still, it was decided early on that the functionalities—email, web browser, maps—would be developed as apps, in part because many were ported over from the Mac OS. “We had created the tools to make it so that we could make these new apps very quickly,” Ganatra says. “There was an understanding internally that we don’t want to make it so there’s this huge amount of setup just to build the next app that Steve thinks up.”

  In other words, they had an API for iOS app development more or less waiting in the wings, even if it was unpolished at launch time. “In hindsight, it was awesome that we had worked that way, but very early on we were mostly just doing that out of convenience for ourselves,” Ganatra says.

  If months of public outcry from developers, concerted jailbreaking operations from hackers, and mounting internal pressure from Apple’s own executives wasn’t enough, there was one more key ingredient that Jobs and any other pro-closed-system executives surely noticed.

  “The iPhone was almost a failure when it first launched,” Bilbrey says. “Many people don’t realize this. Internally, I saw the volume sales—when the iPhone was launched, its sales were dismal. It was considered expensive; it was not catching on. For three to six months. It was the first two quarters or something. It was not doing well.”

  Bilbrey says the reason was simple. “There were no apps.”

  “Scott Forstall was arguing with Steve and convinced him and said, Look, we’ve got to put developer apps on the phone. Steve didn’t want to,” he says. “If an app took out the phone while you were on a phone call that was unacceptable. That could not happen to an Apple phone.”

  “Scott Forstall said, ‘Steve, I’ll put the software together, we’ll make sure to protect it if a crash occurs. We’ll isolate it; we won’t crash the phone.”

  In October 2007, about four months after the iPhone launched, Jobs changed course.

  “Let me just say it: We want native third party applications on the iPhone, and we plan to have an SDK [software developer’s kit] in developers’ hands in February,” Steve Jobs wrote in an announcement posted to Apple’s website. “We are excited about creating a vibrant third party developer community around the iPhone and enabling hundreds of new applications for our users.” (Note that even then, Jobs, along with almost everyone, was underestimating the behemoth the app economy would become.)

  Even members of the original iPhone crew credit the public campaign to open Apple’s walled garden for changing Jobs’s mind.

  “It was the obvious thing to do,” Lamiraux says. “We realized very quickly we could not write all the apps that people wanted. We did YouTube because we didn’t want to let Google do YouTube, but then what? Are we going to have to write every app that every company wants to do?”

  Of course not. This was arguably the most important decision Apple made in the iPhone’s post-launch era. And it was made because developers, hackers, engineers, and insiders pushed and pushed. It was an anti-executive decision. And there’s a recent precedent—Apple succeeds when it opens up, even a little.

  “The original iPod was a dud. There were not a lot sold. iPod really took off when they added support in iTunes in Windows, because most people didn’t have Macs then. This allowed them to see the value of the Apple ecosystem. I’ve got this player, I’ve got a music store, I’ve got a Windows device that plays well. That’s when iPod really took off. And you could say the same thing for the phone,” Grignon says. “It got a lot of rave reviews, but where it became a cultural game-changer was after they allowed
developers in. Letting them build their software. That’s what it is, right? How often do you use your phone as an actual phone? Most of the time, you’re sitting in line at the grocery store Tweeting about who-gives-a-fuck. You don’t use it as a phone.”

  No, we use the apps. We use Facebook, Instagram, Snapchat, Twitter. We use Maps. We text, but often on nonnative apps like Messenger, WeChat, and WhatsApp too.

  In fact, of the use cases that Apple imagined for the product—phone, iPod, internet communicator—only one truly made the iPhone transformative.

  “That’s how Steve Jobs used his product,” says Horace Dediu, the Apple analyst. If you think about how people spend time on their mobile devices today, you’re not going to think about phone calls. “The number-one job is social media today. Number two is probably entertainment. Number three is probably directions or maps or things of that nature,” Dediu says. “The basic communications of emails and other things was there, but the social media did not exist at the time. It had been invented, but it hadn’t been put into mobile and hadn’t been transformed. Really, it was Facebook, if I may use them sort of metaphorically, that had figured out what a phone is for.”

  The developer kit came out late winter of 2008, and in the summer, when Apple released its first upgrade of the iPhone, the iPhone 3G, it launched, at long last, the App Store. Developers could submit apps for an internal review process, during which they would be scanned for quality, content, and bugs. If the app was approved and if it was monetized, Apple would take a 30 percent cut. And that was when the smartphone era entered the mainstream. That’s when the iPhone discovered that its killer app wasn’t the phone, but a store for more apps.

  “When apps started showing up on the phone, that was when the sales numbers took off,” Bilbrey says. “The phone all of a sudden became a phenomenon. It wasn’t the internet browser. It wasn’t the iPod player. It wasn’t the cell phone or anything like that.”

 

‹ Prev