MEMS architecture
The gyroscope is one of an array of sensors inside your phone that provide it with information about how, exactly, the device is moving through the world and how it should respond to its environment. Those sensors are behind some of the iPhone’s more subtle but critical magic—they’re how it knows what to do when you put it up to your ear, move it horizontally, or take it into a dark room.
To understand how the iPhone finds its place in the universe—especially in relation to you, the user—we need to take a crash course through its most crucial sensors, and two of its location-tracking chips.
When it first launched, the iPhone had only three sensors (not counting the camera sensor): an accelerometer, a proximity sensor, and an ambient-light sensor. In its debut press release, Apple extolled the virtues of one in particular.
The Accelerometer
“iPhone’s built-in accelerometer detects when the user has rotated the device from portrait to landscape, then automatically changes the contents of the display accordingly,” Apple’s press team wrote in 2007, “with users immediately seeing the entire width of a web page, or a photo in its proper landscape aspect ratio.”
That the screen would adjust depending on how you held the device was a novel effect, and Apple executed it elegantly—even if it wasn’t a particularly complex technology (remember that Frank Canova had planned that feature for his canceled follow-up to the Simon, the Neon, in the 1990s).
The accelerometer is a tiny sensor that, as its name implies, measures the rate of acceleration of the device. It isn’t quite as old as the gyroscope: It was initially developed in the 1920s, primarily to test the safety of aircraft and bridges. The earliest models weighed about a pound and “consisted of an E-shaped frame containing 20 to 55 carbon rings in a tension-compression Wheatstone half-bridge between the top and center section of the frame,” according to industry veteran Patrick L. Walter. Those early sensors were used for “recording acceleration of an airplane catapult, passenger elevators, aircraft shock absorbers and to record vibrations of steam turbines, underground pipes and forces of explosions.” The one-pound accelerometer cost $420 in 1930s bucks. For a long time, the test and evaluation community—the industry that tries to make sure our infrastructure and vehicles don’t kill us—was the reason accelerometer tech continued to improve. “This group,” Walter wrote, “with their aerospace and military specifications and budgets, drove the market for 50–60 years.” By the 1970s, Stanford researchers developed the first MEMS accelerometers, and from the 1970s into the 2000s, the automotive industry became an important driver, using them in crash tests for airbag sensors.
And then, of course, the MEMS moved into computing. But before they’d be put in the smartphone, they’d have to make a pit stop.
“The sensors became important,” says Brett Bilbrey. “Like the motion accelerometer. Do you know why it was originally put into Apple devices? It first showed up in a laptop.
“Do you remember the Star Wars light-saber app?” he asks. “People were swinging their laptops around and all that?” In 2006, two years before the idea would be turned into an app that joined the impressive cluster of mostly-useless-but-entertaining novelty apps on the iPhone, there was MacSaber—a useless-but-entertaining Mac app that took advantage of the computer’s new accelerometer. It had a purpose beyond enabling laptop light-saber duels, of course; when someone knocked a laptop off a table and it went into freefall, the accelerometer would automatically shut off the hard drive to protect the data.
So Apple already had a starting point. “Then it was like, well, we want to put more sensors into the phone, we’ll move the accelerometer over to the phone,” Bilbrey says.
The Proximity Sensor
Let’s get back to that original iPhone announcement, which noted that the “iPhone’s built-in light sensor automatically adjusts the display’s brightness to the appropriate level for the current ambient light, enhancing the user experience and saving power at the same time.” The light sensor is pretty straightforward—a port-over from the similar laptop feature—and it’s still in use today. The story behind the proximity sensor, though, is a bit more interesting.
Proximity sensors are how your iPhone knows to automatically turn off its display when you lift it to your ear and automatically revive it when you go to set it back down. They work by emitting a tiny, invisible burst of infrared radiation. That burst hits an object and is reflected back, where it’s picked up by a receiver positioned next to the emitter. The receiver detects the intensity of the light. If the object—your face—is close, then it’s pretty intense, and the phone knows the display should be shut off. If it receives low-intensity light, then it’s okay to shine on.
“Working on that proximity sensor was really interesting,” says Brian Huppi, one of the godfathers of the proto-iPhone at Apple. “It was really tricky.”
Tricky because you need a proximity sensor to work for all users, regardless of what they’re wearing or their hair or skin color. Dark colors absorb light, while shiny surfaces reflect it. Someone with dark hair, for instance, risks not registering on the sensor at all, and someone wearing a sequined dress might trigger it too frequently.
Huppi had to devise a hack.
“One of the engineers had really, really dark black hair and, basically, I told them, ‘Go get your hair cut, get me some of your hair, and I’ll glue it to this little test fixture.’” The engineer returned to work with, well, some spare hair. Huppi was true to his word: they used it to test and refine the nascent proximity sensor.
The hair almost completely absorbed light. “Light hitting this was like the worst-case scenario,” Huppi says.
Even when they had the sensor up and running, it was still a precarious affair. “I remember telling one of the product designers, ‘You’re going to need to be really careful about how this gets mechanically implemented because it’s super-sensitive,’” Huppi says. When he ran into the designer months later, he told Huppi, “Man, you were right. We had all sorts of problems with that damn thing. If there was any misalignment, the thing just didn’t work.”
Of course, it eventually did, and it provided another subtle touch that helped more seamlessly and gently fuse the iPhone into daily use.
GPS
Determining your phone’s proximity to your head can be sussed by a sensor; determining its proximity to everything else requires a globe-spanning system of satellites. The story of why your iPhone can effortlessly direct you to the closest Starbucks begins, as so many good stories do, with the space race.
It was October 4, 1957, and the Soviets had just announced that they’d successfully launched the first artificial satellite, Sputnik 1, into orbit. The news registered as a global shock as scientists and amateur radio operators around the planet confirmed that the Russians had indeed beaten other world powers into orbit. In order to win this first leg of the space race, the Soviets had eschewed the heavy array of scientific equipment they’d initially intended to launch and instead outfitted Sputnik with a simple radio transmitter.
Thus, anyone with a shortwave radio receiver could hear the Soviet satellite as it made laps around the planet. The MIT team of astronomers assigned to observe Sputnik noticed that the frequency of radio signals it emitted increased as it drew nearer to them and decreased as it drifted away. This was due to the Doppler effect, and they realized that they could track the satellite’s position by measuring its radio frequency—and also that it could be used to monitor theirs.
It took the U.S. Navy only two years from that point to develop the world’s first satellite navigation system, Transit. In the 1960s and 1970s, the U.S. Naval Research Laboratory worked in earnest to establish the Global Positioning System.
Geolocation has come a long way since the space race, of course, and now the iPhone is able to get a very precise read on your whereabouts—and on your personal motion, movements, and physical activities. Today, every iPhone ships with a dedicated GPS
chip that trilaterates with Wi-Fi signals and cell towers to give an accurate reading of your location. It also reads GLONAS, Russia’s Cold War–era answer to GPS.
The best-known product of this technology is Google Maps. It remains the most popular mapping application worldwide and it may be the most popular map ever made, period. It has essentially replaced every other previous conception of the word map.
But Google Maps did not, in fact, originate with Google. It began as a project headed up by Lars and Jens Rasmussen, two Danish-born brothers who were both laid off from start-ups in the wake of the first dot-com bubble-burst. Lars Rasmussen, who describes himself as having the “least developed sense of direction” of anyone he knows, says that his brother came up with the idea after he moved home to Denmark to live with his mother.
The two brothers ended up leading a company called Where2 in Sydney, Australia, in 2004. After years of failing to interest anyone in the technology—people kept telling them they just couldn’t monetize maps—they eventually sold the thing to Google, where it would be transfigured into an app for the first iPhone.
It was, probably, the iPhone’s first killer app.
As the tech website the Verge noted, “Google Maps was shockingly better on the iPhone than it had been on any other platform.” Pinch-to-zoom simply made navigating feel fluid and intuitive. When I asked the iPhone’s architects what they thought its first must-use function was, Google Maps was probably the most frequent answer. And it was a fairly last-minute adoption; it took two iPhone software engineers, who had access to Google’s data as part of that long-forgotten early partnership, about three weeks to create the app that would forever change people’s relationship to navigating the world.
Magnetometer
Finally, of the iPhone’s location sensors, there’s the magnetometer. It has the longest and most storied history of all—because it’s a compass basically. And compasses can be traced back at least as far as the Han Dynasty, around 206 B.C.
Now, the magnetometer, accelerometer, and gyroscope all feed their data into one of Apple’s newer chips: the motion coprocessor, a tiny chip that the website iMore describes as Robin to the main processor’s Batman. It’s an untiring little sidekick, computing all the location data so the iPhone’s brain doesn’t have to, saving time, energy, and power. The iPhone 6 chip is manufactured by the Dutch company NXP Semiconductors (formerly Philips), and it’s a key component in so-called wearable functionalities; it tracks your daily footsteps, travel distances, and elevation changes. It’s the iPhone’s internal FitBit—it knows whether you’re riding your bike, walking, running, or driving. And it could, eventually, know a lot more.
“Over the long term, the chip could help advance gesture-recognition apps and sophisticated ways for your smartphone to anticipate your needs, or even your mental state,” writes MIT’s David Talbot. Whipping your phone around? It might know you’re angry. And the accelerometer can already interpret a shake as an input mechanism (“shake to undo,” or “shake to shuffle”). Who knows what else our black rectangles might learn about us by interpreting our minor movements and major motions.
These features are not altogether uncontroversial, however, mostly because they enable constant location tracking and technically can never be turned off. Take the story of Canadian programmer Arman Amin, who inadvertently made waves when he posted a story about traveling with his iPhone to Reddit shortly after the M chips started showing up in the 5s.
“While traveling abroad, my iPhone cable stopped working so my 5s died completely,” Amin wrote. “I frequently use Argus [a fitness app] to track my steps… since it takes advantage of the M7 chip built into the phone. Once I got back from my vacation and charged the phone, I was surprised to see that Argus displayed a number of steps for the 4 days that my phone was dead. I’m both incredibly impressed and slightly terrified.”
Even after Amin’s battery was dead, it appears to have continued dripping power to the super-efficient M7 chip. It was a stark demonstration of a common fear that has accompanied the rise of the iPhone, and the smartphone in general—that our devices are tracking our every move. It was a reminder that even with your phone off, even with the battery dead, a chip is tracking your steps. And it feeds into concerns into raised by the iPhone’s location services too, a setting that, unless disabled, regularly sends Apple data about your whereabouts.
The motion tracker helps illustrate a defining paradox of the smartphone zeitgeist: we demand always-on convenience but fear always-on surveillance. This suite of technologies, from GPS to accelerometer to motion tracker, has all but eliminated paper maps and rendered giving directions a dying art form. Yet our very physicality—our movements, migrations, relationships to the spatial world—is being uncovered, decoded, and put to use.
Over a century ago, scientists like Foucault built devices to help humanity understand the nature of our location in the universe—those devices still draw crowds of observers, myself among them, who bask in the grand, nineteenth-century demonstration of planetary motion sensing. As the old pendulum swings, the physics it proved out is working to the determine the location of the devices in our pockets.
And that science is still advancing.
“For many years my group looked at expanding the sensors of the phone,” Bilbrey says, referring to Apple’s Advanced Technology Group. “There are still more sensors and things I shouldn’t talk about.… We’re going to see more sensors showing up.”
CHAPTER 8
Strong-ARMed
How the iPhone grew its brain
“You want to see some old media?”
Alan Kay grins beneath his gray mustache and leads me through his Brentwood home. It’s a nice place, with a tennis court out back, but given the upper-crust Los Angeles neighborhood it sits in, it’s hardly ostentatious. He shares it with his wife, Bonnie MacBird, the author and actress who penned the original script for Tron.
Kay is one of the forefathers of personal computing; he’s what you can safely call a living legend. He directed a research team at the also-legendary Xerox PARC, where he led the development of the influential programming language Smalltalk, which paved the way for the first graphical user interfaces. He was one of the earliest advocates, back in the days of hulking gray mainframes, for using the computer as a dynamic instrument of learning and creativity. It took imagination like his to drive the computer into the public’s hands.
The finest distillation of that imagination was the Dynabook, one of the most enduring conceptual artifacts of Silicon Valley—a handheld computer that was powerful, dynamic, and easy enough to operate that children could use it, not only to learn but to create media and write their own applications. In 1977, Kay and his colleague Adele Goldberg published Personal Dynamic Media, and described how they hoped it would operate.
“Imagine having your own self-contained knowledge manipulator,” they directed the reader—note the language and the emphasis on knowledge. “Suppose it had enough power to outrace your senses of sight and hearing, enough capacity to store for later retrieval thousands of page-equivalents of reference materials, poems, letters, recipes, records, drawings, animations, musical scores, waveforms, dynamic simulations, and anything else you would like to remember and change.”
Some of the Dynabook’s specs should sound familiar. “There should be no discernible pause between cause and effect. One of the metaphors we used when designing such a system was that of a musical instrument, such as a flute, which is owned by its user and responds instantly and consistently to its owner’s wishes,” they wrote.
The Dynabook, which looks like an iPad with a hard keyboard, was one of the first mobile-computer concepts ever put forward, and perhaps the most influential. It has since earned the dubious distinction of being the most famous computer that never got built.
I’d headed to Kay’s home to ask the godfather of the mobile computer how the iPhone and a world where two billion people owned smartphones compared to what he ha
d envisioned in the 1960s and ’70s.
Kay believes nothing has yet been produced—including the iPhone and the iPad—that fulfills the original aims of the Dynabook. Steve Jobs always admired Kay, who had famously told Newsweek in 1984 that the Mac was the “first computer worth criticizing.” In the 1980s, just before he was fired from his first stint at Apple, Jobs had been pushing an effort to get the Dynabook built. Jobs and Kay talked on the phone every couple of months until Steve’s passing, and Jobs invited him to the unveiling of the iPhone in January 2007.
“He handed it to me afterwards and said, ‘What do you think, Alan—is it worth criticizing?’ I told him, ‘Make the screen bigger, and you’ll rule the world.’”
Kay takes me into a large room. A better way to describe it might be a wing; it’s a wide-open, two-story space. The first floor is devoted to a massive wood-and-steel organ, built into the far side of the wall. The second is occupied by shelves upon shelves of books; it’s like a stylish public library. Old media, indeed.
We’ve spent the last couple of hours discussing new media, the sort that flickers by on our Dynabook-inspired devices in a barrage of links, clips, and ads. The kind that Alan Kay fears, as did his late friend the scholar and critic Neil Postman, whose book Amusing Ourselves to Death remains a salient critique of the modern media environment we’re drowning in. In 1985 Postman argued that as television became the dominant media apparatus, it warped other pillars of society—education and politics, chiefly—to conform to standards set by entertainment.
The One Device Page 15