The One Device

Home > Other > The One Device > Page 3
The One Device Page 3

by Brian Merchant


  Fortunately, there was a consumer technology out there that already allowed users to do something like that, if not quite in the exact form the ENRI crew was after. In fact, one of Apple’s engineers was using it. Around that time, Tina Huang had shown up to work with an unusual, plastic black touchpad marketed to computer users with hand injuries. It was made by a small Delaware-based company called FingerWorks. “At the time I was doing a lot of work that involved me testing with a mouse,” Huang tells me, including a lot of dragging and dropping. “So I was having some wrist troubles and I think that definitely motivated me to get the FingerWorks.”

  The trackpad allowed her to use fluid hand gestures to communicate complex commands directly to her Mac. It let her harness what was known as multitouch finger tracking, and, Chaudhri says, it inspired the group to examine the technology.

  “We kind of started playing around with multitouch and that was the thing that resonated with a lot of people,” Strickon says. He was familiar with the upstart company, and he suggested they reach out.

  “I was like, you know, we’ve actually seen these guys,” Huppi says. They’d been in and out of Cupertino taking meetings over the last couple of years, but had never gotten much traction. FingerWorks was founded by a brilliant PhD student, Wayne Westerman, and the professor advising him on his dissertation. Despite generally agreeing that the core technology was impressive, Apple’s marketing department couldn’t figure out how they would use multitouch, or sell it. “We said, well, it’s time to look at it again,” Huppi says. “And it was like, Wow, they really have figured out how to do this multitouch stuff with capacitive sensing.” It’s impossible to understand the modern language of computing, or the iPhone, without understanding what that means.

  On Touch

  At the time, touch tech was largely limited to resistive screens—think of old ATMs and airport kiosks. In a resistive touchscreen, the display is composed of layers—sheets coated with resistive material and separated by a tiny gap. When you touch the screen with your finger, you press the two layers together; this registers the location of said touch. Resistive touch is often inexact, glitchy, and frustrating to use. Anyone who’s ever spent fifteen minutes mashing their fingers onto a flight-terminal touchscreen only to get flickering buttons or random selections is keenly aware of the pitfalls of resistive touch.

  Instead of relying on force to register a touch, capacitive sensing puts the body’s electrochemistry to work. Because we’re all electrical conductors, when we touch a capacitive surface, it creates a distortion of the screen’s electrostatic field, which can be measured as a change in capacitance and pinpointed rather precisely. And FingerWorks appeared to have mastered the technology.

  The ENRI team got their hands on a FingerWorks device, and found they came with diagrams detailing dozens of different gestures. Huppi says, “I sort of likened it to a very exotic instrument: not too many people can learn how to play it.” As it had in the past, Apple’s tinkerers saw the chance to simplify. “The core of the idea was there, which was that there were some gestures, like pinch to zoom,” Huppi says. “And two-finger scrolling.”

  A new, hands-on approach to computing, free of rodent intermediaries and ancient keyboards, started to seem like the right path to follow, and the ENRI team warmed to the idea of building a new user interface around the finger-based language of multitouch pioneered by Westerman—even if they had to rewrite or simplify the vocabulary. “It kept coming up—we want to be able to move things on the screen like a piece of paper on the table,” Chaudhri says.

  It would be ideal for a trackpad as well as a touchscreen tablet; an idea long pursued but never perfected in the consumer market—and one certainly interesting to the vets of the Newton (which had a resistive touch screen) who still hoped to see mobile computing take off.

  And it wouldn’t be the first time a merry band of Apple inventors plumbed another organization for UI inspiration. In fact, Silicon Valley’s premier Prometheus myth is rife with parallels: In 1979, a young squad of Apple engineers, led by Steve Jobs, visited the Xerox Palo Alto Research Center and laid eyes on its groundbreaking graphical user interface (GUI) boasting windows, icons, and menus. Jobs and his band of “pirates” borrowed some of those ideas for the embryonic Macintosh. When Bill Gates created Windows, Jobs screamed at him for stealing Apple’s work. Gates responded coolly: “Well, Steve, I think there’s more than one way of looking at it. I think it’s more like we both had this rich neighbor named Xerox and I broke into his house to steal the TV set and found out that you had already stolen it.”

  While the brass set up meetings with the Delaware touch scholars, the ENRI team set to thinking about how they could start experimenting with multitouch in the meantime, and how splicing FingerWorks tech into a Mac-powered device might work. There was, from the outset, a major obstacle to confront: They wanted to interact with a transparent touchscreen—but FingerWorks had used the technology for an opaque keyboard pad.

  The solution? An old-school hardware hack.

  Rigged

  To find inspiration for a prototype, the team turned to the internet. They found videos of engineers doing direct manipulation, Huppi says, by projecting over the top of an opaque screen. “And we’re like, ‘This is exactly what we’re talking about.’”

  They brought in a Mac, set up a projector to hang over a table, and positioned the trackpad beneath it. The idea was to beam down whatever was being shown on the screen of the Mac, so that it’d become a faux ‘screen’ atop the trackpad. “We basically had this table with a projector over the top, and there’s this trackpad, looking like it was like an iPad sitting on the table,” Huppi says.

  Problem was, it was hard to focus the new “screen.” “I literally went home that day and got some crazy close-up lenses out of my garage, and taped them onto the projector,” Greg Christie says. The lens did the trick. “If you focus everything the right way, you could actually project an image of the screen onto this thing,” Huppi says.

  Finally, they needed a display. For that, they went low-tech. They put a white piece of printer paper over the touchpad, and the touchscreen simulation was complete. Clearly, it wasn’t perfect. “You got a bit of a shadow from your fingers,” Bas Ording says, but it was enough. “We could start exploring what we could do with multitouch.”

  The Mac/projector/touchpad/paper hybrid worked—barely—but they also needed to customize the software if they were going to experiment with the touch dynamics in earnest, and put their own spin on the interface. That’s where Josh Strickon came in. “I was writing a lot of the processing algorithms for doing the finger detection,” Strickon says, as well as “glue software” that gave them access to multitouch data generated by the experiments.

  Freewheeling as it was, the project nonetheless began to take on a secret air. “I don’t remember a day when someone said, ‘Okay, we’re not allowed to talk about it anymore,’” Strickon says, but that day came. With good reason: The ENRI team’s experiment had suddenly become an exciting prospect, but if Steve Jobs found out about it too early and disagreed, the whole enterprise stood to get shut down.

  User Testing

  The experimental rig was perfect for its new home—that empty old user-testing facility. It was spacious, about the size of a small classroom. Giant surveillance cameras dangled from the ceilings, and behind the one-way mirror, there was a room that looked like an old recording studio, replete with mixing boards. “I’m sure it was all top-of-the-line stuff back in the eighties,” Huppi says. “We laughed that the video recording equipment was all VHS!” And they needed security clearance to get in—Christie was one of the few in the company who had access at the time.

  “It was kind of a weird space,” Strickon says. “The irony was, we were trying to solve problems about the user experience in a user-testing room without ever being able to bring an actual user into it.”

  Ording and Chaudhri spent hours down there, hammering out demos and desig
ns, building the fundaments of a brand new interface based entirely on touch. They used Strickon’s data feed to create tweaked versions of the FingerWorks gestures, and tested new ideas of their own. They homed in on fixing the ENRI group’s shit list: Pinch to zoom replaced a magnifying glass icon, a simple flick of the screen simplified click-and-drag scrolling.

  Throughout, their creative partnership made for a powerful symbiosis. “Bas was a bit better with the technical side,” Chaudhri says, “and I could contribute more with the artistic elements.” From an early age, Chaudhri was chiefly interested in how technology was intersecting with culture. “I wanted to be one of three places: the CIA, MTV—or Apple,” he says. After interning at Apple’s Advanced Technology Group, he was offered a job in Cupertino. His friends were dubious. “You’ll spend all your time designing little icons,” they said. He laughed them off and took it. “It turns out they’d only be thirty percent right.”

  His talent with icons, however, made him an ideal fit for Bas’s animated experiments. “We worked together quite well,” Ording says. “He was doing more icons and nice graphics—he’s really good at setting up the whole style. I was a little bit better at doing the interactive prototyping and the feel and dynamic parts.” He’s being modest, of course. Mike Slade, a former adviser to Steve Jobs, described Ording as a wizard: “He’d take ninety seconds pecking away, he’d hit a button, and there it was—a picture of whatever Steve had asked for. The guy was a god. Steve just laughed about it. ‘Basification in progress,’ he’d announce.” Ording’s father ran a graphic design company outside of Amsterdam, and he learned to code as a kid—maybe it was in his blood. Regardless, industry giants like Tony Fadell hail him as a visionary. One of his peers from the iPhone days puts it this way: “I don’t know what else to say about Bas, that guy’s a genius.”

  The new touch-based template proved so promising, even exhilarating, that Chaudhri and Ording would pass entire days down there, sometimes without realizing it—UI’s Lennon and McCarthy at work.

  “We’d go in with the sun and leave with the moon,” Chaudhri says. “We’d forget to eat. If you’ve ever been in love, not having a single care, that’s what it was like. We knew it was big.”

  “There were no windows, it was kind of like a casino,” Ording says. “So you could look up and it’d be four o’clock and we’d worked right through lunch.”

  They started to shield their work from outsiders, even from their boss, Greg Christie—they didn’t want anything to intrude on the flow and momentum of the progress. “At that point,” Chaudhri says, “we stopped talking to people. For the same reasons that start-ups go into stealth mode.” And they didn’t want it to get shut down before they could effectively demonstrate the full scope of their gestating UI’s potential. Naturally, their boss was irked.

  “I remember we were going to Coachella, and Christie told us, ‘Maybe when you get back from that orgy in the desert, you can tell me what the hell you’re doing down there,’” Chaudhri says.

  They cooked up compelling demos that showcased the potential of multitouch: maps you could zoom and rotate, and pictures that you could bounce around the screen with a quick pull of your fingers. They uploaded vacation photos and subjected them to multitouch experiments. “They were the masters of coming up with the UI stuff,” Huppi says. People would gather around as Ording used two fingers to rotate and zoom in on globs of color, manipulating the pixels in a smooth, responsive state. Ording and Chaudhri say it was already clear that what they were working on had the potential to be revolutionary.

  “Right away, there was something cool about it,” Ording says. “You could play with stuff, and drag stuff around the screen, and it would bounce, or you could pinch zoom, all that kind of stuff.” You know, the kind of stuff that would become the baseline for a newfangled mobile machine-human symbiosis.

  It was time to put some of that genius to the test.

  Showtime

  With a handful of working demos and a reasonably reliable rig in place, Duncan Kerr showed the early prototype to Jony Ive and the rest of the ID group. “It was amazing,” core member Doug Satzger said, sounding taken aback. Among the most impressed was Ive. “This is going to change everything,” he said.

  But he held off on sharing the project with Jobs—the model was in a cumbersome, inelegant conceptual state, and he worried that Jobs would dismiss it. “Because Steve is so quick to give an opinion, I didn’t show him stuff in front of other people,” Ive said. “He might say, ‘This is shit,’ and snuff the idea. I feel that ideas are very fragile, so you have to be tender when they are in development. I realized that if he pissed on this, it would be so sad because I knew it was so important.”

  Just about everyone else was already sold. “It’s just one of those things where instantly anyone who saw it was like, ‘This is the coolest thing I’ve ever seen,’” Huppi recalls. “People’s eyes would light up when they used it, when they saw this thing and played with it. And so we knew that there was something really magical about this.”

  The question was—would Steve Jobs think so too? After all, Jobs was the ultimate authority—he could kill the project with a word if he didn’t see the potential.

  The rig worked. The demos were compelling. They made it clear that instead of clicking and typing, you could touch, drag, toss, and manipulate information in a more fluid, intuitive way.

  “Jony felt it was time to show it to Steve Jobs,” Huppi says. At this point, it was as much a matter of timing as anything else. “If you caught Steve on a bad day, everything he saw was shit, and it was like, ‘Don’t ever show this to me again. Ever.’ So you have to be very careful about reading him and knowing when to show him things.”

  In the meantime, the table-sized rig sat in the secret surveillance lab on Infinite Loop, projecting the shapes of the future onto a blank white sheet of paper.

  CHAPTER 1

  A Smarter Phone

  Simon says, Show us the road to the smartphone

  Stop me if you’ve heard this one before.

  One day, a visionary innovator at one of the world’s best-known technology companies decided that the future of communication lay in combining mobile phones with computing power. The key, he believed, was ensuring this new device functioned intuitively, so a user could pick it up and it would already feel familiar. It would have a touchscreen you could use to control the device with your fingers. It would have an easy-to-navigate home screen filled with icons you could tap to activate. It would have internet access and email. It would have games and apps.

  First, though, a prototype had to be built in time to show it off to the world at a very public demonstration, where the eyes of the media would be fixed on him. In order to meet the deadline, the visionary pushed his team to the breaking point. Tensions mounted. Technology failed, then worked, then failed again. Miraculously, on the day of the big demo, the new hybrid phone was—barely—a go.

  The innovator stepped into the spotlight and promised a phone that would change everything.

  And so the smartphone was born.

  The year was 1993.

  The visionary inventor was Frank Canova Jr., who was working as an engineer in IBM’s Boca Raton, Florida, labs. Canova conceived, patented, and prototyped what is widely agreed to be the first smartphone, the Simon Personal Communicator, in 1992. That was a year before the World Wide Web was opened to the public and a decade and a half before Steve Jobs debuted the iPhone.

  While the iPhone was the first smartphone to go fully mainstream, it wasn’t actually a breakthrough invention in its own right.

  “I really don’t see the iPhone as an invention so much as a compilation of technologies and a success in smart packaging,” says Chris Garcia, the curator of the Computer History Museum, the world’s largest collection of computer-related artifacts. “The iPhone is a confluence technology. It’s not about innovation in any field,” he says.

  The most basic innovation of the smartphone w
as the introduction of a computer into a device that every household in the nation had access to: the telephone. The choices that were made to render the phone smart, like using a touchscreen interface and foregrounding apps, would have serious ramifications in shaping the modern world. And those foundations were set well over two decades ago.

  As the renowned computer scientist Bill Buxton puts it, “The innovations of the Simon are reflected in virtually all modern touchscreen phones.”

  By 1994, Frank Canova had helped IBM not just invent but bring to market a smartphone that anticipated most of the core functions of the iPhone. The next generation of the Simon, the Neon, never made it to the market, but its screen rotated when you rotated the phone—a signature feature of the iPhone. Yet today, the Simon is merely a curious footnote in computing history. So the question is: Why didn’t the Simon become the first iPhone?

  “It’s all about time frames,” Canova says with a wry smile, holding up the first smartphone. It’s black, boxy, and the size of a brick. “The technologies actually just barely allowed us to make this kind of phone.”

  We’re sitting in his spacious, slightly cluttered office in Santa Clara, the heart of Silicon Valley, in the shadow of the amusement park Great America. Canova now works for Coherent, an industrial laser company, managing a team of engineers. It’s a twenty-minute drive to Cupertino. And he’s holding the third smartphone ever to roll off an assembly line: the Simon, serial number 3. He won’t have it for much longer—historians are finally wising up to its value, and he’s about to ship it off to the Smithsonian.

 

‹ Prev