Book Read Free

The One Device

Page 19

by Brian Merchant


  A lot of that work is coming from development groups or companies who want apps to reach an international audience. I met a young man named Kennedy Kirdi who worked for iHub’s consulting arm and was building an app, funded by the UN, to help park rangers fight poachers.

  And a lot of it was top-down—the U.S., investors, developers, and well-meaning charities saw the idea of the app as a handy vessel for motivating change and growth in less-developed areas. Africa had become famous for leap-frogging landlines and for its mass adoption of cell phones, so a lot of the app-boom ideology was initially copy-and-pasted onto the region, even though most Kenyans didn’t have smartphones yet.

  “The market share wasn’t big enough for a smartphone economy to work,” Eleanor Marchant, a University of Pennsylvania scholar who is embedded with iHub, tells me. “App development was faddish.

  “Perception really influences how things get built,” she says, “even if it wasn’t an accurate perception. There was this perception of Kenya being really good at mobile. There ended up being a false equivalency,” Marchant says, and a lot of early, donor-funded start-ups didn’t pan out. There hadn’t been another Ushahidi or M-Pesa for years. “Even, for example, the M-Pesa interface itself, it’s very slow, designed to be used on an old Nokia phone, where apps don’t work or don’t really work. It was a text-based intervention. And it still exists that way.”

  In other words, the idea of a mobile revolution or an app-based revolution ported poorly from the U.S. or Europe, where it was a cultural phenomenon, to Kenya, where the reality was much different. For one thing, smartphones were simply not widely affordable yet.

  “It just didn’t work on the ground,” Kinyamu says. I’d gone to visit Kinyamu at his new tech space, the Founder’s Club, which was about to host its grand opening. A pair of monkeys ran up and down the trees in the parking lot, playfully swinging around the branches.

  “We had a mix of both, say, locals, who either started up in the U.S. and came back, and we had guys from the U.S. or Europe who had come and put together a fund,” he says. “And a lot of those guys actually lost their money. There were just a lot of non-African folks who were very optimistic, excited about Africa… trying out lots of things, from hackathons to system development to boot camps. There was lots of fluff. It was not the right reasons.”

  “It was donor-driven, so they just say, ‘We’re looking for something in energy,’ and they’re just throwing money at things,” he says. “All these things, the UN, NGOs, people who decide, ‘This is what Africa needs.’ There was lots of that money too, and it has to be spent or deployed somehow before 2015. A lot of that was supply-driven. People looked at it more as a charity case. So if you can make—it’s fluffy but it tells a better story.

  “Yeah, you’ve won this one-million grant, but then what?” he says. “That’s where the inequality comes in: There’s no level playing field. If you don’t have the access to the networks, to the conferences, to the ones with the money, who are quite often not based here, that’s it.”

  Part of the problem, he says, was that investors and donors tried to import the Silicon Valley mentality to Nairobi. “Before then, it was expats lecturing guys. You know: ‘This is how it works in Silicon Valley, it should work for you.’”

  But Kenyans couldn’t simply make a killer app and expect investors to notice it and acquire it for millions like they would in San Francisco. “People just didn’t know what they were signing up for. Entrepreneurship here is more about survival. It’s really hard. Start-ups are hard. But it’s harder here. Because you’re fixing things that the government should be able to do.”

  So the story of apps shifted from generalized, world-changing apps to more localized functions. “A lot of new things have in a way been localized, to put into context and apply here. It’s more of, before you build programs, you understand which parts of the system you’re trying to fix. You kind of have to give yourself a year of some runway. You’re thinking revenue very early on. So the customer is your only source of money. And so the really solid founders you get are really doing it for food. It’s about, I need to make X to pay rent, to pay my two or three guys and keep it running. I’m not doing it for the acquisition,” Kinyamu says.

  Thus, the newest wave of start-ups seem targeted at distinctly Kenyan interests. One app that kept coming up in conversations there was Sendy. “It’s basically using the little motorcycles you see around here, turning them into delivery nodes,” Hersman says. “They’re moving up the chain into their Uber round.” Kinyamu is trying to raise awareness about Kenya’s deficient infrastructure with the Twitter-based platform What Is a Road?, which encourages users to document potholes and creates an ongoing database. And one of the fastest-rising and most successful apps of late is a way for roadside fruit vendors to coordinate with farmers. BRCK, Hersman and Rotich’s latest venture, is a hardy mobile router that can give smartphones Wi-Fi in remote locations.

  It helps that today, 88 percent of Kenyans have cell phones, and according to Human IPO, 67 percent of all new phones sold in Kenya are smartphones—one of the fastest adoption rates on the continent. The government has approved funds for a controversial Konza City, a fourteen-billion-dollar “techno city” that would ideally support up to 200,000 IT jobs. IT accounts for 12 percent of the economy, up from 8 percent just five years ago. Apple doesn’t have much of a presence here, though the entrepreneurs I meet tell me that founders and other players wield iPhones as status symbols and brandish them about in important meetings.

  And many developers seem intent on bridging profit and social entrepreneurship. “There is a sense of awakening that, hey, the binary operative of looking at start-ups as either for profit or for social entrepreneurship is flawed,” Nelson Kwame says. “There is a sense that SE needs to have profit, at least for sustainability.” And it goes the other way. “Beyond just making a lot of money. The whole system is moving towards this synergy.” He estimates that just 30 percent of new start-ups follow the donor-funded, social-enterprise model.

  Some of the core values of making apps are becoming more prominent—instead of going after grants exclusively, developers like Kwame are focusing more on what interests them. “When you build something and people use it, it’s like, ‘Whoa,’” he says. “After a while, there’s a sense of recognition.”

  The story of the “mobile revolution” I found in Nairobi was anything but clear-cut; as always, it was a web of some forward-thinking innovation—by citizen bloggers and the national telecom—good marketing and storytelling, and slow progress that anointed the city as the Silicon Savanna. And while the actual impact of that designation is complex, and the app economy probably hasn’t benefited most Kenyans much, the perception, both from inside and outside the iHub walls, has imbued its tech scene with a sense of drive and identity.

  These are not well-off kids who trek to San Francisco to try to do an app and change the world. Most of the ones I met were extremely smart, ambitious developers (if they were in the valley of silicon instead of the savanna, they’d probably be millionaires) working, paycheck to paycheck, to bring that promised change closer to home.

  Apps have, at the very least, reshaped the way people think about the delivery of software and core services. But there’s another thing about the app economy: It’s almost all games. In 2015, games accounted for 85 percent of the App Store’s revenue, clocking in at $34.5 billion. This might not be a surprise, as some of the most popular apps are games; titles like Angry Birds or Candy Crush are impossibly ubiquitous. It’s just that the app economy is often touted in its revolutionary, innovation-ushering-in capacity, not as a place to squander hours on pay-to-play puzzle games.

  The App Store also became a place where lone-wolf operations can mint overnight viral successes—a Vietnamese developer, Dong Nguyen, created a simple, pixelated game called Flappy Bird that rapidly became one of the App Store’s most feverishly downloaded apps. The game was difficult and now iconic; its rapid rise
spawned wide discussion across the media. The game was estimated to be making fifty thousand dollars a day through the tiny ad banners displayed during play.

  The other major-grossing segment of the app market besides games is subscription services. As of the beginning of 2017, Netflix, Pandora, HBO Go, Spotify, YouTube, and Hulu all ranked in the twenty top-grossing apps on the App Store. Apart from Tinder, the dating app, the rest were all games.

  This too, like, a fluency with gesture-based multitouching, like the omnipresence of our cameras, like social networking, is a freshly permanent element of our lives that we can’t ignore; you always have the option to dissolve your senses in a mind-obliterating app, tapping the screen and winning yourself tiny little dopamine rushes by beating levels. For all the talk about new innovative apps revolutionizing the economy, just bear in mind that when we actually put our money where our mouth is, 85 percent of the time, it’s paying for distractions.

  Which isn’t to say there aren’t great apps that can help the device serve as the knowledge manipulator Alan Kay once imagined. Or plenty of free ones that that offer worthy cultural contributions without generating much revenue. But an awful lot of the app money is going to games and streaming media—services that are engineered to be as addictive as possible.

  Almost as soon as Flappy Bird rose to international fame, Nguyen decided to kill it. “Flappy Bird was designed to play in a few minutes when you are relaxed,” he told Forbes in an interview. “But it happened to become an addictive product. I think it has become a problem.” Critics couldn’t fathom why he would give up a revenue stream of fifty thousand dollars a day. He had built the app himself; it was making him rich. But he was stressed and guilt-ridden. The app was too addictive. “To solve that problem, it’s best to take down Flappy Bird. It’s gone forever.”

  Which brings us back to Alan Kay. “New media just ends up simulating existing media,” he says. “Moore’s law works. You’re going to wind up with something that is relatively inexpensive, within consumer buying habits. That can be presented as a convenience for all old media and doesn’t have to show itself as new media at all,” Kay says. “That’s basically what’s happened.” Which explains why we’re using our bleeding-edge new devices to stream sitcoms and play video games.

  “So if look out for what the computer actually good for, what’s important or critical about it, well, it’s not actually that different than the printing press,” he says. And that “only had to influence the autodidacts in Europe, and it didn’t know where they were. So it needed to mass-produce ideas so a few percentage of the population could get them. And that was enough to transform Europe. Didn’t transform most people in Europe; it transformed the movers and shakers. And what’s happened.”

  Coincidentally, some observers have noted that the once-equal-opportunity App Store now caters more to big companies who can advertise their apps on other platforms or through already popular apps on the App Store.

  And the immense, portable computing power of the iPhone is, by and large, being used for consumption. What do you use your phone for most? If you fit the profile of the average user, per Dediu, then you’re using it to check and post on social media, consume entertainment, and as a navigation device, in that order. To check in with your friends, to watch videos, to navigate your surroundings. It’s a stellar, routine-reordering convenience and a wonderful source of entertainment. It’s a great accelerator. As Kay says, “Moore’s law works.” But he also laments the fact that the smartphones are designed to discourage more productive, creative interaction. “Who’s going to spend serious time drawing or creating art on a six-inch screen? The angles are all wrong,” he says.

  What Kay is talking about is more a philosophical problem than a hardware problem. He says we have the technological capacity, the ability, to create a true Dynabook right now, but the demands of consumerism—specifically as marshaled by tech companies’ marketing departments—have turned our most mobile computers into, fittingly, consumption devices. Our more powerful, more mobile computers, enabling streaming audio, video, and better graphics, have hooked us. So what could have been done differently?

  “We should’ve included a warning label,” he says with a laugh.

  CHAPTER 9

  Noise to Signal

  How we built the ultimate network

  From the top of a cell phone tower, hundreds of feet above this wide-open field, you’d swear you can see the curvature of the Earth itself, the way the horizon stretches off into the distance and gently bends out of sight. At least, that’s what it looks like on YouTube—I’ve never been anywhere near the top of a cell tower, and I’m not about to anytime soon.

  In 2008, the head of the Occupational Safety and Health Administration, Edwin Foulke, deemed tower climbing—not coal mining, highway repair, or firefighting—“the most dangerous job in America.”

  It’s not hard to see why. The act of scaling a five-hundred-foot tower on a narrow metal ladder with a thirty-pound toolkit dangling below you on a leash with only a skimpy safety tie standing between you and a sudden plunge into terminal velocity is, well, inherently terrifying. (It’s also the reason that white-knuckle videos like this—tower-climber-with-a-GoPro is a surprisingly vibrant subgenre on YouTube—rack up millions of views.) Yet people do it every day, to maintain the network that keeps our smartphones online.

  After all, our iPhones would be nothing without a signal.

  We only tend to feel the presence of the networks that grant us a near-constant state of connectivity to our phones when they’re slow, or worse, absent altogether. Cell coverage has become so ubiquitous that we regard it as a near-universal expectation in developed places: We expect a signal much the way we expect paved roads and traffic signage. That expectation increasingly extends to wireless data, and, of course, to Wi-Fi—we’ve also taken to assuming homes, airports, chain coffee shops, and, public spaces will come with wireless internet access. Yet we rarely connect to the human investment it took—and still takes—to keep us all online.

  As of 2016, there were 7.4 billion cell phone subscribers in a world of 7.43 billion people (along with 1.2 billion LTE wireless data subscribers). And the fact that they can call each other with relative ease is a colossal political, infrastructural, and technological achievement. For one thing, it means a lot of cell towers—in the United States alone, there are at least 150,000 of them. (That’s an industry group estimate; good tower data is scarce as it’s hard to keep track of them all.) Worldwide, there are millions.

  The root of these vast networks goes back over a century, when the technology first emerged, bankrolled by nation-states and controlled by monopolies. “For the better part of the twentieth century,” says Jon Agar, a historian who investigates the evolution of wireless technologies, telecoms worldwide were “organized by large single, national or state providers of the technologies,” telecoms like the one run by Bell Telephone Company, started by our old friend Alexander Graham in the 1890s and that “most valuable patent”—which helped it become the largest corporation in U.S. history by the time it was broken up in 1984. “The transition from that world to our world, of many different types of individual private corporations,” Agar says, is how we end up at the iPhone. It was the iPhone that “broke the carriers’ backs,” as Jean-Louise Gassee, a former Apple executive, once put it.

  The Italian radio pioneer Guglielmo Marconi built some of the first functioning wireless-telegraphy transmitters, and he did so at the behest of a well-heeled nation-state—Britain’s royal navy. A wealthy empire like that was pretty much the only entity that could afford the technology at the time. So Britain fronted the incredible costs of developing wireless communication technology, mostly in the interest of enabling their warships to keep in contact through the thick fog that routinely coated the region. There aren’t many other ways to develop such large-scale, infrastructure-heavy, expensive, and difficult-to-organize networks: You need a state actor, and you need a strong justificati
on for the network—for instance, in the U.S., police radio.

  Not long after Bell Labs had announced both the invention of the transistor and cell phone technology (its creators thought the network layout looked like biological cells, hence the term “cellular”) the federal government was among the first to embrace it. That’s why some of the earliest radiotelephones were installed in American police cars; officers used them to communicate with police stations while they were out on patrol. You’re probably still familiar with the vestiges of this system, which continues to find use, even in the age of digital communications—in what other vehicle do you still typically picture radio-dispatcher equipment?

  Wireless technology remained the province of the state through most the 1950s, with one emerging exception: wealthy business folk. Top-tier mobile devices might seem expensive now, but they’re not even in the same league as the first private radio-communications systems, which literally cost as much as a house. The rich didn’t use radio to fight crime, of course. They used them to network their chauffeurs, allowing them to coordinate with their personal drivers, and for business.

  By 1973, the networks were broad and technology advanced enough that Motorola’s Martin Cooper was able to debut the first prototype mobile phone handset, famously making a public call on the toaster-size plastic cell. But the only commercially available mobile phones were car-based until the mid-1980s arrival of Motorola’s DynaTAC—the series of phones Frank Canova would experiment with to create the first smartphone. These were still ultra-expensive and rare, meeting the demands of a narrow niche—the rich futurist businessman, or Martin Sheen in Wall Street, pretty much. There wouldn’t be a serious consumer market for mobile phones until the 1990s.

 

‹ Prev