Book Read Free

The One Device

Page 13

by Brian Merchant


  As I’m walking through the flurry of lithium flakes, a hard hat on and earplugs in, past salt-jammed pipes and furiously wobbling pumps, I’m struck by the fact that much of the world’s battery power originates right here. I reach out and grab a palmful, spread it around in my hands. I’m touching one part of the tangled web of a supply chain that creates the iPhone—all this just to refine a single ingredient in the iPhone’s compact and complex array of technology.

  From here, the lithium will be shipped from a nearby port city to a battery manufacturer, probably in China. Like most of the component parts in the iPhone, the li-ion battery is manufactured overseas. Apple doesn’t make its battery suppliers public, but a host of companies, from Sony to Taipei-based Dynapack, have produced them over the years.

  Even today, the type of battery that will roll off their assembly lines isn’t insanely more complicated than Volta’s initial formulation; the battery in the iPhone 6 Plus model, for instance, has a lithium–cobalt oxide for the cathode, graphite for the anode, and polymer for its electrolyte. It’s hooked up to a tiny computer that prevents it from overheating or draining so much of its juice that it becomes unstable.

  “The battery is the key to a lot of the psychology behind these devices,” iFixit’s Kyle Wiens points out. When the battery begins to drain too fast, people get frustrated with the whole device it powers. When the battery’s in good shape, so is the phone. Predictably, the lithium-ion battery is the subject of a constant tug-of-war; as consumers, we demand more and better apps and entertainment, more video rendered in ever-higher res. Of course, we also pine for a longer-lasting battery, and the former obviously drains the latter. And Apple, meanwhile, wants to keep making thinner and thinner phones.

  “If we made the iPhone a millimeter thicker,” says Tony Fadell, the head of hardware for the first iPhone, “we could make it last twice as long.”

  About two hours after departing the world’s largest lithium refinery, Jason and I got our batteries stolen. Along with the stuff they powered. We’d just left the comfy clutches of SQM; our driver had dropped us off at the bus station.

  The complex looks like a dying strip mall, thick with the air of exhausted purgatory that attends midcity bus stations. I was wandering around looking for food, and Jason was watching our stuff. An old man approached him and asked him where the bus that’d just arrived was heading. While they were speaking, an accomplice strapped on my backpack and made haste for the exit. When I returned seconds later, we realized what had happened and ran frantically through the station screaming, “¿Mochila azul?” No dice.

  We lost two laptops, our recording equipment, a backup iPhone 4s, and assorted books and notes. But I didn’t lose this, my book, because I’d set iCloud to save my document files automatically.

  It forced me to report on the rest of the trip using my iPhone alone—voice-recording note-taking, photo-snapping—which, if I’d had just a little more data storage left, would’ve been entirely comfortable.

  “Phone, wallet, passport,” Jason took to saying as we passed through borders or left hostels, checking off our post-stolen-laptop essentials. “The only three things we have or need.” It became a half-assed mantra; we’d lost a lot of valuable stuff, but we still had all the tools we needed to do everything we’d done before.

  The Chilean police were friendly enough when we reported the crime but basically told us to move on—Apple products were rare and expensive in Chile, and our devices would likely be resold on the black market pronto.

  Influential as li-ion has been, Goodenough believes that a new and better battery—one whose key ingredient is sodium, not lithium—is on the horizon. “We are on the verge of another battery development that will also prove societally transformational,” he says. Sodium is heavier and more volatile than lithium, but cheaper and more easily accessible. “Sodium is extracted from the oceans, which are widely available, so armies and diplomacy are not required to secure the chemical energy in sodium, as is the case of the chemical energy stored in fossil fuels and in lithium,” he says. There’s a nonzero chance that your future iPhone battery will be powered by salt.

  To which the product reviewers of the world will say: Fine, but will our iPhone batteries ever get better? Last longer? Whittingham thinks yes. “I think they could get double what they get today,” he tells me. “The question is, would everyone be willing to pay for it?

  “If you pull an iPhone apart, I think the big question we ask folks like Apple is, Do you want more efficient electronics or more energy density in your battery?” Whittingham says. They can either squeeze more juice into the battery system or tailor electronics to drain less power. To date, they’ve leaned primarily on the latter. For the future, who knows? “They won’t give you an answer. It’s a trade secret,” Whittingham says.

  There have been serious advances in how electronics consume power. For one thing, “every lithium battery has total electronic protection,” Whittingham notes, in the form of the computer that monitors the energy output. “They don’t want you discharging it all the way,” which would fry the battery.

  Batteries are going to keep improving. Not just for the benefit of iPhone consumers, obviously, but for the sake of a world that’s perched on the precipice of catastrophic climate change.

  “The burning of fossil fuels releases carbon dioxide and other gaseous products responsible for global warming and air pollution in cities,” Goodenough mentions repeatedly. “And fossil fuel is a finite resource that is not renewable. A sustainable modern society must return to harvesting its energy from sunlight and wind. Plants harvest sunlight but are needed for food. Photovoltaic cells and windmills can provide electricity without polluting the air, but this electric power must be stored, and batteries are the most convenient storage depot for electric power.”

  Which is exactly why entrepreneurs like Elon Musk are investing heavily in them. His Gigafactory, which will soon churn out lithium-ion batteries at a scale never before seen, is the clearest signal yet that the automotive and electronics industry have chosen their horse for the twenty-first century.

  The lithium-ion battery—conceived in an Exxon lab, built into a game-changer by an industry lifer, turned into a mainstream commercial product by a Japanese camera maker, and manufactured with ingredients dredged up in the driest, hottest place on Earth—is the unheralded engine driving our future machines.

  Its father just hopes that we use the powers it gives us responsibly.

  “The rise of portable electronics has transformed the way we communicate with one another, and I am grateful that it empowers the poor as well as the rich and that it allows mankind to understand the metaphors and parables of different cultures,” Goodenough says. “However, technology is morally neutral,” he adds. “Its benefits depend on how we use it.”

  CHAPTER 6

  Image Stabilization

  A snapshot of the world’s most popular camera

  “Okay, here,” David Luraschi whispers, furtively nodding at a man with longish greasy hair and a wrinkled leather jacket who’s loping toward us on the boulevard Henri-IV. As soon as he passes, Luraschi quietly whips around, holding my iPhone vertically in front of his chest, and starts tapping the screen in rapid bursts. I, meanwhile, shove my hands in my pockets and awkwardly glance around the bustling Paris intersection. I’m trying to look inconspicuous, but feel more like a caricature of a hapless American spy.

  Stalking around Paris with a professional street photographer, I felt like that pretty much all day. I’ve turned my iPhone over to Luraschi, who’s offered to show me how he does what he does, which, I’ve discovered, involves a lot of waiting for an interesting subject to happen by, then tailing that person for as long as is comfortably possible.

  “I walk around with the camera on and earbuds in,” he says. “I use the volume buttons, here, to snap”; this has the added benefit of giving him a bit of a disguise. “Sometimes it’s hard. You have to watch if you’re creeping,” he say
s, darting a quick smile at me, eyes scanning the crowd. “You don’t want to creep.”

  We circle around the towering Bastille monument, draped in nets and scaffolding. Luraschi homes in on a woman who’s dancing as she walks and snaps a beautiful photo of her, her right hand floating in the air. We pass Parisians of every stripe—stylish twenty-somethings in high heels and trench coats, rumpled men with plucked-at beards, Muslim women in hijabs; Luraschi follows them all.

  Luraschi, a French-American fashion photographer, had, like many artists, initially resisted the rise of Instagram and its focus on image-enhancing filters. “It’s like Stephen Spielberg when he throws in a bunch of violins in a film about the Holocaust,” he says. “It’s like, you don’t need that. The world is already pretty stylized.”

  But he eventually joined Instagram and surprised himself when he rose to prominence with a series of photos that shared a strong thematic link: They were all taken from behind, with the subject entirely unaware of the camera. It’s harder than it looks.

  The shots had a powerful symbiosis with social media—perhaps because each faceless photo could have been of just about anyone in the increasingly crowded, often anonymous public spaces of the web. Whatever the reason, the series started attracting hundreds and then thousands of shares and Likes, and before long, he was being touted as an up-and-coming phenom. People from all over the world began emailing him photos they’d taken from behind.

  Instagram is of course one of the iPhone’s most popular and important apps. Mashable, a website obsessed with digital culture, ranks it as the iPhone’s number-one app, bar none. Released in 2010 on the heels of the suspiciously similar Hipstamatic (which pioneered the Millennial-approved photo-filter approach but wasn’t free to download), it quickly developed a massive following and was scooped up by Facebook for a then-outrageous, now-bargain one billion dollars.

  For Luraschi, the Instagram fame translated into more paid work, though the fear in the industry is that free amateur photos will lead to lower salaries and less contract work for professionals. Luraschi doesn’t seem concerned.

  “I’ve always enjoyed experimenting with digital technologies,” he says. “As much as I’m attached to the traditional practice of photography, of shooting on film, I like how the phone doesn’t try to be a digital camera. Voyeurism has always been a big thing of photography, to not be noticed, to get access to somewhere, striking gold,” he tells me.

  “I’ve found that it being mobile and being able to fit in your pocket—it makes it easier.”

  Makes it easier.

  If future archaeologists were to dust off advertisements for the most popular mass-market cameras of the nineteenth and twenty-first centuries, they would notice some striking similarities in the sloganeering of the two periods.

  Exhibit A: You Press the Button, We Do the Rest.

  Exhibit B: We’ve taken care of the technology. All you have to do is find something beautiful and tap the shutter button.

  Exhibit A comes to us from 1888, when George Eastman, the founder of Kodak, thrust his camera into the mainstream with that simple eight-word slogan. Eastman had initially hired an ad agency to market his Kodak box camera but fired them after they returned copy he viewed as needlessly complicated. Extolling the key virtue of his product—that all a consumer had to do was snap the photos and then take the camera into a Kodak shop to get them developed—he launched one of the most famous ad campaigns of the young industry.

  Exhibit B, of course, is Apple’s pitch for the iPhone camera. The spirit of the two campaigns, separated by over a century, is unmistakably similar: both focus on ease of use and aim to entice the average consumer, not the photography aficionado. That principle enabled Kodak to put cameras in the hands of thousands of first-time photographers, and now it describes Apple’s approach to its role as, arguably, the biggest camera company in the world.

  An 1890 article in the trade magazine Manufacturer and Builder explained that Eastman had “the ingenious idea of combining with a camera, of such small dimensions and weight as to be readily portable, an endless strip of sensitized photographic film, so adjusted within the box of the camera, in connection with a simple feeding device, that a succession of pictures may be made—as many as a hundred—without further trouble than simply pressing a button.”

  Kodak’s Brownie was not the first box camera; France’s Le Phoebus preceded it by at least a decade. But Eastman took a maturing technology and refined it with a mass consumer market in mind. Then he promoted it. Here’s Elizabeth Brayer, biographer of George Eastman: “Creating a nation (and world) of amateur photographers was now Eastman’s goal, and he instinctively grasped what others in the photography industry came to realize more slowly: Advertising was the mother’s milk of the amateur market. As he did in most areas of his company, Eastman handled the promotional details himself. And he had a gift for it—almost an innate ability to frame sentences into slogans, to come up with visual images that spoke directly and colorfully to everyone.” Remind you of anyone?

  Kodak set in motion a trend of tailoring cameras to the masses. In 1913, Oskar Barnack, an executive at a German company called Ernst Lietz, spearheaded development of a lightweight camera that could be carried outdoors, in part because he suffered from asthma and wanted to make a more easily portable option. That would become the Leica, the first mass-produced, standard-setting 35 mm camera.

  In the beginning, the 2-megapixel camera that Apple tacked onto its original iPhone was hardly a pinnacle of innovation. Nor was it intended to be.

  “It was more like, every other phone has a camera, so we better have one too,” one senior member of the original iPhone team tells me. It’s not that Apple didn’t care about the camera; it’s just that resources were stretched thin, and it wasn’t really a priority. It certainly wasn’t considered a core feature by its founder; Jobs barely mentions it in the initial keynote.

  In fact, at the time of the phone’s release, its camera was criticized as being subpar. Other phone makers, like Nokia, had superior camera technology integrated into their dumb phones in 2007. It would take the iPhone’s growing user base—and photocentric apps like Instagram and Hipstamatic—to demonstrate to Apple the potential of the phone’s camera. Today, as the smartphone market has tightened into an arms race for features, the camera has become immensely important, and immensely complex.

  “There’s over two hundred separate individual parts” in the iPhone’s camera module, Graham Townsend, the head of Apple’s camera division, told 60 Minutes in 2016. He said that there were currently eight hundred employees dedicated solely to improving the camera, which in the iPhone 6 is an 8-megapixel unit with a Sony sensor, optical image-stabilization module, and a proprietary image-signal processor. (Or that’s one of the cameras in your iPhone, anyway, as every iPhone ships with two cameras, that one and the so-called “selfie camera.”)

  It’s not merely a matter of better lenses. Far from it—it’s about the sensors and software around them.

  Brett Bilbrey was sitting in Apple’s boardroom, trying to keep his head down. His boss at the time, Mike Culbert, was to his right, and the room was half full. They were waiting for a meeting to start, and everyone except Steve Jobs was seated.

  “Steve was pacing back and forth and we were all trying to not catch his attention,” Bilbrey says. “He was impatient because someone was late. And we’re just sitting there going, Don’t notice us, don’t notice us.” Steve Jobs in a bad mood was already the stuff of legend.

  Someone in the meeting had a laptop on the conference table with an iSight perched on top. Jobs stopped for a second, turned to him, looked at the external camera protruding inelegantly from the machine, and said, “That looks like shit.” The iSight was one of Apple’s own products, but that didn’t save it from Jobs’s wrath. “Steve didn’t like the external iSight because he hated warts,” Bilbrey says, “he hated anything that wasn’t sleek and design-integrated.”

  Incidentally, th
e early iSight had been built by, among others, Tony Fadell and Andy Grignon, two men who would later become key drivers of the iPhone. The poor iSight user froze up.

  “The look on his face was I don’t know what to say. He was just paralyzed,” Bilbrey says. “And without thinking at the moment, I said, ‘I can fix that.’”

  Well.

  “Steve turned to me as if to say, Okay, give me this revelation. And my boss, Mike Culbert, slapped his forehead and said, ‘Oh, great.’”

  A new iMac was coming out, and Apple was currently switching its processor system to chips made by Intel—a huge, top secret effort that was draining the company’s resources. Everyone knew that Jobs was worried that there weren’t enough new features to show off, other than the new Intel chip architecture, which he feared wouldn’t wow most of the public. He was on the lookout for an exciting addition to the iMac.

  Jobs walked over to Bilbrey, the room dead silent, and said, “Okay, what can you do?” Bilbrey said, “Well, we could go with a CMOS imager inside and—”

  “You know how to make this work?” Jobs said, cutting him off.

  “Yeah,” Bilbrey managed.

  “Well, can you do a demonstration? In a couple weeks?” Jobs said impatiently.

  “And I said, yeah, we could do it in a couple weeks. Again I hear the slap of the forehead of Mike next to me,” Bilbrey tells me.

  After the meeting, Culbert took Bilbrey aside. “What do you think you’re doing?” he said. “If you can’t do this, he’s going to fire you.”

  This was hardly a new arena for Brett Bilbrey, but the stakes were suddenly high, and two weeks wasn’t exactly a lot of time. During the 1990s, he’d founded and run a company called Intelligent Resources. Apple had hired him in 2002 to manage its media architecture group. He’d been brought on due to his extensive background in image-processing video; Bilbrey’s company had made “the first video card to bridge the computer and broadcast video industries digitally,” he says. Its product, the Video Explorer, “was the first computer video card to support HD video.” Apple had hired him specifically because, like just about every other tech company, it had a video problem. The clumsy external camera was only part of it.

 

‹ Prev