by Andrew Blum
A few miles later I came over the top of a ridge and was rewarded with the Internet tourist’s jackpot: a fiber-optic regeneration station, housing the equipment that amplified the light signals on their journey across the country. A barbed-wire fence enclosed an area the size of two tennis courts, with a gravel parking lot and three small buildings. When I got out for a closer look, the high desert wind whipped around and slammed the car door shut. Two of the buildings were steel walled, like shipping containers. The third was concrete and stucco, with multiple doors, like a self-storage locker. A zeppelin-shaped fuel tank the size of a sofa stood between them. On the fence was a white sign with red-and-black lettering: NO TRESPASSING. LEVEL 3. IN CASE OF EMERGENCY CALL.... Given that there was nothing around, none of the signal stopped here; all of it was merely received and re-sent by the racks of equipment inside, a necessary pit stop on the photons’ journey through glass.
Directly across the road was an earlier generation’s version of the same idea: an AT&T microwave site, hardened against nuclear attack, its big bunkerlike building, bigger than a house, looking sinister with its spindly antennae. The Level 3 encampment looked like the kind of sheds you’d find behind a gas station. I remembered the same contrast in Cornwall, where British Telecom’s brutalist bunker stood in stark contrast with Global Crossing’s more discreet house. I thought of the tilt-up concrete buildings in Ashburn, and the corrugated steel sheds in Amsterdam. The Internet had no master plan, and—aesthetically speaking—no master hand. There wasn’t an Isambard Kingdom Brunel—the Victorian engineer of Paddington Station and the Great Eastern cable ship—thinking grandly about the way all the pieces fit together, and celebrating their technological accomplishment at every opportunity. On the Internet there were only the places in between, places like this, trying to disappear. The emphasis wasn’t on the journey; the journey pretended not to exist. But obviously it did. I climbed up on the car for a better vantage point, doing what I could to climb up into that big sky. There was nobody around, the highway was empty, and there wasn’t another house in sight. It was too windy to hear the hum of the machines.
Prineville was seventy-five more miles down the road. While The Dalles is in spitting distance of Portland, Prineville is tucked away in the middle of Oregon, far from the nearest interstate, in an area so remote that it was among America’s last to be inhabited—until westward settlers noticed gold stuck in the hooves of lost cattle. Prineville is still a cowboy town, home to the Crooked River Roundup, a big annual rodeo. It’s a place that has always fought to get its own, right down to founding a city-owned railroad after the main line passed it by—a nineteenth-century middle-mile network Nolan Young would appreciate. The City of Prineville Railway is still in operation (“Gateway to Central Oregon”), the last municipal-owned tracks in the entire country. But Prineville’s biggest struggle came with the recent loss of its last major employer, Les Schwab Tire Centers—known for its “free beef” promotion—to Bend, a booming ski town with twelve Starbucks and a big Whole Foods, thirty miles away. Prineville fought to bring in Facebook, touting tax breaks that would save the company as much as $2.8 million a year, as a perk of its presence in an enterprise zone on the outskirts of town. But where Google insisted on secrecy, Facebook moved into Prineville with fanfare. Driving the town’s main drag, with its intact 1950s roadside architecture, I spotted a WELCOME FACEBOOK sign in a store window.
The data center sits on a butte above town, overlooking the Crooked River. Lining both sides of the wide road leading up to it are more white-and-orange fiber markers, like bread crumbs leading to the data center door. My first impression was of its overwhelming scale—long and low like a truck distribution center on the side of the highway. It was surprisingly beautiful, more visually assertive than most any piece of the Internet I’d seen, set on a shallow crest like a Greek temple. Where Google’s buildings were aesthetically loose, with loading docks and appendages pointing every which way, Facebook’s seemed tightly rational, a crisp human form in the sagebrush. It sits alone in the empty landscape, a clean concrete slab topped by a penthouse of corrugated steel. At the time I visited it was still under construction, with only the first large data center room finished. Three more phases extended out the back, like a caterpillar growing new body segments—the last one still showing the yellow insulating panels of endoskeleton. All together the four sections would total three hundred thousand square feet of space, the equivalent of a ten-story urban office tower. Construction on another building the same size would begin soon, and there was room on the property for a third. Across the country in Forest City, North Carolina, Facebook had begun construction on a sister building of the same design—which also happened to be fifty miles from Google’s own massive data center in Lenoir, North Carolina.
Before I visited, I was predisposed to think of these big data centers as the worst kind of factories—black smudges on the virgin landscape. But arriving in Prineville I discovered what an industrial place it was already, from the vast hydroelectric infrastructure of the region to the remnant buildings of the timber industry that dotted the town. The notion of this data center despoiling the landscape was absurd. Prineville has long been a manufacturing town—and at the moment, what it needed most of all was more industry. What amazed me was that the data center had ended up here at all. This enormous building landed in the brush was an astonishing monument to the networked world. What’s here is also in Virginia and Silicon Valley—that isn’t surprising—but the logic of the network led this massive warehouse, this huge hard drive, to this particular town in Oregon.
I found Ken Patchett inside, leaning back in a brand-new Aeron chair with its tags still on. He sat at his desk in the sunny, open offices, his white iPhone earbuds stuck in his ears, finishing up a conference call. Before coming to Prineville to manage the data center, he’d held the same job in The Dalles, but it was hard to imagine him at Google. “My dog had more access to me there than my family!” he said. For all the Google-bots’ silence, Patchett was uncensored. He’s an enormous extrovert, with a booming voice and a winking sense of humor. At six feet, four inches, when he put his hard hat on for a walk through the building’s unfinished sections, he looked like the iron workers still on the site. That fit: his job at Facebook wasn’t about shaping information (at least not entirely), but the proper functioning of this huge machine.
Patchett grew up a military brat and then lived with his grandparents on their farm in New Mexico, milking cows before school. He wanted to be an iron worker or a policeman, but he dropped out of college when he couldn’t play football anymore. He traveled the country for a job managing and servicing equipment at sawmills. “You want to talk about a wood chip, I am the guy to talk about a wood chip,” he said. “And if your wood chips are no good, I can tell you how to make them better.” The work even brought him to Prineville, where as a twenty-four-year-old he installed a chipper in the town mill for the current city manager. He had four kids and got into computers for the money. At a job fair in Seattle in 1998, he heard about a contract position at a Microsoft data center, paying $16 an hour. When he arrived at the place, he realized he’d help build it—one summer as an iron worker. “I walked in there and was, like, hey, I’ve been in here before! Then I saw this fellow lumber by and I thought he was a janitor, and next thing you know he comes out and it’s the guy who’s going to be interviewing me.” He’d taken a couple technical classes, and the guy looked over his paperwork and asked, “What does that mean? Why should I hire you?” That taught him a lesson: “I don’t know nothing. I know enough to figure my way around, but what I learned from taking all these classes is I don’t know enough.”
A month later, Microsoft had him running a data center. “Like any good manager, I came in and painted the walls and brought in the plastic flowers.” Then Microsoft moved him into the global networking team, and he celebrated the turn of the millennium sitting on the top of the AT&T building in Seattle with a satellite phone in his ha
nd, in case the world ended. At Google a few years later, he started out managing The Dalles but soon was promoted out and ended up building data centers in Hong Kong, Malaysia, and China. He was in Beijing the day Google pulled out in 2010. “We left some boxes in there but they’re not doing anything, just blinky lights,” he assured me.
As we began talking about Facebook, I told Patchett how I was interested in why this building was here, of all places, seemingly in the middle of nowhere, but he cut me off. “Just because they don’t have this one thing here, does that mean you forsake the whole community?” He shook his head. “Do you say screw ’em, let them eat cake?” It would be naive to think that Facebook came to Prineville to benefit the community—and indeed, Patchett came on board long after the site was chosen. But now that Facebook was there, he was determined for Facebook to be a part of Prineville. Facebook’s ethos was bringing people together—perhaps, occasionally, more than they wanted to be. That extended to the data center. “We’re not here to change the culture, just to integrate and be a part of this,” Patchett offered.
To some extent this was a concerted PR effort to avoid repeating the mistakes that Google made in The Dalles, and the bad press that followed. Where Google had kept everything top secret, threatening legal action against anyone who even spoke its name, Facebook was determined to be wide open to this community. But it came wrapped in a broader statement about the openness of technology. At a press conference soon after I visited Prineville, Facebook launched the Open Compute project, where it shared the schematics of the entire data center, from the motherboard to the cooling system, and challenged others to use it as the starting point for improvement. “It’s time to stop treating data centers like Fight Club,” Facebook’s director of infrastructure declared. But you could also look at the difference between Google and Facebook from the other angle: Facebook played fast and loose with our privacy while Google vehemently protected it. At the least, Patchett was happy to show off Facebook’s data center. “Want to see how this shit really works?” he asked. “This has nothing to do with clouds. It has everything to do with being cold.”
We started out in the glass-walled lobby filled with modern furniture in bright colors and photos of old-time Prinevillians on the wall. Facebook had hired an art consultant to poke around the town archives and choose images to decorate the place. (It made sense to me: If you’re going to spend half a billion on hard drives, why not a few thousand on art?) Patchett leaned in close to one. “Look at the people—how’d you like to make her angry? And look at the hats. Everybody has their own hat,” he said. “They all have their own style.” He winked—that was a Facebook joke.
We passed the conference rooms named for local beers and entered a long, wide hallway with cavernous ceilings, like the stockroom of an IKEA. The overhead lights went on as we walked. Patchett swiped us through another doorway and into the first data center room, still in the process of being turned on. It was spacious and shiny, as big as a hotel ballroom, brand-new and uncluttered. On either side of an open central corridor were narrow aisles formed by high racks of black servers. In scale and shape, and with the concrete floors, the place felt like the underground stacks of a library. But in place of books were thousands of fluttering blue lights. Behind each light was a one-terabyte hard drive; the room contained tens of thousands of them; the building had three more rooms this size. It was the most data I’d ever seen in one place—the Grand Canyon of data.
And it was important stuff. This wasn’t the dry database of a bank or government agency. Somewhere in here was stuff that was at least partly mine—among the most emotionally resonant bits around. But even knowing that, it still felt abstract. I knew Facebook as the thing on the screen, as a surprisingly rich medium for delivering personal news—of friends’ new babies and jobs, health scares and vacations, first days of school and heart-wrenching memorials. But I couldn’t avoid the breathtaking obviousness of what was physically in front of me: A room. Cold and empty. It all seemed so mechanical. What had I handed over to machines—these machines in particular?
“If you blew the ‘cloud’ away, you know what would be there?” Patchett asked. “This. This is the cloud. All of those buildings like this around the planet create the cloud. The cloud is a building. It works like a factory. Bits come in, they get massaged and put together in the right way, then packaged up and sent out. But everybody you see on this site has one job, that’s to keep these servers right here alive at all times.”
To minimize energy usage, the temperature in the data center is controlled with what amounts to a swamp cooler, rather than normal air conditioners. Cool outside air is let into the building through adjustable louvers near the roof; deionized water is sprayed into it; and fans push the conditioned air down onto the data center floor. “When the fans aren’t on, and the air isn’t being sucked through here—it’s like a real cloud, dude,” Patchett said. “I fogged this whole place up.” Given Prineville’s cold and dry climate, most of the year cooling is free. We stood beneath a broad hole in the ceiling, almost big enough to call an atrium. Daylight was visible along its upper edges. “If you stand right here and look up, you can see the fan bank,” Patchett said. “The air hits this concrete floor and roils left and right. This whole building is like the Mississippi River. There’s a huge amount of air coming in, but moving really slowly.”
We left out the far end of the huge room and came into another wide hallway. “Here’s my own personal storage room for stuff I don’t really need,” Patchett said. “And here’s a bathroom I had no idea was back here until they put that sign on it.” Behind another door was the second large data center space, a match to the one we’d just crossed but filled with server racks in various states of assembly. Behind this room would be two more like it—phases A, B, C, and D, ready for growth. Equipment had been arriving by the truckload every day. “We swarm on it like little server fairies, and by the morning, whhheeeee, there’s all the blinky lights arranged in a nice order,” Patchett said.
“But you’ve got to understand what your growth curve looks like. You want to make sure you don’t overbuild. You want to be 10 percent ahead, although you’re always 10 percent behind. But I’d rather be 10 percent behind than have a half a billion dollars of data center space sitting there.” That reminded me that Patchett had the keys to Facebook’s biggest single line item. The social network had recently raised a billion and a half dollars through a controversial private offering, orchestrated by Goldman Sachs. A significant fraction of that ended up in the back of a semitruck, chugging up this hill. But Patchett had been around long enough to be wary. “The Internet is a fickle thing,” he said. “She’s a crazy lady! So don’t spend everything you’ve got up front because you may or may not use it. People get a swinging dick complex. Google built these monstrous data centers that are empty, you know why? Because it’s fucking cool!”
After lunch, we climbed into Patchett’s huge pickup truck and drove out on a dirt road through the woods behind the data center. Above us was the spur power line that Facebook had built off the main branch—a onetime expense that would pay for itself many times over. At a wide spot in the road, I could see the main lines running off into the northwest in the direction of The Dalles—and Portland, Seattle, Asia. At the edge of a bluff, we got out of the truck and looked out over the town of Prineville and toward the Ochoco Mountains. Immediately below us was an old sawmill, with a new power cogeneration plant built a decade ago but never used. “I think it’s important to think about locally significant stuff when you’re here,” Patchett said. “What if when we’re all grown up and ready to do something, what if we help get that back up and deliver twenty megawatts of power?” It wasn’t a real plan, only a dream. In fact, Facebook had come under fire from Greenpeace for relying too heavily on coal power. But for Patchett, it was tied up in a broader vision about the future of data centers, and America. “If you lose rural America, you lose your infrastructure and your food. It’s incum
bent for us to wire everybody, not just urban America. The 20 percent of the people living on 80 percent of the land will be left behind. Without what rural America provides to urban America, urban America couldn’t exist. And vice versa. We have this partnership.” If in Oregon that was once about timber and beef, it now extended to data, of all things. The Internet was unevenly distributed. It wasn’t everywhere at all—and the places where it wasn’t suffered for it.
We climbed back in the truck and bounced back toward the data center, which emerged out of the woods like an oceanliner. Patchett was fiddling with his iPhone as he drove along the rutted road. “I just got an email,” he said. Testing on the data center was done. “We are live on the Internet right now.”
Epilogue
As everyone from Odysseus on down has pointed out, a journey is really only understood upon arriving home. But what did that mean when the place I was coming home from was everywhere?
The morning I left Oregon, I’d opened my laptop in the airport lounge to write some emails, read a few blog posts, and do the things I always do while sitting in front of the screen. Then, even more strangely, I did the same thing on the plane, paying the few bucks for the inflight Wi-Fi, flying above the earth but still connected to the grid. It was all one fluid expanse, the vast continent be damned—on the Internet’s own terms, at least.
But I hadn’t traveled tens of thousands of miles, crossed oceans and continents, to believe that was the whole story. This may not have been the most arduous of journeys—the Internet settles in mostly pleasant places—but it was a journey nonetheless. The science fiction writer Bruce Sterling voiced a popular sentiment when he wrote, “As long as I’ve got broadband, I’m perfectly at ease with the fact that my position on the planet’s surface is arbitrary.” But that ignores too much of the reality of how most of us live in the world. We’re not merely connected, but rooted.