The Silo Effect

Home > Other > The Silo Effect > Page 2
The Silo Effect Page 2

by Gillian Tett


  Bloomberg tried to apply the same silo-busting principles in a much wider sense. He declared that the different departments needed to work together far more closely than before, breaking down the long-established barriers. Indeed, he was so determined to promote change that he appointed a non–New Yorker, Stephen Goldsmith, as deputy mayor for operations. Before coming to New York, Goldsmith had worked as mayor of Indianapolis, where he had earned acclaim by overhauling how its city bureaucracy functioned, breaking down silos to make the system more efficient. Bloomberg was eager to replicate that in New York.

  However, Goldsmith quickly discovered that it was much harder to promote a revolution in New York than in Indianapolis. Shuffling around furniture in the bullpen was one thing. Persuading bureaucrats to change their working habits was quite another still. “The unions are really strong in New York and they want to protect everyone,” explained Goldsmith. “It’s a huge place. There are 2,500 different job categories in the New York government—yes, 2,500!—and all these different silos are so entrenched.” But even if the rhetoric of Bloomberg’s plans did not play out as he hoped, everybody in City Hall knew the direction of Bloomberg’s ambition. That appealed to Flowers and in 2010 Flowers agreed to join the New York government, hoping to try some experiments.

  THE FIRE AT 2321 Prospect Avenue in Bronx presented the first big chance for Flowers to test some ideas. Soon after he arrived, Flowers placed an advertisement on Craigslist, the online advertising and brokerage site, seeking young data crunchers. Nobody in City Hall usually recruited staff that way. But Flowers quickly assembled a team of recent college graduates: Ben Dean, Catherine Kwan, Chris Corcoran, and Lauren Talbot.19 “I wanted somebody fresh out of college with skills in mathematical economics, someone who could give me a fresh pair of eyes.” Then he installed the “kids,” as he called them, in a downtown warehouse.

  A few days after the Garcia family death, Flowers asked the team to comb through the data that New York was collecting about fire risk. He wanted to see if there was anything that might predict when fires would break out. At first glance, there did not appear to be any obvious clues. The Fire Department had extensive information about previous fires and reports about illegally converted buildings that had been logged via the 311 telephone line that was normally used when people wanted to complain to the government. But—oddly enough—although most calls about illegal conversions emanated in lower Manhattan, that was not the place with most fires, nor where most illegal conversions were found. Those happened in the outer boroughs, such as the Bronx and Queens. That was because many poor immigrants (like the Garcia family) were too scared of the authorities to actually report problems. Those 311 calls were not a good predictor of fires.

  So was there a better way to guess where fires might break out? What would happen, Flowers asked, if you looked at data from other sources—outside the Fire Department? Flowers asked his kids to leave their computers for a few days and go on “ride-alongs” with the inspectors from the different Sheriff’s Office, Police, Fire, Housing, and Building Departments. What, he asked, were the essential features of fire traps? How could you spot them?

  Many of the inspectors were initially suspicious. The New York Fire Department, for example, has a long, proud history, and the inspectors did not like outsiders meddling in their operations. They tended to be scornful of City Hall, and there was a plethora of rules that stipulated that buildings inspectors could only look for some types of problems—while fire inspectors hunted for others. But Flowers was determined to break down these boundaries, and his time in Baghdad had left him convinced that if you wanted to understand a problem there was no substitute to getting out and watching real life unfold. Life could not be put into neat, predefined boxes or just observed from an office or computer program; you had to be willing to watch, listen, and rethink your assumptions.

  So he sternly told his kids to be humble—and keep an open mind about what might help predict fires. “We listened to firemen, to policemen, to inspectors from the Buildings Department, from Housing Preservation and Development, the Water Department. I asked them: ‘When you go to a place that’s a dump, what do you see?’ What are the clues? We listened, listened, listened.” Gradually, a pattern emerged. Dangerous buildings, the kids learned, tended to have been built before 1938, when the building codes were tightened in New York. They were usually located in poor neighborhoods, their owners were often delinquent on their mortgages, and the buildings had generated complaints about issues such as vermin before.20

  So Flowers’s kids hunted for data on those issues. It was surprisingly difficult. In theory, New York was a gold mine for data-loving geeks, since the forty-odd agencies that sit directly under City Hall’s control have long collected extensive records of their activity. City Hall officials were so proud of this data stash that when Bloomberg created his bullpen in City Hall, he installed computer screens on the walls, between the historic oil paintings, to display these beloved statistics. But there was one big catch: the data was held in dozens of different databases, since not only were the agencies separated from each other, but there were subdivisions within the agencies. The numbers as crazily fragmented as the people.

  However, the kids used a database of properties known as “PLUTO” (the Primary Land Use Tax Lot Output)21 to isolate a subset of 640,000 houses in the New York district that were registered with City Hall to hold one to three families. Due to a peculiar quirk of New York law, the Fire Department was only allowed to inspect about half of these; the rest fell under the control of the Department of Buildings. However, the kids tracked through all the different—separate—records from the Fire and Building Departments for data about house fires and illegal conversion complaints. They also scoured the Department of Finance and Department of Investigations—separate bodies that dealt with tax and fraud respectively—for information about previous tax and mortgage defaults, and checked with the Building Department for a list of properties built before 1938. Finally they combined those data pools in a single statistical model. Slowly, a pattern emerged. Whenever all four of the risk factors cropped up together at an address, there was a dramatically higher incidence of house fires and illegal conversions, even if nobody had ever complained about problems. Or to put it another way, if you wanted to predict which houses were likely to be fire traps, the best clue came not from 311 calls or specific complaints about fires, but by combining disparate data on mortgage defaults, violation of building codes, data on the age of structures, and myriad indicators of neighborhood poverty.

  So Flowers went to the Building Department inspectors, with the support of Goldsmith, and asked them to inspect the houses that were violating the building code and looked dangerous on the aggregated data. “At first they didn’t like this idea at all—they said we were nuts!” Flowers recalled. But eventually the Building Department backed down and used his data. The results were stunning. Traditionally, when the inspectors had looked at buildings, they had only uncovered actual problems in 13 percent of the places. With the new method, problems were uncovered 70 percent of the time.22 At a stroke—and without spending any more money—the fire detection process had become almost four times more effective.

  Was that just a lucky fluke? The team tested the same technique with larger buildings. Initially, the results were poor. So Flowers dispatched some of his young data scientists out on the rounds with the inspectors again, and told them to do more on-the-ground research: was there something that made big buildings different from small buildings? Days passed without any clear clue. But then one of the data crunchers heard a veteran inspector remark by chance, as they drew up outside a large building: “This building is fine—just look at the brickwork!” The computer scientist asked why the brickwork mattered, and the inspector explained that years of inspections had shown that landlords who paid to install new bricks did not tolerate fire hazards. So the kids changed tack—and looked at some data on brick deliveries across New York, another unnotic
ed data pile in another corner of the New York bureaucracy. When that was plugged into the statistical map, the accuracy of their predictions surged. In isolation, those records about bricks were not revealing; joined up with other data points, it was dynamite.

  Then the kids applied the same silo-busting approach elsewhere. Cigarettes were a case in point. In previous decades, the city had suffered a big problem with tobacco smuggling since cigarettes cost twice as much in New York as Virginia (due to tax) and the city only had fifty sheriffs to inspect 14,000 news dealers.23 But by cross-checking business licenses against tax fraud data, Flowers’s team dramatically increased the detection rates. They performed a similar trick with illegal sales of OxyContin, the oft-abused prescription drug. Since the city had thousands of pharmacies it had traditionally been hard to spot illegal OxyContin sales by random inspections. But after combining fragmented databases, Flowers’s team determined that just 1 percent of all the pharmacies accounted for around 24 percent of Medicaid reimbursements for the most potent types of OxyContin prescriptions.24 Detection rates soared.

  The kids even dove into the unpleasant problem of “yellow grease” fat used in restaurant deep-fryers. There are an estimated 24,000 restaurants in New York, and many deep-fry their food. “Just think of those fries, spring rolls, whatever!” Flowers liked to say, pointing at his belly. Under New York law, restaurants are supposed to get rid of this grease by taking out contracts with a waste disposal company. However, many have traditionally flouted that law and just tipped the fat down manholes into the New York sewage system instead.

  For years it had been almost impossible to prevent these illegal dumps, since the grease was usually thrown down the manholes late at night. But the skunkworks kids collected reports from the environmental department about yellow grease pollution, and compared it to separate pools of data on business licenses, tax returns, kitchen fires. They plotted out restaurants that had not applied for waste disposal licenses—and created a list of likely grease dumpers. Then the team approached a separate department of the City Hall bureaucracy that was trying to promote biodiesel recycling, and asked if they might collaborate with the health and safety inspectors, and the fire services, to persuade restaurants to stop dumping yellow grease into manholes, and sell it to recycling groups instead. “When the inspectors go into a restaurant now and find yellow grease dumps, they don’t just go in there and say: ‘Hey, knock it off! Pay us a $25,000 fine!’ ” Flowers later recalled. “Instead they say: ‘Don’t be dumb—get paid for getting rid of this stuff ! Sell it to the biodiesel companies! There is a whole industry out there that actually wants to buy yellow grease!’ ”

  Indeed, the benefits of silo-busting were so obvious and powerful that Flowers often wondered why nobody had thought of it before. After all, statistical geeks have been using advanced data-sampling techniques for years, and are trained to look for correlations. Why had nobody tried combining the databases before? Flowers knew the answer to his own question even before he asked it: New York’s government was marred by so many silos that people could not see problems and opportunities that sat just before their noses. The story of the skunkworks, in other words, was not really a story about statistics. It was a tale about how we organize our lives, our data, our departments, our lives, and our minds. “Everything here is arranged in a fragmented way. It’s tough to join it all up. When you do, it’s obvious that you get much better outcomes,” Flowers observed.

  “But somehow [this joined-up approach] doesn’t happen much. You just gotta ask: why?”

  THE PARADOX OF AN INTERCONNECTED WORLD

  The story of New York’s City Hall is not at all unusual. On the contrary, if you look around the world today, our twenty-first-century society is marked by a striking paradox. In some senses, we live in an age where the globe is more interlinked, as a common system, than ever before. The forces of globalization and technological change mean that news can flash across the planet at lightning speed. Digital supply chains link companies, consumers, and economies across the globe. Ideas—good and bad—spread easily. So do people, pandemics, and panics. When trades turn sour in a tiny corner of the financial markets, the global banking system can go topsy-turvy. We live, in short, in a world plagued by what the economist Ian Goldin has dubbed the “Butterfly Defect”: a system that is so tightly integrated that there is an ever-present threat of contagion.25 “The world has become a hum of interconnected voices and a hive of interlinked lives,” as Christine Lagarde, head of the International Monetary Fund, observes. “[This is a] breakneck pattern of integration and interconnectedness that defines our time.”26

  But while the world is increasingly interlinked as a system, our lives remain fragmented. Many large organizations are divided, and then subdivided into numerous different departments, which often fail to talk to each other—let alone collaborate. People often live in separate mental and social “ghettos,” talking and coexisting only with people like us. In many countries, politics is polarized. Professions seem increasingly specialized, partly because technology keeps becoming more complex and sophisticated, and is only understood by a tiny pool of experts.

  There are many ways to describe this sense of fragmentation: people have used words like “ghettos,” “buckets,” “tribes,” “boxes,” “stovepipes.” But the metaphor I find useful is “silo.” The roots of this come from the ancient Greek term siros, which literally means “corn pit.”27 Even today, the word retains that original sense: according to the Oxford English Dictionary,28 a silo is a “tall tower or pit on a farm used to store grain.”29 However, in the middle of the twentieth century, the Western military adopted the word to describe the underground chambers used to store guided missiles. Management consultants then imported the phrase to describe a “system, process, department, etc. that operates in isolation from others,” as the Oxford English Dictionary says. The word “silo” today is thus not just a noun, but can be employed as a verb (to silo) and adjective (silo-ized). And the crucial point to note is that the word “silo” does not just refer to a physical structure or organization (such as a department). It can also be a state of mind. Silos exist in structures. But they exist in our minds and social groups too. Silos breed tribalism. But they can also go hand in hand with tunnel vision.

  This book is not “anti-silo.” It does not argue that silos are always bad, or that we should just issue a moratorium and “abolish all silos!” (Although that might sometimes seem tempting.) On the contrary, a starting point of this book is that the modern world needs silos, at least if you interpret that word to mean specialist departments, teams, and places. The reason is obvious: we live in such a complex world that humans need to create some structure to handle this complexity. Moreover, as the flood of data grows, alongside the scale of our organizations and complexity of technology, the need for organization is growing apace. The simplest way to create a sense of order is to put ideas, people, and data into separate spatial, social, and mental boxes. Specialization and expertise usually deliver progress. After all, as Adam Smith, the eighteenth-century economist, observed, societies and economies flourish when there is a division of labor.30 Without that division, life is far less efficient. If those 150,000 staff working in New York’s government were not organized into expert teams, there would be chaos. A dedicated team of trained firefighters is likely to be better at fighting fires than a random group of amateurs. Silos help us to tidy up the world, classify and arrange our lives, economies, and institutions. They encourage accountability.

  But silos can also sometimes cause damage. People who are organized into specialist teams can end up fighting with each other, wasting resources. Isolated departments, or teams of experts, may fail to communicate, and thus overlook dangerous and costly risks. Fragmentation can create information bottlenecks and stifle innovation. Above all else, silos can create tunnel vision, or mental blindness, which causes people to do stupid things.

  The world around us is littered with examples o
f this. One of the reasons the Great Financial Crisis of 2008 erupted, for example, was that the financial system was so fragmented that it was almost impossible for anyone to take an interconnected view of how risks were developing in the markets and banking world. Gigantic financial companies were split into so many different departments, or silos, the leaders who were supposed to be running the groups did not understand what their own traders were doing. But this is not just a problem affecting banks. In 2010, BP revealed that one of its rigs had suffered an explosion in the Gulf of Mexico. As oil spurted out into the sea, causing terrible pollution, recriminations flew around. Then, as investigators dug into the issues, a familiar pattern started to emerge: BP was a company beset by numerous bureaucratic silos, with technocratic geeks scurrying around in specialized fields. Though the oil company had a technical team monitoring safety, that group was not connected with the team that handled the day-to-day operations on the Macondo oil rig. Messages did not get passed across, or not until it was far too late.31

  In the spring of 2014 General Motors admitted that some of its compact cars, such as the Chevrolet Cobalt and Pontiac G5, had been fitted with a faulty ignition switch that could flip from the “run” position to the “accessory” position while driving, cutting the engine power and disabling the airbags. The company admitted that some engineers had been aware of this fault since 2001 and had known that it would have cost a mere 90 cents per car to fix it. However, they had not changed the switch, even as people died in car crashes, because the information about the switches sat in one tiny, bureaucratic silo. Worse still, the engineers who handled the switches had minimal contact with the legal team that was worrying about reputational risk. General Motors, in other words, was a company that was riddled with silos—and where staff had little internal incentive to collaborate in a proactive way. Like the bankers, or the safety managers at BP, the individual teams protected their own interests, even when this threatened to damage the company as a whole.32 “We have to find a way to break these silos down,” Mary Barra, the newly appointed CEO, lamented to staff after the damning report came out.33

 

‹ Prev