Innovative State
Page 14
At this stage, if anything’s holding the movement back, it’s awareness.
That’s why the President’s continuing emphasis is so significant. So are the efforts, throughout the administration, to publicize activities in this area, for those with expertise in everything from public safety to health care to education to energy to global affairs and so forth. As Todd Park noted, “If you are in these spaces and do not know this stuff is available, then it’s like being in the navigation business and not knowing that GPS exists. There are big, game-changing data resources that are being made available through government action that every entrepreneur is going to want to know about, and some already do.”
The mission?
That everybody will. And then, as Todd Park puts it, “entrepreneurs can turn open data into awesomeness.”
Chapter 6
Standards and Convening
It all started on a Sunday morning in 1904. According to legend, someone dropped a lit cigar or cigarette through a cracked glass block in a sidewalk. The block functioned as a skylight for a basement. That basement belonged to the John Hurst & Company dry goods building. The blaze triggered an explosion that left much of Baltimore burning.1
The city could not find the means to stop the flames. More than 1,200 firefighters answered the urgent call for assistance, arriving by train from as far as New York, eager and seemingly well equipped. Yet they were largely ineffective, because their hose couplings could not connect to the fire hydrants. In fact, the couplings weren’t even compatible from one building to the next. So, as the firefighters scrambled to use other methods, the fire raged for more than 30 hours, reducing 1,500 buildings on 70 blocks to rubble, killing five people, and leaving thousands unemployed.
It may not have gone on so long, or done so much damage, if compatibility had been a priority in the fire equipment industry. Instead, the market incentives led manufacturers to design entirely proprietary systems, including different couplings for each vendor. After all, a city that purchased a particular system would be entirely dependent on that system’s manufacturer for any improvements or upgrades. A more interoperable system would give city officers more market choices all the way down to the spare parts level, meaning that a manufacturer would lose some leverage, and likely some margin.
The question became whether some good could emerge from the ashes. The winds of the progressive era were beginning to blow strong—the Roosevelt administration had brought a major antitrust suit two years before, the Department of Labor had been established the previous year, and the landmark Pure Food and Drug Act would pass two years later. There was an understanding that government had a role to play in protecting Americans from the dangers of their rapidly industrialized and modernized country. So, naturally, after the Baltimore fire, there was a call for stricter building codes and the use of more fireproof materials. But there was also awareness that those improvements might not be sufficient to prevent a similar calamity from occurring elsewhere. The industry needed to come together and drive toward greater standardization, so firefighters wouldn’t again be stymied by intentionally ill-fitting parts.
If you owned a Betamax video recorder in the 1970s, you can probably relate to the challenges of competing formats. Its manufacturer, Sony, tried to dictate a standard for the rest of the industry. Instead, JVC formed a broader coalition to commercialize its own technology, VHS, which was not compatible with Betamax players and vice versa. VHS came to dominate the market and quickly rendered Betamax irrelevant. Nearly two decades later, history could have repeated itself in the rollout of the DVD. Initially, there were two different formats, each backed by a number of prominent companies, with Sony and Phillips on one side, and Toshiba and JVC on the other. But a new market force, the computer industry, took a leadership role in applying the lessons from the VHS/Betamax fiasco. After its Technical Working Group (TWG) threatened to boycott all formats other than a single, standardized one, the DVD manufacturers ultimately came together and produced a common standard.2
Consumers benefited, getting higher-quality images on a more durable disk that could hold additional material. So did Hollywood. After resisting DVD production for fears of copyright infringement, the motion picture industry felt quite differently in 2004 when its studios booked a record $15 billion on DVD sales and rentals, compared to $9.4 billion in revenues at the box office.3 And while the creators of the original DVD standard couldn’t have predicted this at the time, the lightweight nature of the product would later fuel one of the early twenty-first century’s breakout companies: Netflix, which could send feature films around the country for the price of a stamp.
That’s an example of the private sector succeeding in standards development and application without the influence of the government, and you can find plenty of those throughout the past two-plus centuries. But public sector engagement in this area is also as old as the American republic, with government not always content to wait for the private sector to solve a standards problem.
In his first annual message to Congress as President, back in 1790, George Washington spoke of the importance of “uniformity in the current weights and measures of the United States,” and even directed his Secretary of State, Thomas Jefferson, to prepare a plan. From 1830 through 1901, an Office of Standard Weights and Measures, operating under the U.S. Treasury Department, oversaw much of the work—in collaboration with manufacturers, scientists, educators, government officials, and the public at large on standardizing everything from length to mass to temperature to light to time. But its mandate, funding and testing capacity, was modest, if not minuscule. Congress largely adhered to the 10th Amendment, leaving decisions about scientific research investments in standardization to the states, which really meant much of that work wouldn’t get done.
By the turn of the twentieth century, the need for standardization was even more acute, partly due to American society’s increasingly mobile and sprawling, yet interconnected, nature. Previously, most commerce had largely been relegated to the local community, because that’s where people stayed: a gallon of milk was a gallon of milk because that’s how the local dairy measured it, and that’s what the consumer, not knowing better, came to accept. But now consumers were expecting conformity wherever they traveled. And increasingly large companies, in an increasingly large country, needed to think outside of their most proximate market and be assured that their products could compete on a fair unit of measure around the country.
The introduction of electrification served as another impetus for the government to seek greater conformity in technology. For the most possible industries to benefit, the producers and distributors of electricity needed to settle on some standardized way of measuring volts and kilowatt hours, among other things. And for that technical work, most of which would be deemed precompetitive—more commercially relevant than typical university research but not designed to advantage any single firm—some scientists and engineers argued for a role for government. According to an official historical review provided by the National Institute of Standards and Technology (NIST), “The builders of America’s industrial complex had little interest in standards as such, but the scientists, engineers, and experimenters working for industry or independently found themselves increasingly hampered without them.”4
Further, according to NIST’s historical documents, “The burgeoning electrical industry showed that simple standards for mass, length and time were no longer sufficient.” The nation needed uniform standards for electrical units, as well as units and standards for new discoveries such as x-rays and radioactivity. This required research, mostly in physics and chemistry. And that meant “simple offices of weights and measures had to be replaced with more sophisticated institutions.”
Finally, after nearly two decades of debate on whether the government would be overstepping into the economy by engaging in proactive standards work, the National Bureau of Standards (NBS) wa
s finally formed in 1901, and would retain that name until it became NIST in 1988.5 Originally directed by physicist Samuel W. Stratton and staffed by only 12 members, including a chemist, an engineer, and five technical assistants, the new agency restricted its work to that which—to paraphrase the NIST historical documents—would cooperate with university research laboratories, support private enterprise, and promote general welfare. Following the Great Baltimore Fire, the shipping industry also raised concerns about fire hoses and couplings. In response, the Commerce Department enlisted the Bureau of Standards to collect over 600 sizes and variations of hose couplings in use across the country. One year later, the National Fire Protection Association, with the support of the NBS, established a national standard diameter and threads per inch for hose couplings and fire hydrants, while endorsing an interchangeable device for nonstandard couplings. It proved a greater struggle to achieve widespread adoption; for reasons ranging from expense to inertia, many cities took years or decades to comply. Still, the overall fire hose standardization effort left a significant legacy, as one of the first major examples of the federal government responding to a crisis by galvanizing a private sector industry behind a laudable goal—in this case, public safety—and then convening government officials and scientific experts to find solutions.
But why wait for a crisis? Not long after the establishment of the fire standards, the federal government would use similar means—initiating action without imposing mandates—to achieve economic ends. It would apply its convening authority in the aviation industry, in order to spur R&D and growth.
The government had little to do with Wilbur and Orville Wright getting off the ground at Kitty Hawk, back on December 17, 1903. Rather, their flying machine—carrying a pilot, rising by its own power, descending without damage—was a credit to their imagination, experimentation, and determination. In the decade that followed; however, America failed to fully capitalize on their creativity, undermined by ongoing, acute issues of safety and reliability. In 1908, Orville Wright himself was flying above 2,000 spectators when a propeller and rudder broke, sending his plane nose first into the ground and killing his twenty-six-year-old passenger, Lieutenant Thomas Selfridge. By 1913, America ranked 14th in government spending on aircraft development and, by 1916, had produced only 411 aircraft.
It was around that time, however, that the government identified a way to contribute. In 1915, the Woodrow Wilson administration tucked the creation of a new committee into a Naval Appropriation Bill. The National Advisory Committee for Aeronautics (NACA), while low profile and modestly funded, represented a rather significant shift in the scope of government. Its 12 unpaid members were commissioned to conduct research and development on engines and wings. NACA sought to develop a catalog of wing curvatures (or airfoils), so that the appropriate shape could be safely matched with the corresponding aircraft.
This variety could not come, however, until the committee settled on a standardized way of testing each airfoil design. That breakthrough, coupled with American entry into World War I, supercharged aviation production. In just nine months spanning 1917 and 1918, the government procured more than 12,000 planes for use in that conflict.6
But, in the year that followed, in the absence of government demand, production again nosedived. The new Commerce Secretary, a millionaire mining engineer and investor named Herbert Hoover, was intent not to allow America to cede leadership to Europe in this promising new industry. Hoover was obsessed with efficiency—he endeavored to eliminate, from his position in government, much of the waste in the postwar manufacturing economy. That required him to reconcile his guiding mission with his conservative governing philosophy: one based on individualism, industry autonomy, and an aversion to what he deemed the traditional, intrusive models of government intervention.
Hoover would thread that needle through convening rather than coercion, and his vision of an “associative state.”7 He eschewed the top-down planning approach widely espoused throughout Europe, instead using the state’s power to encourage the formation of voluntary and flexible trade associations that represented dozens of industries and saw value in cooperation, even among fierce competitors. Those associations would remain independent of the government, but would benefit from the government’s “friendly interest,” allowing access to its scientific research experts. They would work together to unlock opportunities and achieve growth, by identifying the technical barriers in a particular industry, designing standardized and simplified parts and procedures to address those issues, and attempting to validate their assertions and methods with the help of a government lab. In this sense, the government wouldn’t be getting in an industry’s way, so much as clearing a path, enabling that industry to grow and thrive, through the recognition and implementation of the best possible practices.
“We are passing from one period of extremely individual action into a period of associated activities,” Hoover told the U.S. Chamber of Commerce in Cleveland. “We are upon the threshold, if these agencies can be directed solely to constructive performance in the public’s interest.”8
This is the approach that Hoover would apply in aviation. Hoover was aware of NACA’s standards for wing designs in advance of the war, and their positive impact on the safety of military airplanes. And he was concerned about the collapse in airplane production following the armistice, from 21,000 per year to a total of 263 in 1922. After casualties associated with poor aircraft design, militarily and commercially, America needed to reduce the risk to commercial operators and potential investors, and relieve the worries of would-be passengers, which meant elevating and expanding upon the safety work that NACA had initially done. Hoover called for the organization of a trade association called the Aeronautical Chamber of Commerce, and pushed for the passage of the Air Commerce Act of 1926, to better coordinate government’s capacity to collaborate with that association, opening research laboratories for the purpose of technical breakthroughs and safety upgrades. The two leading manufacturers of that era, Douglas and Boeing, wholly adopted various NACA standards, in their production of the popular DC-3 and 247 aircraft. Those standards, and their subsequent iterations, would play a role in aircraft acquisitions for World War II. Standardization also set the stage for the commercial aviation boom that continues through this day. According to Bureau of Transportation statistics, commercial airliners currently employ more than 500,000 Americans on a full-time or part-time basis, more than 600 million passengers board domestic flights alone in the United States each year, and Boeing projected in 2013 that the U.S. would need 35,280 new planes, valued at $4.8 trillion, by 2032.
During two terms as Commerce Secretary that spanned the Woodrow Wilson and Calvin Coolidge administrations, Hoover doubled the size of the Bureau of Standards and engaged nearly 100 industries in the collective process of creation and deployment. According to his Presidential library, he was intent on insuring “that industries voluntarily cooperated in improving our national progress and improving Americans’ standards of living. To Hoover, no product or industry was too mundane for review and reform: flashlight cases, lumber, chinaware, mattresses, and bricks all merited primers on elimination of waste.” That’s right, even bricks. Under Hoover, standardization reduced the varieties of paving brick from 66 to five. At the conclusion of Hoover’s tenure—and prior to his election as President—the Commerce Department calculated that its standardization and simplification efforts had generated roughly $600 million in economic impact across America’s $18 billion manufacturing sector.9
Over the next several decades, Commerce Department officials in Republican and Democratic administrations largely adhered to Herbert Hoover’s model in their governmental approaches to affecting economic activity: avoiding the ideological extremes of industrial policy (picking winners and losers) and laissez-faire (letting everyone be). Many also embraced his belief in the power of collective action, especially that of interested parties in the same indu
stry, with the guidance of and access to—though not interference from—the government.
In that sense, the Council on Competitiveness was a philosophical descendant of those trade associations that Hoover had called into action.10 Nonpartisan and nongovernmental, the Council was created in 1986 in response to concerns that America was losing economic leadership around the world, notably to Asia. It consisted of an all-star team of CEOs, university presidents, and labor leaders, who came together to assess America’s place in the global marketplace, identify obstacles and opportunities, and generate public policy solutions.
As technology advanced, the corporations represented in the Council encountered more challenges that required collective consideration and action. As of the early twenty-first century, several of America’s manufacturing behemoths had invested millions in high performance computing (HPC), including modeling and simulation activities that were intended to dramatically reduce production times and costs, by allowing for the optimization of design prior to the physical testing stages. In 2005, the Council undertook the High Performance Computing initiative. Three years later, a Council report confirmed earlier findings that “virtually all U.S. businesses that have adopted HPC consider this technology indispensable for their competitiveness and corporate survival.”11 This report cited some examples. Boeing used HPC to reduce its expensive “live” experimental tests from 77 to 11 for the Dreamliner 787 compared to the 777. Walmart used HPC to better manage its worldwide stores from its Arkansas headquarters, in everything from determining shelf space to turning out the lights.
Still, the report noted that only energy firms among U.S. industries had truly integrated HPC into critical business functions, while suppliers to all of those top firms had lagged behind, with many not using HPC at all. It called that situation “troublesome” in light of HPC’s potential to reduce costs and speed time to market; the gap is especially concerning as it comes at the same time that international firms are “driving HPC through their supply chains more aggressively.” There was an explanation for the holdup: suppliers, mostly small- and medium-size manufacturers, typically could not afford to employ the expensive new technologies. Nor had the large manufacturers standardized a method for sharing computer-generated models across their respective supply chains. These issues had undermined progress, with the large firms often held back by their smaller, but essential, brethren.