Book Read Free

Innovative State

Page 15

by Aneesh Chopra


  How could the government help democratize access to this high performance modeling and simulation technology so America’s manufacturing sector could run more efficiently and build new products more rapidly? In 2011, the Commerce Department joined the Council on Competitiveness—including General Electric, John Deere, Lockheed Martin, and Procter & Gamble—in the launch of a new public-private project called the National Digital Engineering and Manufacturing Consortium (NDEMC).12 Seeded with $5 million, two-thirds of that from the private sector, NDEMC ran a pilot program in the Midwest, leveraging research universities and aimed at making modeling and simulation software and training available to small- and medium-size manufacturers. The large manufacturers, such as John Deere, invited their supply chains to participate in the program—in its case, for the purpose of more cost-effective tractor parts. Others offered to help any small- to medium-size manufacturer in the hope of validating the hypothesis that access to such technologies could strengthen American manufacturing.

  Nearly 30 suppliers took advantage of the initiative within the first couple of years. One was Jeco Plastics, an Indiana-based company of 25 employees, which sought to supply plastic shipping pallets to a major German manufacturer, a task that had previously fallen to a Chinese supplier. The order was contingent on making a couple of key cosmetic changes and, while doing so, not diminishing the pallets’ ability to handle the required weight. Facing seemingly insurmountable cost and time constraints to make these irreversible alterations, Jeco CEO Craig Carson turned to NDEMC, and the access it afforded to supercomputers and staff at Purdue University. Testing its models rapidly and at no cost, Jeco adequately upgraded the pallets, increased the purchase order fivefold to $2.5 million, and received enough recurring income as part of a long-term contract that it was able to expand its workforce by 60 percent. Its successful experience with modeling, simulation, and analysis (MS&A) even led to additional contracts, including one with NASA.13

  NDEMC is trying to scale the program by encouraging the development of new, low-cost software products that serve the nation’s small- to medium-size manufacturers. It is also addressing the issue of standards, a necessity considering the diversity of the U.S. supply chain. For context, consider that the Department of Defense alone works with more than 30,000 manufacturing suppliers in the United States, suppliers that represent approximately 10 percent of the total number of the nation’s small- and ­medium-size manufacturers. Many of these suppliers also provide parts and services to other manufacturers, making it impossible for them to implement a specific method for each one. As part of its broader vision of a “digital industrial commons,” NDEMC is working toward standardizing programming language so each supplier can more easily share advanced models and simulations regardless of the manufacturing counterparty.14

  Further, in May 2013, President Obama announced the launch of a program spearheaded by the Defense Department to build a Digital Manufacturing and Design Innovation (DMDI) Institute—one of three new manufacturing hubs that received $70 million in federal funding, plus an expected financial match from private sources.15 The DMDI seeks to inject the full potential of information technology into a new, “smarter” manufacturing economy, one that allows for the safe, secure sharing of product designs, quality improvement through faster feedback loops from sensors and data analysis, and faster delivery of products. And, in conjunction with the private sector, it will address a growing array of standards activities, related to data interoperability, definitions, mapping, security, and other areas.

  As President Obama said on the day of the announcement: “The economy is dynamic. Technology is constantly changing. That means we’ve got to adapt as well.”16

  In the Obama administration, we envisioned this approach—leading through coordination and collaboration rather than fiat—working in other sectors. Ideally, public officials would convene parties to encourage the development and deployment of standards that can spark innovation in a given sector of the economy; entrepreneurs would put those standards to work in the development of new products and services; and forward-­leaning communities would serve as early test beds for those products and services.

  On all of these points, we had willing partners on the legislative side, many of whom saw the value in expanding the reach of an agency that had already experienced considerable growth. Throughout the decades, the Bureau of Standards—and in its latest incarnation as NIST—had been granted greater responsibilities, capabilities, and financial resources. For instance, from 1969 to 1993, there were 79 separate pieces of law that directed the agency to conduct laboratory research and support technologies related to everything from energy conservation and recycling to the metric system to computer security.17 And, in 2007, with the passage of the America COMPETES Act, NIST would be on a 10-year trajectory to double its budget; by 2013, it had already crossed $1 billion. That legislation also created a new, more prominent, position for the NIST ­director— Under Secretary of Commerce for Standards and Technology—while directing NIST to collaborate with the private sector on initiatives as varied as cloud computing standards and high performance building standards.18

  Still, while standards activities grew along with NIST, my colleagues at the White House, including Cass Sunstein, the Director of the Office of Information and Regulatory Affairs (OIRA) and Ambassador Miriam Sapiro, the Deputy United States Trade Representative, sought to revisit the policy President Clinton established in 1998. That had directed agencies to use voluntary consensus industry standards rather than create their own. That essentially told us, as well as other government officials, what we could not do. We couldn’t impose our will on others. But we needed to clarify what government could do, and what role it could play in assisting the private sector, to reach its own consensus on standards. And in doing so, we needed to provide some specific guidelines to federal agencies, so they clearly understood the appropriate areas and limits of intervention.

  Over the course of two years, we engaged hundreds of stakeholders from the public and private sectors, and those interactions would inform the memo we created to institutionalize our approach.19 That memo started with a clear edict: all standards activities, in the U.S. policy context, must involve the private sector. Yet it added that involvement of the federal government, either in the form of active engagement or a convening role, was appropriate “where a national priority has been identified in statute, regulation, or Administration policy” and that involvement “could accelerate standards development and implementation to help spur technological advances and broaden technology adoption.

  “In these instances, the Federal Government can help catalyze advances, promote market-based innovation, and encourage more competitive market outcomes,” the memo continued. “The Federal Government should clearly define its role, and then work with private sector standardization organizations in the exercise of that role.”

  We cited, as an example, the role that the administration had begun to play in the energy sector, since Congress authorized its involvement with the 2007 passage of the Energy Independence and Security Act (EISA). That legislation had directed NIST and the Department of Energy to convene the private sector for the development of standards that would underpin the modernization of the electrical grid.

  That work was long overdue. Following Thomas Edison’s invention of the lightbulb, America had embarked on what the National Academy of Engineering regarded as one of our great achievements, the construction of “an advanced and complex electrical system that would become a model for the world,” thanks to public and private investments.20 And yet, at an event in June 2011, Energy Secretary Steven Chu referenced Edison to illustrate the industry’s recent stagnation. What if Edison transported in a time machine from the 1800s to the present day? He wouldn’t recognize the modern manifestations of his inventions in lighting and sound recording, such as LEDs and iPods. “On the other hand, he would feel really at home with most of
today’s power-generating system,” Chu said. “That’s in the last half of the nineteenth century, and here we are at the beginning of the twenty-first century.”

  As Chu argued, we need a modernized electrical grid, a twenty-first-century system for the twenty-first-century economy. We need widespread implementation and ongoing expansion of a “smart grid.” This is a grid that, as defined by the U.S. Department of Energy, uses information and communications technology to improve “the efficiency, reliability, economics, and sustainability of the production and distribution of electricity.” Such a grid uses digital versions of millions of pieces and parts, such as controls, meters, and transmission lines, upgrades that reflect the power of modern computing and wireless broadband. While many of these remain in relatively primitive stages, the expectation is that, after testing and tinkering, this technology will fully enable real-time communication between the utility and the customer, to accelerate the recording of, and responses to, electrical demand. Real-time information about the state of the grid has value in times of normalcy and distress, for both consumers and utilities. For consumers, knowing the cost of supplying energy at a specific time, such as when demand is greatest, allows them to alter their habits, related to when to do the laundry, run the dishwasher, or merely remove a plug from an outlet. For utilities, it helps to know as soon as possible that a few solar-paneled homes in a neighborhood are requesting more energy than usual. That could speak to cloud cover, and might allow those utilities to better prepare for a surge in energy requests from other homes in the area.

  Improving interactions between utilities and customers, in a way that specifically targets the efficiency end of the energy equation, is consistent with President Obama’s oft-stated goal of cutting energy waste in half over 20 years, as a complement to ongoing efforts to increase and diversify energy production.21 Yet there have been holdups, and some can be traced, at least in part, to the way in which the sector is organized. Simply, America has never had one nationalized electric utility system; instead, it has over 3,000 local and regional systems governed by local and state regulators. Each state has adopted a regulatory system with different financial incentives for the utility—from rewarding production at the lowest costs to allowing utilities to recover more money for producing costlier renewable energy. Those incentives impact both the pace of smart grid technology adoption and the effectiveness of those technologies in lowering energy usage and upping reliability, to name just two improvements.

  So far, innovation has been slow. At the aforementioned June 2011 White House event, only about half the states had adopted specific policies related to smart grid technology, and most of those policies were modest. Most utilities make more money if they sell more power, not the inverse. That’s their incentive. So, why should a utility invest in something that reduces power consumption to the detriment of their shareholders? And for the regulators who oversee those utilities, and whose primary concern is to keep costs down for the rate payers to whom they are accountable, why add any expense without clear benefit? The math doesn’t make sense.

  A few leading states have tackled the incentive problem by implementing policies to decouple the production of energy from the sale of energy services that fuel homes. Others have needed a push. That was the idea driving the White House Strategy for a 21st Century Grid, which, released on that June day, explicitly called for aligning the incentives to encourage the deployment of smart grid technologies, in the name of “a clean, smart, national electricity system that will create jobs, reduce energy use, and expand renewable energy production.”22

  To demonstrate how anyone could make a difference, Secretary Chu invited Shreya Indukuri and Daniela Lapidous to share the stage on the day he released the report. The high school seniors had raised money to install a smart energy system at the Harker School in San Jose, California, using off-the-shelf smart submetering devices, dedicated to individual buildings, as well as an intuitive online dashboard that allowed the school superintendent to learn exactly where energy usage was greatest on the school campus. With a week of installation, several anomalies became apparent, especially the excess usage in the gym. Further investigation revealed that the air-conditioning had been running over the weekend, without anyone knowing or needing it. A flip of the switch saved several thousand dollars and, over one year, Harker saw a 13 percent savings on its energy bill and a 250 percent return on investment. Lapidous proudly touted the low barrier to entry for such a campus-based “smart meter project”: a cost of between $10,000 and $20,000 per school with an 18-month payback period. Then the teenager drew laughs by drawing a clear conclusion: “Even if you’re not an environmentalist, it’s pretty hard to argue with a 250 percent ROI.”

  It certainly is. And if two teenagers could accomplish so much, energy savings stories similar to theirs certainly should become commonplace, rather than seem extraordinary. But three things need to happen. First, the incentives for the utility companies need to be aligned with those of their customers, no easy task.23 Second, those utilities need to grant access to the sort of data that even the Harker students didn’t have at their disposal. Third, all 3,000 utilities need to come together to standardize the way that information is shared, so that it could be understood and implemented by third-party developers and ultimately by consumers. If all of that occurred, it might result in a vibrant marketplace, as simple and appealing as the iPhone app store, competing to aid the Harker students, and those like them, in identifying energy waste.

  So NIST and the Department of Energy, represented by George ­Arnold and Pat Hoffman, respectively, worked together to convene the Smart Grid Interoperability Panel. That public-private partnership was led by the existing private sector standards bodies to design and deploy the necessary standards, and aided by $10 million that President Obama had allocated in the Recovery Act in 2009. The stakeholders—utility sector entities, manufacturers, and technology firms, to name a few—­recognized they needed each other. Good faith, plus good leadership, can go a long way. According to Arnold, in light of the structure of the utility sector, the government “is really the only entity that can provide that coordination leadership role.” NIST prioritized the panel’s work by emphasizing more than a dozen areas critical to jump-starting the smart grid industry. Among them: standardizing how utilities communicate with customers on energy usage information.

  By February 2011, the participants endorsed what Arnold characterized as “a very robust toolkit” of standards. Now it was time for the next giant leap: deployment of those standards in a sector that, according to Hoffman, was “ripe for an information revolution” of the sort that manufacturing had already begun to experience. Months later, I addressed a leading forum for utility executives interested in grid modernization, and raised a question: “How can we safely and securely provide customers electronic access to their energy information, thereby supporting the continuing development of innovative new products and services in the energy sector?”

  The answer would come through enlisting a coalition of utility companies, those willing to implement the agreed-upon standards. It was a strategy based on what a number of insurance companies and medical providers were already working toward in the health care space: standardizing and simplifying the method in which Medicare recipients and veterans could download personal health data, through the placement of a Blue Button icon on patient-facing websites. Why not create a Green Button that would have a similar role and payoff in the energy sector—allowing utility customers to download their own usage data and do with it what they wished, including sharing it with a growing array of third party applications that competed to provide money-saving tips?

  The appeal was well received, especially by those who had been quietly at work on the underlying standards and saw an opening for faster implementation. Still, the movement called for a champion, someone who understood the importance of engaging the customer to spark innovation. That champion would
come from California, where policymakers had long been working to enable the utilities and their customers to benefit from more efficient usage. They had begun to do so in the 1980s through a process called “decoupling,” to separate energy sales from profits, and give utilities state-­approved incentives to encourage conservation and the use of renewable energy. Then, in 2011, Karen Austin came aboard as the new CIO of Pacific Gas & Electric.24 Austin had devoted her career to recognizing, understanding, and improving customer relationships in the private sector, while establishing herself as an e-commerce retailing pioneer at Kmart and Sears, with customer-friendly programs such as Buy Online, Pick Up in Store.

  I called Austin, assuming she would be receptive to the Green Button proposal. “I thought the idea was fantastic,” Austin recalled. “Of course we should give our customers this information. Let’s do it!”

  Seeing it as a win-win-win, for the customer, PG&E, and the environment, she called other California utilities, including Southern California Edison and San Diego Gas & Electric, and convened a meeting within a couple of weeks: highly unusual alacrity for the typically sluggish utility sector. At the meeting, she—and I, as a government representative—would be sitting at the same table with a small group of public, private, and nonprofit leaders, not on a dais, looking down.

 

‹ Prev