Army of None

Home > Other > Army of None > Page 44
Army of None Page 44

by Paul Scharre


  141

  Patriot crew was unharmed: “F-16 Fires on Patriot Missile Battery,” Associated Press, March 25, 2003, http://www.foxnews.com/story/2003/03/25/f-16-fires-on-patriot-missile-battery.html.

  143

  Two PAC-3 missiles launched automatically: Pamela Hess, “Feature: The Patriot’s Fratricide Record,” UPI, accessed June 7, 2017, http://www.upi.com/Feature-The-Patriots-fratricide-record/63991051224638/. “The Patriot Flawed?,” April 24, 2003, http://www.cbsnews.com/news/the-patriot-flawed-19-02-2004/.

  143

  U.S. Navy F/A-18C Hornet: Ibid.

  143

  both missiles struck his aircraft: Ibid.

  143

  “substantial success”: The Defense Science Board Task Force assessed the Patriot’s performance as a “substantial success.” This seems perhaps overstated. It’s worth asking at what point a system’s fratricide rate negates its operational advantages. Defense Science Board, “Report of the Defense Science Board Task Force on Patriot Performance,” 1.

  144

  “unacceptable” fratricide rate: Hawley, “Looking Back at 20 Years of MANPRINT on Patriot.”

  144

  IFF was well understood: Defense Science Board, “Report of the Defense Science Board Task Force on Patriot Performance.”

  144

  “trusting the system without question”: Hawley, “Looking Back at 20 Years of MANPRINT on Patriot.”

  144

  “unwarranted and uncritical trust”: John K. Hawley, “Not by Widgets Alone: The Human Challenge of Technology-intensive Military Systems,” Armed Forces Journal, February 1, 2011, http://www.armedforcesjournal.com/not-by-widgets-alone/.Patriot operators now train on this and other similar scenarios to avoid this problem of unwarranted trust in the automation.

  145

  more than 30,000 people a year: “Accidents or Unintentional Injuries,” Centers for Disease Control and Prevention, http://www.cdc.gov/nchs/fastats/accidental-injury.htm.

  145

  advanced vehicle autopilots: For example “Intelligent Drive,” Mercedes-Benz, https://www.mbusa.com/mercedes/technology/videos/detail/title-safety/videoId-fc0835ab8d127410VgnVCM100000ccec1e35RCRD.

  146

  “No, Ken said that”: Bin Kenney, “Jeopardy!—The IBM Challenge (Day 1—February 14),” video, https://www.youtube.com/watch?v=i-vMW_Ce51w.

  146

  Watson hadn’t been programmed: Casey Johnston, “Jeopardy: IBM’s Watson Almost Sneaks Wrong Answer by Trebek,” Ars Technica, February 15, 2011, https://arstechnica.com/media/news/2011/02/ibms-watson-tied-for-1st-in-jeopardy-almost-sneaks-wrong-answer-by-trebek.ars.

  146

  “We just didn’t think it would ever happen”: Ibid.

  147

  2016 fatality involving a Tesla Model S: Neither the autopilot nor driver applied the brake when a tractor-trailer turned in front of the vehicle. Anjali Singhvi and Karl Russell, “Inside the Self-Driving Tesla Fatal Accident,” New York Times, July 1, 2016, https://www.nytimes.com/interactive/2016/07/01/business/inside-tesla-accident.html. “A Tragic Loss,” June 30, 2016, https://www.tesla.com/blog/tragic-loss.

  148

  The Sorcerer’s Apprentice: Sorcerer’s Apprentice—Fantasia, accessed June 7, 2017, http://video.disney.com/watch/sorcerer-s-apprentice-fantasia-4ea9ebc01a74ea59a5867853.

  148

  German poem written in 1797: Johann Wolfgang von Goethe, “The Sorcerer’s Apprentice,” accessed June 7, 2017, http://germanstories.vcu.edu/goethe/zauber_e4.html.

  149

  “When you delegate authority to a machine”: Bob Work, interview, June 22, 2016.

  150

  “Traditional methods . . . fail to address”: U.S. Air Force Office of the Chief Scientist, Autonomous Horizons: System Autonomy in the Air Force—A Path to the Future (June 2015), 23, http://www.af.mil/Portals/1/documents/SECAF/AutonomousHorizons.pdf?timestamp=1435068339702.

  150

  “We had seen it once before”: Interestingly, this random move may have played a key role in shaking Kasparov’s confidence. Unlike AlphaGo’s 1 in 10,000 surprise move that later turned out to be a stroke of brilliance, Kasparov could see right away that Deep Blue’s 44th move was tactically nonsensical. Deep Blue resigned the game one move later. Later that evening while pouring over a recreation of the final moves, Kasparov discovered that in 20 moves he would have checkmated Deep Blue. The implication was that Deep Blue made a nonsense move and resigned because it could see 20 moves ahead, a staggering advantage in chess. Nate Silver reports that this bug may have irreparably shaken Kasparov’s confidence. Nate Silver, The Signal and the Noise: Why So Many Predictions Fail (New York: Penguin, 2015), 276–289.

  150

  recent UNIDIR report on autonomous weapons and risk: UN Institute for Disarmament Research, “Safety, Unintentional Risk and Accidents in the Weaponization of Increasingly Autonomous Technologies,” 2016, http://www.unidir.org/files/publications/pdfs/safety-unintentional-risk-and-accidents-en-668.pdf. (I was a participant in a UNIDIR-hosted workshop that helped inform this project and I spoke at a UNIDIR-hosted panel in 2016.)

  151

  “With very complex technological systems”: John Borrie, interview, April 12, 2016.

  151

  “Why would autonomous systems be any different?”: Ibid.

  151

  Three Mile Island incident: This description is taken from Charles Perrow, Normal Accidents: Living with High-Risk Technologies (Princeton, NJ: Princeton University Press, 1999), 15–31; and United States Nuclear Regulatory Commission, “Backgrounder on the Three Mile Island Accident,” https://www.nrc.gov/reading-rm/doc-collections/fact-sheets/3mile-isle.html.

  153

  Apollo 13: For a very brief summary of the incident, see National Aeronautics and Space Administration, “Apollo 13,” https://www.nasa.gov/mission_pages/apollo/missions/apollo13.html. NASA’s full report on the Apollo 13 disaster can be found at National Aeronautics and Space Administration, “Report of the Apollo 13 Review Board,” June 15, 1970, http://nssdc.gsfc.nasa.gov/planetary/lunar/apollo_13_review_board.txt. See also Perrow, Normal Accidents, 271–281.

  154

  “failures . . . we hadn’t anticipated”: John Borrie, interview, April 12, 2016.

  154

  Challenger (1986) and Columbia (2003): On Challenger, see National Aeronautics and Space Administration, “Report of the Presidential Commission on the Space Shuttle Challenger Accident,” June 6, 1986, http://history.nasa.gov/rogersrep/51lcover.htm. On the Columbia accident, see National Aeronautics and Space Administration, “Columbia Accident Investigation Board, Volume 1,” August 2003, http://spaceflight.nasa.gov/shuttle/archives/sts-107/investigation/CAIB_medres_full.pdf.

  154

  “never been encountered before”: Matt Burgess, “Elon Musk Confirms SpaceX’s Falcon 9 Explosion Was Caused by ‘Frozen Oxygen,’ ” WIRED, November 8, 2016, http://www.wired.co.uk/article/elon-musk-universal-basic-income-falcon-9-explosion. “Musk: SpaceX Explosion Toughest Puzzle We’ve Ever Had to Solve,” CNBC, video accessed June 7, 2017, http://video.cnbc.com/gallery/?video=3000565513.

  154

  Fukushima Daiichi: Phillip Y. Lipscy, Kenji E. Kushida, and Trevor Incerti, “The Fukushima Disaster and Japan’s Nuclear Plant Vulnerability in Comparative Perspective,” Environmental Science and Technology 47 (2013), http://web.stanford.edu/~plipscy/LipscyKushidaIncertiEST2013.pdf.

  156

  “A significant message for the”: William Kennedy, interview, December 8, 2015.

  156

  “almost never occur individually”: Ibid.

  156

  “The automated systems”: Ibid.

  156

  “Both sides have strengths and weaknesses”: Ibid.

  156

  F-16 fighter aircraft: Guy Norris, “Ground Collision Avoidance System ‘Saves’ First F-16 In Syria,” February 5, 2015, http://aviationweek.com/defense/ground-collision-avoidance-system-saves-first-f-16-sy
ria.

  156

  software-based limits on its flight controls: Dan Canin, “Semper Lightning: F-35 Flight Control System,” Code One, December 9, 2015, http://www.codeonemagazine.com/f35_article.html?item_id=187.

  157

  software with millions of lines of code: Robert N. Charette, “This Car Runs on Code,” IEEE Spectrum: Technology, Engineering, and Science News, February 1, 2009, http://spectrum.ieee.org/transportation/systems/this-car-runs-on-code. Joey Cheng, “Army Lab to Provide Software Analysis for Joint Strike Fighter,” Defense Systems, August 12, 2014, https://defensesystems.com/articles/2014/08/14/army-f-35-joint-strike-fighter-software-tests.aspx. Robert N. Charette, “F-35 Program Continues to Struggle with Software,” IEEE Spectrum: Technology, Engineering, and Science News, September 19, 2012, http://spectrum.ieee.org/riskfactor/aerospace/military/f35-program-continues-to-struggle-with-software.

  157

  0.1 to 0.5 errors per 1,000 lines of code: Steve McConnell, Code Complete: A Practical Handbook of Software Construction (Redmond, WA: Microsoft Press, 2004), http://www.amazon.com/Code-Complete-Practical-Handbook-Construction/dp/0735619670.

  157

  some errors are inevitable: The space shuttle is an interesting exception that proves the rule. NASA has been able to drive the number of errors on space shuttle code down to zero through a labor-intensive process employing teams of engineers. However, the space shuttle has only approximately 500,000 lines of code, and this process would be entirely unfeasible for more complex systems using millions of lines of code. The F-35 Joint Strike Fighter, for example, has over 20 million lines of code. Charles Fishman, “They Write the Right Stuff,” FastCompany.com, December 31, 1996, http://www.fastcompany.com/28121/they-write-right-stuff.

  157

  F-22 fighter jets: Remarks by Air Force retired Major General Don Sheppard on “This Week at War,” CNN, February 24, 2007, http://transcripts.cnn.com/TRANSCRIPTS/0702/24/tww.01.html.

  157

  hack certain automobiles: Andy Greenberg, “Hackers Remotely Kill a Jeep on the Highway – With Me in It,” Wired, July 21, 2015, http://www.wired.com/2015/07/hackers-remotely-kill-jeep-highway/.

  158

  A study of Nest users: Rayoung Yang and Mark W. Newman, “Learning from a Learning Thermostat: Lessons for Intelligent Systems for the Home,” UbiComp’13, September 8–12, 2013.

  158

  “As systems get increasingly complex”: John Borrie, interview, April 12, 2016.

  159

  Air France Flight 447: “Final Report: On the accidents of 1st June 2009 to the Airbus A330-203 registered F-GZCP operated by Air France flight 447 Rio de Janeiro—Paris,” Bureau d’Enquêtes et d’Analyses pour la sécurité de l’aviation civile, [English translation], 2012, http://www.bea.aero/docspa/2009/f-cp090601.en/pdf/f-cp090601.en.pdf. William Langewiesche, “The Human Factor,” Vanity Fair, October 2014, http://www.vanityfair.com/news/business/2014/10/air-france-flight-447-crash. Nick Ross and Neil Tweedie, “Air France Flight 447: ‘Damn it, We’re Going to Crash,’” The Telegraph, April 28, 2012, http://www.telegraph.co.uk/technology/9231855/Air-France-Flight-447-Damn-it-were-going-to-crash.html.

  159

  Normal accident theory sheds light: In fact, Army researchers specifically cited the Three Mile Island incident as having much in common with the Patriot fratricides. Hawley, “Looking Back at 20 Years of MANPRINT on Patriot.”

  160

  “even very-low-probability failures”: Defense Science Board, “Report of the Defense Science Board Task Force on Patriot Performance.”

  10 Command and Decision: Can Autonomous Weapons Be Used Safely?

  161

  aircraft carrier flight decks: Gene I. Rochlin, Todd R. La Porte, and Karlene H. Roberts, “The Self-DesigningHigh-Reliability Organization: Aircraft Carrier Flight Operations at Sea,” Naval War College Review, Autumn 1987, https://fas.org/man/dod-101/sys/ship/docs/art7su98.htm. Gene I. Rochlin, Todd R. La Porte, and Karlene H. Roberts, “Aircraft Carrier Operations At Sea: The Challenges of High Reliability Performance,” University of California, Berkeley, July 15, 1988, http://www.dtic.mil/dtic/tr/fulltext/u2/a198692.pdf.

  161

  High-reliability organizations: Karl E. Weick and Kathleen M. Sutcliffe, Managing the Unexpected: Sustained Performance in a Complex World, 3rd ed. (San Francisco: Jossey-Bass, 2015).

  161

  militaries as a whole would not be considered: Scott A Snook, Friendly Fire: The Accidental Shootdown of U.S. Black Hawks over Northern Iraq (Princeton, NJ: Princeton University Press, 2002).

  162

  “The SUBSAFE Program”: Paul E. Sullivan, “Statement before the House Science Committee on the SUBSAFE Program,” October 29, 2003, http://www.navy.mil/navydata/testimony/safety/sullivan031029.txt.

  162

  seventy submarines in its force: Department of Defense, “Quadrennial Defense Review 2014,” http://archive.defense.gov/pubs/2014_Quadrennial_Defense_Review.pdf.

  164

  “You can mix and match”: Peter Galluch, interview, July 15, 2016.

  164

  “kill or be killed”: Ibid.

  165

  “there is no voltage that can be applied”: Ibid.

  165

  “Absolutely, it’s automated”: Ibid.

  166

  “You’re never driving around”: Ibid.

  167

  “there is a conscious decision to fire”: Ibid.

  167

  “ROLL GREEN”: The command, as reported on the Navy’s website, is “roll FIS green.” U.S. Navy, “Naval Terminology,” http://www.public.navy.mil/surfor/Pages/Navy-Terminology.aspx.

  169

  “terrible, painful lesson”: Peter Galluch, interview, July 15, 2016.

  169

  “tanker war”: Ronald O’Rourke, “The Tanker War,” Proceedings, May 1988, https://www.usni.org/magazines/proceedings/1988-05/tanker-war.

  170

  Iran Air 655: This account comes from an ABC special on the Vincennes incident which relies on first-hand interviews and video and audio recordings from the Vincennes during the incident. ABC Four Corners, “Shooting Down of Iran Air 655,” 2000, https://www.youtube.com/watch?v=Onk_wI3ZVME.

  171

  “unwarranted and uncritical trust”: Hawley, “Not by Widgets Alone.”

  171

  “spent a lot of money looking into”: John Hawley, interview, December 5, 2016.

  171

  “If you make the [training]”: Ibid.

  171

  “sham environment . . . the Army deceives”: Ibid.

  172

  “consistent objective feedback”: Ibid.

  172

  “Even when the Army guys”: Ibid.

  172

  “Navy brass in the Aegis”: Ibid.

  172

  “too sloppy an organization”: Ibid.

  172

  “Judging from history”: Ibid.

  173

  training tape left in a computer: Lewis et al., “Too Close for Comfort: Cases of Near Nuclear Use and Options for Policy,” The Royal Institute of International Affairs, London, April 2014, https://www.chathamhouse.org/sites/files/chathamhouse/field/field_document/20140428TooCloseforComfortNuclearUseLewisWilliamsPelopidasAghlani.pdf, 12–13.

  173

  faulty computer chip: Ibid, 13. William Burr, “The 3 A.M. Phone Call,” The National Security Archive, March 1, 2012, http://nsarchive.gwu.edu/nukevault/ebb371/.

  173

  brought President Boris Yeltsin the nuclear briefcase: Lewis, “Too Close for Comfort,” 16–17.

  174

  “erosion” of adherence: Defense Science Board Permanent Task Force on Nuclear Weapons Surety, “Report on the Unauthorized Movement of Nuclear Weapons,” February 2008, http://web.archive.org/web/20110509185852/http://www.nti.org/e_research/source_docs/us/department_defense/reports/11.pdf. Richard Newton, “Press Briefing with Maj. Gen. Newton from the Pentagon, Arlington, Va.,” Oc
tober 19, 2007, http://web.archive.org/web/20071023092652/http://www.defenselink.mil/transcripts/transcript.aspx?transcriptid=4067.

  174

  thirteen near-use nuclear incidents: Lewis, “Too Close for Comfort.”

  174

  “When I began this book”: Scott D. Sagan, The Limits of Safety: Organizations, Accidents, and Nuclear Weapons (Princeton, NJ: Princeton University Press, 1993), 251.

  174

  “the historical evidence . . . nuclear weapon systems”: Ibid, 252.

  175

  “the inherent limits of organizational safety”: Ibid, 279.

  175

  “always/never dilemma”: Ibid, 278.

  175

  this is effectively “impossible”: Ibid, 278.

  175

  Autonomous weapons have an analogous problem to the always/never dilemma: Special thanks to Heather Roff at Arizona State University for pointing out this parallel.

  177

  “You can go through all of the kinds of training”: John Hawley, interview, December 5, 2016.

  178

  “planned actions”: William Kennedy, interview, December 8, 2015.

  11 Black Box: The Weird, Alien World of Deep Neural Networks

  180

  object recognition, performing as well or better than humans: Kaiming He et al., “Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification.”

  180

  Adversarial images: Christian Szegedy et al., “Intriguing properties of neural networks,” February 2014, https://arxiv.org/pdf/1312.6199v4.pdf.

  180

  usually created by researchers intentionally: In at least one case, the researchers were intentionally evolving the images, but they were not attempting to fool the deep neural network by making nonsensical images. The researchers explained, “we were trying to produce recognizable images, but these unrecognizable images emerged.” “Deep neural networks are easily fooled: High confidence predictions for unrecognizable images,” Evolving Artificial Intelligence Laboratory, University of Wyoming, http://www.evolvingai.org/fooling. For more, see Nguyen A, Yosinski J, Clune J (2015) “Deep neural networks are easily fooled: High confidence predictions for unrecognizable images,” Computer Vision and Pattern Recognition (CVPR ‘15), IEEE, 2015.

 

‹ Prev