Army of None

Home > Other > Army of None > Page 51
Army of None Page 51

by Paul Scharre


  FLA

  Fast Lightweight Autonomy

  GGE

  Group of Governmental Experts

  GPS

  global positioning system

  ICRAC

  International Committee for Robot Arms Control

  ICRC

  International Committee of the Red Cross

  IEEE

  Institute of Electrical and Electronics Engineers

  IFF

  identification friend or foe

  IHL

  international humanitarian law

  IMU

  inertial measurement unit

  INF

  Intermediate-Range Nuclear Forces

  IoT

  Internet of Things

  J-UCAS

  Joint Unmanned Combat Air Systems

  LIDAR

  light detection and ranging

  LOCAAS

  Low Cost Autonomous Attack System

  LRASM

  Long-Range Anti-Ship Missile

  MAD

  mutual assured destruction

  MARS

  Mobile Autonomous Robotic System

  MMW

  millimeter-wave

  NASA

  National Aeronautics and Space Administration

  NGO

  nongovernmental organization

  NORAD

  North American Aerospace Defense Command

  ONR

  Office of Naval Research

  OODA

  observe, orient, decide, act

  OPM

  Office of Personnel Management

  PGM

  precision-guided munition

  PLC

  programmable logic controllers

  RAS

  IEEE Robotics and Automation Society

  R&D

  research and development

  ROE

  rules of engagement

  SAG

  surface action group

  SAR

  synthetic aperture radar

  SAW

  Squad Automatic Weapon

  SEC

  Securities and Exchange Commission

  SFW

  Sensor Fuzed Weapon

  SORT

  Strategic Offensive Reductions Treaty

  START

  Strategic Arms Reduction Treaty

  SUBSAFE

  Submarine Safety

  TASM

  Tomahawk Anti-Ship Missile

  TBM

  tactical ballistic missile

  TJ

  Thomas Jefferson High School

  TLAM

  Tomahawk Land Attack Missile

  TRACE

  Target Recognition and Adaption in Contested Environments

  TTO

  Tactical Technology Office

  TTP

  tactics, techniques, and procedures

  UAV

  uninhabited aerial vehicle

  UCAV

  uninhabited combat aerial vehicle

  UK

  United Kingdom

  UN

  United Nations

  UNIDIR

  UN Institute for Disarmament Research

  U.S.

  United States

  WMD

  weapons of mass destruction

  Illustration Credits

  (All photographs courtesy of Paul Scharre unless otherwise indicated.)

  Text images:

  here

  Center for a New American Security

  here

  U.S. Navy

  here

  © Lockheed Martin

  here

  © Lockheed Martin

  here

  Center for a New American Security

  here

  Center for a New American Security

  here

  Anh Nguyen, Jason Yosinski, Jeff Clune

  here

  Christian Szegedy, Wojciech Zaremba, Ilya Sutskever, Joan Bruna, Dumitru Erhan, Ian Goodfellow, Rob Fergus

  here

  Anh Nguyen, Jason Yosinski, Jeff Clune

  here

  U.S. Air Force

  Insert images:

  1.

  U.S. Marine Corps Historical Division Archives

  2.

  John Warwick Brooke / Imperial War Museum collection

  3.

  Mass Communication Specialist 3rd Class Eric Coffer / U.S. Navy

  4.

  U.S. Navy

  5.

  Mass Communication Specialist Seaman Anthony N. Hilkowski / U.S. Navy

  6.

  Israel Aerospace Industries

  7.

  U.S. Navy

  8.

  Glenn Fawett / Department of Defense

  9.

  Senior Airman Christian Clausen / U.S. Air Force

  10.

  NASA

  11.

  Alan Radecki / U.S. Navy courtesy of Northrop Grumman

  12.

  Elizabeth A. Wolter / U.S. Navy

  13.

  Israel Aerospace Industries

  14.

  John F. Williams / U.S. Navy

  15.

  John F. Williams / U.S. Navy

  16.

  John F. Williams / U.S. Navy

  17.

  DARPA

  18.

  DARPA

  19.

  DARPA

  20.

  Specialist Lauren Harrah / U.S. Army

  21.

  DARPA

  22.

  Daryl Roy / Aegis Training and Readiness Center

  23.

  Mass Communication Specialist 2nd Class Michael Ehrlich / U.S. Navy

  24.

  Mass Communication Specialist 2nd Class Michael Ehrlich / U.S. Navy

  25.

  John F. Williams / U.S. Navy

  26.

  Ben Santos / U.S. Forces Afghanistan

  27.

  Campaign to Stop Killer Robots

  28.

  DJI Technology, Inc.

  29.

  Courtesy Paul Scharre

  30.

  Staff Sergeant Sean Harp / Department of Defense

  31.

  Mass Communication Specialist 3rd Class Sean Weir / U.S. Navy

  Index

  Page numbers listed correspond to the print edition of this book. You can use your device’s search function to locate particular terms in the text.

  Page numbers followed by f indicate figures and illustrations; page numbers followed by m indicate maps; page numbers followed by t indicate tables.

  A800 Mobile Autonomous Robotic System (MARS), 114

  AAAI (Association for the Advancement of Artificial Intelligence), 243

  AACUS (Autonomous Aerial Cargo/Utility System) helicopter, 17

  ABM (Anti-Ballistic Missile) Treaty (1972), 301

  accidents, see failures

  accountability gap, 258–63

  acoustic homing seeker, 39

  acoustic shot detection system, 113–14

  active seekers, 41

  active sensors, 85

  adaptive malware, 226

  advanced artificial intelligence; see also artificial general intelligence

  aligning human goals with, 238–41

  arguments against regarding as threat, 241–44

  building safety into, 238–41

  dangers of, 232–33

  drives for resource acquisition, 237–38

  future of, 247–48

  in literature and film, 233–36

  militarized, 244–45

  psychological dimensions, 233–36

  vulnerability to hacking, 246–47

  “advanced chess,” 321–22

  Advanced Medium-Range Air-to-Air Missile (AMRAAM), 41, 43

  Advanced Research Projects Agency (ARPA), 76–77

  adversarial actors, 177

  adversarial (fooling) images, 180–87, 181f, 183f, 185f, 253, 384n

  Aegis combat system, 162–67
/>
  achieving high reliability, 170–72

  automation philosophy, 165–67

  communications issues, 304

  and fully autonomous systems, 194

  human supervision, 193, 325–26

  Patriot system vs., 165–66, 171–72

  simulated threat test, 167–69

  testing and training, 176, 177

  and USS Vincennes incident, 169–70

  Aegis Training and Readiness Center, 163

  aerial bombing raids, 275–76, 278, 341–42

  Afghanistan War (2001– ), 2–4

  distinguishing soldiers from civilians, 253

  drones in, 14, 25, 209

  electromagnetic environment, 15

  goatherder incident, 290–92

  moral decisions in, 271

  runaway gun incident, 191

  AGI, see advanced artificial intelligence; artificial general intelligence

  AGM-88 high-speed antiradiation missile, 141

  AI (artificial intelligence), 5–6, 86–87; see also advanced artificial intelligence; artificial general intelligence

  AI FOOM, 233

  AIM-120 Advanced Medium-Range Air-to-Air Missile, 41

  Air Force, U.S.

  cultural resistance to robotic weapons, 61

  future of robotic aircraft, 23–25

  Global Hawk drone, 17

  nuclear weapons security lapse, 174

  remotely piloted aircraft, 16

  X-47 drone, 60–61

  Air France Flight 447 crash, 158–59

  Alexander, Keith, 216, 217

  algorithms

  life-and-death decisions by, 287–90

  for stock trading, see automated stock trading

  Ali Al Salem Air Base (Kuwait), 138–39

  Alphabet, 125

  AlphaGo, 81–82, 125–27, 150, 242

  AlphaGo Zero, 127

  AlphaZero, 410

  al-Qaeda, 22, 253

  “always/never” dilemma, 175

  Amazon, 205

  AMRAAM (Advanced Medium-Range Air-to-Air Missile), 41, 43

  Anderson, Kenneth, 255, 269–70, 286, 295

  anthropocentric bias, 236, 237, 241, 278

  anthropomorphizing of machines, 278

  Anti-Ballistic Missile (ABM) Treaty (1972), 301

  antipersonnel autonomous weapons, 71, 355–56, 403n

  antipersonnel mines, 268, 342; see also land mines

  anti-radiation missiles, 139, 141, 144

  anti-ship missiles, 62, 302

  Anti-submarine warfare Continuous Trail Unmanned Vessel (ACTUV), 78–79

  anti-vehicle mines, 342

  Apollo 13 disaster, 153–54

  appropriate human involvement, 347–48, 358

  appropriate human judgment, 91, 347, 358

  approval of autonomous weapons, see authorization of autonomous weapons

  Argo amphibious ground combat robot, 114

  Arkhipov, Vasili, 311, 318

  Arkin, Ron, 280–85, 295, 346

  armed drones, see drones

  Arms and Influence (Schelling), 305, 341

  arms control, 331–45

  antipersonnel weapons, 355–56

  ban of fully autonomous weapons, 352–55

  debates over restriction/banning of autonomous weapons, 266–69

  general principles on human judgment’s role in war, 357–59

  inherent problems with, 284, 346–53

  legal status of treaties, 340

  limited vs. complete bans, 342–43

  motivations for, 345

  preemptive bans, 343–44

  “rules of the road” for autonomous weapons, 356–57

  successful/unsuccessful treaties, 332–44, 333t–339t

  types of weapons bans, 332f

  unnecessary suffering standards, 257–58

  verification regimes, 344–45

  arms race, 7–8, 117–19

  Armstrong, Stuart, 238, 240–42

  Army, U.S.

  cultural resistance to robotic weapons, 61

  Gray Eagle drone, 17

  overcoming resistance to killing, 279

  Patriot Vigilance Project, 171–72

  Shadow drone, 209

  ARPA (Advanced Research Projects Agency), 76–77

  Article 36, 118

  artificial general intelligence (AGI); See also advanced artificial intelligence

  and context, 238–39

  defined, 231

  destructive potential, 232–33, 244–45

  ethical issues, 98–99

  in literature and film, 233–36

  narrow AI vs., 98–99, 231

  timetable for creation of, 232, 247

  as unattainable, 242

  artificial intelligence (AI), 5–6, 86–87; see also advanced artificial intelligence; artificial general intelligence

  “Artificial Intelligence, War, and Crisis Stability” (Horowitz), 302, 312

  Artificial Intelligence for Humans, Volume 3 (Heaton), 132

  artificial superintelligence (ASI), 233

  Art of War, The (Sun Tzu), 229

  Asaro, Peter, 265, 285, 287–90

  Asimov, Isaac, 26–27, 134

  Assad, Bashar al-, 7, 331

  Association for the Advancement of Artificial Intelligence (AAAI), 243

  Atari, 124, 127, 247–48

  Atlas ICBM, 307

  atomic bombs, see nuclear weapons

  ATR (automatic target recognition), 76, 84–88

  attack

  decision to, 269–70

  defined, 269–70

  human judgment and, 358

  atypical events, 146, 176–78

  Australia, 342–43

  authorization of autonomous weapons, 89–101

  DoD policy, 89–90

  ethical questions, 90–93

  and future of lethal autonomy, 96–99

  information technology and revolution in warfare, 93–96

  past as guide to future, 99–101

  Auto-GCAS (automatic ground collision avoidance system), 28

  automated machines, 31f, 32–33

  automated (algorithmic) stock trading, 200–201, 203–4, 206, 210, 244, 387n

  automated systems, 31

  automated weapons

  first “smart” weapons, 38–40

  precision-guided munitions, 39–41

  automatic machines, 31f

  automatic systems, 30–31, 110

  automatic target recognition (ATR), 76, 84–88

  automatic weapons, 37–38

  Gatling gun as predecessor to, 35–36

  machine guns, 37–38

  runaway gun, 190–91

  automation (generally)

  Aegis vs. Patriot, 165–66

  and complex systems, 156–59

  and “moral buffer,” 277

  role in accidents, 155–56

  automation bias, 144–45, 170, 278–79, 324–25

  automobiles

  autonomous features, 28

  self-driving, 28, 31–32, 147, 217, 277

  Autonomous Aerial Cargo/Utility System (AACUS) helicopter, 17

  autonomous cyberweapons, 222–30

  autonomous machines, 31f, 32–33

  autonomous navigation, autonomous targeting vs., 123–24

  autonomous swarms, 11–13

  autonomous targeting, 116, 123–24, 187

  autonomous weapons, 130–33

  accountability gap, 258–63

  antipersonnel weapons, 71, 355–56, 403n

  arms race in, 117–19

  authorization of, 89–101

  automatic weapons as predecessor to, 37–38

  bans of antipersonnel weapons, 355–56

  basics, 35–56

  Brimstone missile, 105–8

  and communications disruption, 303–4, 328

  complete ban of, 352–55

  consequences of, 272

  cyber warfare, 211–30

  danger of d
elegating authority to, 192–95, 192f

  destructive potential of out-of-control logarithms, 207–10

  early history, 35–40

  ethical issues, 6–8, 271–96

  experimental programs, 59–77

  FLA as step towards, 70

  flash wars, 229–30

  as fundamentally inhuman, 285–87

  future of, 54–56, 96–99

  ground combat robots, 111–17

  hacking of, 246–47

  human dignity and, 287–90

  human role in deployment, 52–53; see also human judgment

  inevitability of accidents, 175–79

  laws of war and, 251–70

  legal status, 258–59

  limitations, 53–54

  limited autonomy of homing munitions, 42

  LRASM designation, 63–65, 68

  mines, 50–51

  outside of U.S., 102–19

  PGMs, 40–41

  potential for behaving more ethically than humans, 279–84

  potential to be more humane than conventional weapons, 6

  potential to inflame crises, 317–18

  problems inherent in banning of, 346–53

  regulation, see arms control

  risk of failure, 189–95

  “rules of the road” treaties, 356–57

  Samsung SGR-A1 robot, 104–5

  secret development of, 8

  stability and, 297–318

  swarms, 11–13

  autonomous weapon systems, 44–45

  defined, 44

  failures in, 137–60

  fully autonomous systems, 46–50

  human intervention and risks, 147–49

  supervised, see supervised autonomous weapon systems

  unanticipated consequences of failures, 145–47

  autonomy

  dimensions of, 27–33

 

‹ Prev