Army of None

Home > Other > Army of None > Page 53
Army of None Page 53

by Paul Scharre


  Heaton, Jeff, 132

  Her (film), 235

  Herr, Andrew, 232

  heuristics, 239–40

  Heyns, Christof, 287–89, 295

  hierarchical coordination, 19, 20f

  high-frequency trading, 199, 201–2; see also automated stock trading

  high-reliability organizations, 161, 170–72, 176–78

  High-speed Anti-Radiation Missile (HARM), 47–48, 47f, 353

  “Highway of Death,” 279

  Hiroshima, 279

  Holocaust, 340

  homing munitions, 41, 42

  Horowitz, Michael, 265, 302, 303, 312

  hors de combat, 252, 258–61

  Hui, Fan, 126

  human-assisted automated weapons, see centaur warfighters

  human–automation failure, 159

  human dignity, 287–90

  human judgment

  and attacks, 270

  and autonomous weapons, 80–81

  and crisis stability, 305–6

  and DoD Directive 3000.09, 75, 90–95

  and laws of war, 357–59

  to reduce risk in autonomous systems, 147–49

  and SDI, 309–10

  and Taranis drone, 109

  treaty language about, 357–59

  human-machine hybrid cognitive systems, see centaur warfighters

  human-machine relationship

  and CODE, 73

  as dimension of autonomy, 28–30

  and future of AI, 244

  and Samsung SGR-A1 robot, 104–5

  Human Rights Watch, 252, 282

  humans

  and laws of war, 270

  as moral agent and fail-safe, 323–25

  and “unmanned aircraft,” 16

  human-supervised weapons systems, 147

  Hussein, Saddam, 7, 340

  Hutus, 288

  hybrid human-machine cognitive systems, see centaur warfighters

  I, Robot (film), 27

  IBM Jeopardy! Challenge, 146–47

  ICRAC (International Committee for Robot Arms Control), 285

  ICRC (International Committee of the Red Cross), 269–70

  identification friend or foe (IFF) signal, 138, 139, 144, 379n

  IEDs (improvised explosive devices), 14

  IEEE-RAS Technical Committee on Roboethics, 280–81

  IEEE Spectrum, 104–5

  IFF signal, see identification friend or foe signal

  “I Have No Mouth and I Must Scream” (Ellison), 234

  IHL, see international humanitarian law

  ImageNet dataset, 129

  improvised explosive devices (IEDs), 14

  Inception-v3 neural network, 129

  incomprehensibility of complex systems, 153–54

  indoor flight/reconnaissance, 68–71, 121–24

  inertial measurement unit (IMU), 123

  inertial navigation, 42

  information technology, see cyberwarfare

  INF (Intermediate-Range Nuclear Forces) Treaty, 301

  Innocent II (pope), 331

  intelligence

  autonomy and, 28–33, 50

  spectrum of, 31f

  “intelligence explosion,” 233

  intelligent machines, rise of, 231–48; see also advanced artificial intelligence; artificial general intelligence

  Intermediate-Range Nuclear Forces (INF) Treaty, 301

  International Committee for Robot Arms Control (ICRAC), 285

  International Committee of the Red Cross (ICRC), 269–70

  International Court of Justice, 262

  international humanitarian law (IHL)

  and autonomous weapons bans, 348

  core principles, 251–52

  and human dignity, 295

  human judgment and, 358–59

  Martens clause, 263–66

  precautions in attack, 258

  principle of distinction, 252–55

  principle of proportionality, 255–57

  and rogue states, 268–69

  unnecessary suffering, 257–58

  internecine wars, 288

  Internet of Things (IoT), 219–20

  Internet Worm (1988), 212, 225

  Introduction to Artificial Intelligence, 245

  introspective systems, 226

  Iran

  cyberattacks by, 213

  RQ-170 drone incident, 209

  Stuxnet attack on nuclear facilities, 214

  swarming of U.S. ships, 22, 107

  U.S. military and, 207

  Iran Air Flight 655, 169–70, 262

  Iran-Iraq War (1980–1988), 169–70

  Iraq War (1991), see Gulf War

  Iraq War (2003–2011)

  chemical weapons and, 340

  distinguishing soldiers from civilians, 253–55

  drones in, 14, 25

  electromagnetic environment, 15

  Patriot missile fratricides, 137–43, 160, 192, 278; see also Patriot missile system

  Israel

  ground robots, 5, 102

  Harpy drone, 5, 47–48, 47f, 52, 55, 117, 353

  robotic boat, 102–3

  Trophy system, 92

  Israel Defense Forces, 92

  Japan

  Fukushima Daiichi nuclear plant disaster, 154–55

  Japan (continued)

  poison gas use in WWII, 340

  Senkaku Islands incident, 208

  U.S. bombing campaigns in WWII, 279, 282

  JASON, 186, 187

  Java, 131

  Jennings, Ken, 146

  Jeopardy! Challenge, 146–47

  Jet Propulsion Laboratory, 235

  Johnson, Lyndon, 389n

  Joint Unmanned Combat Air Systems (J-UCAS), 60–61

  Just and Unjust Wars (Waltzer), 273

  “just war,” 294

  Kahn, Herman, 311

  Kalashnikov, 116, 124

  Kasparov, Gary, 150, 321–22, 380n–381n

  Kelly, Kevin, 5

  Kendall, Frank, 90–93, 120–21

  Kennedy, John F., 317

  Kennedy, William, 155–56, 178

  Khalitov, Vyacheslav, 116

  Khmer Rouge, 288

  Khrushchev, Nikita, 307, 317–18

  kill box, 106–8

  kill switches, 202, 230

  Knight Capital Group, 201–2

  Korea, see North Korea; South Korea

  Kosovo, U.S. air campaign over, 322–23

  Kubrick, Stanley, 312

  Kumar, Vijay, 70

  Kuwait, 169

  land mines

  and arms control, 267, 331–32, 342

  CCW regulations, 268

  humanitarian consequences, 50–51

  and public conscience, 265–66

  unbounded autonomy of, 270

  Jody Williams’s campaign to ban, 271

  Lawrence, Peter, 205

  laws of war, 251–70

  accountability gap and, 258–63

  autonomous weapons and, 282–83

  core principles, 251–52

  debates over restriction/banning of autonomous weapons, 266–69

  and dictates of public conscience, 263–66

  hors de combat, 258–61

  legal status of humans vs. machines, 269–70

  need for human judgment’s place in, 357–59

  precautions in attack, 258

  principle of distinction, 252–55

  principle of proportionality, 255–57

  and unnecessary suffering, 257–58

  learning systems, see deep learning neural networks

  Ledé, Jean-Charles, 72

  Lee, Daniel, 70

  LeMay, Curtis, 279, 296

  lethal autonomy; see also autonomous weapons

  debate over, 6–8

  DoD policy, 25, 95

  liability, autonomous weapons and, 258–59

  LIDAR (light detection and ranging), 121, 122

  Lieber Code, 259

  life-and-death choices, 2–8


  limited weapons bans, 342

  Limits of Safety (Sagan), 174–75

  Lin, Patrick, 223

  LOCAAS (Low Cost Autonomous Attack System), 49

  Lockheed Martin, 63

  Loew ben Bezalel, Rabbi Judah, 234

  loitering munitions, 46–47, 97–98, 354

  London Blitz, 342

  Long-Range Anti-Ship Missile (LRASM), 62–68, 66f–67f, 353

  loosely coupled systems, 152

  low probability of intercept/low probability of detection (LPI/LPD), 72–73

  M2 .50 caliber heavy machine gun (ma deuce), 38

  M249 SAW light machine gun, 191

  machine guns, 37–38, 190–91, 276–77, 299

  machine intelligence, 231; see also intelligent machines

  MAD (mutual assured destruction), 314, 315

  “mad robot theory,” 315–16

  Main, Kevin, 138, 139

  Making of a Fly (Lawrence), 205

  mal en se, 285

  malware, 211–13, 225, 246–47

  management by exception/management by consent, 404n

  Marine warfare Continuous Trail Unmanned Vessel ACTUV (anti-sub), 78–79

  Marrison, Clive, 109

  MARS (A800 Mobile Autonomous Robotic System), 114

  MARS (Mobile Autonomous Robotic System), 114

  Marshall, Andy, 94

  Marshall, S. L. A., 275

  Martens Clause, 263–66, 351

  Matrix trilogy, 134

  Maxim gun, 37, 38

  Mayhem system, 217–23, 226

  McGrath, Bryan, 53–54

  McNamara, Robert, 279, 310–11, 348, 389n

  meaning, failure of autonomous weapons to interpret, 6; see also context

  meaningful human control, 68, 347, 352, 358

  mercy, in war, 273–74, 403n

  Merkava tank, 92

  Messinger, Eric, 224

  Micire, Mark, 69

  Microsoft, 87

  military necessity, 348

  millimeter-wave (MMW) radar seeker, 106–8

  mines, see land mines

  Minot Air Force Base, 174

  missile(s), 40–41, 54, 114–15, 139, 141–43; see also specific missiles, e.g.: Tomahawk

  missile defense shields, 300

  missile launch false alarm (1983), 1–2

  Mk 60 CAPTOR encapsulated torpedo mine, 51

  MMW (millimeter-wave) radar seeker, 106–8

  Mobile Autonomous Robotic System (MARS), 114

  mobile autonomous weapons, 353, 354

  Montgomery, USS, 169

  “moral buffer,” 277

  moral injury, 290

  morality, see ethics

  MQ-25 Stingray tanker drone, 60–62

  MRK-002-BG-57 “Wolf-2,” 114

  Musk, Elon, 154, 232, 246

  mutual assured destruction (MAD), 314, 315

  Nagasaki, 279

  narrow AI, 95–99, 127, 231, 241–43

  NASDAQ, 199, 202, 204

  Natanz, Iran, nuclear facility, 214

  National Aeronautics and Space Administration (NASA), 76, 154, 235

  National Defense Authorization Act (1988–1989), 309–10

  NATO (North Atlantic Treaty Organization), 115

  Naval Postgraduate School, 11–13, 18

  Navy, U.S.

  Aegis combat system, see Aegis combat system

  cultural resistance to robotic weapons, 61–62

  Fire Scout Drone, 209

  high-reliability organizations within, 161–62

  MQ-25 Stingray drone, 60

  robot boat experiments, 22–23

  Sea Hunter, 79

  TASM, 49

  Vincennes incident, 169–70, 177, 262

  X-47B experimental drone, 17

  near-miss accidents (false alarms), 1–2, 173–75, 299, 311, 314

  Nest thermostat, 33–34, 158

  networking, 54–56

  neural networks, see deep learning neural networks

  neutron bomb, 301–2, 407n

  New Orleans, Battle of (1815), 304

  New York Stock Exchange, 200, 202; see also “Flash Crash”

  NGOs (nongovernmental organizations), 7, 348–50

  Nintendo, 239

  Nixon, Richard M., 315

  “no cities” nuclear doctrine, 348

  non-cooperative targets, 85

  nongovernmental organizations (NGOs), 7, 348–50

  non-lethal, nonkinetic autonomous weapons, 89

  nonproliferation regimes, bans vs., 349

  non-state groups, drone use by, 102

  NORAD (North American Aerospace Defense Command), 173

  Normal Accidents (Perrow), 174–75

  “normal accidents,” 150–54, 159–60

  North Korea, 105, 208, 260

  Norway, 173

  nuclear false alarms, 299

  nuclear missile launch false alarm (1983), 1–2

  Nuclear Non-Proliferation Treaty, 340

  nuclear power, 151–55, 176–77

  nuclear weapons; see also Cuban Missile Crisis

  and arms control, 332, 340

  Hiroshima and Nagasaki, 279

  missile launch false alarm (1983), 1–2

  and near-miss accidents, 173–75

  and stability, 298–302

  offense-dominant regime, 299–300

  Office of Naval Research (ONR), 22, 100, 235

  Office of Personnel Management (OPM), 212

  Office of the Secretary of Defense, 7, 25

  offset strategies, 59

  Ohio-class submarines, 173

  Oko (Soviet missile alert system), 1–2, 314

  Omohundro, Steve, 63, 237–38, 240

  On Killing (Grossman), 275

  ONR, see Office of Naval Research

  OODA (observe, orient, decide, act)

  basics, 23–24

  for Patriot missile system, 142f

  weapon system loop, 43

  Operation Desert Storm, see Gulf War (1991)

  Operation Iraqi Freedom, 160; see also Iraq War (2003–2011)

  OPM (Office of Personnel Management), 212

  optical flow, 122–23

  Ottawa Treaty, 51, 268

  Outer Space Treaty (1967), 301, 344

  PAC-3 missile, 143

  Pascal Visual Object Classes database, 129

  passive sensors, 41, 113

  patches, 217–22, 226

  Patriot missile system, 140f

  Aegis system vs., 165–66, 171–72, 175–76

  assessment study, 143–45, 147, 159–60

  and automation bias, 170, 278–79, 324–25

  automation philosophy, 165–66

  decision-making process, 142f

  Iraq War fratricides, 137–43, 160, 192, 278

  Patriot Vigilance Project, 171–72

  Pentagon, 61, 96

  perfidy, 259

  Perimeter system, 313–14, 409n

  Perrow, Charles, 153, 174–75, 189

  Persian Gulf War (1991), see Gulf War

  perverse instantiation, 239

  Petrov, Stanislav, 1–2, 6, 144–45, 207, 305

  PGMs (precision-guided munitions), 39–41, 281–83

  Phalanx Close-In Weapon System (CIWS), 111

  Phoenix, Joaquin, 235

  Platform-M, 111–14

  Playground (TensorFlow tutorial), 128–29

  PLC (programmable logic controllers), 214

  PMK-2 encapsulated torpedo mine, 51

  poison gas, 340–43

  poker, 242

  post-traumatic stress, 290

  posturing, 275

  Powell, Colin, 279

  Power, Thomas, 307

  precautions in the attack, 252

  precision-guided munitions (PGMs), 39–41, 281–83

  Predator drones, 16, 17

  preemptive weapons bans, 343–44, 352

  principle of distinction, 252–55

  profnath (online bookseller), 205
>
  programmable logic controllers (PLC), 214

  projection bias, 310, 316

  proportionality, principle of, 255–57

  Protector (robotic boat), 103

  psychological distance, 275–76

  public conscience, dictates of, 263–66

  public opinion, 264–66

  Python, 131

  al-Qaeda, 22, 253

  radar, 85

  Ramage, USS, 166

  Raspberry Pi, 127, 131

  Reagan, Ronald, and administration, 1, 309–10

  Reaper drones, 16, 17

  regulation, see arms control

  Reisner, Daniel, 255

  reliability, see safety

  Remotely Piloted Aircraft Vector (U.S. Air Force report), 61

  responsibility, consequences of removing, 275–79

  retributive justice, 262

  revolutions in warfare, 93–96

  risk

  in autonomous systems, 147–49

  of autonomous weapon failure, 189–95

  of delegating autonomy to a machine, 192f

  risk mitigation, 175

  “roadmaps” for unmanned system investment, 15–17

  roboethics, 280–81

  robotic ground vehicles, 102, 111–17

  robotics

  for surveillance, 13–14

  Three Laws of, 26–27

  robot tanks, 115–16

  ROE, see rules of engagement

  Roff, Heather, 209–10

  rogue states, 7

  Roomba, 26, 30, 34, 278

  Rosoboronexport, 115

  RQ-170 stealth drone, 209

  rules-based ethics, 285; see also deontological ethics

  rules-based learning systems, 125, 179

  rules of engagement (ROE), 106, 109, 309, 328

  “rules of the road,” 356–57

  runaway gun, 190–91

  R.U.R. (Rossumovi Univerzální Roboti) (play), 234

  Russell, Stuart, 68, 71, 72, 116, 118, 356

  Russia

  cyberattacks by, 213

  ground combat robots, 5, 102, 111–17

  lack of public policy on autonomous weapons, 118

  Platform-M, 111–14

  robotized military units, 6

  Soviet era, see Cold War; Soviet Union

  tolerance for casualties and fratricide, 172

  Trident missile false alarm (1995), 173–74

  Uran-9 ground combat robot, 114–16

  U.S. military and, 207

  Vikhr robot tank, 115–16

  Wolf-2, 114

  Rutter, Brad, 146

  Rwanda, 288

  SAC (Strategic Air Command), 307

  safety (reliability)

  and Aegis combat system, 162–67

  of autonomous weapons, 161–79

  nuclear weapons and near-miss accidents, 173–75

  and USS Vincennes incident, 169–70

  SAG (surface action group), 64

  Sagan, Scott, 174–75

  Salty Dog 501/502 (X-47B experimental drone), 60–62

  Samsung SGR-A1 robot, 5, 104–5, 118, 303–4

  SAR (synthetic aperture radar), 86

 

‹ Prev