by Paula Guran
“I miss you. I wish I could have understood.”
Every day, after you’re done with killing, you get up from your chair and walk out of the office building and go home. Along the way you hear the birds chittering overhead and see teenagers walking by, giggling or moping, self-absorbed in their safe cocoons, and then you open the door to your home. Your spouse wants to tell you about her annoying boss and your children are waiting for you to help them with their homework, and you can’t tell them a thing you’ve done.
I think either you become crazy or you already were.
She did not want him to be defined by the number on that piece of paper her mother kept hidden at the bottom of the box in the attic.
“They counted wrong, Dad,” Kyra said. “They missed one death.”
Kyra walked down the hall dejectedly. She was done with her last interview of the day—a hot Silicon Valley startup. She had been nervous and distracted and had flubbed the brainteaser. It had been a long day and she didn’t get much sleep the night before.
She was almost at the elevator when she noticed an interview schedule posted on the door of the suite next to the elevator for a company named AWS Systems. It hadn’t been completely filled. A few of the slots on the bottom were blank; that generally meant an undesirable company.
She took a closer look at the recruiting poster. They did something related to robotics. There were some shots of office buildings on a landscaped, modern campus. Bullet points listed competitive salary and benefits. Not flashy, but it seemed attractive enough. Why weren’t people interested?
Then she saw it: “Candidates need to pass screening for security clearance.” That would knock out many of her classmates who weren’t U.S. citizens. And it likely meant government contracts. Defense, probably. She shuddered. Her family had had enough of war.
She was about to walk away when her eyes fell on the last bullet point on the poster: “Relieve the effects of PTSD on our heroes.”
She wrote her name on one of the blank lines and sat down on the bench outside the door to wait.
“You have impressive credentials,” the man said, “the best I’ve seen all day, actually. I already know we’ll want to talk to you some more. Do you have any questions?”
This was what Kyra had been waiting for all along. “You’re building robotic systems to replace human-controlled drones, aren’t you? For the war.”
The recruiter smiled. “You think we’re Cyberdyne Systems?”
Kyra didn’t laugh. “My father was a drone operator.”
The man became serious. “I can’t reveal any classified information. So we have to speak only in hypotheticals. Hypothetically, there may be advantages to using autonomous robotic systems over human-operated machines. Robots.”
“Like what? It can’t be about safety. The drone operators are perfectly safe back here. You think machines will fight better?”
“No, we’re not interested in making ruthless killer robots. But we shouldn’t make people do the jobs that should be done by machines.”
Kyra’s heart beat faster. “Tell me more.”
“There are many reasons why a machine makes a better soldier than a human. A human operator has to make decisions based very limited information: just what he can see from a video feed, sometimes alongside intelligence reports. Deciding whether to shoot when all you have to go on is the view from a shaking camera and confusing, contradictory intel is not the kind of thinking humans excel at. There’s too much room for error. An operator might hesitate too long and endanger an innocent, or he might be too quick on the trigger and violate the rules of engagement. Decisions by different operators would be based on hunches and emotions and at odds with each other. It’s inconsistent and inefficient. Machines can do better.”
Worst of all, Kyra thought, a human can be broken by the experience of having to decide.
“If we take these decisions away from people, make it so that individuals are out of the decision-making loop, the result should be less collateral damage and a more humane, more civilized form of warfare.”
But all Kyra could think was: No one would have to do what my father did.
The process of getting security clearance took a while. Kyra’s mother was surprised when Kyra called to tell her that government investigators might come to talk to her, and Kyra wasn’t sure how to explain why she had taken this job when there were much better offers from other places. So she just said, “This company helps veterans and soldiers.”
Her mother said, carefully, “Your father would be proud of you.”
Meanwhile, they assigned her to the civilian applications division, which made robots for factories and hospitals. Kyra worked hard and followed all the rules. She didn’t want to mess up before she got to do what she really wanted. She was good at her job, and she hoped they noticed.
Then one morning Dr. Stober, the head roboticist, called her to join him in a conference room.
Kyra’s heart was in her throat as she walked over. Was she going to be let go? Had they decided that she couldn’t be trusted because of what had happened to her father? That she might be emotionally unstable? She had always liked Dr. Stober, who seemed like a good mentor, but she had never worked with him closely.
“Welcome to the team,” said a smiling Dr. Stober. Besides Kyra, there were five other programmers in the room. “Your security clearance arrived this morning, and I knew I wanted you on this team right away. This is probably the most interesting project at the company right now.”
The other programmers smiled and clapped. Kyra grinned at each of them in turn as she shook their outstretched hands. They all had reputations as the stars in the company.
“You’re going to be working on the AW-1 Guardians, one of our classified projects.”
One of the other programmers, a young man named Alex, cut in: “These aren’t like the field transport mules and remote surveillance crafts we already make. The Guardians are unmanned, autonomous flying vehicles about the size of a small truck armed with machine guns and missiles.”
Kyra noticed that Alex was really excited by the weapons systems.
“I thought we make those kinds already,” Kyra said.
“Not exactly,” Dr. Stober said. “Our other combat systems are meant for surgical strikes in remote places or are prototypes for frontline combat, where basically anything that moves can be shot. But these are designed for peacekeeping in densely populated urban areas, especially places where there are lots of Westerners or friendly locals to protect. Right now we still have to rely on human operators.”
Alex said in a deadpan voice, “It would be a lot easier if we didn’t have to worry about collateral damage.”
Dr. Stober noticed that Kyra didn’t laugh and gestured for Alex to stop. “Sarcasm aside, as long as we’re occupying their country, there will be locals who think they can get some advantage from working with us and locals who wish we’d go away. I doubt that dynamic has changed in five thousand years. We have to protect those who want to work with us from those who don’t, or else the whole thing falls apart. And we can’t expect the Westerners doing reconstruction over there to stay holed up in walled compounds all the time. They have to mingle.”
“It’s not always easy to tell who’s a hostile,” Kyra said.
“That’s the heart of the issue. Most of the time, the population is ambivalent. They’ll help us if they think it’s safe to do so, and they’ll help the militants if they think that’s the more convenient choice.”
“I’ve always said that if they choose to help the militants blend in, I don’t see why we need to be that careful. They made a decision,” Alex said.
“I suppose some interpretations of the rules of engagement would agree with you. But we’re telling the world that we’re fighting a new kind of war, a clean war, one where we hold ourselves to a higher standard. How people see the way we conduct ourselves is just as important nowadays.”
“How do we do that?” Kyra asked
, before Alex could further derail the conversation.
“The key piece of software we have to produce needs to replicate what the remote operators do now, only better. The government has supplied us with thousands of hours of footage from the drone operations during the last decade or so. Some of them got the bad guys, and some of them got the wrong people. We’ll need to watch the videos and distill the decision-making process of the operators into a formal procedure for identifying and targeting militants embedded in urban conditions, eliminate the errors, and make the procedure repeatable and applicable to new situations. Then we’ll improve it by tapping into the kind of big data that individual operators can’t integrate and make use of.”
The code will embody the minds of my father and others like him so that no one would have to do what they did, endure what they endured.
“Piece of cake,” said Alex. And the room laughed, except for Kyra and Dr. Stober.
Kyra threw herself into her work, a module they called the ethical governor, which was responsible for minimizing collateral damage when the robots fired upon suspects. She was working on a conscience for killing machines.
She came in on the weekends and stayed late, sometimes sleeping in the office. She didn’t view it as a difficult sacrifice to make. She couldn’t talk about what she was working on with the few friends she had, and she didn’t really want to spend more time outside the office with people like Alex.
She watched videos of drone strikes over and over. She wondered if any were missions her father had flown. She understood the confusion, the odd combination of power and powerlessness experienced when watching a man one is about to kill through a camera, the pressure to decide.
The hardest part was translating this understanding into code. Computers require precision, and the need to articulate vague hunches had a way of forcing one to confront the ugliness that could remain hidden in the ambiguity of the human mind.
To enable the robots to minimize collateral damage, Kyra had to assign a value to each life that might be endangered in a crowded urban area. One of the most effective ways for doing this—at least in simulations—also turned out to be the most obvious: profiling. The algorithm needed to translate racial characteristics and hints about language and dress into a number that held the power of life and death. She felt paralyzed by the weight of her task.
“Everything all right?” Dr. Stober asked.
Kyra looked up from her keyboard. The office lights were off; it was dark outside. She was practically the last person left in the building.
“You’ve been working a lot.”
“There’s a lot to do.”
“I’ve reviewed your check-in history. You seem to be stuck on the part where you need the facial recognition software to give you a probability on ethnic identity.”
Kyra gazed at Dr. Stober’s silhouette in the door to her office, back-lit by the hall lights. “There’s no API for that.”
“I know, but you’re resisting the need to roll your own.”
“It seems . . . wrong.”
Dr. Stober came in and sat down in the chair on the other side of her desk. “I learned something interesting recently. During World War II, the U.S. Army trained dogs for warfare. They would act as sentries, guards, or maybe even as shock troops in an island invasion.”
Kyra looked at him, waiting.
“The dogs had to be trained to tell allies apart from enemies. So they used Japanese-American volunteers to teach the dogs to profile, to attack those with certain kinds of faces. I’ve always wondered how those volunteers felt. It was repugnant, and yet it was also necessary.”
“They didn’t use German-American or Italian-American volunteers, did they?”
“No, not that I’m aware of. I’m telling you this not to dismiss the problematic nature of your work, but to show you that the problem you’re trying to solve isn’t entirely new. The point of war is to prefer the lives of one group over the lives of another group. And short of being able to read everyone’s minds, you must go with shortcuts and snap heuristics to tell apart those who must die from those who must be saved.”
Kyra thought about this. She could not exempt herself from Dr. Stober’s logic. After all, she had lamented her father’s death for years, but she had never shed a tear for the thousands he had killed, no matter how many might have been innocent. His life was more valuable to her than all of them added together. His suffering meant more. It was why she was here.
“Our machines can do a better job than people. Attributes like appearance and language and facial expressions are but one aspect of the input. Your algorithm can integrate the footage from citywide surveillance by thousands of other cameras, the metadata of phone calls and social visits, individualized suspicions built upon data too massive for any one person to handle. Once the programming is done, the robots will make their decisions consistently, without bias, always supported by the evidence.”
Kyra nodded. Fighting with robots meant that no one had to feel responsible for killing.
Kyra’s algorithm had to be specified exactly and submitted to the government for approval. Sometimes the proposals came back marked with questions and changes.
She imagined some general (advised, perhaps, by a few military lawyers) looking through her pseudocode line by line.
A target’s attributes would be evaluated and assigned numbers. Is the target a man? Increase his suspect score by thirty points. Is the target a child? Decrease his suspect score by twenty-five points. Does the target’s face match any of the suspected insurgents with at least a fifty-percent probability? Increase his suspect score by five hundred points.
And then there was the value to be assigned to the possible collateral damage around the target. Those who could be identified as Americans or had a reasonable probability of being Americans had the highest value. Then came native militia forces and groups who were allied with U.S. forces and the local elites. Those who looked poor and desperate were given the lowest values. The algorithm had to formalize anticipated fallout from media coverage and politics.
Kyra was getting used to the process. After the specifications had gone back and forth a few times, her task didn’t seem so difficult.
Kyra looked at the number on the check. It was large.
“It’s a small token of the company’s appreciation for your efforts,” said Dr. Stober. “I know how hard you’ve been working. We got the official word on the trial period from the government today. They’re very pleased. Collateral damage has been reduced by more than eighty percent since they started using the Guardians, with zero erroneous targets identified.” Kyra nodded. She didn’t know if the eighty percent was based on the number of lives lost or the total amount of points assigned to the lives. She wasn’t sure she wanted to think too hard about it. The decisions had already been made.
“We should have a team celebration after work.”
And so, for the first time in months, Kyra went out with the rest of the team. They had a nice meal, some good drinks, sang karaoke. And Kyra laughed and enjoyed hearing Alex’s stories about his exploits in war games.
“Am I being punished?” Kyra asked.
“No, no, of course not,” Dr. Stober said, avoiding her gaze. “It’s just administrative leave until . . . the investigation completes. Payroll will still make bi-weekly deposits and your health insurance will continue, of course. I don’t want you to think you’re being scapegoated. It’s just that you did most of the work on the ethical governor. The Senate Armed Forces Committee is really pushing for our methodology, and I’ve been told that the first round of subpoenas is coming down next week. You won’t be called up, but we’ll likely have to name you.”
Kyra had seen the video only once, and once was enough. Someone in the market had taken it with a cellphone, so it was shaky and blurry. No doubt the actual footage from the Guardians would be much clearer, but she wasn’t going to get to see that. It would be classified at a level beyond her clearance.
r /> The market was busy, the bustling crowd trying to take advantage of the cool air in the morning. It looked, if you squinted a bit, like the farmer’s market that Kyra sometimes went to, to get her groceries. A young American man, dressed in the distinctive protective vest that expat reconstruction advisors and technicians wore over there, was arguing with a merchant about something, maybe the price of the fruits he wanted to buy.
Reporters had interviewed him afterwards, and his words echoed in Kyra’s mind: “All of a sudden, I heard the sounds made by the Guardians patrolling the market change. They stopped to hover over me, and I knew something was wrong.”
In the video, the crowd was dispersing around him, pushing, jostling with each other to get out of the way. The person who took the video ran, too, and the screen was a chaotic blur.
When the video stabilized, the vantage point was much further. Two black robots about the size of small trucks hovered in the air above the kiosk. They looked like predatory raptors. Metal monsters.
Even in the cellphone video, it was possible to make out the recorded warning in the local language the robots projected via loudspeakers. Kyra didn’t know what the warnings said.
A young boy, seemingly oblivious to the hovering machines above him, was running at the American man, laughing and screaming, his arms opened wide as if he wanted to embrace the man.
“I just froze. I thought, oh God, I’m going to die. I’m going to die because this kid has a bomb on him.”
The militants had tried to adapt to the algorithms governing the robots by exploiting certain weaknesses. Because they realized that children were assigned a relatively high value for collateral damage purposes and a relatively low value for targeting purposes, they began to use more children for their missions. Kyra had had to tweak the algorithm and the table of values to account for these new tactics.
“All of your changes were done at the request of the Army and approved by them,” said Dr. Stober. “Your programming followed the updated rules of engagement and field practices governing actual soldiers. Nothing you’ve done was wrong. The Senate investigation will be just be a formality.”