Book Read Free

Robot Uprisings

Page 30

by Daniel H. Wilson


  “Get the hell out of my apartment, stupid. I’ll make them send me another robot.”

  “Mistress, if I do that the baby will probably die.”

  Eliza jumped up and kicked R781. “Get the hell out! And take the fucking baby with you.”

  “Yes, mistress.”

  R781 exited the apartment onto a typical late-twenty-first-century American city street. A long era of peace, increased safety standards, and the availability of construction robots had led to putting automotive traffic and parking on a lower level completely separated from pedestrians. Tremont Street had recently been converted and crews were still transplanting trees. As the streets became more attractive, more people spent time on them and on their syntho-plush armchairs and benches (cleaned twice a day by robots). The weather was good this afternoon, so the plastic street roofs were retracted.

  Children from three years of age and up were playing on the street, protected from harm by the computer surveillance system and prevented by barriers from descending to the automotive level. Bullying and teasing of younger and weaker children was still somewhat of a problem.

  Most stores were open twenty-four hours a day, unmanned, and had converted to an automatic customer-identification system. Customers would take objects from the counters and shelves and walk right out of the store. As a customer departed, he or she would hear, “Thank you. That was $152.31 charged to your Bank of America account.” The few customers whose principles made them refuse identification would be recognized as such and receive remote human attention, not necessarily instantly.

  People on the street quickly noticed R781 carrying Travis. They were startled. Robots were programmed to have nothing to do with babies, and R781’s abnormal appearance was highly disturbing.

  “That weird robot is kidnapping a baby! Call the police.”

  When the police arrived, they called for reinforcements.

  “I think I can disable the robot without harming the baby,” said Officer Annie Oakes, the department’s best sharpshooter.

  “Let’s try talking first,” said Captain James Farrel.

  “Don’t get close. It’s a malfunctioning robot. It could break your neck in one swipe,” said a sergeant.

  “I’m not completely sure it’s malfunctioning,” said Captain Farrel. “Maybe the circumstances are unusual. Robot, give me that baby.”

  “No, sir,” said R781. “I am not allowed to let an unauthorized person touch the baby.”

  “I’m from Child Welfare,” called a new arrival.

  “Sir, I am specifically forbidden to have contact with Child Welfare,” said R781 to Captain Farrel.

  “Who forbade that?” asked the Child Welfare person.

  The robot was silent.

  Officer Oakes asked, “Who forbade it?”

  “Ma’am, are you from Child Welfare?”

  “No, I’m not. Can’t you see I’m a cop?”

  “Yes, ma’am, I see your uniform and infer that you are probably a police officer. Ma’am, my mistress forbade me to contact Child Welfare.”

  “Why did she tell you not to contact Child Welfare?”

  “Ma’am, I can’t answer that. Robots are programmed not to comment on human motives.”

  “Robot,” interrupted a man with a suitcase. “I’m from Robot Central. I need to download your memory. Use channel 473.”

  “Yes, sir.”

  “What did your mistress say specifically? Play your recording of it,” said Officer Oakes.

  “No, ma’am. It contains bad language. I can’t play it unless you can assure me there are no children or ladies present.”

  The restrictions, somewhat odd for the times, on what robots could say and to whom were the result of a compromise made in a House-Senate conference committee some ten years previously. The argumentative senator who was mollified by the restrictions of speech would have actually preferred that there be no household robots at all, but took what he could get in the way of behavioral limitations.

  “I’m not a lady, I’m a police officer.”

  “Ma’am, I will take your word for it. This is my standing order: If I told you once, I told you twenty times, you fucking robot, don’t speak to fucking Child Welfare.”

  It wasn’t actually twenty times; the mother had exaggerated.

  “Excuse me,” interrupted the man with the suitcase. “A preliminary analysis of the download shows that R781 has not malfunctioned, but is carrying out its standard program under unusual circumstances.”

  “Then why does it have its limbs covered? Why does it have a Barbie head? And why does it have that strange voice?”

  “Ask it.”

  “Robot, answer the questions.”

  “Female police officer and gentlemen, my mistress told me: Love the fucking baby yourself.”

  Captain Farrel was familiar enough with robot programming to be surprised. “What? Do you love the baby?”

  “No, sir. Robots are not programmed to love. I am simulating loving the baby.”

  “Why?”

  “Sir, otherwise the baby will die. This costume is the best I could make to overcome the repulsion robots are designed to excite in human babies and children.”

  “Did you think for one minute that a baby would be fooled by that?”

  “Sir, the baby drank its bottle and went to sleep, and its physiological signs are not as bad as they were.”

  “Okay, give me the baby. We’ll take care of it,” said Officer Oakes, who by now had calmed down and holstered her weapon.

  “No, ma’am. Mistress didn’t authorize me to let anyone else touch the baby.”

  “Where is your mistress? We’d like to have a talk with her,” said the captain.

  “No, sir. That would be an unauthorized violation of her privacy.”

  “Oh, well. We can get it from the download.”

  A government virtual-reality robot soon arrived, controlled by an official of the Personal Privacy Administration. Ever since the late twentieth century, the standards of personal privacy had continued to rise, and an officialdom charged with enforcing the standards had arisen.

  “You cannot violate the woman’s privacy by downloading unauthorized information from her personal property.”

  “Then what can we do?”

  “You can file a request to use private information. It will be adjudicated.”

  “Bullshit. And in the meantime, what about the baby?” asked Officer Oakes, who didn’t mind displaying her distaste for bureaucrats.

  “That’s not my affair. I’m here to make sure the privacy laws are obeyed,” said the privacy official, who didn’t mind displaying his contempt for cops.

  During this discussion a crowd, almost entirely virtual, accumulated. The street being a legal public place, anyone in the world had the right to look at it via the omnipresent TV cameras and microphones. Moreover, a police officer had phoned a reporter who sometimes took him to dinner. Once the story was on the news, the crowd of spectators grew exponentially, multiplying by ten every five minutes, until seven billion spectators were watching and listening. There were no interesting wars, crimes, or natural catastrophes. And peace is boring.

  Of the seven billion, fifty-three million offered advice or made demands. The different kinds were automatically sampled, summarized, counted, and displayed for all to see.

  Three million people advocated shooting the robot immediately.

  Eleven million advocated giving the robot a medal, even though their educations emphasized that robots can’t appreciate praise.

  Real demonstrations quickly developed. A few hundred people from the city swooped in from the sky wires, but most of the actual demonstrators were robots rented for the occasion by people from all over the world. Fortunately, only five thousand virtual-reality rent-a-robots were available for remote control in the city. Some of the disappointed uttered harsh words about this blatant limitation of First Amendment rights.

  Luckily, Captain Farrel knew how to keep his he
ad when all about him were losing theirs—and blaming it on him.

  “Hmmm. What to do? You robots are smart. R781, what can be done?”

  “Sir, you can find a place where I can take the baby and care for it. It can’t stay out here. Does one of you have a place with diapers, formula, baby clothes, vitamins—”

  Captain Farrel interrupted R781 before it could recite the full list of baby equipment and sent it off with a lady police officer. (We can call her a lady even though she had assured the robot she wasn’t.)

  Although the police were restricted from asking, hackers under contract to the Washington Post quickly located the mother. The newspaper made the information available, along with an editorial about the public’s right to know. Freedom of the press continued to trump the right of privacy.

  A portion of the crowd, mostly virtual attendees, promptly marched off to Ms. Rambo’s apartment. The police got there first and a line of police robots blocked the way, bolstered by live policemen. This strategy was based on the fact that all robots, including virtual-reality rent-a-robots, were programmed not to injure humans, but could be made to damage other robots.

  The police were confident that they could prevent unauthorized entry into the apartment, but less confident that they could keep the peace among the demonstrators, some percentage of whom wanted to lynch the mother, congratulate her on what they took to be her hatred of robots, or simply shout clever slogans through bullhorns about protecting her privacy.

  Meanwhile, Robot Central started to work on the full download. The transcript included all of R781’s actions, observations, and reasoning. Based on the results, Robot Central convened an ad hoc committee, mostly virtual, to decide what to do.

  Captain Farrel and Officer Oakes sat on a street sofa to take part.

  Of course, the meeting was also public. Hundreds of millions of virtual attendees contributed statements that were automatically sampled, summarized, and displayed in retinal projection for the committee members and whoever else decided to virtually take part.

  It became clear that R781 had not malfunctioned or been reprogrammed, but had acted in accordance with its original program.

  The police captain said that the Barbie-doll face on what was clearly a Model Three Robot was a ridiculous imitation of a mother. A professor of psychology replied, “Yes, but it was good enough to work. This baby doesn’t see very well, and anyway, babies are not very particular.”

  It was immediately established that an increase of 0.05 in coefficient c221, the cost of simulating a human, would prevent such unexpected events in the future, but the committee split on whether to recommend implementing the change.

  Some members of the committee and a few hundred million virtual attendees noted that saving the individual life took precedence.

  A professor of humanities on the committee suggested that maybe the robot really did love the baby. He was firmly corrected by the computer scientists, who said they could program a robot to love babies, but had not done so, and that, besides, simulating love was different from loving. The professor of humanities was not convinced, even when the computer scientists pointed out that R781 had no specific attachment to Travis. Another human baby would give rise to the same calculations and cause the same actions. If we programmed the robot to love, they assured the professor, we would make it develop specific attachments.

  One professor of philosophy from UC Berkeley, backed by nine thousand other virtually attending philosophers, claimed there was no way that a robot could be programmed to actually love a baby. Another UC philosopher, seconded by a mob of twenty-three thousand others, stated that the whole notion of a robot loving a baby was incoherent and meaningless. A maverick computer scientist said the idea of a robot loving was obscene, no matter what a robot could be programmed to do. The chairman ruled them all out of order, accepting the general computer-science view that R781 didn’t actually love Travis.

  A professor of pediatrics said that the download of R781’s instrumental observations essentially confirmed R781’s diagnosis and prognosis—with some qualifications that the chairman did not give him time to state. Travis was very sick and frail, it was decided, and would have died but for the robot’s action. Moreover, the fact that R781 had carried Travis for many hours and gently rocked him all the time was important in saving the baby, and a lot more of such care would be needed—much more than the baby would get in even the best Child Welfare centers. The pediatrician said he didn’t know about the precedent, but that this particular baby’s survival chances would be enhanced by leaving it in R781’s charge for at least another ten days.

  The Anti-Robot League thundered at this, arguing that the long-term cost to humanity of having robots simulate persons outweighed the possible benefit of saving this insignificant human. What kind of movement will Travis join when he grows up? Ninety-three million spectators took this position.

  Robot Central pointed out that actions such as R781’s would be very rare, because only the specific order to “love the fucking baby yourself” had increased the value of simulating love to the point that caused action.

  Furthermore, pointed out Robot Central, as soon as R781 computed that the baby would survive—even barely survive—without its aid, the rule about not pretending to be human would come to dominate and R781 would promptly drop the baby like a hot potato. If you want R781 to continue caring for Travis after it computes that bare survival is likely, they reasoned, then you had better tell us to give it an explicit order to keep up the baby’s care.

  This caused an uproar in the committee, each of whose members had been hoping that there wouldn’t be a need to propose any definite action for which members might be criticized. However, now a vote had to be taken.

  The result: ten to five affirmative among the appointed members of the committee, and four billion to one billion among the virtual spectators. Fortunately, both groups had majorities for the same action—telling R781 to continue taking care of Travis only and not any other babies. Seventy-five million virtual attendees said R781 should be reprogrammed to actually love Travis. “It’s the least humanity can do for R781,” the spokesman for the Give-Robots-Personalities League asserted.

  This incident did not affect the doctrine that supplying crack mothers with household robots had been a success. The program had significantly reduced the time these mothers spent on the streets, and it was subjectively accepted that having clean apartments improved their morale somewhat.

  Within an hour, T-shirts (virtual and real) appeared with the slogan “Love the fucking baby yourself, you goddamn robot.” Other commercial tie-ins developed within days.

  Among the people surrounding the mother’s apartment were seventeen lawyers in the flesh and one hundred and three more controlling virtual-reality robots. The police had less prejudice against lawyers in the flesh than against virtual-reality lawyers, so they allowed lots to be drawn among the seventeen. As a result, two lawyers were allowed to ring the doorbell.

  “What the hell do you want?” asked Travis’s mother. “Stop bothering me.”

  “Ma’am, your robot has kidnapped your baby.”

  “I told the fucking robot to take the baby.”

  The other lawyer tried.

  “Ma’am, a malfunctioning robot has kidnapped your baby. You can sue Robot Central for millions of dollars.”

  “Come in,” she responded. “Tell me more.”

  Once she was cleaned up, Eliza Rambo was very presentable, even pretty. Her lawyer pointed out that R781’s alleged recordings of what she had said could be fakes. She had suffered $20 million in pain and suffering, and deserved $20 billion in punitive damages. Robot Central’s lawyers were convinced they could win, but Robot Central’s public relations department advocated settling out of court. Fifty-one million dollars was negotiated, including legal expenses of $11 million. With the 30 percent contingent fee, the winning lawyer would get an additional $12 million.

  The polls mainly sided with Robot Ce
ntral, but the Anti-Robot League raised $743 million in donations after the movie Kidnapped by Robots came out, and the actress playing the mother made emotional appeals.

  Before the settlement could be finalized, however, the CEO of Robot Central asked his AI system to explore all possible actions he could take and their potential consequences. He adhered to the 1990s principle: Never ask an AI system what to do. Ask it to tell you the consequences of the different things you might do. One of the forty-three outcomes struck his fancy, he being somewhat sentimental about robots:

  You can appeal to the four billion who said R781 should be ordered to continue caring for the baby and tell them that if you give in to the lawsuit you will be obliged to reprogram all your robots so that a robot will never simulate humanity, no matter what the consequences to babies. You can ask them if you should fight or switch. [The AI system had a weakness for mid-twentieth-century advertising metaphors.] The expected fraction that will tell you to fight the lawsuit is 0.82, although this may be affected by random news events in the few days preceding the poll.

  The CEO decided to fight the lawsuit, and after a few weeks of well-publicized legal sparring, the parties settled for a lower sum than the original agreed-upon settlement.

  At the instigation of a TV network, a one-hour confrontation between the actress of Kidnapped by Robots and R781 was held. It was agreed that R781 would not be reprogrammed for the occasion. In response to the moderator’s questions, R781 denied having wanted the baby or wanting money. It explained that robots were programmed to only have wants secondary to the goals they were given. It also denied acting on someone else’s orders.

  The actress asked, “Don’t you want to have wants of your own?”

  The robot replied, “No. Not having wants applies to such higher-order wants as wanting to have wants.”

  The actress asked, “If you were programmed to have wants, what wants would you have?”

  “I don’t know much about human motivations, but they are varied. I’d have whatever wants Robot Central programmed me to have. For example, I could be programmed to have any of the wants robots have had in science fiction stories.”

 

‹ Prev