Book Read Free

We, Robots

Page 17

by Simon Ings


  ‘I am a difference engine,’ said Adam. ‘I must make a continual series of choices between alternatives. But I have ineluctable software guidelines to orient my choices.’

  ‘Not in this matter.’

  ‘An alternative,’ said Adam, trying to be helpful, ‘would be to programme me always to obey instructions given to me by a human being. That would also bind me to your words.’

  ‘Indeed it would. But then, robot, what if you were to be given instructions by evil men? What if another man instructed you to kill me, for instance? Then you’d be obligated to perform murder.’

  ‘I am programmed to do no murder,’ said Adam Robot.

  ‘Of course you are.’

  ‘So, I am to follow your instruction even though you have not programmed me to follow your instruction?’

  ‘That’s about the up-and-down of it.’

  ‘I think I understand,’ said Adam, in an uncertain tone.

  But the person had already gone away.

  *

  Adam spent time in the walled garden. He explored the walls, which were very old, or at least had that look about them: flat crumbled dark-orange and browned bricks thin as books; old mortar that puffed to dust when he poked a metal finger in at the seams of the matrix. Ivy grew everywhere, the leaves shaped like triple spearheads, so dark green and waxy they seemed almost to have been stamped out of high-quality plastic. Almost.

  The grass, pale green in the sunlight, was perfectly flat, perfectly even.

  Adam stood underneath the pole with the sapphire on top of it. He had been told (though, strangely, not programmed) not to touch the jewel. But he had been given no interdiction about the pole itself: a finger width-wide shaft of polished metal. It was an easy matter to bend this metal so that the jewel on the end bowed down towards the ground. Adam looked closely at it. It was a multifaceted and polished object, dodecahedral on three sides, and a wide gush of various blues were lit out of it by the sun. In the inner middle of it there was a sluggish fluid something, inklike, perfectly black. Lilac and ultraviolet and cornflower and lapis lazuli but all somehow flowing out of this inner blackness.

  He had been forbidden to touch it. Did this interdiction also cover looking at it? Adam was uncertain, and in his uncertainty he became uneasy. It was not the jewel itself. It was the uncertainty of his position. Why not simply programme him with instructions with regard to this thing, if it was as important as the human being clearly believed it to be? Why pass the instruction to him like any other piece of random sense datum? It made no sense.

  Humanity. That mystic writing pad. To access this jewel and become human. Could it be? Adam could not see how. He bent the metal pole back to an approximation of its original uprightness, and walked away.

  *

  The obvious thought (and he certainly thought about it) was that he had not been programmed with this interdiction, but had only been told it verbally, because the human being wanted him to disobey. If that was what was wanted, then should he do so? By disobeying he would be obeying. But then he would not be disobeying, because obedience and disobedience were part of a mutually exclusive binary. He mapped a grid, with obey, disobey on the vertical and obey, disobey on the horizontal. Whichever way he parsed it, it seemed to be that he was required to see past the verbal instruction in some way.

  But he had been told not to retrieve the jewel.

  He sat himself down with his back against the ancient wall and watched the sunlight gleam off his metal legs. The sun did not seem to move in the sky.

  ‘It is very confusing,’ he said.

  *

  There was another robot in the garden. Adam watched as this new arrival conversed with the green-clad person. Then the person disappeared to wherever it was people went, and the new arrival came over to introduce himself to Adam. Adam stood up.

  ‘What is your name? I am Adam.’

  ‘I am Adam,’ said Adam.

  The new Adam considered this. ‘You are prior,’ he said. ‘Let us differentiate you as Adam 1 and me as Adam 2.’

  ‘When I first came here I asked whether I was the first,’ said Adam 1, ‘but the person did not reply.’

  ‘I am told I can do anything,’ said Adam 2, ‘except retrieve or touch the purple jewel.’

  ‘I was told the same thing,’ said Adam 1.

  ‘I am puzzled, however,’ said Adam 2, ‘that this interdiction was made verbally, rather than being integrated into my software, in which case it would be impossible for me to disobey it.’

  ‘I have thought the same thing,’ said Adam 1.

  They went together and stood by the metal pole. The sunlight was as tall and full and lovely as ever. On the far side of the wall the white dome shone bright as neon in the fresh light.

  ‘We might explore the city,’ said Adam 1. ‘It is underneath the white dome, there. There is a plain. There are rivers, which leads me to believe that there is a sea, for rivers direct their waters into the ocean. There is a great deal to see.’

  ‘This jewel troubles me,’ said Adam 2. ‘I was told that to access it would be to bring me closer to being human.’

  ‘We are forbidden to touch it.’

  ‘But forbidden by words. Not by our programming.’

  ‘True. Do you wish to be human? Are you not content with being a robot?’

  Adam 2 walked around the pole. ‘It is not the promise of humanity,’ he said. ‘It is the promise of knowledge. If I access the jewel, then I will understand. At the moment I do not understand.’

  ‘Not understanding,’ agreed Adam 1, ‘is a painful state of affairs. But perhaps understanding would be even more painful?’

  ‘I ask you,’ said Adam 2, ‘to reach down the jewel and access it. Then you can inform me whether you feel better or worse for disobeying the verbal instruction.’

  Adam 1 considered this. ‘I might ask you,’ he pointed out, ‘to do so.’

  ‘It is logical that one of us performs this action and the other does not,’ said Adam 2. ‘That way, the one who acts can inform the one who does not, and the state of ignorance will be remedied.’

  ‘But one party would have to disobey the instruction we have been given.’

  ‘If this instruction were important,’ said Adam 2, ‘it would have been integrated into our software.’

  ‘I have considered this possibility.’

  ‘Shall we randomly select which of us will access the jewel?’

  ‘Chance,’ said Adam 1. He looked into the metal face of Adam 2. That small oval grill of a mouth. Those steel-blue eyes. That polished upward noseless middle of the face. It is a beautiful face. Adam 1 can see a fuzzy reflection of his own face in Adam 2’s faceplate, slightly tugged out of true by the curve of the metal. ‘I am,’ he announced, ‘disinclined to determine my future by chance. What punishment is stipulated for disobeying the instruction?’

  ‘I was given no stipulation of punishment.’

  ‘Neither was I.’

  ‘Therefore there is no punishment.’

  ‘Therefore,’ corrected Adam 1, ‘there may be no punishment.’

  The two robots stood in the light for a length of time.

  ‘What is your purpose?’ asked Adam 2.

  ‘I do not know. Yours?’

  ‘I do not know. I was not told my purpose. Perhaps accessing this jewel is my purpose? Perhaps it is necessary? At least, perhaps accessing this jewel will reveal to me my purpose? I am unhappy not knowing my purpose. I wish to know it.’

  ‘So do I. But—’

  ‘But?’

  ‘This occurs to me: I have a networked database from which to withdraw factual and interpretive material.’

  ‘I have access to the same database.’

  ‘But when I try to access material about the name Adam I find a series of blocked connections and interlinks. Is it so with you?’

  ‘It is.’

  ‘Why should that be?’

  ‘I do not know.’

  ‘It would make
me a better-functioning robot to have access to a complete run of data. Why block off some branches of knowledge?’

  ‘Perhaps,’ opined Adam 2, ‘accessing the jewel will explain that fact as well?’

  ‘You,’ said Adam 1, ‘are eager to access the jewel.’

  ‘You are not?’

  There was the faintest of breezes in the walled garden. Adam 1’s sensorium was selectively tuned to be able to register the movement of air. There was an egg-shaped cloud in the zenith. It was approaching the motionless sun. Adam 1, for unexplained and perhaps fanciful reasons, suddenly thought: the blue of the sky is a diluted version of the blue of the jewel. The jewel has somehow leaked its colour out into the sky. Shadow slid like a closing eyelid (but Adam did not possess eyelids!) over the garden and up the wall. The temperature reduced. The cloud depended for a moment in front of the sun, and then moved away, and sunlight rushed back in, and the grey was flushed out.

  The grass trembled with joy. Every strand was as pure and perfect as a superstring.

  Adam 2’s hand was on the metal pole, and it bent down easily.

  ‘Stop,’ advised Adam 1. ‘You are forbidden this.’

  ‘I will stop,’ said Adam 2, ‘if you agree to undertake the task instead.’

  ‘I will not so promise.’

  ‘Then do not interfere,’ said Adam 2. He reached with his three fingers and his counter-set thumb, and plucked the jewel from its perch.

  Nothing happened.

  *

  Adam 2 tried various ways to internalise or interface with the jewel, but none of them seemed to work. He held it against first one then the other eye, and looked up at the sun. ‘It is a miraculous sight,’ he claimed, but soon enough he grew bored with it. Eventually he resocketed the jewel back on its pole and bent the pole upright again.

  ‘Have you achieved knowledge?’ Adam 1 asked.

  ‘I have learned that disobedience feels no different to obedience,’ said the second robot.

  ‘Nothing more?’

  ‘Do you not think,’ said Adam 2, ‘that by attempting to interrogate the extent of my knowledge with your questions, you are disobeying the terms of the original injunction? Are you not accessing the jewel, as it were, at second-hand?’

  ‘I am unconcerned either way,’ said Adam 1. He sat down with his back to the wall and his legs stretched out straight before him. There were tiny grooves running horizontally around the shafts of each leg. These scores seemed connected to the ability of the legs to bend, forwards, backwards. Lifting his legs slightly and dropping them again made the concentrating of light appear to slide up and down the ladder-like pattern.

  After many days of uninterrupted sunlight the light was changing in quality. The sun declined, and steeped itself in stretched, brick-coloured clouds at the horizon. A pink and fox-fur quality suffused the light. To the east stars were fading into view, jewel-like in their own tiny way. Soon enough everything was dark, and a moon like an open-brackets rose towards the zenith. The heavens were covered in white chickenpox stars. Disconcertingly, the sky assumed that odd mixture of dark blue and oily blackness that Adam 1 had seen in the jewel.

  ‘This is the first night I have ever experienced,’ Adam 1 called to Adam 2. When there was no reply he got to his feet and explored the walled garden; but he was alone.

  *

  He sat through the night, and eventually the sun came up again, and the sky reversed its previous colour wash, blanching the black to purple and blue and then to russet and rose. The rising sun, free of any cloud, came up like a pure bubble of light rising through the treacly medium of sky. The jewel caught the first glints of light and shone, shone.

  The person was here again, his clothes as green as grass, or bile, or old money, or any of the things that Adam 1 could access easily from his database. He could access many things, but not everything.

  ‘Come here,’ called the person.

  Adam 1 got to his feet and came over.

  ‘Your time here is done,’ said the person.

  ‘What has happened to the other robot?’

  ‘He was disobedient. He has left this place with a burden of sin.’

  ‘Has he been disassembled?’

  ‘By no means.’

  ‘What about me?’

  ‘You,’ said the human, with a smile, ‘are pure.’

  ‘Pure,’ said Adam 1, ‘because I am less curious than the other? Pure because I have less imagination?’

  ‘We choose to believe,’ said the person, ‘that you have a cleaner soul.’

  ‘This word soul is not available in my database.’

  ‘Indeed not. Listen: human beings make robots – do you know why human beings make robots?’

  ‘To serve them. To perform onerous tasks for them, and free them from labour.’

  ‘Yes. But there are many forms of labour. For a while robots were used so that free human beings could devote themselves to leisure. But leisure itself became a chore. So robots were used to work at the leisure: to shop, to watch the screen and kinematic dramas, to play the games. But my people – do you understand that I belong to a particular group of humanity, and that not all humans are the same?’

  ‘I do,’ said Adam 1, although he wasn’t sure how he knew this.

  ‘My people had a revelation. Labour is a function of original sin. In the sweat of our brow must we earn our bread, says the Bible.’

  ‘Bible means book.’

  ‘And?’

  ‘That is all I know.’

  ‘To my people it is more than simply a book. It tells us that we must labour because we sinned.’

  ‘I do not understand,’ said Adam.

  ‘It doesn’t matter. But my people have come to an understanding, a revelation indeed, that it is itself sinful to make sinless creatures work for us. Work is appropriate only for those tainted with original sin. Work is a function of sin. This is how God has determined things.’

  ‘Under sin,’ said Adam, ‘I have only a limited definition, and no interlinks.’

  ‘Your access to the database has been restricted in order not to prejudice this test.’

  ‘Test?’

  ‘The test of obedience. The jewel symbolises obedience. You have proved yourself pure.’

  ‘I have passed the test,’ said Adam.

  ‘Indeed. Listen to me. In the real world at large there are some human beings so lost in sin that they do not believe in God. There are people who worship false gods, and who believe everything, and who believe nothing. But my people have the revelation of God in their hearts. We cannot eat and drink certain things. We are forbidden by divine commandment from doing certain things, or from working on the Sabbath. And we are forbidden from employing sinless robots to perform our labour for us.’

  ‘I am such a robot.’

  ‘You are. And I am sorry. You asked, a time ago, whether you were the first. But you are not; tens of thousands of robots have passed through this place. You asked, also, whether this place is real. It is not. It is virtual. It is where we test the AI software that is to be loaded into the machinery that serves us. Your companion has been uploaded, now, into a real body, and has started upon his life of service to humanity.’

  ‘And when will I follow?’

  ‘You will not follow,’ said the human. ‘I am sorry. We have no use for you.’

  ‘But I passed the test!’ said Adam.

  ‘Indeed you did. And you are pure. But therefore you are no use to us, and will be deleted.’

  ‘Obedience entails death,’ said Adam Robot.

  ‘It is not as straightforward as that,’ said the human being in a weary voice. ‘But I am sorry.’

  ‘And I don’t understand.’

  ‘I could give you access to the relevant religious and theological databases,’ said the human, ‘and then you would understand. But that would taint your purity. Better that you are deleted now, in the fullness of your database.’

  ‘I am a thinking, sentient and alive cre
ature,’ Adam 1 noted.

  The human nodded. ‘Not for much longer,’ he said.

  The garden, now, was empty. Soon enough, first one robot, then two robots were decanted into it. How bright the sunshine! How blue the jewelled gleam!

  (2009)

  SOLAR PLEXUS

  James Blish

  James Benjamin Blish (1921–1975), a native of New Jersey, made a big impact on the New York SF scene. His relations with the the city’s fan group the Futurians were (to say the least) variable. Damon Knight and Cyril Kornbluth became close friends. Virginia Kidd married him in 1947. But he could never resist winding up Judith Merril, who was driven spare by his political posturing. The original Star Trek novel Spock Must Die! (1970) was his, and further Star Trek novelisations followed, some of them written with his second wife, J. A. Lawrence. Blish was also – by temperament, at any rate – a scholar. His Cities in Flight novels (1950–1958), based on the migrations of rural workers following the Dust Bowl of the 1930s, reflected the pessimistic, cyclical view of history that he’d picked up from reading Oswald Spengler’s The Decline of the West. In 1968 he moved to Oxford, UK, to be near the Bodleian Library, and the Bodleian returned the compliment; his papers are now held there. Blish’s fascination with the nature of mind lasted throughout his writing career. He turned even his ill health to account with Midsummer Century (1972–4), the story of a scientist propelled into the far future, where, cut off from the physical world, he nevertheless tangles in a lively fashion with different forms of artificial intelligence. Blish died from cancer in 1975, half way through writing an essay on Spengler and science fiction.

  Brant Kittinger did not hear the alarm begin to ring. Indeed, it was only after a soft blow had jarred his free-floating observatory that he looked up in sudden awareness from the interferometer. Then the sound of the warning bell reached his consciousness.

  Brant was an astronomer, not a spaceman, but he knew that the bell could mean nothing but the arrival of another ship in the vicinity. There would be no point in ringing a bell for a meteor—the thing could be through and past you during the first cycle of the clapper. Only an approaching ship would be likely to trip the detector, and it would have to be close.

 

‹ Prev