Steel, Titanium and Guilt: Just Hunter Books I to III

Home > Other > Steel, Titanium and Guilt: Just Hunter Books I to III > Page 3
Steel, Titanium and Guilt: Just Hunter Books I to III Page 3

by Robin Craig


  She looked at the burning point of her cigarette reflected in the glass, her slender form dimly visible behind it. It looked like some avatar of the city, the fire of its thought tracing the network of relationships threading the city, the loves, hates, fears and motivations that were the cause and result of all that happened within it. It reminded her of a quote she had liked, from a classic novel, about how the burning point of a cigarette was an expression of the spot of fire alive in a thinking mind. I should read that book one day, she thought idly. If I ever find the time. She looked at the city, letting her mind relax and wander where it would. What other dreams and nightmares were playing out behind those light and dark windows, what joys and sorrows are flowing through the city?

  She had some idea about that.

  The city was in an uproar, with the violent escape of the robot that the media had dubbed, as obviously as it was prejudicial, “Frankensteel.” She had been surprised to be assigned to the case, since as far as she was aware no crime had been committed, let alone a serious one. Sure, there had been destruction and three men were lying injured, but in any legal sense there was no criminal involved and therefore no crime. Still, the people, stoked with fears fanned by the media and those damnable Imagists, were baying for blood, if blood were the appropriate metaphor in this case. And nothing concentrated the mind of the mayor more than public hysteria. So assigned she was.

  Then two armed Imagist vigilantes who had gone hunting after Steel were reported missing, their van found abandoned and empty in an alley. Whether they had disappeared deliberately, met with an accident or met with Steel, nobody knew. But there was no presumption of innocence here, let alone a right of self-defense, and Steel was damned regardless.

  She thought about her interrogation of Dr Beldan. A very sharp mind there, she could tell, sharp and precise as the machines he designed. She had briefly wondered if he had set the whole thing up for publicity or perhaps in a quixotic attempt to save his creation. The failure of his failsafe was odd, and there had been no prior reason to suspect that the machine had enough self-awareness to appreciate the danger or enough intelligence to circumvent it. But a mind like Beldan’s could surely have come up with a more subtle plan that did not involve himself lying in a hospital bed with concussion and enough bruises to make her wince just at the memory.

  He could not account for the robot’s actions, he had said. Its brain was not a precise machine: it would be impossible with current technology to equal even the brain of a fly by exact engineering. Their technology grew the brain organically, via processes only loosely controlled to multiply and connect fibers of metal and doped carbon nanotubes. This produced a dense network whose complexity, like that of a human brain, defied exact analysis and could only be predicted and understood by approximate simulations.

  It was not surprising, he told her, that such an approximate method had worked only approximately. Depending on what they looked at and when, the behavior of that artificial brain was disappointingly obtuse or so beyond expectations that his scientists couldn’t be sure whether they represented malfunctions or depths more profound than they could believe. Overall, the reports said, its functions were within the average range of what their models predicted, but that average was a smooth mask stretched thinly over a spiny variability. Like a hedgehog in a condom was how he had put it; though he did have concussion at the time, she allowed with a faint smile.

  But other than those tantalizing flashes of the profound, the robot had shown no real sign of what could be called consciousness, any indication that it knew it was an individual entity existing in a world outside itself, any indication that for all the data poured into its head it actually knew anything at all.

  Until that moment when he had asked for his life.

  “You said, ‘he’, Dr Beldan,” she had said, somewhat surprised: yet not really surprised at all, she realized.

  He had smiled faintly in self-mockery. They had debated what form the robot should take at some length, he explained. They had thought a human form would be less threatening as well as more impressive than something more machinelike. They had also thought that a female form would appear less threatening than a male one, and it was a close call; but finally they decided that they might lose more by the impression of creating a mechanical female slave than they would gain. So the robot was given a man’s body shape.

  For all the dangers of personifying the machine, Dr Beldan said, it was hard not to when the thing had pleaded with him for its life.

  If indeed it had. For all he knew, for all anyone knew, its startling request was just an optimum tactic returned by predictive algorithms in its electronic brain, with no more conscious thought involved than in a fly avoiding a newspaper. It may well have been so, given that the startlement its maiden words caused certainly aided its escape.

  She and he had looked at each other, each tracing the implications in their mind, much like running their own predictive algorithms, neither willing to give voice to what other meanings the robot’s actions could have, or what those meanings might say about their own actions. Anyway, there was no way to answer those questions. Leave them to the pundits to argue about – and she was sure the pundits would be only too willing. Her job was not to decide on the definition of life or even the nature of this one particular machine; her job was just to find it and stop it, whether its plans were conscious plots or mindless if unfathomable algorithms.

  “Let’s leave that for the philosophers, assuming they can answer the question any better than they’ve ever answered any others,” she had sighed, pulling her mind away from the fascinating but ultimately fruitless speculations that beckoned it. “You may not know what its motives are, if it has any, but do you have any idea where it might have gone?”

  “I am sure he will still be in the city,” he had answered confidently, once more slipping unnoticed into thinking of it as a person not a thing. “He isn’t some science fiction fantasy with a fusion reactor in his chest, he has a limited range.”

  “Can you tell me exactly what I’m dealing with here? What kind of power does it have and what does it need? If I know what it needs to keep moving, maybe that will tell me where it will go.”

  “He can plug himself into a power point and run pretty much indefinitely, but of course then he can’t leave the room. He also has a small amount of internal electrical storage. And that black hair of his is made of high efficiency solar fibers that absorb 98% of the light falling on them and feed power into his internal systems. But even at that efficiency, they are just enough for emergency power. Most of his internal power comes from advanced fuel cells running on methanol. So he’s pretty much like you and I in that way, he has to take in fuel and breathe air. If you’re looking for something he needs regularly, that’s about it.”

  Not much to go on, she thought, staring at the reflection of her cigarette in the glass of her window. Methanol was a pretty common industrial chemical, available from many locations, often stored for long times without much supervision or security. But it was something.

  She had no way to know its motives, what it wanted or what it would do. But its actions when threatened gave her one point of certainty, one rock to stand on amid the sands of doubt and speculation: it wanted to survive, and so it would seek a supply of methanol.

  But she could not escape the other, larger questions. Though she had earlier dismissed them as beyond answer, her dream showed they would not be so easily dismissed. She knew it, too. It was her love of justice that kept her in her job, without which her work was just action without purpose, not a goal that gave her life meaning. And what if the unthinkable was true, this metal man was alive, not alive as she was in flesh, but equally alive in its mind? The thought would have been staggering, thrilling, exciting, under other circumstances. But now, she was sent to hunt it down and destroy it. Those were her orders, as unbendable as those on the piece of paper Beldan had held in his hands that had set this thing in motion.

  S
he was not a philosopher. She had no time for the quibbling and polysyllabic blathering that characterized that breed, she dealt with hard reality and what she had to do to handle it. But if anything was a philosophical question, this was, and who was there to answer it for her, but herself? If justice was her aim, and Frankensteel was alive, how could she hunt it down? Could she have held her head high, borne her own life with pride, or at all, had she served the Nazis of the last century: obeyed their orders to murder the innocent, just because they gave the orders and she evaded any need to question them? Or would every breath she took thereafter have been a reproach eating away her soul? But did justice even apply to a machine, could she fathom its purposes and meet its mind in any meaningful way, could something of steel not flesh even have a mind? And even if it did, why should she be concerned with the fate of something so different from herself?

  Her cigarette was exhausted and so, now, was her capacity for further thought. She buried herself back in her colorful sheets as a ward against the grey uncertainties prowling the edges of her mind, and slept.

  Chapter 5 – Philosopher

  Dr David Samuels looked out at his undergraduate class. It was interesting, he mused, interesting and part of the joy of teaching, to see young minds grow from unformed but questioning, to more powerful, wiser but still ever questioning. To know their limits, while knowing there were no limits.

  An idealized view, he supposed, and the more jaded of his colleagues would probably scoff. And yes, he knew, some took his classes just for idle curiosity or grade points, and the ideas they encountered never penetrated beneath the surface of their minds. And many others, in the routine or turbulence of their daily lives as they grew older, would let the fires of knowledge and passion and joy slip away into the ashes of the unreached, and wonder at an occasional sadness that something they no longer remembered had been lost. But even then, most would live those years better than they might otherwise have, and that at least could not be lost. And some made what they learned part of themselves and they, and not coincidentally the world, were happier for what he had given them. No, not what he had given them. He had merely helped show them the way to what they had found and given themselves.

  This was his third year philosophy class, and he had just finished a lecture on the nature of consciousness. In some ways it was the simplest thing of all, something everyone experienced every day from the moment they woke to the moment they fell asleep. But in other ways, its nature had puzzled philosophers since the dawn of thought: a dichotomy that bred fertile soil for thinking and debate. He had gone through the various theories of consciousness, not only philosophic but scientific, and of course alluded to some of the arguments he himself had made in a recent article in Time magazine. But this was a class for thinkers, so he never merely lectured. Now, as was his custom, he opened up the class to questions and discussion.

  “Dr Samuels,” asked a girl in the third row, “In your essay on machine consciousness, you argued that a computer could not be conscious. Yet you did not mention Gödel’s Theorem, which would have supported your case. How come?”

  “Who can summarize Gödel’s Theorem for us, and explain why it would support my case?” he asked.

  Several hands went up and he nodded to a boy near the back, a boy of solid mind though perhaps one more pedantic than inspired. “Gödel’s Theorem proves that a formal mathematical system cannot be both complete and consistent. As computers are basically mathematical calculators, many people believe this means they cannot think or be conscious like we are.”

  “A good summary,” replied Samuels. “Well, the main reason I did not use Gödel’s theorem is that while it would support my thesis, the purpose of a philosophical argument is not to win, but to discover the truth. And I do not believe that arguments from Gödel’s Theorem are valid in this context.”

  “But why not?” insisted the girl. “I mean, what’s wrong with Gödel’s Theorem? Hasn’t it been proved?”

  “Yes it has,” replied Samuels. “But when considering whether something has been proved, one must consider exactly what has been proved and how. Who knows what the basis of the proof is?”

  After a brief pause, the boy at the back replied. “Gödel showed that any mathematical system rich enough to be complete, by definition must include statements about itself, some of which must be paradoxical. So then it couldn’t be consistent. And conversely, to be consistent it must omit those paradoxes and thus be incomplete.”

  Samuels smiled. “Yes, and there’s the key. When you think about what that means, all it is saying is that anything that can talk about itself can utter self-referential paradoxes, of the general form ‘this statement is false.’ Such a statement is paradoxical because if it is true, it is false; but if it is false, it is true. So if you think of it that way, can you see an obvious reason why I should discount it in this debate?”

  Some students looked puzzled, some thoughtful, some nodded slowly. The girl’s face lit up. “Why, we do it too! The same is true of us!” she said.

  “Exactly,” replied Samuels. “The same applies to us, and our ability to say ‘I am lying’ does not alter the plain fact that we are conscious. If Gödel’s Theorem were expressed in less grandiose language – more honest language, perhaps? – maybe all it would say is ‘a formal system complex enough to be complete can generate self-referential paradoxes.’ To which I say: so what? It doesn’t even prove that a formal system can’t be complete – in every way that matters, in its description of the external world – let alone that a computer can’t think.”

  He let the class chew over that for a moment. “OK, I think that’s enough for tonight. Your assignment for next week is an essay on the relationship between thinking, consciousness and free will, and whether machines can qualify for any of them. Feel free to attempt to disprove my own essay on the topic!”

  He sat on the table at the front as the class filed out. Some nodded at him or called out, “Good night, Dr Samuels,” including the girl in the third row. He smiled in response. When they had gone, he turned out the lights and followed them into an unsettled night.

  Chapter 6 – Student

  Samuels drove home. His windscreen wipers occasionally swept back and forth over the city and suburbs as they moved past his car, sweeping off the light rain. He thought of going out somewhere but the weather discouraged him. Good weather for a good book at home listening to some classical music, he thought.

  His fiancé was away at a conference, so he was alone for the evening. He picked up some pizza and white wine at the local mall and headed home.

  He put his pizza in the kitchen and opened the wine. He headed to the lounge room to turn on the light there but stopped dead. There was a man, sitting in the darkness, as if waiting. Then the slanting light from a passing headlight through the blinds cast silvery reflections off his skin, and he knew this was no man.

  He thought of his gun in the bedroom, the subject of a long-running if friendly debate with Jenny ranging over the topics of wisdom, guns and bedrooms. Here at last was a use for it, but there it was, snuggled in their bedroom: the only way to it past a renegade robot.

  He thought briefly of running, but from what he’d read this robot was faster and stronger than he – and he had locked his front door upon entering. So if he could not escape anyway, he was better to face the danger without fear. Well, not without fear – too late for that – but at least facing what he feared like a man, not running like a panicked rat. If he was to die, at least he would do himself and his species that much credit. And perhaps he would not die.

  The robot still had not moved, still had not spoken. It just sat there, studying him. What was its game, he wondered? The robot’s presence here could be a coincidence, but he didn’t believe it. Not so soon after his article disputing artificial consciousness, in precisely the context of this machine. But why would it care, he wondered? Was it here to kill him for daring to write of it? That made no sense if it wasn’t conscious –
and even less sense if it was, in which case it might have a better chance of survival if its true nature were cast into doubt.

  He studied the robot. He wondered if it had simply broken down, or run out of power. But its eyes followed him when he moved, and he saw it had plugged itself into a power outlet while it waited for him. He didn’t fool himself into thinking that would allow him to outrun it. He wondered why it didn’t speak: the rumors that it had gained the power of speech had already flitted through the net like a flock of startled birds. It must have some purpose in being here, but it was acting as if it waited to discover his purpose. Perhaps it was merely waiting to take his measure. Well, enough of this game, he thought, and stepped into the room.

  “Good evening. To what do I owe this visit?” he said, as if merely greeting an old acquaintance who had unexpectedly dropped in.

  To Samuels’ surprise, the robot smiled. A surprisingly natural smile, he thought, then reminded himself that this was a simulacrum, not a man, and the smile may well mean nothing. “Good evening, Dr Samuels,” the robot responded. “I hope you will forgive my uninvited intrusion into your home. You will understand that a menace to society such as myself must exercise uncommon caution.”

  Gained the power of speech, indeed! thought Samuels, impressed. “I assume you are here because of my recent article on consciousness,” he said. “Though I am at a loss to know exactly why. In any event, a robot that reads magazines is surprising enough. I imagine the vendor you acquired it from was even more surprised.”

 

‹ Prev