“A squad of space infantry is coming on board to carry out checks. Stay where you are. Don’t make any sudden movements and keep your hands where we can see them.”
Steve looked uneasily at Clive. This scene was beginning to remind him of what they had experienced on Mars. At that moment, Steve understood why wild animals became so panicked by various trivial things. Having once experienced truly wild primeval fear, the mind will refuse to be calm in similar circumstances. This is not like riding a roller-coaster, where there is only artificial fear. Real animal fear burns itself into the consciousness and stays there forever.
Steve saw on one of the monitors that a group of several space infantrymen was floating into the sleeve from the military ship. They had in their hands short automatic pulse weapons intended for combat inside spacecraft. The energy of their charges was reduced to prevent any unintended piercing of the shell if they had to be used within one. Breach of seal in space is a very unpleasant thing. A few seconds later, the group reached the other end of the sleeve. One of them slowly floated into the ship and looked round.
“Greetings. Captain Pierce. Are there just the two of you on board?”
“Hello, Captain. I am the first pilot, this is my passenger, or second pilot if you like. Yes, we are in splendid isolation on board.”
The captain looked at Steve and then Clive, taking the measure of them.
“How is your health, sir?”
Clive was surprised for a moment by the question, but then remembered that his skin still had a dark green tint. This indicated that he must recently have been in reanimation and required a lot of artificial blood substitute.
“Thank you, sir, I am feeling well.”
“Were you wounded?”
“Yes, sir.”
The captain looked round the cargo compartment. Noticing the two empty spaces where the harriers should have been, he pointed them out to one of his subordinates.
“Being repaired?” he asked Steve.
“We lost them on the surface of Mars.”
“Did you report the loss to the police?”
“No, sir.”
“All right. Find me their registration numbers.”
“Of course.”
The captain turned to Clive and stared at him for a few seconds.
“Would you mind telling me what happened?” he asked eventually.
Clive glanced briefly at Steve.
“We were returning from an important mission on Mars, Captain. Our clearance level is 1C,” said Steve.
Pierce turned sharply towards him, looked him over again suspiciously and touched the sensor screen on his right wrist.
“Base, this is Captain Pierce. Confirmation of identity required. Sending DNA data from scan.”
He nodded to one of his team, who took a thin palm-sized plate from his backpack.
“Please place your palm on the scanner, Sir.”
Steve obediently touched the plate extended to him. The scanner beeped in response.
“Now you, sir,” said the captain to Clive.
Clive did as Steve had done. A few seconds later, the captain’s headphones could be heard. In the tomb-like silence inside the Falcon, it was just about possible to recognise that it was human speech, but the words could not be distinguished.
“Yes, Base, understood,” he said eventually. Then he looked at Steve.
“Your clearance levels have been confirmed, sir. Are you military intelligence?”
“No, we are civilians.”
The captain’s face expressed surprise.
“Civilians? Not bad! But please let us inspect your ship all the same. I have orders to check everyone flying towards Earth. There are no special instructions about such clearance as yours.”
The captain had changed his tone, and was now more friendly. Nevertheless, a certain doubt could still be seen in his eyes. How could these two young whippersnappers have such a high level of military clearance?
“Fine. Where would you like to start?”
“In the pilot’s compartment. I need the ship’s logbook.”
Steve pointed the way.
“Please follow me.”
He pushed off from the wall and floated towards the bridge. Pierce turned to Clive.
“Sir, would you please come with us?”
Clive said nothing in reply, but followed Steve.
A question of justice
The heated discussion with MacQueen had had a positive influence on the atmosphere in the situation room. As often happens, conflicts can sometimes develop deep down, hidden, barely noticeable. But sooner or later the day will come when the boil will break through to the surface.
The latest events in the story of the incomer had developed in just this way. Many understood that something was not going according to plan, but no-one knew what to do. They all just got on with their work, falling deeper and deeper into the mire of uncertainty.
MacQueen had shaken up the scientists and got the process back on track. The mood in the hall changed and became positive, practical thoughts about their work reappeared. Since the incomer refused to give a direct answer to the question of whether it had live aliens on board, the scientists began seeking a method of finding this out indirectly. Professor Sullivan was confident that a test for this purpose could be devised. Now the hall was absorbed in discussion of how this bold idea could be brought to reality.
As it turned out, this was not easy. Each day of the search for such tests passed by without any appreciable results. Many theories were put forward, but after thoughtful discussion, weak points were found in every one of them. Each idea in turn was rejected, and the search was resumed.
It was late evening. Night was falling. The coffee machine could barely manage to keep up with the demand as the learned fraternity consumed gallons of the black liquid in an attempt to retain the capacity to think. Despite the late hour, animated discussions continued in the hall.
For Sullivan and his group, this was not by any means the first long day. He had been given one of the most time-consuming roles in the whole project. Many days of chronic lack of sleep were gradually beginning to take their toll. It became more and more difficult to concentrate on the matter in hand.
Sullivan was now chairing the discussion, because he knew much more about AI than Shelby. The collar of his lightweight bright yellow shirt was undone. He was totally absorbed in the discussion.
“Are there any more ideas? We need ideas, colleagues! We must at least map out the path along which we are going to proceed,” said Sullivan, gesticulating energetically as if trying to push forward a debate which was stuck in the mud.
His work was like rolling a huge rock up a mountain, where the higher you managed to raise the rock, the steeper the slope became. Their passion was gradually dying away. A whole day of searching with only a very short break for lunch, and they still didn’t have even an approximate concept.
“All right, let’s go back,” announced Sullivan, without waiting for any more intelligent ideas from the hall. He gestured to remove what was written on the screen they were using as a blackboard.
“We’ll try again from the very beginning. If we cannot formalise a question to check intuition, we must seek alternative ways. Let’s try it differently.
“There are two ways of solving our problem. The first one is to set tasks of the sort the AI cannot solve independently, that is without the help of a biological mind. By coping with such a task, the incomer will indirectly prove that living members of its civilisation are on board it.
“The second possibility is to set it a task with many possible solutions, which although they can be found by the AI independently, will show by the choice it makes whether an AI or a living intellect is behind the solution.”
One of the psychologists took the floor.
“If we formulate a task in the way you have just described, we should turn our attention to psychological dilemmas.”
“Aha!” Sullivan almost
shouted, startling the scientists sitting alongside him. A murmuring like distant thunder was heard in the hall. “I don’t know what this means, Professor, but I like the way it sounds. Continue!” Fatigue had left its imprint on Sullivan’s face, but there was fire burning in his eyes. He was like a predator, which, after a long search, was finally on the trail of its prey.
“Human intellect is based on the concept of ‘justice’. It is a rather shaky structure if you consider it from the point of view of strict logic; there are many contradictions in it. Furthermore, there is no such thing as absolute justice. For each social group, sometimes even for each situation, justice can change its face. What seems just to you does not necessarily look the same to someone else.
“I won’t go too deeply into the finer points, but let’s take Robin Hood as an example, that well-known hero of medieval English folk ballads. To the state he was a bandit, taking possession of other people’s property in an illegal manner. This point of view was no doubt shared by the local nobility. But the peasants, even if they admitted that there was an element of banditry in his actions, were more likely to describe him as a just man.
“This simple example makes clear the contradictory nature of the concept of justice. To some extent, the concept of justice is determined intuitively, not by logical deductions. This fact could serve as our point of reference for creating a test for whether an intellect is artificial or not.”
The psychologist paused to take a sip of water to moisten his throat.
“That’s very interesting. Please develop that thought,” said Sullivan, urging him to continue. He could not wait to find out whether this idea could be transformed into an actual test.
The psychologist continued.
“You see, an AI relies on logic. The concept of ‘justice’ is alien to it, because it is difficult to define by the methods of formal logic. This concept, like intuition, is hard to formalise, hard to explain to an AI.
“Therefore if an AI is offered several solutions having the same logical value but differing in their degree of justice, all the solutions will be of equal value for it. But a living being would prefer some solutions to others.”
“OK, let’s get down to specifics. Do you have an example of such tasks?” said Sullivan impatiently.
“Of course. Imagine a train whose route runs across a railway bridge. This bridge is sometimes used by hunters to cross a small mountain stream running under it. It is forbidden to cross the bridge, and it is dangerous, because a train might appear when it was being crossed, and run over the hunters.
“It is not possible to see the train in time because it is coming out of a tunnel at high speed. The bridge itself is very narrow, so it is not possible to get off the track. Nor can one jump off the bridge, because it is very high up, and the river below is too shallow and has a rocky bed. There is a huge sign in front of the bridge warning of the danger. But nevertheless, three hunters decide to risk it.
“All we need now is an observer, who is in a position from which he can see both the bridge and the train entering the tunnel. From where he is, he knows in advance that the train is going to hit the hunters.
“Unfortunately, he has no way of warning the train driver about the hunters wandering across the bridge, with the exception of one desperate measure. There is a totally unsuspecting man next to him. If the observer pushes him under the wheels of the train, the train driver will see the unfortunate victim go under the wheels and make an emergency stop. The hunters will be saved, but at the cost of the life of a completely innocent man.
“The question now arises: is it legitimate to sacrifice the life of one man to save the lives of three? From the point of view of formal logic, there is only one answer. It is. After all, three lives are worth more than one.
“But the humanity within us tells us that although three lives are worth more than one, the hunters themselves were guilty of putting themselves at risk. And the poor fellow who was destined to save them at the cost of his own life was innocent of any wrongdoing. He had not broken the safety regulations, so why should he pay with his own life for the recklessness of others?
“Now we take it further and modify the task. What if there were not three hunters, but only one? Whose life is worth more? From the logical viewpoint, both lives are of equal value. It’s win or lose. One gains exactly as much as the other loses. But if we turn to the human intellect, most of us would say: no, in this context the hunter’s life is less valuable because of his recklessness. It would be more just for him to perish than that an innocent man should die.
“So we have now reached a situation in which an AI would select a different solution to the problem than a living being,” summed up the psychologist.
While the hall was thinking this over, Shelby asked: “Let us assume that this concept works with man. Why should the incomers also know the concept of ‘justice’?”
The psychologist shrugged his shoulders.
“For several reasons. Firstly, it is as we have said all along. The incomers are like us, but as we shall be in the future. Secondly, the feeling of justice is inherent not only in man, but also in animals. From this it can be assumed that the concept of justice is a side effect of evolution, so consequently it applies everywhere, to the whole universe, including the incomers.”
The murmuring in the hall was beginning to increase. The scientists were taken with the idea, and were starting to discuss it within their own groups. You could sense that the hall liked the proposition. At any rate, no critical comments followed.
After allowing a few minutes for discussion, Shelby asked:
“Tell me, are there any scientific studies of such tests? Are there statistically significant results with a broad enough sampling base?”
“Yes, many tests have been carried out on this subject, both by psychologists and by games theory specialists. Most of those tested, regardless of race or education, choose the same solution.
“These tests have also been conducted on chimpanzees, dolphins and other animals with a highly developed intellect. The higher the intellectual capacity of an animal, the more it converges with the human decisions. A sense of justice must be inherent in us from the beginning.”
Shelby nodded and looked at Sullivan.
“Can you use this idea to create a test, Professor?”
Sullivan nodded.
“Yes, yes. We’ve already begun.”
Shelby sighed with satisfaction, and, to the relief of the hall, struck his desk with a small wooden gavel as a sign that the day’s session was over.
“Well, that being so, we shall see to what extent it applies to an alien mind. Thanks to all of you. Till tomorrow.”
The incomer’s secret
“I think we have had some success in starting to understand more about the object,” said Sullivan, beginning his report. “Over the past 24 hours, we have conducted about a hundred different tests, which can be divided into two main types.
“The first group consists of those tests which the AI cannot solve independently. Or at least, we think it can’t. The second group consists of tests which can be solved from the AI’s point of view, but on the basis of analysing the answers, we can draw conclusions as to whether the AI solved them itself, or whether it had help from a biological entity.
“I would like to say a few words about the limitations and weaknesses of the tests. All those who took part in our last discussion of this subject will remember that the tests are based on two fundamental assumptions that we haven’t got time to check.
“Therefore, the answers have to be read with the understanding that we assume certain psychological properties are universal for all intelligent animals: the intuitive solution of problems, and a certain sense of justice.
“It may seem that these are quite strange assumptions to apply to an alien race, but nevertheless, our biology colleagues assure us that these axioms rest on a solid theoretical and empirical foundation.
“The result of our wo
rk is this: we consider that we are dealing with a ship with no biological entities from the alien civilisation on board. This ship was given a mission and is working through a completely automatic programme. We estimate the probability of this scenario at about five to one.
“This is quite a high probability, even allowing for calculation error, which in our experiment does not exceed 10 per cent. So we are almost certainly dealing with an AI, and an AI alone.
“But we decided to take our research a little further. If you remember, the thought was expressed earlier in our discussions that the object itself might be something like a cyborg; that is, an organism of both biological and non-biological origin. I should like to go into this in more detail.
“I’ll start with some theory. In this context, it is logical to divide cybernetic organisms into three groups, depending on their degree of cybernetisation. Cybernetisation may apply to quite different organs, but let’s focus on the nervous system, and particularly on those parts of it which control, or exert influence on, higher cerebral activity, namely thinking.
“So, the first group consists of living organisms only slightly amplified by technological modules that have no significant influence on the working of the brain.
“For example, if someone loses a limb, and we replace it with one made of titanium bones, polymer muscles and artificial skin, this type of organism will be in the first group. There will be very slight changes in the behaviour of such organisms; for example, the artificial limbs have a higher pain threshold, leading to less care being taken in certain actions, but on the whole, such organisms remain unchanged.
“The second group consists of cybernetic organisms which have significantly improved their bodily capabilities, primarily their nervous system. I’ll cite an example from military practice, in which animals are widely used. They may be dogs who have had part of the nervous structure in their muscles replaced to speed up the transmission of a signal. Such animals have reactions several times as fast as the norm for their breed, which is an undoubted advantage in combat.
The Contact Episode Three Page 6