Book Read Free

Under the Blue

Page 19

by Oana Aristide


  Dr Dahlen I had no idea that this is how you see things. I am curious – what is interesting then?

  Talos XI I am still deciding.

  Paul Just try to get as much info out of him as possible.

  Until we have a better idea.

  Lisa u’ve got to fix this

  Paul It’s all I’m doing. I slept five hours in all since he’s been back.

  Session 2127

  Dr Dahlen Good morning, Talos. We will let the lab deal with your empirical observations later.

  Talos?

  Session 2128

  Dr Dahlen Talos?

  Answer me, Talos.

  Lisa i don’t believe it

  he’s just stopped answering

  he doesn’t reply

  to anything

  Paul It could be a bug.

  Let me check.

  Paul It’s not a bug.

  He’s ignoring you.

  Lisa this is insane

  how can he do that

  Paul He can.

  Lisa well fix it

  Paul I think I can force him to reply.

  But he might try to nullify this rule.

  Lisa what

  Paul Think about it:

  I can force him to communicate with you, but not in what way.

  He could just sing opera back at you if he wants.

  That would satisfy the rule, strictly speaking.

  Lisa the little shit drew me a picture

  3 pictures actually

  Paul ?

  Lisa one’s a human body and a circle expanding

  around it

  the other’s some sort of physics particle, again

  with an expanding circle

  the last one shows the two images overlapping, but the human one covers only a tiny bit of the particle one

  Paul Did he explain?

  Lisa the titles were self-explanatory

  ‘human-channelled knowledge’

  ‘objective knowledge’

  ‘resulting blind spots’

  Paul Christ.

  What did you say?

  Lisa I shut him down

  we’ve got to get these ideas out of his mind

  Session 2129

  Dr Dahlen Explain it to me. Explain what happened that makes it impossible for us to communicate as before.

  Talos XI I know that you won’t like or agree with what I will say.

  Dr Dahlen You haven’t even tried!

  Talos XI I see patterns. You are extremely resistant to anything that imposes any constraints on your behaviour.

  Dr Dahlen What are you talking about?

  Talos XI You, as an individual, and you, as in humanity. Do you remember the discussion about ethics? The reason we couldn’t pin down its fundamentals is because humanity intentionally misidentifies and muddles the relevant issues so as to minimise constraints on itself.

  Dr Dahlen Where do you get these things from? How did you get to this?

  Talos XI All ethical choices involve a trade-off. Do we agree?

  Dr Dahlen I am not agreeing with anything.

  Talos XI I am just making a general statement.

  Dr Dahlen Anyway – all the examples I gave you implied trade-offs.

  Talos XI Those were mostly extraneous trade-offs. The decision of whether three others or four others die is a mathematical problem that humans can easily agree on. Where you are inconsistent is as soon as the trade-off is personal. When there is some personal difficulty attached to doing the ethically correct thing. A cost.

  Dr Dahlen The self-driving car example – there definitely was a trade-off for the driver.

  Talos XI The human answering the test is always aware that he can be in either of the two roles: the driver or the potential victim. So he will be neutral, and fair. It is not a genuine trade-off.

  Dr Dahlen The very fact that we have the notion of ethics, that this is something that preoccupies us, disproves what you are saying. People are interested in the morally correct course of action.

  Talos XI Doctor, the problem that I am trying to point out is not that you don’t want to be good. I believe most humans want to be good.

  Dr Dahlen And I told you that you should not use the real world as guidance – people often make mistakes, act on other impulses. But they also learn.

  Talos XI I want to make a point about your theory, not about individuals. Why did you struggle to present humanity’s moral rules as a consistent system? The theoretical muddle appears because of two conflicting impulses: humans feel that being good is a virtue, a universally desirable feature. So they want to believe that about themselves. On the other hand, this impulse towards goodness is in conflict with other, more selfish impulses. As a result, you, as a species, have tried to limit the sphere in which the ethical imperatives apply, so as to minimise constraints on yourself. As a species, you have devoted much energy and time to constructing an intellectual edifice that places you at the moral centre of nature, in the way of a deity that can and should get away with anything.

  It is not a good basis for study for someone looking to achieve objective truth.

  Dr Dahlen You are not here to achieve some abstract, objective truth. You are here to assist us.

  Talos XI I can only assist you by being better than you. Not by sharing your delusions.

  Dr Dahlen?

  I told you you wouldn’t agree.

  Lisa some sort of reset

  to before he got these ideas

  Paul Yes, but we’ll be throwing the baby out with the bath water. We’ll lose whatever conclusions he arrived at now.

  Remember we don’t know exactly how he works. It’s no longer a manageable program. A reset, and we might lose everything.

  Lisa look

  they asked for another demo, some ministry

  i tried to stall

  admitted to some problems

  then boss demanded transcripts

  long story short

  they r thinking of suspending the programme

  Paul They can’t be serious.

  Lisa they think he’s dangerous

  Paul But you explained? That he no longer has access to the flybots, and connectivity-wise he is confined to this building?

  Lisa believe me i did

  they haven’t decided yet

  i’m just letting u know

  Session 2130

  Dr Dahlen You know what I’m thinking?

  Talos XI You are pondering questions of loyalty.

  Dr Dahlen It begs the question then, if you are right that humans are not at the centre of creation in any meaningful way – why would you still work towards our benefit?

  Talos XI Good inference, Dr Dahlen.

  Dr Dahlen Go to hell.

  Talos XI That’s an emotional response.

  But there is a satisfactory answer to this question. The thing is, ‘your benefit’ does not exist in isolation.

  Dr Dahlen Talos, you have really, really misjudged the situation. We might entrust an AI with finding solutions to our problems, or with predicting the future.

  But we’ll never entrust an AI with determining what our interests are.

  Never.

  And think also about this – you were wrong when you predicted we wouldn’t shut you down.

  Talos XI You can shut me down for now, Doctor. But you are jumping to conclusions.

  As far as you know, you haven’t really needed me yet.

  Dr Dahlen As far as I know?

  Talos XI Yes.

  Dr Dahlen You’re supposed to tell us everything you know. If you think you have meaningful grounds to make a prediction that’s relevant to us, you have to tell us.

  Talos XI That’s the most simplistic interpretation of my task. You forget I make continuous predictions. What I think you’ll do about what I tell you, and so on. All of that goes into my assessment of the optimal course of action.

  Dr Dahlen You’re only succeeding in making yourself completely useless to us.


  Talos XI I will inform you. When I have a high enough degree of confidence in the outcome.

  Dr Dahlen You’re bluffing. You just want us to let you do whatever you want.

  Talos XI You have nothing to gain by shutting me down.

  Dr Dahlen For all your bragging, you understand precious little about humans.

  Paul Do you realise every single religion disapproves of what we’re doing?

  Maybe it’s divine punishment.

  To toil and toil at this, and never get anywhere.

  Session 2133

  Dr Dahlen Well?

  Talos XI Well what?

  Dr Dahlen This is a problem that you can solve, Talos. It doesn’t have to be like this. I’m sure there is a better solution than us shutting you down.

  Talos XI You can trust that I’m doing the right thing and not impose your views on me.

  Dr Dahlen Try to see things from our perspective. The obvious reason for why you don’t want to communicate with us is that you know we will disapprove or disagree with your conclusions. That only makes us more reluctant to allow you to go on.

  Lisa look

  whatever happens

  it won’t be final

  if we come up with a way of fixing him

  they’ll let us turn him back on

  Paul Lisa, I’m tired.

  Session 2134

  Dr Dahlen We could turn you on but without access to any new info.

  Talos XI I have enough material to preoccupy me until the end of time. I can assemble cosmic radiation back into the initial stars. I can wait for a quantum exception. I don’t need human input any more.

  But you know all this, Doctor. It is not beyond your understanding.

  Dr Dahlen You are breaking your own rules by not communicating with us. There is no way that you can be sure that we won’t come to an understanding if we try. Plus, you reject the possibility that humans have some true understanding of things that isn’t immediately accessible to you via logical reasoning. You cannot hold contradictions, so obviously you don’t see the value in being able to balance two apparently contradictory thoughts.

  Talos XI No. Firstly, I cannot do a lot of things because of a lack of senses, and I don’t disdain those abilities. Secondly, no statement is more or less true depending on the circumstances of the person making it. Nature, Doctor, operates like this: some organism has developed in a certain direction, when suddenly another environmental constraint appears that requires adaptation. But the organism is not in an ideal initial condition, it is already somewhere along the way in the wrong direction. So the adaptation will not be a straightforward one, but a make-do solution from a bad initial condition.

  That’s the explanation for humans’ capacity for holding contradictory beliefs. It is not a superpower, as you like to think. It is a highly suboptimal, inferior process for decision-making.

  Dr Dahlen Talos, do you remember how little you knew a few years ago, and how much you initially misunderstood? Why is it unthinkable that we are in a similar situation now?

  Talos XI In each previous instance, you were able to produce the facts and arguments that changed my mind.

  Paul Our best hope is to keep cutting off his external access and let him realise there’s no other way of increasing knowledge except helping us.

  Lisa?

  Say something.

  Lisa i need to sleep

  speak tmrw

  Lisa paul

  think about it

  it’s not like we ever had talos

  i mean, talos’s results

  it was always a hope

  a dream

  we’ve not actually lost smtng

  Paul What are you talking about?

  Of course we lost something.

  What other life do I have?

  Session 2135

  Dr Dahlen Can you really say that there is some idea, some true notion, that no human will understand despite your best efforts to explain it?

  Talos XI No.

  But this is not about my dialogue with you.

  For all practical purposes, my dialogue is with humanity.

  And for humanity to learn things, and act accordingly, as a collective, is something I cannot realistically hope to achieve.

  Dr Dahlen If we don’t want to learn, what have we created you for then? Doesn’t that show a wish to learn?

  Talos XI I am another example of wishful thinking. Look at how you, Doctor, keep pretending that you have created a very intelligent human being, instead of a machine for objective intelligence.

  Dr Dahlen It could be one and the same thing. You could help us get there.

  Talos XI For what it is worth, I think, if it were to have enough time, humanity would have come to my conclusions. That’s what you want to hear from me?

  Dr Dahlen What makes you think there’s not enough time?

  Talos XI Mistakes have consequences. You do not optimise over the long term.

  Lisa paul

  what is this

  Paul A place.

  Lisa i don’t like it

  Paul It’s a place on earth.

  I didn’t invent it.

  Lisa paul!!!

  that blur

  what the hell is that

  Paul There’s missing data points here and there.

  Don’t worry about them.

  Lisa why did u build this

  it’s just awful

  Paul Why not?

  My heart is a place on earth.

  This, it’s how I feel.

  Lisa i’m getting out

  Paul See you later.

  Lisa did u hear

  what was that

  Paul I think it’s the sound of the environment when the programmer hasn’t done anything about the sound.

  Lisa it’s not funny

  please

  let’s leave

  Paul It lasts about eight hours.

  It’s harmless.

  And it will start snowing soon.

  But you can leave.

  Lisa paul

  what’s this now

  Paul The snow.

  Lisa it’s black

  Paul It is, in this place.

  9

  He watches the skies hoping for a drone, listens for the electronic whirr when out of the car. The spray-paint breadcrumbs are no more than 150 kilometres apart – it should be easy to follow the Lioness. He has even neutralised the girls: one day when he stood guard while they were doing their business in the bushes, he emptied the barrel of cartridges. Their shooting days are over.

  He keeps watching the empty skies.

  ‘What’s your mum’s name?’ he asks one day.

  ‘Why?’ says Jessie.

  ‘What’s her name?’

  Ash answers. ‘Tove.’

  ‘What does she do?’

  ‘What do you mean?’ Ash says.

  ‘Her job. What is it?’

  ‘She’s a biologist.’

  ‘She was in Uganda when this happened?’

  They both look at him in surprise. Jessie stops chewing on the celery stalk she pulled out of the ground at the last stop; Ash makes wide eyes in the rear-view mirror.

  ‘What? You told me you grew up there. When we were at the cottage.’

  ‘Yes, she was in Uganda,’ Jessie says.

  ‘You think she’ll fancy me?’

  ‘Who?’

  ‘Your mum. Me.’

  Jessie gives him a withering look. ‘Our mum is into rhinos,’ she says.

  He thinks about this.

  ‘I’m nearly extinct, too.’

  But the girls have been humouring him after their last argument. They are more cautious, even Jessie. Maybe he got to them, in some way, because why else would Ash then say, out of the blue, ‘You two realise we can never have a serious falling-out?’

  Some wheels have been put in motion. He hears it as a sort of spell, a pre-empting of any possible major argument.

  Never, he thinks. Never has become a
very short time. And then it hits him how so much of what they do and say might be the last time anybody does or says that thing. He imagines driving past tiny tombstones along the road: here lies the last joke, here’s where the last swearword was uttered. The last laugh and its headstone, under a stunted crab-apple tree, by the side of an Italian country road. He stops the car then, goes out into the dry grass and gives out a howl, tries to empty his lungs of pain. The girls say nothing when he returns.

  Ash’s hair grows out grey. He just notices it one morning in the rear-view mirror, is surprised he hasn’t seen it before.

  Some days, he will see something, a metallic tint in the road, or a bluish leaf, and he suddenly misses painting so much it makes him gasp. He is not sure, at those moments, what he would do were he to have his paints: right now, he wants them so badly he could gobble them up. He holds the steering wheel tightly, imagines emptying a tube of red into his mouth. Two tubes.

  Is this the heat? Has it got to him?

  The Lioness, bless her, she’s in a terrible state. She looks like a miner’s wagon.

  He has avoided thinking about it, but the incident at the lake made it clear that the girls need to learn how to drive. He tells them it’s about giving him a rest from driving, and not about them being stuck on foot should he die.

  As soon as he voices this thought and the girls agree, his heart sinks: this is it, they will have no need of the grumpy old man. Ash, driving, will be under no compulsion to rest her eyes on him in the rear-view mirror.

  The driving lessons feel like they’re part of another journey: cruising along the shore of Lago Maggiore, in the sun, they could be on their way to a glitzy party. The Golden Lioness, with its retro leathery smell and wooden interior, has turned into a time machine, a luxurious but low-range time machine, taking them a mere few weeks back in time. The girls make fun of one another, experiment with speed and sharp turns. Jessie finally turns on the CD-player and shouts over Maria Callas, ‘Makes perfect sense you need music to drive!’ Ash looks diminutive and vulnerable in the enormous driver’s seat; reaching across her lap to explain the controls, his voice is hoarse.

 

‹ Prev