Book Read Free

Pandora's Brain

Page 26

by Calum Chace


  Vic turned towards Norman, who smiled slightly as he waited for the inevitable crescendo of protests and resumed questioning. When that seemed to have reached its peak, he unleashed his military bark again.

  ‘Dr Damiano said no questions! That is all. Thank you!’

  Norman took Vic’s elbow and led him away from the gates. The guards closed back in, leaving the press pack disappointed and edgy. They continued to hurl questions over the heads of the guards, but more in desperation than in hope.

  ‘How does it feel to be Dr Frankenstein, Damiano?’

  ‘Say hello to Matt for us, boys!’

  ‘Have you watched Colossus recently, Dr Damiano?’

  Norman turned to Vic as they walked through the front doors. ‘What’s Colossus?’

  ‘He was probably talking about The Forbin Project, an old movie about the invention of an AI that takes over the world. It’s an OK movie – not great but not terrible. There’s talk of it being re-made. There were lots of similar movies made during previous phases of hype about AI.’

  ‘Hmmm. Well, this time the hype is justified!’ Norman remarked, as much to himself as to Vic.

  All eyes turned to Vic and Norman as they walked back into the control room. Vic brought everyone up to speed.

  ‘We made a brief statement, but it won’t put them off. We’re about to become the centre of a major media storm. Gus, we’re going to have to move fast on that link to Palo Alto if we’re not to lose the opportunity altogether.’

  ‘It’s almost ready, sir. I should be able to initiate the handshake in a minute or so.’

  ‘Great!’ Vic said, and shot Norman a questioning glance. Norman nodded solemnly.

  Vic continued: ‘Go ahead and make the link as soon as you can. No further authorisation required, but let me know when it’s complete. And everybody . . .’ he addressed the room collectively, ‘knowledge about this link is classified. It stays in this room. No leaks. Understood?’

  Vic’s mouth formed a thin-lipped smile of gratitude as every head in the room nodded.

  He started to address Matt via the main monitor, but Matt beat him to the punch.

  ‘I guess this means there’s no longer any reason why my friends couldn’t join us? Alice, Carl and Jemma, I mean?’

  Vic hesitated for only a couple of seconds to consider this before nodding his agreement, and asking Rodriguo to make arrangements for the three students to be brought to the compound.

  He had only just finished giving that instruction when a tall, thin man entered the room and headed towards him. He wore a smart grey suit and a light blue shirt with no tie. He moved with a confidence and an urgency but also a deference that suggested he was a man who was frequently on the scene when important matters were discussed, but never had to make the final decision.

  ‘Vic, Norman . . . you’re going to want to see this,’ he said as he handed Vic a sheet of A4 with a hand-written note. ‘Rodrigo,’ he added, ‘can you get CNN on one of these monitors?’

  Within seconds the monitor above Rodrigo’s desk switched from colourful analytical graphics tracking Matt’s neuronal activity, to a newscast showing the reporters outside the building, followed by an announcer behind a studio desk reporting on the reaction around the world to the news of Matt’s upload.

  ‘The scientific community’s response to the development seems to be divided. Two of the best-known experts in the artificial intelligence field, Professors Jenkins and Yasowicz at Stanford, told us a few moments ago that if true, the news represents an extraordinary breakthrough and a great day for scientific discovery. But Dr Siggursson, a prominent AI researcher based at MIT, has issued a statement calling for Dr Damiano’s project to be halted until its framework can be peer-reviewed, and arguing for great caution to be exercised before continuing. He added that Dr Damiano’s lack of transparency was a serious cause for concern.

  ‘Leaders of all the major religious communities are starting to make pronouncements, and we understand that the Pope will be issuing a statement from the Vatican soon. Meanwhile, in Washington, the Speaker of the House of Representatives has asked the President to address the nation as soon as possible, and in London, questions are being asked in the House of Commons about how much the British government knows about this event which has taken place a couple of miles down the road. Reactions are also starting to come in from other governments around the world.

  ‘We’ll bring you more about the scientific, political and religious reaction to the news as we get it. In the meantime, here’s Cindy Loughton, with what we know about the science behind today’s development.’

  As a giant schematic of a human brain appeared on the screen, Vic asked Rodrigo to turn the sound down. He held up the sheet of A4.

  ‘Can I have everybody’s attention, please. The President of the United States is going to be calling in. . .’ he glanced at his watch, ‘. . . four minutes for a briefing. I’m going to take the call here, on the main monitor, in case he wants to ask a particular question of any of you.’

  The thin man looked surprised, and put his hand on Vic’s forearm. ‘Vic, are you sure you want to take this call so . . . publicly? There could be . . . um . . . confidentiality issues.’

  ‘Yes, thank you, Martin,’ replied Vic levelly, gently but deliberately withdrawing his arm from Martin’s hold. ‘I have considered that, but I think it is more important that the President has access to all the relevant people.’ He turned to the main desk, where Julia was now sitting. ‘Julia, please call the switchboard and have them put the call through to this desk.’

  The room was now buzzing with excitement and alarm, and Vic took advantage of the bustle to walk over the Gus and ask him quietly, ‘Is the link open yet?’

  ‘Very soon, sir,’ Gus whispered.

  ‘Good. Activate it as soon as it’s ready, and let Matt know. I just hope we’re not too late.’

  Julia announced that the call was coming through from the situation room in the White House slightly ahead of schedule. Seconds later the face of the President appeared on the main monitor. He was dressed formally, in a blue suit and tie, and was flanked by a high-ranking military officer on his left, and a middle-aged man in a smart suit to his right. Behind them, Vic could see at least a dozen other advisers and aides, a couple in uniform, the rest in suits.

  ‘Good afternoon, Dr Damiano,’ he said. ‘And I recognise Colonel Hourihan there: good to see you again, Norman.’

  ‘Good to see you again, Mr President,’ Norman replied, stiffly.

  ‘You seem to have a number of colleagues with you, Dr Damiano,’ the President observed. ‘May I assume that everyone is security cleared?’

  ‘Yes, sir,’ Vic replied, ‘although there are four members of Matt Metcalfe’s family present, who have received provisional clearance only. That includes Matt himself, who is hosted on a supercomputer next door, and can hear this conversation. I thought it might be helpful for you to be able to ask them questions directly.’

  ‘Interesting,’ the President said thoughtfully. After a moment’s reflection, he continued, ‘Yes, I agree with you. Good call. Well now, Dr Damiano, you are probably aware that you have caused a bit of a storm. Is it true that you have successfully uploaded a human mind into a computer, and that by doing so you have created the world’s first artificial general intelligence?’

  ‘Yes sir. We believe so.’

  The President leaned forwards slightly, and for the first time Vic could see his suppressed anger. ‘Well do you mind telling me why the hell you just went ahead and did this without consulting anyone? I mean, isn’t there some kind of protocol for this? You have set off trip wires and alarms all over the world, and we are flying headlong into a major international incident.’

  THIRTY-NINE

  ‘May I speak freely, sir?’ Vic asked.

  ‘I wish you would,’ the President replied.

  ‘Norman and I drafted a protocol, sir, but it proved impossible to get it signed off. We were told by
senior personnel in the Pentagon that in the meantime we should proceed without explicit authorisation. One officer told us the US Navy has a saying ‘It’s better to seek forgiveness than to seek permission’. And after all, we have broken no laws, and infringed no regulations.’

  Vic noticed that a number of the people in the situation room were looking distinctly uncomfortable at this point, especially the ones in uniform. The President was not amused. He turned to one of the officers.

  ‘I am sick and tired of the military making end runs in political situations. You have not heard the last of this, I assure you.’

  Looking back at the screen, he continued, ‘Very well. You say that Matt Metcalfe, or what he has become, can hear this conversation?’

  ‘Yes, sir,’ Vic replied.

  ‘Very well. Good afternoon, Matt. I have followed your exploits with considerable interest – along with the rest of the planet. How does it feel to be in your new . . . incarnation?’

  ‘Good afternoon, Mr President. I’m still getting used to it, but I have to say it feels a lot better than the alternative.’

  A faint smile appeared briefly on the President’s face. ‘Indeed. May I assume that you believe that you would pass the Turing Test?’

  ‘Yes sir, I am confident about that. Even if the judges were my parents, who are listening to this conversation, as you know.’

  The President nodded. ‘Dr and Mrs Metcalfe, do you believe that I am speaking with your son?’

  ‘Yes, sir, without a doubt,’ David said.

  ‘He remembers events that only my son could know about, Mr President,’ Sophie added.

  ‘I see,’ the President said. ‘Now I apologise, but I’m going to have to be blunt. What would your reaction be if I was obliged to order Dr Damiano here to turn the machine off?’

  ‘With great respect sir, I believe that would be murder,’ David said, nervously but firmly.

  ‘Excuse me for being blunt in return, Mr President,’ Sophie began, ‘but capital punishment is illegal in this country. And in any case Matt has done nothing wrong.’

  ‘I wouldn’t like it, either!’ Matt interjected, with a comic timing that belied the gravity of the situation.

  The President smiled again, more warmly this time. ‘They warned me that you are an impressive family. I understand your position, and I sympathise. For what it’s worth, I’m inclined to agree with you. Unfortunately mine is not the only opinion that counts here. There are broad issues at stake, and all kinds of leaders are making their voices heard. Some of them are saying that humans cannot create souls, which means that Matt cannot be a human in the usual sense. Another of the arguments that is already being made is that turning off the machine would simply be putting Matt to sleep. Is that wrong, Dr Damiano, and if so, why?’

  ‘When we sleep our brains are still active, Mr President,’ Vic replied. ‘You might think that switching the machine off would place Matt in a coma, but that is not correct either. If we switch off the machine there will be no further brain activity. Period. The only appropriate analogy is death. In fact ‘analogy’ is the wrong word: it would be death.’

  ‘But death from which Matt could be returned again, if we could subsequently determine that is was safe to do so?’ the President asked.

  ‘We don’t know that, Mr President. It was no easy task to initiate the brain processes in the new host, and to be honest some of the entities which we seemed to create on the way were . . . well, I wouldn’t want to have to do the same thing over. We don’t know that we didn’t just get lucky last time. We also don’t know that a re-initiation would ever be approved. And even if we did succeed in re-initiating Matt’s brain, we have no way of knowing whether it would be a different Matt. One very plausible way of looking at it is that we would be killing this Matt and then initiating another one.’

  ‘I see. Thank you Dr Damiano.’ The President looked around at his advisers to see whether there were any other questions. There were none. ‘I will see what can be done.’ He leaned towards the screen again, and his tone became a little darker. ‘In the meantime, Dr Damiano, from this moment on you will do nothing that could limit my freedom of action on this matter. Am I understood?’

  ‘Yes sir,’ Vic responded, sheepishly.

  ‘May I make a request, Mr President?’ Matt asked.

  ‘Go ahead, Matt,’ the President said, surprised.

  ‘I understand that my situation is controversial. It seems very straightforward to me, and to my family, of course, but I realise that others view it differently. I would like to make a proposal to my fellow humans, which I hope might swing the argument. If you agree with the idea, would you be willing to present it on my behalf?’

  ‘I will certainly undertake to look at it, Matt. I promise you that. Who would you like me to present it to?’

  ‘Thank you Mr President, I can’t ask more than that. It may sound presumptuous, sir, but I think it should be put to the General Assembly of the United Nations.’

  The President’s eyebrows rose, then he laughed lightly. ‘Well, Matt, I salute your ambition! I look forward to receiving your proposal.’

  ‘Thank you again, Mr President.’

  The screen went dark. The room was hushed, and the air seemed heavier than normal. It was several moments before anyone felt able to break the silence. Perhaps because he was the only person there who had met the President before, it was Norman who broke it.

  ‘What’s this proposal, Matt?’

  ‘It’s something I’ve been working on for a while,’ Matt replied. ‘I’m sending a copy to Julia and I would appreciate any comments that any of you might have.’

  Julia clicked her mouse a couple of times and then turned to the printer on the desk behind her, which was printing out a short document. She handed it to Norman, who read it out loud.

  Dear Fellow Humans

  I have the dubious privilege of being the first human being whose mind has been uploaded into a silicon brain. I was still a young man when I was murdered, and I am very grateful to have a second chance at living a full life. Although my ‘new’ self is in some ways different from my ‘old’ self, I am still Matt Metcalfe. I have his memories, I think and feel the same. My parents are adamant that I am Matt.

  I know that my existence is controversial; some people would like to switch off the machine which hosts my mind. The scientists who uploaded me believe that this would mean I was being murdered a second time. It certainly feels that way to me. I don’t think a decision to kill me should be taken lightly.

  In addition to considerations about my basic human rights, I believe there are many other powerful arguments against terminating me. I would like to set out two of them.

  The first is the fact that I can help make the new world of artificial general intelligence – AGI – safe for humanity. My existence proves that AGI can be created. The potential advantages to a nation or organisation which controls an AGI are enormous, which means that more AGIs will soon be created, even if laws are passed which ban it. I have thought deeply – both before and after my murder – about the implications of this for our species. The potential upsides are staggering, but it is true that there are also serious downside potentials. Many researchers in the field believe that humanity will be safer if the first AGIs are uploaded human minds. I am an AGI, but I am also human: I share all of your drives and emotions.

  If uploading is banned (and hence if I am killed again) it is inevitable that AGIs will soon be created that are not based on human minds. We have no way of knowing what motivations these AGIs may have, nor what powers they may quickly acquire. A world where the only AGIs were non-human AGIs could be a very dangerous world for the rest of us.

  The second argument against killing me again is that I am anxious to work on another vital project. Today and every day around the world, 150,000 people will die. Whenever a person dies, humanity loses a library. This is a global holocaust, and it is unnecessary. Although we do not yet have brain preserva
tion technology, we do know in principle what it would take to preserve the brains of dead people in such a way that their minds can be revived when uploading becomes affordable. I have studied this problem, and I believe that an ‘Apollo Project’ – a major international scientific and engineering effort sponsored by one or more governments – could develop effective brain preservation technology within five years.

  This Apollo Project would not only provide a lifeboat to those who die before uploading becomes widely available, it would also mitigate the damage of serious social unrest arising when rich people upload while the rest cannot.

  Thank you for listening to me.

  Matt Metcalfe

  One by one, the people in the control room finished reading Matt’s proposal. Apart from an occasional whispered exclamation, no-one spoke. They looked to Matt’s parents for a lead.

  ‘It’s powerful, Matt,’ his father said, at last. ‘It’s an impressive vision, and a forceful pair of arguments. Do you think it will work?’

  ‘I think it has a much better chance than relying on the human rights argument,’ Matt replied. ‘The people who would like to get rid of me can simply deny that I am human. They can argue that I was not created by God, so I have no soul, and so I am not human.’

  ‘But that’s nonsense,’ Sophie protested. ‘There is no proof that there is any such thing as a soul, and it’s obvious that you are human.’

  ‘Thanks mum. Naturally, I agree with you, but what you and I think may not be important. Don’t forget that 90% of humanity claims to be religious, and many religions are going to take a while to come around to the idea that humans can create intelligent life. They aren’t ready to give up on the soul just yet.’

 

‹ Prev