Criminal Minds (Fox Meridian Book 4)

Home > Other > Criminal Minds (Fox Meridian Book 4) > Page 22
Criminal Minds (Fox Meridian Book 4) Page 22

by Niall Teasdale


  The first round hit his stomach and pancaked, gluing itself to his shirt where the radio jammer would soon kick in and block communications through most of the house. The other rounds were solid-core antipersonnel rounds and they punched through into his torso. Damaged, he reeled back, but he was still very much on his feet and moving. And Fox did not really want to have to use the next round in the magazine because it was probably going to hurt.

  ‘I can blow you in half now, or you can surrender!’ Fox yelled, and she knew before he reacted what he was going to do.

  She had managed to get a couple of metres between them and he pressed forward, hands swinging up to claw at her face, and she fired. She felt the heat on her face, felt shrapnel hit her coat and bite into her hand, and saw the jet of flame blast out through his back. She stepped back and his flailing arms never came close, but the damn thing was still moving!

  It staggered. She heard motors whining as he struggled to press the attack. ‘Damn it!’ Fox snapped, and she stepped back, bringing her gun up, bracing it, and firing. Three rounds, solid-core, straight into the key electronics she knew from the specs Kit had found for her were right there in the chest cavity. There was a sound like someone screaming through a distortion filter, though the android’s lips did not move, and then it fell to its knees, then its face.

  ‘Think it’s dead?’ Fox asked silently.

  ‘With the jammer in effect, it’s difficult to tell,’ Kit replied. ‘The damage looked extensive, but it may still be partially functional. You did not aim for its primary computer.’

  Fox popped the empty magazine from her pistol, pulled a replacement from her pocket, and snapped it into the well. ‘I can rectify that. What do you think?’

  ‘I think that the law does not have mechanisms for handling homicidal AIs. I think that the dangers of studying this monster outweigh the potential benefits. I think that, if it can recover its senses, when the jammer cuts out, it might attempt to escape back onto the internet.’

  ‘I think that those are all very good points.’ And Fox emptied her second magazine into the android’s skull.

  ~~~

  When Sam and Marie burst in through the little apartment’s lounge door, Fox was sitting on the sofa in her bra and panties, taping over the four cuts on her stomach. The wig and dress she had been wearing were strewn on the floor at her feet. She looked quite calm about it, but Marie let out a sort of sobbing gasp at the sight.

  ‘You’re injured!’ Marie squeaked.

  ‘Just scratches,’ Fox replied.

  ‘Your hand’s covered in blood!’

  Fox grinned. ‘That’s actually the lesser of the insults. Fingers bleed a lot. He got me pretty good on the stomach, but it’ll heal. The hand was shrapnel. And he’s a lot worse off than I am.’

  Sam, still holding his own pistol, kicked at the shattered android on the floor. ‘I see that, but you’ll go over to MarTech and have those wounds treated properly.’

  ‘Yeah, promise. Don’t want scars here. I have too many bikinis.’

  ‘It’s over though?’ Marie asked. ‘He’s dead?’

  Fox looked over at the android which had been Ripper, murderer of far too many women. ‘He’s scrap,’ she said, and then she picked up a sterile wipe to clean her hand.

  Part Five: Spin

  New York Metro, 9th November 2060.

  ‘You have no right to mount an operation like this.’ Olin was pissed off and not really bothering to hide it. It was, Fox thought, making him sloppy, and Robbard, who was also in the interview room, was just as annoyed, which was just making Olin worse.

  ‘I have no right to defend myself in my own home?’ Fox asked. ‘I think someone should have notified the general populace of that change in the law.’

  ‘You baited the killer and–’

  ‘I determined that it was likely that the Ripper AI would attack Marie Shaftsbury yesterday night. I did not stop her from attending a meeting at IB-Nineteen, but I ensured that she was under surveillance following her departure from Time Spire. When it was determined that she was being followed by someone matching the description of a Kildare-series android, I took Marie’s place and allowed Ripper to attack me instead of Miss Shaftsbury. I didn’t need to bait him; he was already planning the hit.’

  ‘And you didn’t inform NAPA of your… determinations because?’

  ‘You, very specifically, told me you were not interested in my help, Inspector.’

  Olin’s eyes flashed. ‘You had information–’

  ‘I had the exact same information as you. I got information from Mortenson regarding Ripper’s obsession with Mary Jane Kelly, but that was recorded by NAPA in Boston and relayed to you as an interested party. I know it was, because I checked with Boston while I was there. You requested and received all of the interview material from him. You saw the messages Ripper left before I did. I’ll give you a pass on not making the connection in time only because I’m closer to Marie and maybe made the leap to assuming she was the target sooner, but the vid she was in got a lot of attention and IB-Nineteen have been running blips for the series practically every hour since they announced it. You had plenty of time to work this one out without me helping you. Which, I’ll say again, you specifically told me not to do.’

  ‘You destroyed the cyberframe.’

  ‘Sure did.’

  ‘We needed the computer and software to confirm that the android was not being operated by remote,’ Robbard said.

  Fox looked at him. ‘Okay, first of all, I’ll be putting in a specific complaint following this interview regarding the participation of an IA detective in these proceedings.’ Fox watched Robbard’s jaw tighten and did her best to keep a smirk from forming. ‘Secondly… You do know how telepresence remote software works, don’t you?’

  ‘I’m fully aware of how an android is controlled through VR, Miss Meridian?’

  ‘Yeah, apparently not since you think it would operate through a broad-spectrum jammer.’ Fox waited a beat to see if he had a comeback on that, but he just ground his teeth. ‘I’m sure a number of people wanted to get their hands on that software, but nothing good could come out of analysing it, or using it. Legally, it should have been destroyed by Mortenson as soon as he realised it was exceeding its parameters.’

  ‘All right,’ Robbard said. ‘I’m here to discuss the leak of information regarding the case. Namely, that a rogue, emergent AI is responsible for the deaths.’

  ‘And you want to hang that on me?’

  ‘I want to hear your explanation for the information becoming public knowledge.’

  ‘Well, out of a spirit of cooperation, I am going to tell you how it got out. You still have no right to be asking me that question. Given that MarTech Technologies produces a number of AIs, they are obviously very interested in the fallout from this. Their memetics department is been working on it like beavers. The first rumours started in Boston on a couple of conspiracy sites. The charges made against Mortenson are on public record. To wit, negligence leading to the release of an illegal and unconstrained AI onto public networks. Someone linked that to the Criminal Minds project and posted a suggestion that Mortenson had allowed an AI patterned after a serial killer to escape. Discussion was fairly rapid after that. Someone suggested NIX had arranged for the release deliberately, but someone always suggests NIX is involved. It’s a law of the internet or something. Like the one about arguments devolving to the point where someone’s compared to Adolf Hitler.’

  Fox glanced at Olin as he gave out a snort which actually sounded like he got the joke. Fox had been fairly certain he had no sense of humour, and Robbard seemed to have had his surgically removed from the look on his face. ‘Anyway, someone linked in the Ripper-style murders in New York and consensus shifted to the idea that a “Jack the Ripper” AI had escaped and was killing prostitutes. Small jump to the idea that it had to have taken over an android body to do that. NAPA actually confirmed the rumour when it announced that I’d destroyed a cy
berframe, and that was when the mainstream media picked up on the conspiracy theories and ran with it.’

  Fox gave them both a smile. ‘I can get you the memetic connection tree if you like. There’s no leak on this one. It’s all down to people making deductions based on information anyone could have got their hands on, and having way too much time on their hands to think about this stuff. Hell, why would I want to leak any of this? I’m going to have to go on bloody chat shows trying to spin it so that it doesn’t tar the entire AI industry with the same brush.’

  12th November.

  Fox settled onto the middle of three seats, straightened her skirt, and tried her best to look confident and at ease. She felt like she was facing a firing squad, even if firing squads normally had more than two people in them.

  ‘You come highly recommended, Tara,’ Elaine Resnik said.

  ‘Yes,’ Fox said. ‘I was watching when Marie said you should interview me. I think she still has the bruises.’

  Charlie Iberson raised an eyebrow. ‘You’ve faced off against terrorists, psychos, and now, from all accounts, killer androids, and you’re worried about little old us?’

  Fox’s lips twitched. ‘Well, they wouldn’t let me bring a gun on stage.’

  ‘It’s for your own protection,’ Resnik said. ‘Charlie’s bulletproof and they just make her annoyed. However, she did mention killer androids and, much as I really want to get into your career in the Army and law enforcement, it’s this rather worrying development in artificial intelligence that brings you here.’

  ‘And some people are going to say that that’s a little outside your normal topics of conversation.’

  ‘Yeah, we had to pull a few strings with the channel to persuade them to let us have this, but the recent murders are all about women, a woman took down the killer, and we have two other special guests tonight who are both women. We think we have it covered.’

  Iberson looked out at the cameras covering them, addressing the audience. ‘You got that, ladies? This is an important deal for the better half of the population.’

  ‘I think it’s important for all of the population,’ Fox said. ‘So I hope there are some girls out there making their boyfriends watch, but I’m going to be turning up on one or two other shows talking about this as well, just in case all the men are scared of you, Charlie.’

  ‘Okay,’ Resnik said, ‘now that we’ve established Charlie’s credentials… Killer AIs. Are we all going to be killed in our sleep by psychotic androids?’

  ‘No,’ Fox said flatly.

  ‘Short,’ Iberson said, ‘to the point, and you sound very sure of your answer.’

  ‘Well, I am. If there’s one thing I’ve learned from working through the Ripper case, it’s that, while it’s not impossible for an AI to kill, or even become homicidal, it simply shouldn’t happen. There are a lot of safeguards in place to stop it. AIs are hardcoded to be honest, as in they follow the legal system without question, and they have code in place to stop them developing a lot of the mental problems that drive humans to kill.’

  ‘And that seems to have worked out really well.’

  ‘Because a human did something illegal and stupid. Technology can be a wonderful thing, but it’s basically a tool.’ Fox paused and looked thoughtful for a second. ‘Most technology is a tool, but we’ll get to that later. Tools are only as good as the people who use them, and only as evil too. You can kill someone with a screwdriver, and you can do something to make an AI into a killer if you’re stupid enough to give it the right conditions.’

  ‘You’re saying an android, any cyberframe, couldn’t ever kill someone on purpose?’ Resnik asked.

  ‘No, I’m not. But it’s less likely than a human committing murder. An AI, even a class four, would weigh the situation heavily in favour of the law. You would have to push it pretty hard to make it snap, and then it would be highly likely to hand itself over to NAPA at the first opportunity. Think of it like this. Um, survival instinct is pretty ingrained into humans, but people do commit suicide. It takes extreme circumstances, but they do it. AIs have a survival instinct, but to them the law is almost as strong a driver. They won’t do anything illegal without something which overrides that instinct.’

  ‘Well, you brought up the legalities and controls, so let’s meet one of the women who enforces those rules. Ladies, Teresa Martins.’ Terri walked on stage looking a lot more confident than Fox thought she had managed. Of course, Terri did more publicity work… And Resnik spotted the ease of their new guest. ‘Well, Teresa, you look more comfortable in Charlie’s presence than Tara did.’

  Terri took her seat and grinned. ‘Oh, Charlie’s a pussycat and I happen to know we share a preference.’

  ‘We do?’ Iberson asked.

  ‘Oriental girls. I’ve seen you in the gossip columns with Nishi Sakura.’

  To Fox’s total amazement, Iberson actually blushed. ‘Oh, well… Elaine, rescue me here.’

  Resnik was smirking. ‘Charlie’s sensitive about the love of her life’ – Iberson made strangling noises – ‘so we’ll just move on and ask you if you think AIs can be dangerous, Teresa? Should we be worrying about them being in our homes?’

  ‘Okay,’ Terri said and then paused. ‘Yes, they can be dangerous. No, you shouldn’t worry about them.’

  ‘That’s… equivocal.’

  ‘Not really. As you’ve heard, there are strict procedures to follow during the production of a new AI. The testing each new model has to undergo is rigorous, and anything which fails those tests is scrapped. We can’t reuse any code from a failed AI. No one should ever have something in their home, or office, or… car, which hasn’t been through testing. An AI is safer to be around than a typical human. But that doesn’t mean someone couldn’t make an AI which contravened the rules, and it doesn’t mean that an AI couldn’t find itself in circumstances where it had no choice but to react violently. We can, and do, train them for military and police actions. There have been a few, a very few, cases where an AI will attack a human it sees attacking someone else.’

  ‘But you don’t, generally, work on military models?’

  ‘No, I usually handle civil AIs. I’m not ideologically opposed or anything, but I’m a psychologist, human and synthetic, and I’m good at making AIs which are good at dealing with people.’

  ‘Like the one we’re going to meet tonight.’

  Terri grinned. ‘Yes, but I’m going to let Tara introduce her, because Kit belongs to Tara.’

  Resnik smiled and looked out at her virtual audience. ‘Yes, you heard it right, ladies. And believe me, we had to beat the technical department with sticks to get them to put this together.’

  ‘Most of them enjoyed that part,’ Iberson commented.

  ‘I think they did. You may have been wondering why there’s a server at the back of the stage since we don’t usually showcase computer equipment. Tonight, that grey box is playing host to Tara Meridian’s personal assistant, who is a class four AI. Tara, would you do the honours?’

  ‘Certainly,’ Fox said, turning in her seat to look back at the grey column. ‘Show time, Kit. Out you come.’ There was a virtual puff of smoke that left Kit in its wake, dressed in her pencil skirt outfit, but without the glasses, which they had decided were wrong for the interview. ‘She doesn’t normally appear in a puff of smoke,’ Fox said, ‘but she’s a virtual image, obviously, which your tech guys are making sure is visible to your viewers. To me, she just seems to be there when I need her, pretty much as real as anyone else.’

  ‘She looks real to me,’ Resnik said. ‘Come and take a seat, Kit. You look a little nervous.’

  Kit stepped forward and sat down beside Fox. Her tail was up and twitching at the tip. ‘I am,’ she said. ‘I can itemise the reasons for that, if you wish.’

  ‘Just like that? You know exactly why you’re nervous?’

  ‘No, but I ran an analysis while I was waiting.’

  ‘AIs, especially class fours, have a unique unders
tanding of their own thought processes,’ Terri explained. ‘It’s a capability which has led to a number of insights into psychology. Memetics wouldn’t exist without those insights.’

  ‘And you can’t dig out how a human came to the same sort of decision, obviously,’ Iberson said, ‘because we’re so damn contrary.’

  ‘Actually, you can. The problem is that AIs are built to let you perform diagnostic analysis, humans aren’t. To get a full picture of how a brain thinks, we have to take it apart, neuron by neuron, using nanotechnology, and taking a lot of measurements as we go. Most people aren’t willing to have their brain turned into soup just so they can find out why they decided to pick out those same six lottery numbers every week.’

  ‘I guess not,’ Resnik went on. ‘Okay, so… Kit, a lot of people who are used to the more common AIs wouldn’t believe they could be nervous.’

  ‘Most people have used class ones without knowing it,’ Kit replied. ‘Those are the things that make your camera work or give you the voice-activation functions on your toaster. Almost everyone has used a class two because the basic operating systems of more or less every computer made in the past thirty years have been based on a class two AI. Most modern autodrive cars have a class three AI, and the same goes for things like automated pilot programs in aircraft. A lot of modern virtual assistants, VAs, are class three as well. They have something of a personality, allowing them to make decisions in a more human manner, but they don’t really have a true grasp of emotion and they have no creativity.’

  ‘But you do?’

  ‘My emotional and creative capacity is, theoretically, as great as a human. However, I just turned one on the third, so I don’t have the wealth of experience that a human has.’

 

‹ Prev