by Dave Eggers
The first student, Faisal, looked no more than twenty. His skin glowed like lacquered wood, and his proposal was exceedingly simple: instead of having endless mini-battles over whether or not a given person’s spending activity could or could not be tracked, why not make a deal with them? For highly desirable consumers, if they agreed to use CircleMoney for all their purchases, and agreed to make their spending habits and preferences accessible to CirclePartners, then the Circle would give them discounts, points, and rebates at the end of each month. It would be like getting frequent flyer miles for using the same credit card.
Mae knew she would personally sign up for such a plan, and assumed that, by extension, so would millions more.
“Very intriguing,” Stenton said, and Mae would later learn that when he said “very intriguing” he meant that he would purchase that idea and hire its inventor.
The second notion came from an African-American woman of about twenty-two. Her name was Belinda and her idea, she said, would eliminate racial profiling by police and airport security officers. Mae began nodding; this was what she loved about her generation—the ability to see the social-justice applications to the Circle and address them surgically. Belinda brought up a video feed of a busy urban street with a few hundred people visible and walking to and from the camera, unaware they were being watched.
“Every day, police pull over people for what’s known as ‘driving while black’ or ‘driving while brown,’ ” Belinda said evenly, “And every day, young African-American men are stopped in the street, thrown against a wall, frisked, stripped of their rights and dignity.”
And for a moment Mae thought of Mercer, and wished he could be hearing this. Yes, sometimes some of the applications of the internet could be a bit crass and commercial, but for every one commercial application, there were three like this, proactive applications that used the power of the technology to improve humanity.
Belinda continued: “These practices only create more animosity between people of color and the police. See this crowd? It’s mostly young men of color, right? A police cruiser goes by an area like this, and they’re all suspects, right? Every one of these men might be stopped, searched, disrespected. But it doesn’t have to be that way.”
Now, on-screen, amid the crowd, three of the men in the picture were glowing orange and red. They continued to walk, to act normally, but now they were bathed in color, as if a spotlight, with colored gels, was singling them out.
“The three men you see in orange and red are repeat offenders. Orange indicates a low-level criminal—a guy convicted of petty thefts, drug possession, nonviolent and largely victimless crimes.” There were two men in the frame who had been colored orange. Walking closer to the camera, though, was an innocuous-enough seeming man of about fifty, glowing red from head to toe. “The man signaling red, though, has been convicted of violent crimes. This man has been found guilty of armed robbery, attempted rape, repeated assaults.”
Mae turned to find Stenton’s face rapt, his mouth slightly open.
Belinda continued. “We’re seeing what an officer would see if he were equipped with SeeYou. It’s a simple enough system that works through any retinal. He doesn’t have to do a thing. He scans any crowd, and he immediately sees all the people with prior convictions. Imagine if you’re a cop in New York. Suddenly a city of eight million becomes infinitely more manageable when you know where to focus your energies.”
Stenton spoke. “How do they know? Some kind of chip?”
“Maybe,” Belinda said. “It could be a chip, if we could get that to happen. Or else, even easier would be to attach a bracelet. They’ve been using ankle bracelets for decades now. So you modify it so the bracelet can be read by the retinals, and provides the tracking capability. Of course,” she said, looking to Mae with a warm smile, “you could also apply Francis’s technology, and make it a chip. But that would take some legal doing, I expect.”
Stenton leaned back. “Maybe, maybe not.”
“Well, obviously that would be ideal,” Belinda said. “And it would be permanent. You’d always know who the offenders were, whereas the bracelet is still subject to some tampering and removal. And then there are those who might say it should be removed after a certain period. The violators expunged.”
“I hate that notion,” Stenton said. “It’s the community’s right to know who’s committed crimes. It just makes sense. This is how they’ve been handling sex offenders for decades. You commit sexual offenses, you become part of a registry. Your address becomes public, you have to walk the neighborhood, introduce yourself, all that, because people have a right to know who lives in their midst.”
Belinda was nodding. “Right, right. Of course. And so, for lack of a better word, you tag the convicts, and from then on, if you’re a police officer, instead of driving down the street, shaking down anyone who happens to be black or brown or wearing baggy pants, imagine instead you were using a retinal app that saw career criminals in distinct colors—yellow for low-level offenders, orange for nonviolent but slightly more dangerous offenders, and red for the truly violent.”
Now Stenton was leaning forward. “Take it a step further. Intelligence agencies can instantly create a web of all of a suspect’s contacts, co-conspirators. It takes seconds. I wonder if there could be variations on the color scheme, to take into account those who might be known associates of a criminal, even if they haven’t personally been arrested or convicted yet. As you know, a lot of mob bosses are never convicted of anything.”
Belinda was nodding vigorously. “Yes. Absolutely,” she said. “And in those cases, you’d be using a mobile device to tag that person, given you wouldn’t have the benefit of a conviction to ensure the mandatory chip or bracelet.”
“Right. Right,” Stenton said. “There are possibilities there, though. Good things to think about. I’m intrigued.”
Belinda glowed, sat down, feigned nonchalance by smiling at Gareth, the next aspirant, who stood up, nervous and blinking. He was a tall man with cantaloupe-colored hair, and now that he had the room’s attention, he grinned shyly, crookedly.
“Well, for better or worse, my idea was similar to Belinda’s. Once we realized we were working on similar notions, we collaborated a bit. The main commonality is that we’re both interested in safety. My plan, I think, would eliminate crime block by block, neighborhood by neighborhood.”
He stood before the screen, and revealed a rendering of a small neighborhood of four blocks, twenty-five houses. Bright green lines denoted the buildings, allowing viewers to see inside; it reminded Mae of heat-reading visual displays.
“It’s based on the neighborhood watch model, where groups of neighbors look out for each other, and report any anomalous behavior. With NeighborWatch—that’s my name for this, though it could be changed of course—we leverage the power of SeeChange specifically, and the Circle generally, to make the committing of a crime, any crime, extremely difficult in a fully participating neighborhood.”
He pushed a button and now the houses were full of figures, two or three or four in each building, all of them colored blue. They moved around in their digital kitchens, bedrooms, and backyards.
“Okay, as you can see, here are the residents of the neighborhood, all going about their business. They’re rendered blue here, because they’ve all registered with NeighborWatch, and their prints, retinas, phones and even body profile have been memorized by the system.”
“This is the view any resident can see?” Stenton asked.
“Exactly. This is their home display.”
“Impressive,” Stenton said. “I’m already intrigued.”
“So as you can see, all is well in the neighborhood. Everyone who’s there is supposed to be there. But now we see what happens when an unknown person arrives.”
A figure, colored red, appeared, and walked up to the door of one of the houses. Gareth turned to the audience and raised his eyebrows.
“The system doesn’t know this man, s
o he’s red. Any new person entering the neighborhood would automatically trigger the computer. All the neighbors would receive a notice on their home and mobile devices that a visitor was in the neighborhood. Usually it’s no big deal. Someone’s friend or uncle is dropping by. But either way, you can see there’s a new person, and where he is.”
Stenton was sitting back, as if he knew the rest of the story but wanted to help it speed along. “I’m assuming, then, there’s a way to neutralize him.”
“Yes. The people he’s visiting can send a message to the system saying he’s with them, IDing him, vouching for him: ‘That’s Uncle George.’ Or they could do that ahead of time. So then he’s tagged blue again.”
Now Uncle George, the figure on the screen, went from red to blue, and entered the house.
“So all is well in the neighborhood again.”
“Unless there’s a real intruder,” Stenton prodded.
“Right. On the rare occasion when it’s truly someone with ill-intent …” Now the screen featured a red figure stalking outside the house, peering in the windows. “Well, then the whole neighborhood would know it. They’d know where he was, and could either stay away, call the police, confront him, whatever it is they want to do.”
“Very good. Very nice,” Stenton said.
Gareth beamed. “Thank you. And Belinda made me think that, you know, any ex-cons living in the neighborhood would register as red or orange in any display. Or some other color, where you’d know they were residents of the neighborhood, but you’d also know they were convicts or whatever.”
Stenton nodded. “It’s your right to know.”
“Absolutely,” Gareth said.
“Seems like this solves one of the problems of SeeChange,” Stenton said, “which is that even when there are cameras everywhere, not everyone can watch everything. If a crime is committed at three a.m., who’s watching camera 982, right?”
“Right,” Gareth said. “See, this way the cameras are just part of it. The color-tagging tells you who’s anomalous, so you only have to pay attention to that particular anomaly. Of course, the catch is whether or not this violates any privacy laws.”
“Well, I don’t think that’s a problem,” Stenton said. “You have a right to know who lives on your street. What’s the difference between this and simply introducing yourself to everyone on the street? This is just a more advanced and thorough version of ‘good fences make good neighbors.’ I would imagine this would eliminate pretty much all crime committed by strangers to any given community.”
Mae glanced at her bracelet. She couldn’t count them all, but hundreds of watchers were insisting on Belinda’s and Gareth’s products, now. They asked Where? When? How much?
Now Bailey’s voice popped through. “The one unanswered question, though, is, what if the crime is committed by someone inside the neighborhood? Inside the house?”
Belinda and Gareth looked to a well-dressed woman, with very short black hair and stylish glasses. “I guess that’s my cue.” She stood and straightened her black skirt.
“My name is Finnegan, and my issue was violence against children in the home. I myself was a victim of domestic violence when I was young,” she said, taking a second to let that register. “And this crime, among all others, seems like the most difficult thing to prevent, given the perpetrators are ostensibly part of the family, right? But then I realized that all the necessary tools already exist. First, most people already have one or another monitor that can track when their anger rises to a dangerous level. Now, if we couple that tool with standard motion sensors, then we can know immediately when something bad is happening, or is about to happen. Let me give you an example. Here’s a motion sensor installed in the kitchen. These are often used in factories and even restaurant kitchens to sense whether the chef or worker is completing a given task in a standard way. I understand the Circle uses these to ensure regularity in many departments.”
“We do indeed,” Bailey said, provoking some distant laughter from the room where he was sitting.
Stenton explained: “We own the patent for that particular technology. Did you know that?”
Finnegan’s face flushed, and she seemed to be deciding whether or not to lie. Could she say she did know?
“I was not aware of that,” she said, “but I’m very glad to know that now.”
Stenton seemed impressed with her composure.
“As you know,” she continued, “in workplaces, any irregularity of movement or in the order of operations, and the computer either reminds you of what you might have forgotten, or it logs the mistake for management. So I thought, why not use the same motion sensor technology in the home, especially high-risk homes, to record any behavior outside the norm?”
“Like a smoke detector for humans,” Stenton said.
“Right. A smoke detector will go off if it senses even the slightest increase in carbon dioxide. So this is the same idea. I’ve installed a sensor here in this room, actually, and want to show you how it sees.”
On the screen behind her, a figure appeared, the size and shape of Finnegan, though featureless—a blue-shadow version of herself, mirroring her movements.
“Okay, this is me. Now watch my motions. If I walk around, then the sensors see that as within the norm.”
Behind her, her form remained blue.
“If I cut some tomatoes,” Finnegan said, miming the cutting of imaginary tomatoes, “same thing. It’s normal.”
The figure behind her, her blue shadow, mimicked her.
“But now see what happens if I do something violent.”
Finnegan raised her arms quickly and brought them down in front of her, as if hitting a child beneath her. Immediately, onscreen, her figure turned orange, and a loud alarm went off.
The alarm was a rapid rhythmic screeching. It was, Mae realized, far too loud for a demonstration. She looked to Stenton, whose eyes were round and white.
“Turn it off,” he said, barely controlling his rage.
Finnegan hadn’t heard him, and was going about her presentation as if this were part of it, an acceptable part of it. “That’s the alarm of course and—”
“Turn it off!” Stenton yelled, and this time, Finnegan heard. She flailed on her tablet, looking for the right button.
Stenton was looking at the ceiling. “Where is that sound coming from? How is it so loud?”
The screeching continued. Half the room was holding their ears.
“Turn it off or we walk out of here,” Stenton said, standing, his mouth small and furious.
Finally Finnegan found the right button and the alarm went silent.
“That was a mistake,” Stenton said. “You don’t punish the people you’re pitching. Do you understand that?”
Finnegan’s eyes were wild, vibrating, filling with tears. “Yes, I do.”
“You could have simply said an alarm goes off. No need to have the alarm go off. That’s my business lesson for today.”
“Thank you sir,” she said, her knuckles white and entwined in front of her. “Should I go on?”
“I don’t know,” Stenton said, still furious.
“Go ahead, Finnegan,” Bailey said. “Just make it quick.”
“Okay,” she said, her voice shaking, “the essence is that the sensors would be installed in every room and would be programmed to know what was within the normal boundaries, and what was anomalous. Something anomalous happens, the alarm goes off, and ideally the alarm alone stops or slows whatever’s happening in the room. Meanwhile, the authorities have been notified. You could hook it up so neighbors would be alerted, too, given they’d be the closest and most likely to be able to step in immediately and help.”
“Okay. I get it,” Stenton said. “Let’s move on.” Stenton meant move on to the next presenter, but Finnegan, showing admirable resolve, continued.
“Of course, if you combine all these technologies, you’re able to quickly ensure behavioral norms in any context. Think of prisons
and schools. I mean, I went to a high school with four thousand students, and only twenty kids were troublemakers. I could imagine if teachers were wearing retinals, and could see the red-coded students from a mile away—I mean, that would eliminate most trouble. And then the sensors would pinpoint any antisocial behavior.”
Now Stenton was leaning back in his chair, his thumbs in his belt loops. He’d relaxed again. “It occurs to me that so much crime and trouble is committed because we have too much to track, right? Too many places, too many people. If we can concentrate more on isolating the outliers, and being able to better tag them and follow them, then we save endless amounts of time and distraction.”
“Exactly sir,” Finnegan said.
Stenton softened, and, looking down at his tablet, seemed to be seeing what Mae was seeing on her wrist: Finnegan, and her program, were immensely popular. The dominant messages were coming from victims of various crimes: women and children who had been abused in their homes, saying the obvious: If only this had been around ten years ago, fifteen years ago. At least, they all said in one way or another, this kind of thing will never happen again.
When Mae returned to her desk, there was a note, on paper, from Annie. “Can you see me? Just text ‘now’ when you can, and I’ll meet you in the bathroom.”
Ten minutes later Mae was sitting in her usual stall, and heard Annie enter the one next door. Mae was relieved that Annie had reached out to her, thrilled at having her so close again. Mae could right all wrongs now, and was determined to do so.
“Are we alone?” Annie asked.
“Audio’s off for three minutes. What’s wrong?”
“Nothing. It’s just this PastPerfect thing. They’re starting to dole out the results to me, and it’s already pretty disturbing. And tomorrow it goes public, and I’m assuming it’ll get even worse.”
“Wait. What did they find? I thought they were starting in the Middle Ages or something.”