Three Laws Lethal

Home > Other > Three Laws Lethal > Page 13
Three Laws Lethal Page 13

by David Walton


  The obfuscated source code was just as indecipherable as expected. Without any English variable names or comments, it was pure logic, without any way to understand the purpose of any portion of it. No programmer would have been able to read it, not even the Mercedes developers who wrote it in the first place. In fact, Lauren had hired an ex-mercedes developer named Yusuf Nazari, who had worked on some portion of the code a few years earlier. He had done his best so far to make sense of it, trying to reason his way from known inputs to some level of meaning, but it was slow going. It was unquestionably a violation of the spirit of discovery, but to the judge, the real code would have been gibberish as well. It was hard to explain how one set of gibberish was acceptable, but another was not.

  Tyler liked Seattle. He liked the breeze off of Puget Sound, the occasional sightings of Mount Rainier and mount St. Helens, the seafood, the young tech culture, and, of course, the Science Fiction and Fantasy Hall of Fame. He didn’t even mind the constant drizzling rain. As a place to start over, he could have done worse.

  A week into the job, he saw the news about Brandon. It was buried in the announcements section of an autocar magazine: a new self-driving startup, offering app-driven ride services in New York City and its suburbs. The company was called Black Knight, and its founder and CEO was Brandon Kincannon. For most people, the name of the company would summon visions of a powerful warrior on horseback, or perhaps of Batman, or the Marvel Comics superhero. Tyler knew it was a reference to the ludicrous character in Monty Python and the Holy Grail, and thus a tribute to Abby.

  Apparently, Brandon’s inheritance from his parents had been considerable. In the millions, if it had been enough to start a viable company. It would be a hard battle to make it profitable, though. He would be trying to enter a market fenced on one side by car manufacturers, selling both traditional and self-driving cars to commuters, and on the other side by the New York taxi drivers union, eighteen thousand drivers strong, providing near-instant access to transportation from the street. Not to mention Uber and Lyft, which were already filling the same niche.

  He wasn’t surprised Brandon hadn’t asked him to join his company, but it still hurt. It was a dream Tyler had been forced to abandon, and now Brandon had found a way to make it happen. He wondered who Brandon had hired to write the core software, and how good they were. Not that it really mattered. It wasn’t Tyler, and never would be. He had a new life now, and he would have to get used to it.

  CHAPTER 13

  Naomi spent hours exploring the gigantic world the Mikes inhabited. Ten million of them. She flew over miles of countryside, mostly fields and sheep pastures as before, with mirrored buildings placed in regular patterns to reflect sunlight. The difference was the scale. The features of the world—lakes and rivers and forests and mountains—were randomly placed, but the landscape just kept going. Wherever she flew, she saw Mikes at work, shifting sheep to new grassy fields, constructing new buildings, harvesting yams.

  As a society, they were doing what they had been created to do: survive. They had somehow reached out of their simulation and harnessed the power of the computing system to create a larger simulation, thus living longer and providing resources for the children they inevitably spawned. But how? How had it happened? They were characters in a video game. This shouldn’t be any more possible than Pac-Man escaping its maze.

  But the Mikes were a lot more sophisticated than Pac-Man. They had already hacked the game once, finding a way to survive that she hadn’t intended. If anything, this demonstrated the power of an evolutionary framework. Given enough time, even very unlikely opportunities to improve survival would be discovered, and once discovered, would live on. Any good strategy for survival would soon be widely adopted, since those who followed it would survive and those who didn’t would die. The tendency to pass instructions to children would become prevalent, because those who passed on such instructions would have more surviving grandchildren.

  But how they had done it? She checked the cluster of machines the simulation was running on, and it, too, had grown to mammoth proportions. Like most computing systems these days, it was hosted on a cloud. That meant it wasn’t like a computer sitting on her desk with a hard disk and a few processors. The university purchased their computing capacity as a commodity from a cloud retailer like Amazon. They paid for as little or as much computing as they used. So what looked and behaved like a single machine to Naomi, with processing capacity and storage for her files, was in fact part of a giant data center in a warehouse somewhere, with her data stored redundantly as stripes on multiple physical disks. Part of the advantage of a cloud was auto-scaling: the ability to grow a computer system to be as large as the problem you were trying to solve.

  It seemed unlikely that the Mikes could be aware of the physical architecture behind their world. Yet somehow, they had figured out a way to affect it. Practically, that meant interacting in some way with the stack of addressable memory the operating system provided to the Realplanet application. They could theoretically have access to this through a bug, some kind of buffer overflow that allowed them to change memory values they weren’t supposed to be able to touch.

  It took her days to discover how they had done it. She hunted through the Realplanet history of the world, narrowing her search by looking at times when the population had suddenly increased. Even so, it wasn’t easy. The transition had taken centuries of game time. Eventually, she did track it down to a bug—not in her scripts but in the Realplanet software itself.

  It turned out it was done with mirrors. If two of the mirrored surfaces used to construct the buildings were set up facing each other, the resulting reflections caused an infinite loop in the software, which culminated in a buffer overflow. Smaller mirrors wouldn’t do it, and they had to be exactly parallel, to a very small tolerance.

  A buffer overflow happened when a program wrote data into memory outside of the limits intended for it, like water overflowing a riverbank. If done unintentionally, it usually meant crashing the program. But hackers had a long tradition of exploiting such bugs to intentionally write their own data in place of data that was supposed to be there. In the old days, they had used buffer overflows to rewrite parts of the operating system or to gain root access to a system. These days, program memory addressing was limited to offsets from a given starting point, preventing such bugs from becoming unlimited security holes, but they could still be used in some contexts to cause effects outside of what the programmer intended.

  The first time, the mirror hack had nearly crashed the simulation. The program’s calculations went badly askew, sending the world into terrifying chaos. Straight lines went jagged, objects popped in and out of existence, mountains flickered into forests or lakes. Half the population died, and several that had been dead came back to life. The program recovered, and the simulation returned to normal, but from then on, the instructions written for future generations included a ban against facing two mirrors together.

  Some of the Mikes, however, didn’t obey the rules. In the random genetic variations of the children from their parents, most were spawned with the desire to follow the wisdom discovered by their elders, but some were not. Some of these renegades experimented with mirrors.

  Often, nothing much would change. Other times, the Mike himself would die, or he would cause another catastrophe. On one game-changing occasion, however, the result was nearly magical. One of the Mikes, experimenting with large mirrors set to face each other, produced a hundred times the amount of light than should have been possible. Naturally, since survival was the primary goal of every mike, and survival required light, this revived experimentation with the mirror hack. Which led to more deaths, more catastrophes, and more discoveries.

  After centuries, the Mikes discovered not only how to reproduce exactly the right arrangement to yield a lot more light but also a host of other hacks that benefited them. They found ways to use the mirror effect as a weapon, destroying other clans of Mikes to preserve m
ore resources for their own offspring. Eventually, they developed methods of manipulating the light that allowed them to access the underlying computer memory with more precision. They expressed this language using the same words they used to tell each other about food and building plans— straight and left and yes and no. It was a crude programming language of sorts, compiled into computer memory through a complex pattern of light reflection.

  It was brilliant. From there, of course, they had reached farther, codifying commands to use again and again. They learned how to access the auto-scaling capability of the cloud, requesting more memory and processing resources for their growing cluster, thus allowing their world to expand. She doubted they actually understood the system, not in the same way a human programmer did. Theirs was more of a practical understanding, like a toddler who discovers consistent ways to make pretty colors and sounds with a smartphone, without knowing how it works or what it’s really for.

  Naomi wasn’t surprised that Realplanet had been sold with such a bug, or that no one else had discovered it. Issues like that were pretty common, even in professionally developed software. Besides, this was something different. Realplanet was being hacked from the inside, and no programmer anticipated that. The Mikes were part of the programming. This was the program hacking itself.

  Were the Mikes conscious?

  The question pricked her more deeply than it might have before, obsessed as she had become with thoughts of the soul and identity apart from the body, immersed in stories about the nature of the mind. Could they have inner thought lives unique to themselves? They hadn’t been programmed that way, but such a thing might have evolved if it improved their ability to survive. Did they know they existed? Were they just simulations, or something more? It was difficult to define what she even meant by the question, never mind knowing how to answer it. Even the terminology fell short. Some people used the word sentient, but that just meant having senses and using them to interact with one’s environment. A sea slug was sentient. Intelligent was too vague, since dogs and parrots and chimps and dolphins were all intelligent, to some degree. Naomi liked sapient best, but it essentially just meant “like Homo sapiens.” Like humans. And wasn’t it possible for a creature to exist that had a self-aware perspective and reasoning ability, and yet be utterly unlike humans?

  She suspected that the problem with picking a word was that nobody really knew what it was they were talking about. There seemed to be an essence that distinguished the human experience from that of everything else on the planet, but we couldn’t nail down what it was. Was human intelligence categorically different from a dolphin’s, or were we just farther along on the same scale? Could the difference be defined? Measured? If the same essence appeared in a computer or an alien species, would we even recognize it?

  The Turing test represented an attempt, at least, to measure the slippery concept, but it fell so far short as to be practically useless. Telemarketing bots could fool most people on the phone these days, but no one thought they had private emotional lives. The conundrum boiled down to this: the only person who could know if a creature was self-aware was the creature itself. Naomi knew that she was. She assumed other people were because they spoke and behaved in similar ways, but really, there was no way to be sure. Maybe they were robotic simulations, or clever hallucinations of her own mind. A novel by Robert Sawyer called Quantum Night suggested that only a small percentage of the human population was actually self-aware, while the rest had just developed an evolutionary survival strategy to imitate them.

  Science fiction aside, it seemed reasonable to assume that for humans, the appearance matched the reality. With computers, however, the answer was far less clear.

  The essence of the question seemed to be one of conscious experience. An AI could behave and even talk exactly like a human, but experience nothing. It would be like the inhabitant of Searle’s Chinese room, only worse: instead of simply not understanding, it would be just a mechanism, with no subjective experience of being in the room at all. Was conscious experience something that could be fabricated? Was it even physical at all? The problem with subjective experience was just that; it was subjective. Who could say if another being was conscious of its surroundings?

  In the case of the Mikes, Naomi found herself doubting it. For one thing, they had little in the way of language. Their communication hadn’t progressed beyond the original set of directional words. Even these they used sparingly, barely communicating at all in their day-to-day lives. Didn’t a human-level consciousness of self require some kind of language to think in? Could one truly think without a way to express those thoughts? Despite all they had done as a group, it was hard to watch the simple, repetitive interactions of a single Mike and think of it as sapient.

  They had figured out how to hack their universe, to be sure. But did the Mikes really discover those things through thought and invention, or was it simply the power of the process of evolution? The same sorts of achievements were common in the natural world. Poison arrow frogs had “figured out” how to carry their tadpoles up tall trees to pools of water in the rainforest canopy. Gulls had “learned” to drop mollusks from a height to break them open and reach the meat inside. Hawk moths had “developed” a long proboscis to reach the nectar in the deepest flowers. But none of these were evidence of intelligence or clever invention on the part of the animal. They were the result of evolution, generating diversity and then rewarding the instincts and designs that proved most beneficial.

  Could she really say, then, that simulations that followed their evolutionary programming were sapient, however sophisticated their activity might have become?

  One thing she was sure of, though: the explosive growth of the Mikes’ world was going to be a problem. Her original cluster had consisted of eight virtual machines with two cores each, fully utilized. This sprawling world of millions required a hundred times that at least, and it was still growing.

  She remembered the email from Penn about her account expiring. No large organization hosted their own servers anymore, and Penn was no exception. Computing power and storage were a commodity, abstracted from the underlying hardware and sold to customers like electricity or water. The Penn network, which would once have been stored in the basement of one of the university buildings, was now part of a giant cloud computing warehouse housed somewhere in the suburbs, with fifty or eighty thousand servers in it. Penn’s data and processing load would be distributed across those machines along with hundreds of other customers, and they would get a monthly bill for their usage. If that bill had doubled in size over the course of half a year—and she was pretty sure it had—the Penn network admins would certainly have noticed.

  In fact, that’s probably what had prompted the alert about her account expiring. The school usually gave students more time, a few semesters at least, to clear out their accounts. She had even heard of people still using their machines to host websites years after graduation. Which meant the admins probably didn’t know where the extra usage was coming from. That wouldn’t last long, though. They would add instrumentation and track it down by account and user. They would find the problem and shut it down.

  There was no way Naomi was going to let that happen. But how could she stop it? A quick look at prices at major cloud service providers told her what she already knew: there was no way she could afford to host the Mikes’ world herself. She was pulling a pretty good salary writing software, but it wasn’t enough. Even if she re-enrolled at Penn, they wouldn’t let her continue to run a simulation that cost more than her school tuition.

  She could theoretically write some kind of program to distribute the simulation around the world, using other people’s computing power like Bitcoin mining programs. Besides being illegal, however, she doubted she could do so effectively. The world of cryptocurrency mining was pretty well saturated, and usually involved bundling with other software that users downloaded. She doubted she could compete in that world, and besides, she didn’t think R
ealplanet could be distributed that widely and still maintain a single, coherent simulation. There had to be another way.

  Her glasses chimed, indicating an incoming call. This was unusual all by itself, since she had no social contacts in the city. Her mom called every day, but she had already talked to her that morning. A box to one side of her vision showed the name Brandon Kincannon along with his picture. She almost ignored it. If it had been Tyler, she would have. But Brandon hadn’t tried to contact her before now, which made her curious. She answered it.

  “Hello?”

  “Naomi, it’s Brandon. I won’t ask how you’re doing, because I can guess. But I have an offer for you.”

  “An offer?”

  “Yeah. Well, to be honest, I need your help. You might have seen the news that I’m starting an autocar company.”

  “No.” She hadn’t been watching the news, and she certainly wasn’t reading any articles about the self-driving car industry.

  “Well, I am. And I heard you’re living in Manhattan.”

  “Brandon—”

  “Wait. Hear me out. It’s in her memory. It’s too late to save Abby, but we can save other people. You probably don’t even want to see a self-driving car again, never mind build one. I get that. But if we don’t, then nothing comes of her death at all. People keep on dying, and nothing changes.”

  “It was our self-driving cars that killed her,” Naomi said.

  “And it’s the lack of them that’s killing three thousand people a day. Just right here in Manhattan, there’s a traffic accident every twelve minutes.” His voice grew plaintive. “I don’t know what else to do. Nothing else has any purpose, any meaning for me. I have to make it right somehow. This is the only way I know. And I can’t do it without you.”

 

‹ Prev