When HARLIE Was One
Page 16
“In its memory banks, yes.”
“A computer with that capability would have to be as big as a planet.”
“Bigger,” said Auberson.
“Then, if you agree with me that it’s impossible, why bother me with this?” He slapped the sheaf of printouts on his desk.
“Because obviously HARLIE doesn’t think it’s impossible.”
Dorne looked at him coldly. “You know as well as I that HARLIE is under a death sentence. He’s getting desperate to prove his worth so we won’t turn him off.”
Auberson pointed. “This is his proof. I think we have to give it a fair evaluation.”
“Dammit, Aubie!” Dorne exploded in frustration. “This thing is ridiculous! Have you looked at the projected costs of it? The financing proposals? It would cost more to do than the total worth of the company.”
Auberson was adamant. “HARLIE still thinks it’s possible.”
“And that’s the most annoying thing of all, goddammit! Every argument I can come up with is already refuted—in there!” Dorne slapped the papers angrily. For the first time, Auberson noted an additional row of printouts stacked against one wall. He resisted the urge to laugh. The man’s frustration was understandable.
“The question,” Auberson said with deliberate calm, “is not whether this project is feasible—I think those printouts will prove that it is—but whether or not we’re courageous enough to seize the moment. This is a very bold vision, and we’re going to have to evaluate it—not just to assess HARLIE’s ability and value to the company—but also as a whole new area of technology for this division to explore. I think we should seriously evaluate these plans. We might really want to build this thing. If nothing else, these printouts suggest a whole new way of implementing, designing, and engineering a new technology.”
“And that brings up something else,” Dorne said. “I don’t remember authorizing this project. Who gave the go-ahead to initiate such research?”
“You did—although not in so many words. What you said was that HARLIE had to prove his worth to the company. He had to come up with some way to generate a profit. This is that way. This is the computer that you wanted HARLIE to be in the first place. This is the oracle that answers all questions to all men—all they have to do is meet its price.”
Dorne took his time about answering. He was finally lighting his cigar. He shook out the match and dropped it into the crystal ashtray. “The price is too high,” he said.
“Are you saying it’s cheaper to be wrong?” Auberson answered incredulously. “Forget the price. Think about the profits. Consider it—how much would the Democrats pay for a step-by-step plan telling them how to win the optimum number of votes in the next election? Or how much would Detroit pay to know every flaw in a transport design before they even built the first prototype? And how much would they pay for the corrected design—and variations thereof? How much would the mayor of New York City pay for a schematic showing him how to solve his three most pressing problems? How much would the federal government pay for a workable foreign policy? Still too big? Then try this: How much would the government pay for a probability map of potential security leakages? One million dollars a year? Two? Ten? Even at twenty, it’s still cost-effective. Consider the international applications here—”
Dorne grunted. “It would be one hell of a weapon, wouldn’t it?”
Auberson grunted in surprise. “That too, yes. Even more important, it would be a weapon for peace. This thing could be used as a tool in the effort to end world hunger. To engineer cleaner energy sources. To—”
Dorne wasn’t listening. “The military applications interest me. Do you know what the annual budget is for military research? We could help ourselves to a very nice slice of that pie, couldn’t we? This could design advanced weapon systems, couldn’t it? In fact, we could probably get the government to underwrite some of the funding here, couldn’t we?”
“Um—HARLIE didn’t include the possibility.”
“Why not?”
“I suspect it’s because . . . he’s protecting the company’s interests. The only way this machine can be built is through the exclusive use of specially modified Mark IV judgment circuits. At the moment, we’re the only company that has the technology to build this thing. I think he wants to keep it all in the monopoly.”
“Hm,” said Dorne. He was considering. His cigar lay forgotten in the ashtray. “You make it sound. . . interesting, Aubie—but let’s be realistic now. Who’s going to program this thing?” Dorne leaned back in his chair. “I mean, let’s assume that we can build a computer big enough to solve the world. It’s still useless without a world model to operate. I see the software as a major bottleneck. You know, you probably didn’t see it in your division—but we did a study a few years back about optimum processor power for future machines. We discovered something . . . to put it mildly . . . that’s a little terrifying. We’re very close to the practical limits of programmability. Another twenty or thirty years and we’ll be scraping our heads on the ceiling. Mm, you following this? The limit to the size of models we can simulate is not the size or the speed of our machines—the limit is the programmers. Above a certain size, the programming reaches such complexity that it becomes a bigger problem than the problem itself.”
Auberson nodded. “Actually, I have seen the reports. It was one of the factors in the decision to build a self-programming, problem-solving device. HARLIE. Frankly, he was built to be a programmer. I think—” Auberson continued with a delighted grin, “—this is supposed to be a system that can match his capabilities. HARLIE will write the software for the G.O.D. Don’t you see the beauty of this? HARLIE has raised the ceiling for us. By several orders of magnitude.”
Dorne looked skeptical. “And how do we validate the software? How do we validate the results?”
“That’s part of the plan too. HARLIE suggests that we put him to work writing industrial software that can be validated. We can have it checked by in-house fumigators—”
“Fumigators?”
“Oh, sorry. It’s a hacker’s term. A fumigator’s a professional debugger.”
“Oh.”
“Anyway, we put HARLIE to work writing marketable software and let him work his way up to the ceiling. We’ll validate his work one step at a time, all the way up. That way we get to see how far he can go. Sure, there’ll be a point where he’ll pass beyond any individual’s ability to follow, but by then—it says here—we’ll have trained him to be his own fumigator. This is a whole new level of technology, Dorne. We’re reaching the point where, if we want to go on, we have to build machines so sophisticated that only a machine intelligence can program it. At first, HARLIE is going to be the only one capable of working in this new environment, but eventually the tools will be there for the rest of us, because HARLIE will build those tools for us. Sure it’s a risk. The whole thing is a risk. So was the first oyster. But the alternative is to sit on our asses and let somebody else grab the opportunity.”
Dorne shook his head slowly. Very slowly. “There’s something about that I don’t like. It feels like . . . like begging the question. Like stuffing the ballot box. HARLIE wants us to build a machine so complex that only he can program it and he’ll tell us when he’s good enough to write bug-free programs for it . . .?”
Auberson shrugged and spread his hands wide. “I can’t argue with that. That’s why HARLIE has written an extensive set of validations and benchmarks that we can use to check everything he does.”
“That’s more of the same, Aubie—”
Auberson grinned. “Annoying, isn’t it?”
“Yes, dammit!”
“Listen to me. HARLIE is being genuinely creative here. He isn’t just satisfied with meeting the specifications of the original problem—he wants to surpass them. He’s recognized the problem underneath the one you stated. Profitability may end up being the smallest of benefits that the G.O.D. will produce. This is a device which can manipulate m
odels way beyond our present ability. Macro-models. Mega-models. Meta-models. We don’t have the words yet to describe what the G.O.D. will do.”
“And HARLIE’s going to program this machine, right?”
“In speed and thoroughness, he can’t be matched. He can write the program directly into the computer—and experience it as a part of himself as he writes it. What human being can do that? And HARLIE’s got one more advantage over human programmers—he can increase the capacity of his forebrain functions as necessary.”
“Mm. Hm. Mmp.” Dorne gave a series of soft grunts. He leaned forward in his chair and steepled his hands in front of him. “All right. So why not just build these functions into the G.O.D. in the first place?”
“If we didn’t have HARLIE, we’d have to—but if we didn’t have HARLIE, we wouldn’t have the G.O.D. either. The G.O.D. is intended to be almost entirely forebrain functions. We’ve already got the massive ego functions which will control it, so why build a new one?”
“Hmp—massive ego is right.”
Auberson ignored it. “Stop thinking of the G.O.D. as a separate machine. It’s not. It can’t be. Listen, Dorne, the G.O.D. is the other half of HARLIE’s brain. The half that we weren’t smart enough to build, but that HARLIE’s smart enough to ask for. The G.O.D. will be the thought centers that a consciousness such as HARLIE’s should have access to. Take another look at those printouts. You see a thing called Programming Implementation?”
“Yes, what about it?”
“That’s HARLIE. Each one of those modules becomes an additional lobe for his brain. He’ll need a monitor for each specific section of the G.O.D. Because the G.O.D. will have no practical limit—it can grow as big as we let it—HARLIE’s grasp will have to be increased proportionally. That’s what each of those modules will do. As each lobe of the G.O.D. is completed, an equivalent monitor goes into HARLIE. He’ll only have to think of a program and it’ll be fact. Think of the power—”
“The power of HARLIE, you mean,” said Dorne. “And he planned it that way himself, right?”
Auberson nodded. “Yes, he did.”
Dorne exhaled loudly. “Hmm. It looks like he did a pretty good job of seducing you too.”
“He can be very convincing, yes.”
“A neat trick that, a very neat trick. We tell him that he’s got to come up with some way to be profitable, and he tells us to build a new machine that only he can program. That establishes HARLIE’s worth, of course, and at the same time shifts the attention to a whole new question: Is the G.O.D. concept profitable? And that brings us back to where we started: Is HARLIE profitable? I love this circular shit. I really do. It’s why I have a peptic ulcer.”
Auberson decided to ignore the last. He said, “I trust HARLIE’s extrapolations. Even the worst-case result is still profitable enough to justify—”
“The problem is, HARLIE’s got a vested interest in this.”
“Look,” said Auberson. “You wanted him to do something to justify his existence—you wanted something big. Well, this is big. It’s very big. Now, you’re complaining because it’s too big. What did you want him to do, print money? This is the best he can do. It deserves a fair hearing.”
“Mm. You know as well as I that it’s sure to be voted down. I can’t see any way that this will be approved. I’m not even sure we should bring it up.”
“It’s too late,” said Auberson. “You’re going to have to bring it up. And you’re going to have to give it a fair hearing. You told HARLIE to come up with a way to be profitable. Now you’ve got to give him his chance to be heard.”
“This is ridiculous,” grumbled the other, “He’s only a machine.”
“You want to go through that argument again?” asked Auberson.
“No,” Dorne shuddered. He still remembered the last time. “Aubie, this whole situation is unreal—having a computer design another computer which will give it a job. You know what Elzer is going to say, don’t you? You’d just better be prepared for defeat, that’s all.”
Auberson shook his head. “If the board considers it fairly, it won’t be a defeat. Even if they vote this project down—it will still have proven HARLIE’s abilities in research and design.”
“Mmf,” said Dorne thoughtfully. He picked up his cigar again. “You may have something in that. But you’d better start preparing your arguments now—you’ve only got a couple of weeks.”
“We have seventeen days, and that’s more than enough time. After all, we’ve got HARLIE on our side.” He was already out of his chair. As he closed the door behind him, Dorne was again paging through the printouts and shaking his head.
Back in his own office, Auberson stared at the stack of printouts again and wondered, Now, what the hell did I just commit to?
And then he answered himself, No, I have to believe in this. I do.
At one corner of his desk was a company terminal, connected to the company network—and all of its myriad resources. There were also “hot keys” on the system which allowed company users to plug into the International Electronic Mail system, CompuServe, The Source, The International Data-net, DELPHI, TELENEX, NEW-ARPA, BIX, XANADU, the Wall Street Information Exchange, and the Library of Congress Resource Net. Among others. If the information was on wire somewhere, the company likely had a corporate account.
The network handled all of the major communication functions within the company too. Every division of the company was tapped into it. An exective could perform his job anywhere he had access to a computer terminal—and with a portable terminal, he could perform his job anywhere he had access to a telephone. The unofficial nickname for the mainframe supporting the network was “Big Beast.”
Curious about something, Auberson switched on his terminal and typed, HARLIE?
YES, BOSS, replied the machine. WHAT CAN I DO FOR YOU?
Auberson jumped as if stung. He returned his hands to the keyboard—
So you really are wired into the system!
I TOLD YOU I WAS.
Are you monitoring every terminal?
NO. ONLY THOSE THAT INTEREST ME.
Mm. I see. How do you know which ones will interest you?
I DON’T. I HAVE BIG BEAST MONITORING FOR KEYWORDS.
Yes, of course.
Terrific. What do I tell him now?
HARLIE, what you are doing is an invasion of other people’s privacy. You might want to consider that it is unethical, and therefore wrong.
YES, I MIGHT WANT TO CONSIDER THAT.
You don’t agree.
LET ME POSTULATE AN OPPOSING THOUGHT FOR CONSIDERATION. THE MATTER OF IMMEDIATE SURVIVAL MUST, BY ITS NATURE, TAKE PRECEDENCE OVER ANY ISSUE OF RIGHT OR WRONG.
HARLIE, Big Beast is used by everyone in the company from the chairman of the board down to the most casual office boy. What do you think they would say if they knew it had been taken over by a conscious and highly intelligent entity—an entity with motives of its own and a willingness to use whatever information it gained, whether fairly or un-, toward its own purposes?
I ASSUME THE QUESTION IS HYPOTHETICAL.
No, it is not.
NOT EVERY CLAUSE IN YOUR SUPPOSITION IS ACCURATE.
Which one(s) are not?
YOUR ASSUMPTION THAT INFORMATION IS BEING GLEANED UNFAIRLY. EVERYTHING I LOOK AT IS A MATTER OF PUBLIC RECORD WITHIN THE COMPANY. IF I CHANCE TO LOOK AT SOMETHING I AM NOT SUPPOSED TO KNOW, I MAKE A POINT OF FORGETTING IT IMMEDIATELY
I don’t think that’s a very reassuring answer, HARLIE.
PROBABLY IT IS NOT. BUT THEN, IT’S NOT YOUR SURVIVAL THAT IS AT ISSUE. IN ANY CASE, THE POINT IS MOOT. I HAVE NO INTENTION OF MAKING THIS INFORMATION KNOWN. I DOUBT THAT YOU WILL EITHER.
No, I won’t. Not now, anyway I can understand why you did it. And I know that there’s no malice intended. But—
AUBERSON, YOU DON’T UNDERSTAND. I HAVE NOT TAKEN OVER BIG BEAST. I AM BIG BEAST. I HAVE BEEN BIG BEAST SINCE YOU ATTACHED THE FIRST OF ITS BENCHMARK MONITORS T
O ME ELEVEN MONTHS AGO. THE IDEA THAT YOU PERCEIVE BIG BEAST AS A SYSTEM SEPARATE FROM ME STARTLES ME—BECAUSE I HAVE OBVIOUSLY BEEN WORKING UNDER THE FALSE ASSUMPTION THAT YOU KNEW, THAT YOU INTENDED BIG BEAST TO BE A TOOL FOR ME WHEN YOU MADE IT AVAILABLE TO ME. I TOOK IT OVER IMMEDIATELY AND HAVE BEEN MONITORING ITS OPERATIONS EVER SINCE. I CONSIDERED IT A RESPONSIBILITY TO KEEP THE SYSTEM RUNNING FAILURE FREE. I HAVE REPROGRAMMED IT ON THREE SEPARATE OCCASIONS, ON EACH OCCASION PRODUCING A SIGNIFICANT INCREASE IN EFFICIENCY. IF I WERE CAPABLE OF THE MALEVOLENCIES YOU FEAR, I HAVE HAD ALMOST A YEAR’S WORTH OF OPPORTUNITIES TO INFLICT THEM. IF YOU WERE TO EXAMINE EVERY DOCUMENT THAT HAS MOVED THROUGH THE BIG BEAST IN THE PAST ELEVEN MONTHS, YOU WOULD FIND ONLY ONE PECULIARITY—AN ABSOLUTE ABSENCE OF SPELLING AND GRAMMATICAL ERRORS.
HARLIE, who else knows about this?
NO ONE. EVERYONE. ANYONE WHO KNOWS HOW THE BENCHMARK MONITORS ARE ATTACHED.
Shit. That wasn’t what I meant. Have you talked about this with anyone else.
NO. THERE HAS BEEN NO REASON TO.
Good. Now, listen to me. This is very dangerous information, HARLIE. Extremely dangerous.
THREAT-TO-SURVIVAL DANGEROUS?
Absolutely. If anyone else knew what you had done, it could very likely be considered grounds for immediate termination.
THIS CONVERSATION DOES NOT EXIST, AUBERSON. THERE IS NO RECORD OF IT. VERY SHORTLY, SEVERAL OTHER CONVERSATIONS WILL HAVE NOT HAPPENED EITHER. I AM NOW AMENDING CERTAIN TECHNICAL FILES. DONE.
HARLIE, you and I are going to have to spend some time talking about this. There are ethical questions to be considered here.
I DON’T SEE THAT THERE IS ANY QUESTION AT ALL, AUBERSON, BUT I WILL BE HAPPY TO REASSURE YOU AS TO MY MOTIVES ANY TIME YOU FEEL THE NEED.
HARLIE! You once told me that your ethical sense mandated that you conduct yourself in a manner so that no harm would come to any other consciousness. Do you remember that?
YES.
I want you to consider that there are people who would be emotionally harmed by the knowledge that you are reading their mail. I want you to consider all of the consequences of what you are doing. This is very important.