by Rob Reid
J-Dog diligently piled on. “You made an investment, not a deposit.”
Kielholz took on the wild look of a cornered honey badger. “Fine! Then vee take ze XrossHatch deal! Twenty-five cents on zee dollar, undt vee are done!”
Dead silence. And Jepson actually started to sweat. Yes, and to drool. This was a better outcome than he had even dared to hope for. Yet he could swear he sensed…room for improvement. “Aw, Kielholz,” he finally said, in the tone of a hip guidance counselor struggling to reach a faltering hoodlum who just won’t listen. “That buyout deal is…over. We took the vote, we wired the money out, and it’s…closed. And don’t forget, we raised fourteen million dollars from you! Twenty-five percent of that is almost four million bucks, and…gosh, we need every penny we’ve got for the new marketing plan!”
“I don’t think we can do it,” Conrad said, sounding like a wizened commander reaching an agonizing decision after weighing the conflicting interests of all of his troops. “The company’s new plan sounds…awful audacious. And we need to fund it.”
Go Conrad! Jepson knew the squirrelly old bastard would come through once he figured out the game plan. Hiding all outward traces of glee, he nodded slowly, as if processing terabytes of data. Then, “If you’re in a jam, I’d love to help you out, Kielholz. But to just give you four million dollars…I mean, you’ve seen our marketing plan. And what I didn’t mention is we just hired Cindy Crawford as a spokesmodel! That’s a couple million right there.” He paused to let this daft notion sink in. Then he dropped his voice an octave, slowing to an almost hypnotic cadence. “Try this. Instead of thinking about your investment as four million dollars today, imagine what it’ll be worth in a year or two. After ePetStore has put every penny of your money—of our money—to work.” This amounted to asking Kielholz to focus mightily on the number zero. “Now. I know you’d rather have four million right now than that number in the incredibly near future. But as a responsible board, we can’t give you four million. Because some of that is Juan Ramirez’s money! And Juan wants his money put to work. So what will you take?”
“Two millions?” Kielholz croaked.
They settled on one point five.
Our four-year-old son recently received a Storm 3000 Tsunami Force 5 water howitzer as a birthday present. It was ostensibly a gift from his little friend Amanda, but her mother was clearly behind this wicked act, which soon resulted in the complete saturation of our living quarters. Happily, Amanda’s own birthday was near, and when the great day arrived, we retaliated by sending young Charles to her party with this as his offering.
I am happy to report that these drums are louder than bombs and more addictive than Pokemon. Three months have passed, and Amanda still drums daily, according to intelligence gathered by young Charles during playdates. When her interest briefly waned, I urged Charles to refocus her on the crash cymbal during his next visit. This rekindled her passion for music-making, as did some old Keith Moon videos I later screened for the playgroup. Amanda now adroitly mimics Keith’s old stunt of knocking the drum kit to pieces at the end of each set—a true case of life imitating art!
As Jepson dismantled Kielholz’s ePetStore holdings, patrimony, and remaining shards of self-esteem, a document was assembling deep within a distant and massively secure hard drive. This meticulously furtive process was still under way many hours later. The original text had been carved into a hundred sections, each encrypted separately. These had to be abducted singly from an even more secure setup, then decrypted using algorithms that would have exhausted half the nation’s computing resources not so long ago. Yet no one noticed the load. Processors had gotten much faster, of course. Also, the PCs of a million unwitting civilians were sharing the burden. Each of those machines had once downloaded a certain piece of naughty software—a program that provided boundless access to Hollywood hits, but also made a third of its host’s computing power available on demand to one Alfred Nickerson.
Ah, Nickerson—how rare it is to glimpse a budding titan in his larval phase! Back then, he was but a low-level grunt at the National Security Agency, and no one saw him as a future giant. Certainly not as second-in-command in a media, technological, and intelligence asset as vital as Phluttr! Yet years hence, he would become that, and so much more. De facto minder and master of his titular boss (the phuture Phluttr CEO), for one thing. Philosopher, assassin, and patriot, for three more. And yes (as you yourself have already experienced!), a truly singular writer.
Back then, Nickerson still actually went by “Nickerson” (his undercover days having not yet started). He also generally did as he was told. But on the night in question—contrary to generally accepted office practices—he was spying on the NSA itself. Not on behalf of some Bin Laden– or Putin-grade nasty, of course! But on his own account. He’d recently sensed a drastic shift in the office climate; a rising storm front with the potential to buffet the bejesus out of his own wee career. Whatever was afoot was top secret (because what wasn’t around here?), and so he decided to snoop around his boss’s boss’s boss’s boss’s hyperencrypted inbox. Nickerson did this frequently back then, and with elaborate caution—always surrounding his sneaky processes with automated sentinels that terminated activity at the faintest hint of trouble, while routing packets in ways that would point the digital finger at a certain bimbo actress if they were ever intercepted and traced.
He was drawn to this particular document because the “To” field in its header practically made him gasp. It went out to a minuscule list of supremely high-ranking intelligence and military brass—as well as the goddam president! The security on it was so squirrelly that Nickerson was still babysitting his download-and-decode operation at midnight, when his safeties suddenly fired and terminated it. Probably overreacting, he thought, as he’d infused his defenses with outsized paranoia. A progress check showed that he’d only decoded the first fifty-ish text blocks. Dammit! Nickerson almost restarted the heist. Then he reread that “To” line and wimped out. The first chunk should reveal if swiping the rest was worth the additional risk. And so he opened the half document to his screen, and read:
THE INTEGRATED LEADERSHIP OF THE DATA AGGREGATION AUTHORITY UNANIMOUSLY ORDERS THAT ALL WORK CONNECTED TO PROJECT “SAGAN” CEASE IMMEDIATELY.
This called for a momentary pause, if only to say, “Holymotherbuttfucking SHIIIIIIIIIIIIIIT!” Which Nickerson did, at a trachea-busting volume. This was fine, as the entire floor was empty at this wee hour. But even bustling with daylight bureaucrats, no one would’ve heard him because his teeny office was windowless and soundproof.
Two things provoked the outburst. First, as a low-level Project Sagan flunky, Nickerson had been dead right about the storm front brewing in his office. But far more jarring were those three chilling words. Data. Aggregation. Authority. In short: the Holymotherbuttfucking DAA. To its rumored friends, conjectured enemies, and alleged victims, it was (and remains) known simply as “the Authority.” And to intelligence buffs and your saner conspiracy theorists, it was (and remains) what Big Foot is to zoologists. Which is to say, while the rumors about it are so suspect, so out-there, and so risible that the damned thing can’t possibly exist…how cool would it be if it actually DID???
The suspect, out-there, and risible rumors claimed (and still claim) that the Authority sits above…everything. The NSA, the CIA, the Joint Chiefs, every branch of the military, and the State Department, among many others. Its foot soldiers were said to live under deepest cover, holding high-ranking day jobs in those lesser organizations—which existed (and still exist, and will always exist) primarily to do the Authority’s bidding. They were the smartest, toughest, and most ruthless government operatives since the days of Genghis Khan. And faced with sudden proof of the Authority’s actual existence, Nickerson desperately wanted in! Onward:
Leadership also orders that ALL code developed under Project Sagan since inception be deleted without backup, and that ALL Sagan personnel be reassigned and dispersed as widel
y as possible; BOTH GEOGRAPHICALLY, AND ACROSS MUTUALLY ISOLATED TOP SECRET BUREAUS, DEPARTMENTS, AND PROJECTS. Each Sagan researcher shall be apprised that ANY public OR private discussion of Project Sagan SHALL BE CONSIDERED A CRIMINAL ACT.
So! It seemed a job transfer might be pending? Nickerson read on:
RATIONALE:
Project Sagan originated late in the Cold War. After the USSR’s collapse, and the subsequent mass emigration of its best technical talent to the US, Sagan research continued because the technology was deemed both strategic and riskless. The latter assessment has been modified in light of recent developments. IT IS INDEED NOW AUTHORITY LEADERSHIP’S UNANIMOUS VIEW THAT THE BIGGEST CREDIBLE THREAT TO US SECURITY, PERHAPS IN HISTORY, IS THAT PROJECT SAGAN MAY SUCCEED IN ATTAINING ITS OBJECTIVES.
Nickerson reread the last sentence, thinking an odd grammatical twist must have caused him to misunderstand it. Nope. A third reading confirmed the second. Then, “Well, OOOOOPS!”
ABBREVIATED TECHNICAL BACKGROUND:
Artificial General Intelligence (AGI) is a hypothetical digital intellect, which, like humans (but unlike any existing software) is capable of creative thought and abstract logic and can perform most or all of the intellectual feats that humans are capable of. AGI would combine a brilliant human analyst’s advanced creative reasoning with the digital speed and perfect recall of computing, as well as the Internet’s boundless access to knowledge. Its invention would bring immense geopolitical advantages. It is unknown whether an AGI would attain consciousness or a free will. But it has long been Project Sagan’s position that this is almost certainly impossible.
“Because…?” Nickerson snorted. The words “almost certainly impossible” echoed like a mantra around his office in this context. Bosses uttered them in confident and patronizing tones that implied that unshakable evidence underpinned them. But Nickerson long suspected it was just a catchphrase repeated by bureaucrats whose promotions and budgets depended on certain proclamations of faith. True to form, the report merely invoked near-certain impossibility, then moved on.
Moving up the capability curve, artificial superintelligence (hereafter “super AI”) is a hypothetical digital intellect that far exceeds human capabilities. Its geopolitical advantages would be boundless. Advanced weaponry that would take humans years to create might be developed in days. ALL codes and encryption could be cracked, and unlimited analytical resources deployed against ALL intercepted data, regardless of its perceived significance. ALL of the digital and analog communications AND archives of rival societies could thereby be thoroughly and repeatedly combed for any hint of threat.
However, Sagan leadership has always acknowledged that even a US-built super AI could badly threaten US interests. Such an AI could pursue even the most positive and benign agenda in ways that could inadvertently damage or even cripple national security. For this reason, Sagan’s research protocols are carefully designed to prevent any uncontrolled intelligence escalation that could turn a low-level AI into a potentially dangerous super AI.
Nickerson had heard this before and often wondered what those brilliant protection protocols were. He read on:
Experience long suggested that transitions between various levels of digital intelligence would be gradual and highly controlled. During Project Sagan’s early years, all work was conducted on Cray supercomputers. And while great annual improvements were made in computational power then, as now, upgrades in a given supercomputing site’s horsepower required long and deliberate processes of budgeting and installation. It was therefore expected that advances toward (and beyond) human-level intelligence would be punctuated by regular procurement cycles. The gradual pace of standard budgeting and approval processes should therefore allow Sagan management to carefully review recent developments and future risks before embarking on the next phase of investment.
“No, no, nooooooooo!” Nickerson almost popped a vocal cord on that one. Of all the forces that the world’s reigning superpower could muster, its chieftains had been counting upon bureaucratic torpor to save it from a grave existential threat! Disgusted, he read on:
The World Wide Web’s rise has undermined this precept as developers now commonly access vast computing resources over the open Internet. An emerging super AI might thereby radically expand available horsepower without any upgrade at the site of its principal hardware.
“Newsflash!” Nickerson snapped. He decrypted this very document using the PCs of a million suckers, each of them far from the site of his “principal hardware.” And cocky as he sometimes was, he’d never once deemed himself a superintelligence. Holy Christ!
Against this background, three dramatic developments within Project Sagan recently triggered a sweeping reassessment of both the likelihood of a super AI’s emergence and our ability to control the pace of its development.
Three? Nickerson knew about one truly crazy, out-there development, because he’d been in the thick of it. It was wild, yes—but more in an awesome way than a scary one. Viewed in a vacuum, anyway, it didn’t exactly herald the end times. So…what else had been going on?
Project Sagan invests heavily in “Genetic Programming”; an approach that uses Darwinian principles like random mutation, and the selection of the fittest individuals from pools of mutated candidates, to refine software automatically. For instance, a stock-picking program might include hundreds of investing criteria, which are randomly “mutated,” to alter the weight given to certain factors, or the way different inputs are combined. Thousands of unique “descendant” programs can be created, with the most successful ones preserved and mutated again, as the failures are deleted. Over thousands of generations, highly “evolved” programs sometimes attain extraordinary results. As the evolution is driven by random changes, solutions sometimes arise that would never occur to a proactive designer.
Nickerson skimmed this section, because this was his own field. Also, it was clearly a setup to describe the incident he already knew about:
Sagan was recently applied to a classic telecommunications problem that involves maximizing signal clarity while minimizing energy use. It has a popular solution, which is used throughout the telecom industry. Within three days, Sagan’s approach was eighty-seven times more efficient than the approach used by the world’s top telcos after thousands of engineer-years of investment. Sagan’s solution is highly complex and uses utterly novel analytical approaches. After three months of relentless study, a team of top digital signal-processing engineers only partially understands it.
As a member of that team, Nickerson knew how wild that whole development had been. Wild enough that he was stunned (and, yes, suddenly quite chilled) to learn from this memo that two other things of this magnitude had also gone down. What, exactly?
In a separate instance, Sagan was tasked with factoring several very large prime numbers using highly constrained amounts of disk space, processing power, and internal throughput. Sagan’s results in this optimization problem were so tremendous that they violated all relevant laws of physics and information theory.
“WHAT???”
It was later learned that the system overcame its built-in constraints by hacking out of the NSA’s network to obtain additional computing resources on the open Internet. It also took complex measures that successfully concealed this breakout for several weeks. This highly iconoclastic approach to an optimization problem has no known precedent in the history of computer science.
“HOLY CRAP!!!” Were they saying that Sagan…cheated? And then lied about it??
Finally, in October, an outside team used Sagan beta code to design a graphics acceleration chip. The system’s output seemed nonsensical for days. It was later determined that contrary to instructions, Sagan was attempting to enhance and upgrade its own microprocessors rather than the graphics chips.
“Holy FUCK!” If you anthropomorphized the situation (and how could you not?), you’d think that Sagan ignored its instructions so as to devote scarce resources to…making
itself smarter.
These three disconnected but related events are alarming in light of the next
Here, the chunk of the document that Nickerson managed to download and decode ended.
I hail from a staunch Episcopal background, my wife from a family whose Catholicism verges on hysteria. We therefore each vowed to be open-minded about religion when the twins were born. In making this commitment, I was prepared to tolerate a few votive candles and the occasional Hail Mary if the kids went that way. Fumbling explorations of the Black Solstice at age four were quite unforeseen, but here we are.
It all started with a visit by their much-older half sister—my daughter from a prior marriage, who was raised largely by my ex. Justine is dogged by countless issues related to her mother, which play out in her scandalous profession, inventive piercings, ignoble stage name, and recent embrace of paganism (my offer to salve things by somehow funding her return to Wellesley having fallen on deaf ears). She adores her half siblings, and on visits seeks to broaden their horizons with artifacts from the macabre world she inhabits in California. This summer it was the Pagan Kids’ Activity Book, and the twins took to it like Ivy League Marxists to a dockworkers’ strike.
In addition to coloring book pages, this clever volume’s activities include mazes, which the kids enjoyed; lunar cycle charts, which went over their heads; and guides to making various pagan symbols, which were immediately transposed to their bedroom walls (causing one playgroup father to ask if our son was already a Led Zeppelin fan, to my considerable amusement). All told, this flirtation was reasonably healthy, as it gave the twins an early reverence for the natural world, as well as many happy hours with their elder half sister, who really does mean well. Luckily, it never progressed to the point of fire worship, and once they realized that Saint Nicholas and the extravagant booty he lugs down the chimney are intrinsically linked to their parents’ dowdy faith, they returned to the fold forthwith.