Mindfuck

Home > Other > Mindfuck > Page 18
Mindfuck Page 18

by Christopher Wylie


  But as CA rapidly grew after Mercer’s investment, I hadn’t fully grasped the scale of the race projects we were involved in. The new managers that Nix and Bannon hired started excluding me from meetings, and I stopped getting automatically invited to project planning meetings. I thought this was another power trip by Nix, so I simply felt annoyed rather than suspicious. But one of the psychologists on the team started coming to me to show me some of the new race projects. He showed me the master document of research questions that were being fielded in America, and my stomach dropped when I started reading. We were testing how to use cognitive biases as a gateway to move people’s perceptions of racial out-groups. We were using questions and images clearly designed to elicit racism in our subjects. As I watched a video of a man who was a participant in one of the field experiments, who’d been provoked by a CA researcher’s guided questioning into spasms of rage, racist insults flying from his mouth, I started to confront what I was helping to build.

  In our invasion of America, we were purposefully activating the worst in people, from paranoia to racism. I immediately wondered if this was what Stanley Milgram felt like watching his research subjects. We were doing it in service to men whose values were in total opposition to mine. Bannon and Mercer were more than happy to hire the very people they sought to oppress—queers, immigrants, women, Jews, Muslims, and people of color—so that they could weaponize our insights and experiences to advance these causes. I was no longer working at a firm that fought against radical extremists who shackled women, brutalized nonbelievers, and tortured gays; I was now working for extremists who wanted to build their very own dystopia in America and Europe. Nix knew this and didn’t even care. For the cheap thrill of sealing another deal, he had begun entertaining bigots and homophobes, expecting his staff not only to look the other way, but for us to betray our own people.

  In the end, we were creating a machine to contaminate America with hate and cultish paranoia, and I could no longer ignore the immorality and illegality of it all. I did not want to be a collaborator.

  Then, in August 2014, something terrible happened. A veteran SCL staffer, a longtime friend and confidant of Nix’s, returned from Africa severely ill with malaria. He came into the office red-eyed and sweating profusely, slurring his words and talking nonsense. After Nix shouted at him for being late, the rest of us urged him to go to the hospital. But before he could be seen at the hospital, he collapsed and tumbled down a flight of stairs, smashing his head hard on the concrete. He slipped into a coma. His brain swelled and part of his skull was removed. His doctors worried that his cognitive functioning might never be the same.

  After Nix returned from visiting the hospital, he asked HR for guidance on liability insurance and how long he had to keep paying his loyal friend, still in a coma and missing part of his skull. This seemed callous in the extreme. It was in that moment that I realized Nix was a monster. Worse, I knew he wasn’t alone.

  Bannon was also a monster. And soon enough, were I to stay, I worried that I would become a monster, too.

  The social and cultural research I’d been enjoying only a few months before had given birth to this thing—and it was terrifying. It is hard to explain what the atmosphere was like, but it was as if everyone had become detached from the realities of what we were doing. But I had snapped out of the daze and was now watching a revolting idea become real. My head cleared, and the real-world consequences of Nix’s evil dreams began to haunt me. Late into the evening, unable to sleep, I would stare at the ceiling, my thoughts stalled between agony and bewilderment. One night, I called my parents in Canada, at 3 A.M. their time, to ask for advice. “Read the signs,” they said. “If you can’t sleep—if you’re making calls at all hours in a panic for answers—then you know what you should do.”

  I told Nix that I was leaving. I wanted to get away from his psychopathic vision—and that of Bannon—as fast as I could. Otherwise I risked catching the same disease of mind and spirit.

  Nix countered by appealing to my sense of loyalty. He made me think that I would be a bad person if I abandoned my friends at the firm. I was the one who had recruited people to work on Bannon’s project. They trusted me, and I didn’t want to betray them.

  “Chris, you cannot leave me alone here with Nix,” said Mark Gettleson, who had joined the company in large part to work with me. “If you go, I go.”

  I didn’t like the idea of walking out on my friends and colleagues, but I hated what Cambridge Analytica had become and what it was doing in the world. I told Nix that we could discuss how I would be phased out, but that I was definitely leaving. He did what came naturally—he took me to lunch.

  The restaurant was in Green Park, not far from Buckingham Palace. As soon as we sat down, Nix said, “All right, then. I was expecting we would have this conversation eventually. How much do you want?”

  I told him it wasn’t about the money.

  “Come on,” he said. “I’ve run this firm long enough to know it’s always about the money.”

  He mentioned that I’d never asked for a raise, unlike some of my colleagues, despite how little he’d been paying me. And it was true: I had one of the lower salaries in the office, about half of what others were making, whereas recruits for Project Ripon were taking home triple to quadruple that. When I shook my head, Nix said, “Fine. I’ll just double your salary. That should do it.”

  “Alexander,” I said, “this is not some game I’m playing. I am leaving. I don’t want to work here anymore. I’m done with whatever this is.” My tone deepened, and he seemed to finally realize I meant it, because then he leaned toward me and said, “But Chris, this is your baby. And I know you. You wouldn’t abandon your baby out in the streets, would you?” He must have sensed an opening, because he took the idea and ran with it. “It’s just been born. Don’t you want to see it grow up? To know what school it goes to? If we can get it into Eton? To see what it accomplishes in life?”

  He seemed pleased with the metaphorical flourish, but I wasn’t the least bit moved. I told him that I felt less like a father than a sperm donor, with no power to keep the baby from growing into a hateful child. Nix quickly pivoted, suggesting we set up a Cambridge Analytica “fashion division.”

  “Jesus Christ, Alexander. Are you serious? Psychological warfare, the Tea Party…and fucking fashion trends? No, Alexander. That’s ridiculous.”

  Finally, he got angry. “You’ll end up being the fifth Beatle,” he said.

  The fifth beetle? I thought. Was this some kind of Egyptian parable? Something to do with scarabs? What in the world was he talking about? Not until later did I realize he was talking about the band that had formed three decades before I was born.

  Even after I met him halfway, agreeing to stay on until the midterms in early November, Nix continued to insist that I was making a mistake.

  “You don’t even understand the enormity of what you have created here, Chris,” he said. “You’re only going to understand it when we’re all sitting in the White House—every single one of us, except for you.”

  Seriously? Even for Nix, this was grandiose. I could have had a nameplate in the West Wing, he told me. I was too stupid to realize what I was giving up.

  “If you leave, that’s it,” he said. “Do not come back.”

  I stayed for less than a year after Bannon took over and unleashed chaos. But looking back, I struggle to understand how I could have stayed even that long. Every day, I overlooked, ignored, or explained away warning signs. With so much intellectual freedom, and with scholars from the world’s leading universities telling me we were on the cusp of “revolutionizing” social science, I had gotten greedy, ignoring the dark side of what we were doing. Many of my friends did the same. I tried to convince Kogan to leave, too, and even when he conceded that the project could become an ethical quagmire, he decided to continue collaborating with Cambridge Analytica after I lef
t. When I found out Kogan was staying, I refused to help him acquire more data sets for his projects, as I was worried that any new data I got for him could end up in the hands of Nix, Bannon, and Mercer. What in my mind was meant to become an academic institute was becoming just another player in Cambridge Analytica’s expanding web of partners. When I refused to continue helping Kogan, he demanded that I get rid of any data I had received from him, which I did. But this came at a huge cost for me personally, as Kogan had specifically added in fashion and music questions to the panels so I could incorporate the survey responses into my Ph.D. thesis on trend forecasting. With the basis of my academic work now gone, I knew I would have to give up my Ph.D., which had become the only thing that was keeping me going. But what bothers me most was how I had let Nix dominate me. I let him pick away at every insecurity and vulnerability I had, and then, in service to him, I picked away at the insecurities and vulnerabilities of a nation. My actions were inexcusable, and I will always live with the shame.

  * * *

  —

  JUST BEFORE I LEFT Cambridge Analytica, the firm was planning more election work in Nigeria. As Nix had explained to Lukoil in his presentation about rumor campaigns, the African nation was familiar territory. Cambridge Analytica knew that numerous foreign interests had a hand in African elections, making it unlikely that anyone would care what the firm was up to—it’s Africa, after all. Following the frenzy of decolonization in the 1960s, many Western powers still felt entitled to interfere with their former African territories; the only difference now was the need for a measure of discretion. Europe had been built on African oil, rubber, minerals, and labor, and the mere fact of a former colony’s political independence was not going to change that.

  With the Nigeria project, Cambridge Analytica pushed itself even deeper into psychologically abusive experiments. At the same hotel where Cambridge Analytica set up camp, Israeli, Russian, British, and French “civic engagement” projects operated behind fig-leaf cover stories. The unspoken belief shared by all: Foreign interference in elections does not matter if those elections are African.

  The company was working nominally in support of Goodluck Jonathan, who was running for reelection as the president of Nigeria. Jonathan, a Christian, was running against Muhammadu Buhari, who was a moderate Muslim. Cambridge Analytica had been hired by a group of Nigerian billionaires who were worried that if Buhari won the election, he would revoke their oil and mineral exploration rights, decimating a major source of their income.

  True to form, Cambridge Analytica focused not on how to promote Goodluck Jonathan’s candidacy but on how to destroy Buhari’s. The billionaires did not really care who won, so long as the victor understood loud and clear what they were capable of, and what they were willing to do. In December, Cambridge Analytica had hired a woman named Brittany Kaiser to become “director of business development.” Kaiser had the kind of pedigree that Nix drooled over. In their first meeting, Nix flirted with Kaiser, saying to her, “Let me get you drunk and steal all your secrets.” She had grown up in a wealthy area outside of Chicago and attended Phillips Academy, an exclusive private school in Massachusetts (alma mater of both Presidents Bush). She went to the University of Edinburgh and afterward got involved in projects in Libya. Once there, she met a barrister named John Jones who represented not only Saif Qaddafi, Muammar Qaddafi’s son, but also Julian Assange of WikiLeaks. Jones was a well-respected member of the British bar. Kaiser started consulting for him and, as a result, became acquainted with Assange. She started working at Cambridge Analytica toward the end of 2014, just as I was leaving.

  Cambridge Analytica created a two-pronged approach to swaying the Nigerian election. First they would seek out damaging information—kompromat—on Buhari. And, second, they would produce a video designed to terrify people from voting for him. Kaiser traveled to Israel, where, according to her, she was introduced to some consultants by her contacts there. According to internal correspondence I saw about the Nigeria project, Cambridge Analytica also engaged former intelligence agents from a handful of countries. It is uncertain who, if anyone, at Cambridge Analytica knowingly procured the services of hackers, but what is clear is that highly sensitive material about political opponents—which may have been hacked or stolen—somehow ended up in the company’s possession. By gaining access to opposition email accounts, databases, and even private medical records, the firm discovered that Buhari likely had cancer, which was not public knowledge at the time. The use of hacked material was not unique to Nigeria, and Cambridge Analytica also procured kompromat on the opposition leader of St. Kitts and Nevis, an island nation in the Caribbean.

  The hacking of private medical information and emails was disturbing enough, but the propaganda videos Cambridge Analytica produced were much worse. The ads, which were placed on mainstream networks, including Google, were targeted to areas of Nigeria where the population leaned pro-Buhari. A Nigerian surfing the news would encounter an ordinary-looking clickbait ad—a gossipy headline or a photo of a sexy woman. When the person clicked on the link, he or she would be taken to a blank screen with a video box in the middle.

  The videos were short—just over a minute long—and they usually started with a voice-over. “Coming to Nigeria on February 15, 2015,” intoned a man’s voice. “Dark. Scary. Very uncertain.” “What would Nigeria look like if sharia were imposed as Buhari has committed to do?” The answer, according to the video, was the most gruesome, horrifying carnage imaginable. Suddenly the video cut to a scene of a man slowly sawing a blunt machete back and forth across a man’s throat. As blood spurted from the victim’s neck, he was thrown into a ditch to die. The earth around him was stained red. In another scene, a group of men tied up a woman, then drenched her in gasoline and set her on fire as she screamed in agony. These were not actors—this was actual footage of torture and murder.

  A number of people left CA right after I quit, reasoning that if the firm had become too sketchy for me, the guy who knew all the secrets, then it was too sketchy, period. The Nigeria project, a new low, set off another round of departures. By March 2015, everyone I cared about—Jucikas, Clickard, Gettleson, and several others—had left Cambridge Analytica. But many others found a reason to stay. Kaiser stayed on until 2018, coming forward publicly after the firm was sinking under the weight of the evidence I had provided to the media and authorities. She later claimed not to have known CA was hiring hackers, telling a British parliamentary inquiry that she just thought they were good at “intelligence gathering” and using “different types of data software to trace transfers between bank accounts….I don’t really know how that works.”

  * * *

  —

  AS I LOOK BACK at my time at Cambridge Analytica, some things make a lot more sense than they did in the moment, when I became conditioned to the weirdness of the place. There were always strange people coming and going—shady characters in dark suits; African leaders wearing oversize military hats the size of dinner platters; Bannon—so if every unusual event tripped you up, you wouldn’t have lasted long.

  I know now that Lukoil has a formal cooperation agreement with the Russian Federal Security Service (FSB)—the successor to the Soviet KGB. And a member of the House Intelligence Committee later informed me that Lukoil often served as a front for the FSB, conducting intelligence gathering on its behalf. Lukoil executives had also been caught conducting influence operations in other countries, including the Czech Republic. In 2015, Ukrainian security services accused Lukoil of financing pro-Russian insurgencies in Donetsk and Luhansk. “I have only one task connected with politics, to help the country and the company,” Lukoil’s CEO, Vagit Alekperov, said of his role in geopolitics.

  In fact, this is likely the primary reason they would have been interested in SCL. SCL had a long history in Eastern Europe, and in 2014 it was in discussions for another NATO project on counter-Russian propaganda. SCL had previously worked on
campaigns in the Baltics that blamed Russians for political problems. “In essence, Russians were blamed for unemployment and other problems affecting the economy,” said one old report on the project. But beyond all that, just as Lukoil was funding pro-Russian insurgencies in Donetsk, SCL’s defense division was beginning countermeasure work to “collect population data, conduct analytics, and deliver a data-driven strategy for the Ukrainian government in pursuit of their goal to win back control of Donetsk.” This project was designed to “erode and weaken the Donetsk People’s Republic (DPR)” and would have made the firm a significant target for Russian intelligence gathering, which was known to operate through Lukoil in Europe.

  In reality, when Nix and I met with these “Lukoil executives,” we were almost certainly speaking to Russian intelligence. They likely were interested in finding out more about this firm that was also working for NATO forces. That’s likely also why they wanted to know so much about our American data, and Nix probably struck them as someone who could be flattered into saying pretty much anything. It’s entirely possible that Nix did not know to whom he was speaking, just as I did not. What made these contacts all the more concerning was that they wouldn’t have needed to hack Cambridge Analytica to access the Facebook data. Nix had told them where it could be accessed: in Russia, with Kogan.

  This is not to say that Kogan would have even known about this, but gaining access to the Facebook data would have been as simple as keylogging his computer on one of his lecture trips to Russia. In 2018, after the U.K. authorities seized Cambridge Analytica’s servers, the Information Commissioner’s Office subsequently stated that “some of the systems linked to the investigation were accessed from IP addresses that resolve to Russia and other areas of the CIS.”

 

‹ Prev