Book Read Free

Mindfuck

Page 12

by Christopher Wylie


  He told Bannon that while SCL had London offices, we were based primarily out of Cambridge because of our close partnership with the university. This was a total falsehood, made up on the fly. But for Nix, truth was whatever he deemed true in the moment. As soon as he’d said we had a Cambridge office, he started referring to it all the time, urging Bannon to stop by.

  “Alexander, we don’t have a Cambridge office,” I said, exasperated with his insanity. “What the fuck are you talking about?”

  “Oh, yes we do, it’s just not open at the moment,” he said.

  A couple of days before Bannon’s next visit to the U.K., Nix had the London office staff set up a fake office in Cambridge, complete with rented furniture and computers. On the day Bannon was scheduled to arrive, he said, “Okay, everyone, we’re working out of our Cambridge office today!” And we all packed up to go out there and work. Nix also hired a handful of temps and several scantily clad young women to staff the would-be office for Bannon’s visit.

  The whole thing felt ludicrous. Gettleson and I messaged each other, sharing links about Potemkin villages, the fake Russian towns set up in old tsarist Russia to woo Catherine the Great when she visited in 1783. We christened the office the Potemkin Site and made relentless fun of Nix for coming up with such a stupid idea. But when I walked around the fake office with Bannon, two months after I first met him in a Cambridge hotel, I could see the light in his eyes. He was buying it and loving every moment of it. Fortunately, he never noticed that some of the computers weren’t actually plugged in or that some of the hired girls didn’t speak English.

  Nix set up the Potemkin Site every time Bannon came to town. Bannon never caught on that it was fake. Or if he did, he didn’t mind. It fit the vision. And when it came time to name the new entity the Mercers were funding, Bannon chose Cambridge Analytica—because that was where we were based, he said. So Cambridge Analytica’s first target was Bannon himself. The Potemkin Site perfectly encapsulated the heart and soul of Cambridge Analytica, which perfected the art of showing people what they want to see, whether real or not, to mold their behavior—a strategy that was so effective, even a man like Steve Bannon could be fooled by someone like Alexander Nix.

  CHAPTER 6

  TROJAN HORSES

  -

  “You know DARPA funds some of their work,” Brent Clickard told me on a train ride from London to Cambridge. “If you want to expand your team, these are the ones you want.” As one of SCL’s psychologists, he floated between the company and his ongoing academic work inside one of the psychology labs at the University of Cambridge. Like me, Clickard was becoming enamored with the possibilities of what our research could yield, which is why he was so willing to provide the firm access to the world’s leading research psychologists. The psychology department at Cambridge had spearheaded several breakthroughs in using social-media data for psychological profiling, which in turn prompted interest from government research agencies. What Cambridge Analytica eventually became depended in large part on the academic research published at the university it was named after.

  Cambridge Analytica was a company that took large amounts of data and used it to design and deliver targeted content capable of moving public opinion at scale. None of this is possible, though, without access to the psychological profiles of the target population—and this, it turned out, was surprisingly easy to acquire through Facebook, with Facebook’s loosely supervised permissioning procedures. The story of how this came to be started in my early days at SCL, before they created Cambridge Analytica as a spin-off American brand. Brent Clickard had shown me around the Psychometrics Centre at Cambridge. Having read through many of his papers, and those of his colleagues at the Centre, I was intrigued by their novel tactic of integrating machine learning with psychometric testing. It seemed like they were working on almost the same research questions we were at SCL, albeit with a slightly different purpose—or so I thought.

  Research into using social data to infer the psychological disposition of individuals was published in some of psychology’s top academic journals, such as the Proceedings of the National Academy of Sciences (PNAS), Psychological Science, and the Journal of Personality and Social Psychology, among many others. The evidence was clear: The patterns of a social media user’s likes, status updates, groups, follows, and clicks all serve as discrete clues that could accurately reveal a person’s personality profile when compiled together. Facebook was frequently a supporter of this psychological research into its users and provided academic researchers with privileged access to its users’ private data. In 2012, Facebook filed for a U.S. patent for “Determining user personality characteristics from social networking system communications and characteristics.” Facebook’s patent application explained that its interest in psychological profiling was because “inferred personality characteristics are stored in connection with the user’s profile, and may be used for targeting, ranking, selecting versions of products, and various other purposes.” So while DARPA was interested in psychological profiling for military information operations, Facebook was interested in using it for increased sales of online advertising.

  As we approached the Downing Site building, I spotted a small plaque that read PSYCHOLOGICAL LABORATORY. Inside, the air was stale, and the décor hadn’t been updated since at least the 1970s. We walked up a few flights of stairs and then to the last office at the end of a narrow corridor, where Clickard introduced me to Dr. Aleksandr Kogan, a professor at the University of Cambridge who specialized in computational modeling of psychological traits. Kogan was a boyish-looking man and dressed as awkwardly as his manner. He stood with a simpering grin in the middle of the room, which was filled with stacks of papers and random decorations from his time studying in Hong Kong.

  At first, I had no idea about Kogan’s background, as he spoke English with a perfect American accent, albeit with an exaggerated prosody. I later learned he was born in the Moldavian SSR during the final years of the Soviet Union and spent part of his childhood in Moscow. Not long after the Soviet Union collapsed, in 1991, his family emigrated to the United States, where he studied at the University of California, Berkeley, before completing his Ph.D. in psychology in Hong Kong and joining the faculty at the University of Cambridge.

  Clickard had introduced me to Kogan, as he knew the work he was doing in his lab at Cambridge could be extremely useful for SCL. But, knowing Nix’s preferred style of venue, Clickard decided that the introduction should happen over canapés and wine. Nix was fickle, and he could completely write people off because he didn’t like their tie or choice of restaurant. So we all met at a table booked by Clickard at an upstairs bar inside the Great Northern Hotel, beside Kings Cross train station. Kogan was visiting London for the day and had made time to tell us about his work before heading back to Cambridge. It was common enough for Nix to drink too much wine on a night out, but I’d never seen him intoxicated by a voice other than his own. The topic was social media.

  “Facebook knows more about you than any other person in your life, even your wife,” Kogan told us.

  Nix snapped out of his trance, reverting to his usual embarrassing self. “Sometimes it’s best wives don’t know certain details,” he quipped, sipping his wine. “Why would I ever need or want a computer to remind me—or her?”

  “You might not want it,” the professor answered, “but advertisers do.”

  “He’s interesting, but he doesn’t sound like a Cambridge man to me,” mumbled Nix, drinking more wine while Kogan was in the restroom.

  “Because he’s not from Cambridge, Alexander. Jesus…He just teaches there!”

  Clickard rolled his eyes. Nix was a distraction from more pressing concerns. After the firm looked at Kogan’s research, Nix was eager to put him to work. SCL had just secured the financing from Mercer and was in the process of setting up a new American entity. But before Nix was to let Kogan near his new prize
project in America, he would have to prove himself in the Caribbean first. At the time, in early 2014, Kogan was working with researchers based at St. Petersburg State University on a psychological profiling project funded by the Russian state through a public research grant. Kogan advised a team in St. Petersburg that was pulling swaths of social media profile data and using it to analyze online trolling behavior. Given that this Russian social media research focused on maladaptive and antisocial traits, SCL thought it could be applied to the Trinidad project, as Ministry of National Security staff there were interested in experimenting with predictive modeling of Trinidadian citizens’ propensity to commit crimes.

  In an email to Trinidad’s security ministry and its National Security Council about “criminal psychographic profiling via [data] intercepts,” one SCL staffer said that “we may want to either loop in or find out a bit more about the interesting work Alex Kogan has been doing for the Russians and see how / if it applies.”

  Kogan eventually signed up to assist SCL on the Trinidad project, where he offered advice on how to model a set of psychological constructs that past research had identified as related to antisocial or deviant behavior. Kogan wanted data in exchange for helping to plan the project, and he started discussions with SCL about accessing its data set of 1.3 million Trinidadians for his own research. What I liked about Kogan was that he wanted to work fast and to get stuff done, which was not common for professors accustomed to the glacial pace of academic life. And he came across as honest, ambitious, and upfront, if a little bit naïve, in his excitement for ideas and intellectual ambition.

  I got along quite well with Kogan in the beginning. He shared my interest in the emerging fields of computational psychology and computational sociology. We would talk for hours about the promise of behavioral simulation, and when we discussed SCL, he was palpably excited. At the same time, Kogan was slightly odd, and I noticed that his colleagues would make snide remarks about him when he wasn’t around. But it wasn’t as if this bothered me. If anything, it made me relate to him more—after all, I’d been on the receiving end of plenty of snide remarks myself. Besides, you had to be a bit weird to work at SCL.

  When Kogan joined the Trinidad initiative in January 2014, we were just launching the early trial phases of the America project with Bannon. Based on our qualitative studies, we had some theories we wanted to test, but the available data was insufficient for psychological profiling. Consumer information—from sources like airline memberships, media companies, and big-box stores—didn’t produce a strong enough signal to predict the psychological attributes we were exploring. This wasn’t surprising, because shopping at Walmart, for example, doesn’t define who you are as a person. We could infer demographic or financial attributes, but not personality—extroverts and introverts both shop at Walmart, for example. We needed data sets that didn’t just cover a large percentage of the American population but also contained data that was significantly related to psychological attributes. We suspected we needed the kind of social data we had used on other projects in other parts of the world, such as clickstreams or the types of variables observed in a census record, which Kogan had picked up on.

  Kogan started on Trinidad, but he was far more intrigued by SCL’s work in the United States. He told me that if he was brought on to the American job, we could work with his team at the Psychometrics Centre to fill gaps in the variables and data categories in order to create more reliable models. He started asking to access some of our data sets to see what might be missing in the training set, which is the sample data sets one uses to “train” a model to identify patterns. But that wasn’t quite the problem. Clickard told him that we’d done preliminary modeling and had training sets but that we needed data at scale. We couldn’t find data sets that contained variables that we knew helped predict for psychological traits and covered a wide population. It was becoming a major stumbling block. Kogan said that he could solve the problem for us—as long as he could use the data for his research too. When he said that if he was brought onto the America project, we could set up the first global institute for computational social psychology at the University of Cambridge, I was instantly on board. One of the challenges for social sciences like psychology, anthropology, and sociology is a relative lack of numerical data, since it’s extremely hard to measure and quantify the abstract cultural or social dynamics of an entire society. That is, unless you can throw a virtual clone of everyone into a computer and observe their dynamics. It felt like we were holding the keys to unlock a new way of studying society. How could I say no to that?

  In the spring of 2014, Kogan introduced me to a couple of other professors at the Psychometrics Centre. Dr. David Stillwell and Dr. Michal Kosinski were working with a massive data set they’d harvested legally from Facebook. They were pioneers in social-media-enabled psychological profiling. In 2007, Stillwell set up an application called myPersonality, which offered users a personality profile for joining the app. After giving the user a result, the app would harvest the profile and store it for use in research.

  The professors’ first paper on Facebook was published in 2012, and it quickly caught the attention of academics. After Kogan connected us, Kosinski and Stillwell told me about the huge Facebook data sets they’d acquired in their years of research. The U.S. military’s research agency, DARPA, was one of the funders of their research, they said, making them well suited to work with a military contractor. Stillwell was typically muted in our interactions, but Kosinski was clearly ambitious and tended to push Stillwell into keeping the conversation going. Kosinski knew this data could be extremely valuable, but he needed Stillwell to agree to any data transfers.

  “How did you get it?” I asked.

  They told me, essentially, that Facebook simply let them take it, through apps the professors had created. Facebook wants people to do research on its platform. The more it learns about its users, the more it can monetize them. It became clear when they explained how they collected data that Facebook’s permissions and controls were incredibly lax. When a person used their app, Stillwell and Kosinski could receive not only that person’s Facebook data, but the data of all of their friends as well. Facebook did not require express consent for apps to collect data from an app user’s friends, as it viewed being a user of Facebook as enough consent to take their data—even if the friends had no idea the app was harvesting their private data. The average Facebook user has somewhere between 150 and 300 friends. My mind turned to Bannon and Mercer, as I knew they would love this idea—and Nix would simply love that they loved it.

  “Let me get this straight,” I said. “If I create a Facebook app, and a thousand people use it, I’ll get…like 150,000 profiles? Really? Facebook actually lets you do that?”

  That’s right, they said. And if a couple million people downloaded the app, then we’d get 300 million profiles, minus the overlapping mutual friends. This would be an astonishingly huge data set. Up to that point, the largest data set I had worked on was Trinidad, which I thought was quite large, with profiles of one million people. But this set was on an entirely different level. In other countries, we had to get special access to data or spend months scraping and harvesting for populations several orders of magnitude smaller.

  “So how do you get people to download this app?” I asked.

  “We just pay them.”

  “How much?”

  “A dollar. Sometimes two.”

  Now, remember, I’ve got a potential $20 million burning a hole in our firm’s pocket. And these profs have just told me that I can get tens of millions of Facebook profiles for…a million dollars, give or take. This was a no-brainer.

  I asked Stillwell if I could run some tests on their data. I wanted to see if we could replicate our results from Trinidad, where we had access to similar types of Internet browsing data. If the Facebook profiles proved as valuable as I hoped, we would not only be able to fulfill Robe
rt Mercer’s desire to create a powerful tool—what was even cooler was that we could mainstream a whole new field of academia: computational psychology. We were standing at the frontier of a new science of behavioral simulation and I was bursting with excitement at the prospect.

  * * *

  —

  FACEBOOK LAUNCHED IN 2004 as a platform to connect students and peers in college. In a few years, the site grew to become the largest social network in the world—a place where almost everyone, even your parents, shared photos, posted innocuous status updates, and organized parties. On Facebook, you could “like” things—pages of brands or topics, along with the posts of friends. The purpose of liking was to allow users a chance to curate their personas and follow updates from their favorite brands, bands, or celebrities. Facebook considers this phenomenon of liking and sharing the basis of what it calls a “community.” Of course, it also considers this the basis of its revenue model, where advertisers can optimize their targeting using Facebook data. The site also launched an API (application programming interface) to allow users to join apps on Facebook, which would then ingest their profile data for a “better user experience.”

  In the early 2010s, researchers quickly caught on that entire populations were organizing data about themselves in one place. A Facebook page contains data on “natural” behavior in the home environment, minus the fingerprints of a researcher. Every scroll is tracked, every movement is tracked, every like is tracked. It’s all there—nuance, interests, dislikes—and it’s all quantifiable. This means the data from Facebook has increasingly more ecological validity, in that it is not prompted by a researcher’s questions, which inevitably inject some kind of bias. In other words, many of the benefits of the passive qualitative observation traditionally used in anthropology or sociology could be maintained, but as many social and cultural interactions were now captured in digital data, we could add the benefits of generalizability one achieves in quantitative research. Previously, the only way one could have acquired such data would have been from your bank or phone company, which are strictly regulated to prevent access to that sort of private information. But unlike a bank or telecom company, social media operated with virtually no laws governing its access to extremely granular personal data.

 

‹ Prev