Book Read Free

Mindfuck

Page 15

by Christopher Wylie


  Ongoing jokes and memes would be shared about resisting their life sentences and waging a Beta Rebellion or Beta Uprising to fight for the redistribution of sex for the betas. But lurking behind the strange humor was the rage of a life of rejection. In scrolling through these narratives of victimhood, my mind turned back to the narratives of extreme jihadist recruitment media, with the same naïve romanticism of oppressed men breaking the shackles of a vapid society to transform themselves into glorified heroes of rebellion. Likewise, these incels were perversely attracted to society’s “winners,” like Donald Trump and Milo Yiannopoulos, who in their warped view represented the epitome of the same hypercompetitive alphas who brutalized them, to lead the charge. Many of these seething young men were ready to burn society to the ground. Bannon sought to give them an outlet via Breitbart, but his ambition didn’t stop there. He saw these young men as the early recruits in his future insurgency.

  When Cambridge Analytica launched, in the summer of 2014, Bannon’s goal was to change politics by changing culture; Facebook data, algorithms, and narratives were his weapons. First we used focus groups and qualitative observation to unpack the perceptions of a given population and learn what people cared about—term limits, the deep state, draining the swamp, guns, and the concept of walls to keep out immigrants were all explored in 2014, several years before the Trump campaign. We then came up with hypotheses for how to sway opinions. CA tested these hypotheses with target segments in online panels or experiments to see whether they performed as the team expected, based on the data. We also pulled Facebook profiles, looking for patterns in order to build a neural network algorithm that would help us make predictions.

  A select minority of people exhibit traits of narcissism (extreme self-centeredness), Machiavellianism (ruthless self-interest), and psychopathy (emotional detachment). In contrast to the Big Five traits found in everyone to some degree as part of normal psychology—openness, conscientiousness, extroversion, agreeableness, and neuroticism—these “dark triad” traits are maladaptive, meaning that those who exhibit them are generally more prone to antisocial behavior, including criminal acts. From the data CA collected, the team was able to identify people who exhibited neuroticism and dark-triad traits, and those who were more prone to impulsive anger or conspiratorial thinking than average citizens. Cambridge Analytica would target them, introducing narratives via Facebook groups, ads, or articles that the firm knew from internal testing were likely to inflame the very narrow segments of people with these traits. CA wanted to provoke people, to get them to engage.

  Cambridge Analytica did this because of a specific feature of Facebook’s algorithm at the time. When someone follows pages of generic brands like Walmart or some prime-time sitcom, nothing much changes in his newsfeed. But liking an extreme group, such as the Proud Boys or the Incel Liberation Army, marks the user as distinct from others in such a way that a recommendation engine will prioritize these topics for personalization. Which means the site’s algorithm will start to funnel the user similar stories and pages—all to increase engagement. For Facebook, rising engagement is the only metric that matters, as more engagement means more screen time to be exposed to advertisements.

  This is the darker side of Silicon Valley’s much celebrated metric of “user engagement.” By focusing so heavily on greater engagement, social media tends to parasitize our brain’s adaptive mechanisms. As it happens, the most engaging content on social media is often horrible or enraging. According to evolutionary psychologists, in order to survive in premodern times, humans developed a disproportionate attentiveness toward potential threats. The reason we instinctually pay more attention to the blood and gore of a rotting corpse on the ground than to marveling at the beautiful sky above is that the former was what helped us survive. In other words, we evolved to pay keen attention to potential threats. There’s a good reason you can’t turn away from grisly videos: You’re human.

  Social media platforms also use designs that activate “ludic loops” and “variable reinforcement schedules” in our brains. These are patterns of frequent but irregular rewards that create anticipation, but where the end reward is too unpredictable and fleeting to plan around. This establishes a self-reinforcing cycle of uncertainty, anticipation, and feedback. The randomness of a slot machine prevents the player from being able to strategize or plan, so the only way to get a reward is to keep playing. The rewards are designed to be just frequent enough to reengage you after a losing streak and keep you going. In gambling, a casino makes money from the number of turns a player takes. On social media, a platform makes money from the number of clicks a user performs. This is why there are infinite scrolls on newsfeeds—there is very little difference between a user endlessly swiping for more content and a gambler pulling the slot machine lever over and over.

  * * *

  —

  IN THE SUMMER OF 2014, Cambridge Analytica began developing fake pages on Facebook and other platforms that looked like real forums, groups, and news sources. This was an extremely common tactic that Cambridge Analytica’s parent firm SCL had used throughout its counterinsurgency operations in other parts of the world. It is unclear who inside the firm actually gave the final order to set up these disinformation operations, but for many of the old guard who had spent years working on projects around the world, none of this seemed unusual. They were simply treating the American population in the exact same way they would treat the Pakistani or Yemeni populations on projects for American or British clients. The firm did this at the local level, creating right-wing pages with vague names like Smith County Patriots or I Love My Country. Because of the way Facebook’s recommendation algorithm worked, these pages would pop up in the feeds of people who had already liked similar content. When users joined CA’s fake groups, it would post videos and articles that would further provoke and inflame them. Conversations would rage on the group page, with people commiserating about how terrible or unfair something was. CA broke down social barriers, cultivating relationships across groups. And all the while it was testing and refining messages, to achieve maximum engagement.

  Now CA had users who (1) self-identified as part of an extreme group, (2) were a captive audience, and (3) could be manipulated with data. Lots of reporting on Cambridge Analytica gave the impression that everyone was targeted. In fact, not that many people were targeted at all. CA didn’t need to create a big target universe, because most elections are zero-sum games: If you get one more vote than the other guy or girl, you win the election. Cambridge Analytica needed to infect only a narrow sliver of the population, and then it could watch the narrative spread.

  Once a group reached a certain number of members, CA would set up a physical event. CA teams would choose small venues—a coffee shop or bar—to make the crowd feel larger. Let’s say you have a thousand people in a group, which is modest in Facebook terms. Even if only a small fraction shows up, that’s still a few dozen people. A group of forty makes for a huge crowd in the local coffee shop. People would show up and find a fellowship of anger and paranoia. This naturally led them to feel like they were part of a giant movement, and it allowed them to further feed off one another’s paranoia and fears of conspiracy. Sometimes a Cambridge Analytica staffer would act as a “confederate”—a tactic commonly used by militaries to stir up anxieties in target groups. But most of the time, these situations unfolded organically. The invitees were selected because of their traits, so Cambridge Analytica knew generally how they would react to one another. The meetings took place in counties all across the United States, starting with the early Republican primary states, and people would get more and more fired up at what they saw as “us vs. them.” What began as their digital fantasy, sitting alone in their bedrooms late at night clicking on links, was becoming their new reality. The narrative was right in front of them, talking to them, live in the flesh. Whether or not it was real no longer mattered; that it felt real was enough.

/>   Cambridge Analytica ultimately became a digitized, scaled, and automated version of a tactic the United States and its allies have used in other countries. When I first started at SCL, the firm had been working on counter-narcotics programs in a South American country. The strategy was, in part, to identify targets to disrupt narcotics organizations from within. The first thing the firm would do was find the lowest-hanging fruit, meaning people who its psychologists reasoned would be more likely to become more erratic or paranoid. Then the firm would work on suggesting ideas to them: “The bosses are stealing from you” or “They’re going to let you take the fall.” The goal was to turn them against the organization, and sometimes, if a person hears something enough times, they come to believe it.

  Once those initial individuals were sufficiently exposed to these new narratives, it would be time to have them meet one another so that they could form a group which could then organize. They would share rumors, working one another into deeper paranoia. That was when you introduced the next tier: people whose initial resistance to rumors had started to weaken. And this is how you gradually destabilize an organization from the inside. CA wanted to do the same to America, using social media as the spearhead. Once a county-based group begins self-organizing, you introduce them to a similar group in the next county over. Then you do it again. In time, you’ve created a statewide movement of neurotic, conspiratorial citizens. The alt-right.

  Internal tests also showed that the digital and social ad content being piloted by CA was effective at garnering online engagement. Those being targeted online with test advertisements had their social profiles matched to their voting records, so the firm knew their names and “real world” identities. The firm then began to use numbers on the engagement rates of these ads to explore the potential impact on voter turnout. One internal memo highlighted the results from an experiment involving registered voters who had not voted in the two previous elections. CA estimated that if only 25 percent of the infrequent voters who began clicking on this new CA content eventually turned out to vote, they could increase statewide turnout for the Republicans in several key states by around 1 percent, which is often the margin of victory in tight races. Steve Bannon loved this. But he wanted CA to go further—and darker. He wanted to test the malleability of the American psyche. He urged us to include what were in effect racially biased questions in our research, to see just how far we could push people. The firm started testing questions about black people—whether they were capable of succeeding in America without the help of whites, for example, or whether they were genetically predetermined to fail. Bannon believed that the civil rights movement had limited “free thinking” in America. He was determined to liberate people by revealing what he saw as the forbidden truths about race.

  Bannon suspected that there were swaths of Americans who felt silenced by the threat of being labeled “racist.” Cambridge Analytica’s findings confirmed his suspicion: America is filled with racists who remain silent for fear of social shunning. But Bannon wasn’t just focused on his emerging alt-right movement; he also had Democrats in mind.

  While “typical Democrats” talk a good game when it comes to supporting racial minorities, Bannon detected an underlying paternalism that betrayed their professed wokeness. The party, he felt, was full of “limousine liberals”—a term coined in the New York mayoral race of 1969 and instantly seized on by populists to denigrate do-gooder Democrats. These were the white Democrats who supported school busing but sent their own kids to majority-white private schools, or who professed to care about the inner city but lived in gated communities. “The Dems always treat blacks like children,” Bannon said on one call. “They put them in projects…give them welfare…affirmative action…send white kids to hand out food in Africa. But Dems are always afraid to ask the question: Why do those people need so much babysitting?”

  What he meant was that white Democrats revealed their prejudices against minorities without realizing it. He posited that although these Democrats think that they like African Americans, they do not respect African Americans, and that many Democratic policies stemmed from an implicit acknowledgment that those people cannot help themselves. Speechwriter Michael Gerson perfectly encapsulated this idea with a phrase he coined for then–presidential candidate George W. Bush in 1999: “the soft bigotry of low expectations.” According to this argument, Democrats were hand-holders, enablers of bad behavior and poor testing results because they didn’t actually believe that minority students could do as well as their non-minority peers.

  Bannon had a starker, more aggressive take on this idea: He believed the Democrats were simply using American minorities for their own political ends. He was convinced that the social compact that emerged after the civil rights movement, where Democrats benefited from African American votes in exchange for government aid, was not born out of any moral enlightenment, but instead out of shrewd calculation. In his framing, the only way the Democrats could defend what he saw as the inconvenient truths of this social compact was through political correctness. Democrats subjected “rationalists” to social shame when they spoke out about this “race reality.”

  “Race realism” is the most recent spin on age-old tropes and theories that certain ethnic groups are genetically superior to others. Race realists believe, for example, that black Americans score lower on standardized tests not because the tests are skewed, or because of the long history of oppression and prejudice that blacks must overcome, but because they’re inherently less intelligent than white Americans. It’s a pseudoscientific notion, embraced by white supremacists, with roots in the centuries-old “scientific racism” that underlies, among other disasters of human history, slavery, apartheid, and the Holocaust. The alt-right, led by Bannon and Breitbart, adopted race realism as a cornerstone philosophy.

  If Bannon were to succeed in his quest for liberation of his “free thinkers,” he needed a way of inoculating people from political correctness. Cambridge Analytica began studying not only overt racism but racism in its many other incarnations. When we think about racism, we often think of overt hatred. But racism can persist in different ways. Racism can be aversive, where a person consciously or subconsciously avoids a racial group (e.g., gated communities, sexual and romantic avoidance, etc.), and racism can be symbolic, where a person holds negative evaluations of a racial group (e.g., stereotypes, double standards, etc.). However, because the label “racism” can hold such social stigma in modern America, we found that white people often ignore or discount their internalized prejudices and react strongly to any inference that they hold such beliefs.

  This is what is known as “white fragility”: White people in North American society enjoy environments insulated from racial disadvantages, which fosters an expectation among white people of racial comfort while lowering their ability to tolerate racial stress. In our research, we saw that white fragility prevented people from confronting their latent prejudices. This cognitive dissonance also meant that subjects would often amplify their responses expressing positive statements toward minorities in an effort to satiate their self-concept of “not being racist.” For example, when presented with a series of hypothetical biographies with photos, some respondents who scored higher in prior implicit racial bias testing would rate minority biographies higher than identical white biographies. See? I scored the black person higher, because I am not racist.

  This cognitive dissonance created an opening: Many respondents were reacting to their own racism not out of concern about how they may be contributing to structural oppression, but rather to protect their own social status. For Bannon, this was enough to convince him that his theory about Democrats was true—that they just pay lip service to minorities, but deep down they are just as racist as anyone else in America. The difference was who was living in what “reality.”

  * * *

  —

  BANNON ENVISIONED A VEHICLE to help white racists move p
ast all this and become liberated “free thinkers.” In 2005, when Bannon started at IGE, the Hong Kong–based gaming company, the firm employed a factory of low-wage Chinese gamers to play World of Warcraft in order to win items in the game. Instead of trading them, or selling them through the game’s interface, which was allowed, IGE would sell the digital assets to Western players for a profit. This activity was largely seen by other players as cheating, and a civil suit and backlash online against the firm ensued. It’s possible this was Bannon’s early exposure to the rage of online communities; some of the commentary was reportedly “anti-Chinese vitriol.” Bannon became a regular reader of Reddit and 4chan and began to see the hidden anger that comes out when people are anonymous online. To him, they were revealing their true selves, unfiltered by a “political correctness” that was preventing them from speaking these “truths” in public. It was through the process of reading these forums that Bannon realized he could harness them and their anonymous swarms of resentment and harassment.

  This was especially true after Gamergate, in the late summer of 2014, right before Bannon was introduced to SCL. In many ways, Gamergate created a conceptual framework for Bannon’s alt-right movement, as he knew there was an undercurrent populated by millions of intense and angry young men. Trolling and cyberbullying became key tools of the alt-right. But Bannon went deeper and had Cambridge Analytica scale and deploy many of the same tactics that domestic abusers and bullies use to erode stress resilience in their victims. Bannon transformed CA into a tool for automated bullying and scaled psychological abuse. The firm started this journey by identifying a series of cognitive biases that it hypothesized would interact with latent racial bias. Over the course of many experiments, we concocted an arsenal of psychological tools that could be deployed systematically via social media, blogs, groups, and forums.

 

‹ Prev