Data Versus Democracy

Home > Other > Data Versus Democracy > Page 15
Data Versus Democracy Page 15

by Kris Shaffer


  least enthusiasm—among those same potential Clinton supporters on her

  left and in the middle. As Jeff Stein wrote for Vox:

  None of the Podesta emails has so far actually broken any fresh

  scandals about the woman on track to be the next president. Instead,

  they’ve mostly revealed an underbelly of ugliness to the multiple

  Clinton controversies that we’ve already known about: the ques-

  tionable relationship between the Clinton Foundation and its

  donors, Clinton’s ease with powerful interests on Wall Street, her

  ties to wealthy campaign contributors. 31

  The Clinton Foundation controversies were likely particularly damaging to

  Clinton’s potential support among centrists turned off by Trump. Many of

  them had voted for Republicans in the past and were dissatisfied by the GOP’s

  nominee. But narratives of corruption around her, combined with Republican

  prejudices against the Clinton name, likely made it difficult for a number of

  moderates and dejected Republicans to vote Democrat in 2016. The ties to

  Wall Street and wealthy donors likely hurt Clinton’s chances with former

  Sanders supporters, who had been calling for Clinton to release the content

  of the private speeches she had been paid to give to bankers and other wealthy

  Americans in the past. Some of that content was awkward or embarrassing,

  but some of it also undercut Clinton’s ability to garner support among those

  to the left of her campaign’s policy platform.

  30Ibid.

  31Jeff Stein, “What 20,000 pages of hacked WikiLeaks emails teach us about Hillary

  Clinton,” Vox, published October 20, 2016, www.vox.com/policy-and-politics/2016/

  10/20/13308108/wikileaks-podesta-hillary-clinton.

  82

  Chapter 5 | Democracy Hacked, Part 1

  I also think it is likely that the months-long obsession with “Clinton’s emails”

  compounded the impact of FBI Director James Comey’s announcement on

  October 28, 2016, that he was reopening the investigation into Clinton’s use

  of a private email server while Secretary of State. That investigation was

  reopened due to evidence that surfaced during the investigation of former

  Congressman Anthony Weiner, whose estranged wife, Huma Abedin, was a

  top Clinton aide.32 Comey was investigating the possibility that Clinton’s

  private email server may have contained classified information, not the content

  of emails published by Russia or WikiLeaks, but a fast-moving news cycle

  replete with vague references to “Clinton’s emails” likely meant that these

  news stories compounded each other’s impact on voters.

  Perhaps most significantly, though, is that the GRU’s release of kompromat on

  Clinton and the DNC meant that Russia got to set a significant portion of the

  agenda of the presidential election. Journalists pored over leaked content looking

  for scoops, which they reported. Those same journalists posed questions

  about the content of these documents in debates. And the candidates faced

  questions about these documents all along the campaign trail, from both

  journalists and voters. Every question about this Russian kompromat—

  however legitimate a concern that question represented—was a question

  that didn’t address the policy platforms set forth by the campaigns. The hours

  spent chasing down security concerns and prepping answers to these questions

  were hours not spent by the campaigns building and defending their own

  platform. To a nontrivial extent, Russian hackers determined the issues that

  American voters would devote their limited attention to, and on which they

  would determine how to cast their votes on November 8. My colleagues and

  I call this manipulation of public messaging “weaponized truth.” The contents

  of Fancy Bear’s data dumps appeared authentic to security researchers—they

  were not “fake news.” But they were meant to manipulate public discourse

  and individuals’ behavior. Whether or not we call that “disinformation” (I do,

  because disingenuous manipulation is at the core of the operation), it is

  certainly a component of information warfare—and because of its basis in

  facts, it can be an incredibly effective one.

  Project Lakhta

  Of course, Russia did not just manipulate American public discourse via the

  release of hacked and leaked documents. Their Internet Research Agency

  (IRA) also conducted information warfare against the United States through

  an expert social media manipulation campaign. The details of the IRA’s

  extensive operations are detailed primarily in indictments issued in 2018 by

  32Adam Goldman and Alan Rappeport, “Emails in Anthony Weiner Inquiry Jolt Hillary

  Clinton’s Campaign,” The New York Times, published October 28, 2016, www.nytimes.

  com/2016/10/29/us/politics/fbi-hillary-clinton-email.html.

  Data versus Democracy

  83

  the U.S. Department of Justice against key Russian individuals and LLCs and in

  two reports prepared for the U.S. Senate Select Committee on Intelligence

  (SSCI) and released in December 2018, providing a detailed analysis of data

  provided to SSCI by Twitter, Facebook (including Instagram), and Google

  (YouTube). 33

  The picture portrayed in these documents—which is likely still incomplete—

  is one of large-scale, expert manipulation of public attention through a

  combination of “weaponized truth,” partial truths, flat-out lies, and voter

  suppression narratives, aimed at the election of Donald Trump as president of

  the United States and discrediting the potential presidency of Hillary Clinton

  and the democratic process in general. Further, after the election, the IRA

  continued to attempt to manipulate and destabilize American society, even

  increasing their activity aimed at certain American communities on platforms

  like Instagram. And though the data currently available suggests that

  government and platform activities have significantly hindered the IRA’s ability

  to wage social media–based information warfare, it is also clear that Russian

  groups are still attempting to manipulate public opinion and discredit their

  critics via U.S.-targeted online media.

  While Russian influence operations go back much further, even in the United

  States, the now-famous operation aimed at manipulating the 2016 election

  ratchets up in April 2014 with the formation of the IRA’s “translator project,”

  aimed at studying U.S. social and political groups on online platforms, including

  YouTube, Facebook, Instagram, and Twitter. In May 2014, the strategy for this

  operation, known internally as “Project Lakhta,” was set: “spreading distrust

  towards the candidates and the political system in general”34 with the objective

  of interfering in the 2016 U.S. election, specifically. 35 By June 2014, IRA

  operatives were already conducting in-person intelligence gathering physically

  in the United States (with a subsequent trip in November 2014).36 By

  September 2016, Project Lakhta’s monthly budget was approximately 1.25

  million dollars. 37

  According to the IRA indictment:

  Defendants and their co-conspirators, through fraud and deceit,

  created hundreds of social med
ia accounts and used them to develop

  certain fictitious U.S. personas into “leader[s] of public opinion” in

  the United States.

  33Full disclosure: I coauthored one of those reports.

  34United States of America v. Internet Research Agency, LLC, et al., www.justice.gov/

  opa/press-release/file/1035562/download.

  35Ibid., p. 12.

  36Ibid., p. 13.

  37Ibid., p. 7.

  84

  Chapter 5 | Democracy Hacked, Part 1

  ORGANIZATION employees, referred to as “specialists,” were

  tasked to create social media accounts that appeared to be oper-

  ated by U.S. persons. The specialists were divided into day-shift and

  night-shift hours and instructed to make posts in accordance with

  the appropriate U.S. time zone. The ORGANIZATION also circu-

  lated lists of U.S. holidays so that specialists could develop and post

  appropriate account activity. Specialists were instructed to write

  about topics germane to the United States such as U.S. foreign pol-

  icy and U.S. economic issues. Specialists were directed to create

  “political intensity through supporting radical groups, users dissatis-

  fied with [the] social and economic situation and oppositional social

  movements. ”38

  This included focusing messaging around themes like immigration, Black Lives

  Matter and police brutality, Blue Lives Matter, religion, and regional secession,

  among others. These topics were guided through internal documents provided

  to IRA “specialists” to use as the basis of their content, and in September

  2016, one internal memo stressed that “it is imperative to intensify criticizing

  Hillary Clinton” in advance of the November election. 39

  The internal documents made public by the Department of Justice in their

  indictments of key IRA officials and shell companies only provide a small

  window into their actual operations targeting Americans, though. To really

  see what they did and how their content spread, the Senate Select Committee

  on Intelligence commissioned two groups to analyze private data provided to

  the Senate by Twitter, Facebook, and Google and report their findings. Several

  of my colleagues and I had the honor of contributing to one of those reports.

  While those datasets were all missing key metadata (and, I believe, further

  examples of IRA and other Russian agencies’ U.S.-directed propaganda), they

  exhibit a massive and professional operation that far exceeds the initial

  statements made by platform executives. It is impossible to quantify how

  many votes this campaign may have changed, or at least influenced, but it is

  impossible to deny that this operation, in conjunction with Fancy Bear’s work,

  was a significant factor in the tone of the election, the issues that took center

  stage in public discourse, and the media coverage around the election. All of

  this together certainly influenced some votes and has since cast doubt on the

  legitimacy of the 2016 election and fear about foreign influence of previous

  and subsequent elections.

  But just what did their operation look like? And how pervasive was it?

  38Ibid., p. 14.

  39Ibid., p. 17.

  Data versus Democracy

  85

  To start with, IRA influence operations around the 2016 U.S. election hit

  every major platform, and even some minor ones. In addition to Facebook,

  Instagram, Twitter, and YouTube, evidence of IRA operations has surfaced on

  Google Plus, Vine, Gab, Meetup, VKontakte, LiveJournal, Reddit, Tumblr,

  Pinterest, Medium, and even Pokémon Go.40 That’s to say nothing of the

  world-wide web, where the IRA (and other branches or contractors of the

  Russian government) have web sites, blogs, and pro-Kremlin “think-tank”

  journals. This network of IRA web assets was “run like a sophisticated

  marketing agency” with dozens of real people posting, sharing, retweeting,

  and commenting on each other’s memes, blogs, and tweets. As my colleagues

  and I wrote in our report for SSCI, “it was far more than only $100,000 of

  Facebook ads, as originally asserted by Facebook executives. The ad

  engagements were a minor factor in a much broader, organically-driven

  influence operation. ”41 The overall budget for Project Lakhta exceeded $25

  million, 42 which primarily went to paying employees to create not ads but

  organic content: tweets, posts, memes, videos, events, all shared from user

  accounts and pages belonging to fake personas and groups carefully crafted by

  IRA “specialists.” Overall, IRA content pre- and post-election reached an

  estimated 126 million Americans on Facebook, 20 million on Instagram, and

  1.4 million on Twitter. This was no tiny operation.

  It’s also important to note that the IRA, by and large, did not operate a

  network of automated accounts, known as a botnet. IRA employees were

  expected to meet daily quotas of organic posts, comments, shares, and likes.

  These were mainly human-operated accounts that sought to “weave

  propaganda seamlessly into what appeared to be the nonpolitical musings of an

  everyday person.” 43 Thus, they had employee shifts that lined up with U.S. time

  zones (see the DoJ indictment previously discussed) and a system of bonuses

  and fines that encouraged individual “specialists” to produce high-engagement

  content.44 To appear even more like real Americans, IRA “specialists” played

  hashtag games45 and posted a high volume of nonpolitical content.

  Project Lakhta involved far more than pro-Trump and anti-Clinton messages

  blasted into the ether in the hopes of reaching an audience. In fact, election-

  related posts only accounted for 7% of IRA Facebook content, 18% of

  40Renee DiResta, Kris Shaffer, Becky Ruppel, David Sullivan, Robert Matney, Ryan Fox,

  Jonathan Albright, and Ben Johnson, “The Tactics and Tropes of the Internet Research

  Agency,” New Knowledge, published December 17, 2018, https://disinformationre-

  port.blob.core.windows.net/disinformation-report/NewKnowledge-

  Disinformation-Report-Whitepaper-121718.pdf, p. 5.

  41Ibid., p. 33.

  42Ibid., p. 6.

  43Adrien Chen, “The Agency,” The New York Times Magazine, published June 2, 2015, www.

  nytimes.com/2015/06/07/magazine/the-agency.html.

  44Ibid.

  45“The Tactics and Tropes of the Internet Research Agency,” p. 13.

  86

  Chapter 5 | Democracy Hacked, Part 1

  Instagram content, and 6% of Twitter content. 46 Rather, the IRA created

  what my colleagues and I call a “media mirage”—a false, interconnected,

  multiplatform media landscape, targeting multiple different communities

  with deceptive, manipulative messaging. 47 This “mirage” included a significant

  portion of apolitical content, and where the content was political, it was

  often focused on current divisive social issues that energized (or

  de-energized) members of different communities, rather than specific

  candidates. This mirage targeted three general communities—right-leaning

  Americans, left-leaning Americans, and African Americans—as well as more

  hyper-targeted subcommunities like pro-secessionist Texans, democratic

  socialists, evangelical Christian
s, etc. And this mirage targeted them with

  real news, fake news, disingenuous conversation, and—likely most

  significant—meme warfare.

  The IRA had done their homework—both online and on the ground in the

  United States—when it came to targeting American communities. (And they

  constantly retooled their messaging based on user engagement stats, just like

  one would expect from a digital marketing firm.) In many cases, they targeted

  communities with specific messages tailored for that community, which, of

  course, fit the Kremlin’s agenda. For example, they targeted right-wing

  Americans with narratives that would get them energized to come out and

  vote against democratic or moderate candidates—fearmongering narratives

  about immigration and gun rights, inspirational Christian-themed narratives,

  and warnings about Clinton’s alleged corruption. They targeted left-leaning

  Americans with narratives that would de-energize them, turning their Clinton

  support lukewarm or encouraging a third-party or write-in protest vote—

  narratives about Clinton’s alleged corruption, the DNC’s undemocratic

  primary process that denied Bernie Sanders a fair shot, feminist and

  intersectional narratives that labeled Clinton a bad feminist. And they targeted

  African Americans with even more poignant voter-suppression narratives

  about police brutality or the racist tendencies of both parties that were

  intended to turn them off from the democratic process in general. The

  ultimate goal was to encourage votes for Trump and—if not possible to flip a

  leftist to vote Trump—to depress turnout from Democrats and from

  demographics that tended to vote Democrat. And they pursued this goal

  more through general social and political narratives than through posts that

  referenced the candidates specifically.

  Notice, though, that most of these narratives are about distrust and fear—

  not positive American virtues that happened to be consistent with Russian

  aims. Yes, there were some high-engagement patriotic accounts and pro-

  Christianity accounts, but overall, even these accounts were about creating

  an insider/outsider framework that could be operationalized in other

  46Ibid., p. 76.

  47Ibid., p. 45.

  Data versus Democracy

  87

  contexts with narratives of fear or anger about outsiders. Many of the pro-

 

‹ Prev