Book Read Free

Data Versus Democracy

Page 20

by Kris Shaffer


  Americans who held views that, no matter how real and personal, happened to

  align with the Kremlin’s interests. By reinforcing existing social biases and

  personal interests early on, the IRA were able to “hack” the model so that

  users most likely to engage Kremlin-friendly messages saw more of them,

  even if the sources were legitimate.

  This is a situation where the contrast between disinformation as manipulative

  behavior and disinformation as deceptive messages comes into sharp relief. By

  researching and weaponizing existing social biases, the IRA was able to wield

  those earnestly held beliefs in their efforts to manipulate unwitting Americans

  into taking actions they may or may not have otherwise taken. And that was

  true regardless of whether or not the posts and memes were factual.

  No amount of individual fact-checking will solve this problem. This is first and

  foremost a behavioral issue that can only be seen and solved with the analysis

  of data on the large scale: cross-community, cross-platform, over long

  stretches of time. But if platforms, governments, and/or third-party researchers

  conduct that large-scale analysis, and accounts are suspended for this

  coordinated, inorganic, manipulative behavior, then those more sophisticated

  state actors lose many of their best tools for manipulating communities into

  unwittingly doing their bidding. And in the process, we all regain the integrity

  of our information space.

  That integrity doesn’t come easy, especially when we are facing influence

  operations from sophisticated actors like a military or state-sponsored firm.

  As Russia’s operations in particular demonstrate, disinformation is international

  and cross-platform and weaponizes fact and fiction alike. As a result, no one

  entity has all of the necessary data or expertise to solve the problem.

  Platforms don’t have each other’s data, nor the data held by intelligence or law

  enforcement agencies. And they do not have the cultural or linguistic experts

  that the intelligence agencies, universities, and nonprofit think-tanks do.

  Governments don’t have the platforms’ data—and can only collect data under

  certain legal circumstances—nor do they have many employees with significant

  engineering experience. And while third-party researchers have, at times, a

  healthier mix of experts in different areas, they also have less direct access to

  platform data. Just like the U.S. government realized after the September 11

  attacks, information sharing and the collaborative utilization of a variety of

  areas of expertise will be key to addressing the most sophisticated

  disinformation campaigns. And given the legal constraints Western

  governments rightly face when it comes to surveilling their own citizens,

  114

  Chapter 7 | Conclusion

  and the competitive relationship the tech companies have with each other, it

  is likely that carefully structured public-private partnerships will be required

  to address the disinformation defense needs that societies around the globe

  are increasingly encountering.

  But What Can I Do?

  It’s clear that disinformation is a large-scale problem that requires large-scale

  solutions. Platforms, governments, and researchers must all do their part to

  address this rapidly evolving problem collaboratively. But is there anything

  that individuals can do?

  We’ve already addressed a few ways that individuals can alter the inputs that

  big data algorithms use to determine the content that we see on our favorite

  platforms. Mindfulness about what we consume, what we engage, what we

  share, and what we create is absolutely essential, both to fight “fake news”

  and to fight the amplification of social bias and polarizing messaging.

  Similarly, we can limit the data about us that individual algorithms have access

  to. Consuming some content offline, registering for different services with

  different email addresses, using different browsers for different services, even

  paying cash and avoiding buyer rewards programs can keep our personal data

  in different, smaller piles rather than one big one that many platforms can

  access. The result may be less relevant ads (is that really a problem?!), but also

  less effective hyper-targeting with messages designed to manipulate.

  But ultimately, some networked activism and collective political action will

  be necessary. Elected leaders, regulators, and platform executives need to

  be involved in solving the propaganda problem, and to do that they need to

  be motivated—electorally, legally, financially. But for that to work, we don’t

  just need governments and corporations working together, all of us need to

  work together.

  Democracy depends on the free flow of information, both to inform and to

  afford all of us the chance to deliberate and persuade each other. If we can’t

  trust the integrity of the information we consume, we can’t trust the

  democratic process.

  I opened Chapter 6 with a quote from Zeynep Tufekci that bears repeating.

  “Technologies alter our ability to preserve and circulate ideas and stories, the

  ways in which we connect and converse, the people with whom we can

  interact, the things that we can see, and the structures of power that oversee

  the means of contact. ”3 Technologies alter the structures of power. That’s not 3Zeynep Tufekci, Twitter and Tear Gas: The Power and Fragility of Networked Protest (New

  Haven: Yale University Press, 2017), 5.

  Data versus Democracy

  115

  necessarily a bad thing (remember Ferguson), nor is it necessarily a good thing

  (remember GamerGate), but it’s absolutely not a neutral thing. It’s up to us to

  invent, implement, and rein in technologies in ways that bring our vision of

  what the world could be to fruition.

  Too often we are passive, going with the flow, letting the affordances of the

  technology—and its manipulation by more motivated, and often less ethical,

  actors—determine the future we are building. If we want to make a positive

  mark on the universe, we need mindfulness, deliberateness, and collaboration,

  as we step into the next chapter of our history. The problem likely won’t go

  away anytime soon. But that shouldn’t stop us from working to minimize its

  effects and harness the power of new technologies for good.

  I

  Index

  A

  4chan, 57, 58, 60–64

  Abundance, 4–6

  8chan, 57–60

  Activists, 48, 49, 51–53, 55, 58, 61, 64

  Clickbait, 19

  Adaptations, 4

  Clinton, Hillary, 67–69, 75, 77, 80, 81, 84, 86

  Advertising, 20, 27–29

  Cognitive system

  clickbait, 19

  Advertising, targeted, 110

  human brain, 21

  Algorithmic neutrality, 34

  memory, 21

  Algorithmic recommendation, 12–15

  processor, 21

  Algorithmic timeline, 54

  RAM, 20

  STM and LTM, 22

  Amnesty International, 54

  triggers, attention, 22, 23

  Application programming interface (API), 96

  unconscious memory

  Arab Spring, 9
3, 94, 101

  Apple, 27

  characters, 25

  Attentional blink, 109

  emotions, 23, 24

  Attention economy, 10–12

  familiar and unfamiliar things, 25

  iPod, 27

  B

  learn patterns, 26

  Big data algorithms, 114

  life-or-death evolutionary

  history, 24

  Black Lives Matter movement, 47–49, 53, 61

  marketers, 28

  Botnet, 85

  perceptual fluency, 24

  Bots, 48, 58, 59, 95–97

  physical limitations, 24

  Brazil, 95–97

  pleasurable panting, 23

  psychological principle, 27

  Breitbart, 60, 62

  social media and big

  Bronco Bots, 104

  data analytics, 28

  web technology, 29

  C

  Collaborative filtering, 36–40, 43, 110

  Capitalism, 7, 9

  Comey, James, 68, 82

  Carbohydrates, 4, 5

  Confirmation bias, 109

  © Kris Shaffer 2019

  K. Shaffer, Data versus Democracy,

  https://doi.org/10.1007/978-1-4842-4540-8

  118

  Index

  Content recommendation systems

  Feedback loop, 38, 39, 41

  bias amplifier

  Feudalism, 7, 9

  collaborative filtering, 39, 40

  feedback loop, 38–40

  Food scarcity, 6

  polarization, 41

  online advertising, 32

  G

  algorithmic neutrality, 34

  Gamelan, 26

  collaborative filtering, 43

  GamerGate, 55, 59

  Google’s search engine, 32–34

  information consumption, 43

  Genetic code, 5

  machine learning model, 34

  Global watchdog agencies, 99

  optimal engagement, 42

  Google, 32–35, 42

  promote engagement, 42

  stream works, 35–38

  H

  Crap detector, 6

  Homo sapiens, 6

  Crash override network, 59

  Human rights, 101

  Crimea, 70, 71

  Human Rights Watch group, 101

  Currency, 11, 12

  I, J, K

  D

  Ice Bucket Challenge, 50

  Democratic National Committee

  Industrial revolution, 6

  (DNC), 69, 76, 80–82, 86

  Information laundering, 16

  Depression Quest, 56, 58

  Information warfare, 67, 69, 70, 72,

  Digital community center, 52

  74, 82, 83

  Digital revolution

  Internet Research Agency (IRA), 75, 76,

  automobile, 92

  82–85, 87

  Facebook, 94

  Project Lakhta

  frictionless design, 94

  botnet, 85

  industrial technology, 92

  IRA operations, 87

  social structures, 94

  organization, 84

  Duterte, Rodrigo, 97

  L

  E

  Latin American politics

  Economy, 6–9, 12

  account profiles, 104

  Egypt, 93–95

  botnets, 104

  Election, 2016 U.S. presidential, 67, 76

  network of bots or fake accounts, 104

  peace deal, 105

  Emotional preference, 4

  peer-to-peer messaging, 105, 106

  Evolutionary psychology, 4

  threat, 103

  violence, 104

  F

  LGBTQ community, 57, 60

  Facebook, 47, 50, 54, 63, 96

  Long-range acoustic

  Fake news, 5

  devices (LRADs), 47, 50, 51

  Fancy Bear (APT28), 75

  Long-term memory (LTM), 21

  Index

  119

  M

  Project Lakhta, 82–87

  Mainstream media vs. social media, 51

  Propaganda, defined, 15–17

  Market-based capitalism, 7

  Propaganda problem

  big data algorithms, 114

  Market-driven economy, 7

  collaborative filtering, 110

  McKesson, DeRay, 48, 49, 52, 53, 64

  community mirage, 112

  Media, 6, 9–13, 15–18

  confirmation bias, 109

  Media strategy, 98

  disinformation, 110, 113, 114

  individual algorithms, 114

  Mere exposure effect, 24

  media mirage, 112

  Microtargeting, 29

  pluralistic ignorance, 110

  Misinformation, 16

  priming, 109

  public-private partnerships, 114

  Models, 32, 34–38

  social stereotypes, 111

  Moore’s law, 8

  vicious cycle, 111

  Myanmar, 95, 99, 102

  Psychological warfare, 101, 102, 108

  automation, 102

  digital technology, 100

  Q

  ethnic cleansing, 102

  Facebook, 101

  Quinn, Zoë, 56–59, 64

  internet, 100, 101

  local rumor mills, 101

  R

  mass media technology, 100

  Random access memory (RAM), 20

  mobile phone, 101

  violence, 99

  Recommendation engine, 13, 17

  vulnerability, 102

  Reddit, 57, 60, 63, 64

  Responsibility, moral/ethical, 64

  N, O

  Rohingya, 99–102

  Natural selection, 5

  Rousseff, Dilma, 95, 96

  Negative emotions, 4

  Russia, 69–75, 77, 80, 82

  Russian disinformation operations

  P

  active measures in Baltic, 72–74

  Peer-to-peer disinformation, 107

  Crimea, 70, 71

  election, 2016 U.S. presidential, 67

  Perceptual fluency, 24

  information warfare, 67, 69, 70

  Philippines, 97, 98

  Russian [Twitter] bots, 69

  Pizzagate conspiracy theory, 61

  Ukrainian separatists, 70–72

  Pleistocene period, 4

  Russian military intelligence (GRU), 75

  APT28

  Pluralistic ignorance, 93, 110

  DNC, 69, 86

  Polarization, 35, 41, 44

  hacks of Democratic Party, 76, 77

  Police, 49–52, 54, 59–61

  Political campaigns, 95

  S

  Priming, 23

  Said, Khaled, 93

  Product placement, 27

  Scarcity, 17

  Professional development, 92

  Short-term memory (STM), 22

  120

  Index

  Social-justice warrior (SJWs), 55

  U

  Social media, 91, 95

  Ukraine, 69–72

  Social platforms, impact, 64

  U.S. Senate Select Committee on Intelligence

  Sockpuppets, 48, 58, 63

  (SSCI), 82, 83

  Stream, 6

  Subreddits, 57, 58, 63

  V

  Supply and demand, 5–10, 12

  Verificado 2018, 106, 108

  Sweden, 72–74

  Vine, 47, 49, 53

  VKontakte, 71, 85

  T

  W, X

  Tahrir Square, 93

  Weak ties, 110

  Trolls, 47, 56–61

  WhatsApp, 105–107

  Trump, Donald, 68, 69, 75, 80, 81, 83, 86, 87

  Tufekci, Zeynep, 92, 93

  Y, Z

  Twitter, 49
–51, 96, 104

  Yiannopoulos, Milo, 60, 62

  Document Outline

  Contents

  About the Author

  Acknowledgments

  Introduction: From Scarcity to Abundance

  Part I: The Propaganda Problem Chapter 1: Pay Attention How Taste Is Made

  Supply and Demand: Why an Information Economy Is No Longer Sustainable

  If You Don’t Pay for the Product, You Are the Product: Attention as Commodity, Engagement as Currency

  Algorithmic Recommendation: The Cause of, and Solution to, All of Life’s Problems

  Propaganda Defined

  Summary

  Chapter 2: Cog in the System Clickbait: You Won’t Believe What Happens Next! Mapping the Cognitive System

  The Limits of Conscious Attention The Triggers of Attention

  Familiarity Breeds Believability: The Role of Unconscious Memory

  Summary

  Chapter 3: Swimming Upstream What’s New?

  How the Stream Works

  Bias Amplifier

  Letting Your Guard Down

  Summary

  Part II: Case Studies Chapter 4: Domestic Disturbance Crowd Control: How the Tweets of Ferguson Steered Mainstream Media and Public Awareness August 9, 2014: What Happened in Ferguson

  A Movement Emerges as “Leaderless” Activists Organize on Twitter

  Who Decides What Stories Get Told?

  You Can’t Just Quit the Internet: How GamerGate Turned Social Media into a “Real-life” Weapon Zoë Quinn and the Blog Post from Hell

  Antisocial Media: When Domestic Psychological Abuse Tactics Scale Up

  Unprepared: How Platforms, Police, and the Courts (Failed to) Respond

  The Emergence of the Alt-Right

  The Mob Rules -or- Who Decides What Stories Get Told? [redux]

  Summary

  Chapter 5: Democracy Hacked, Part 1 What Happened?

  Meet the New War, Same as the (C)old War

  Ukrainian “Separatists”

  Active Measures in the Baltic

  Fancy Bear and the Great Meme War of 2016 Fancy Bear Crashes the Democratic Party

  How Fancy Bear Got In

  Project Lakhta

  What Now?

  Summary

  Chapter 6: Democracy Hacked, Part 2 A Digital Revolution

  Bots in Brazil

  “Weaponizing” Facebook in the Philippines

  Consolidating Power in Myanmar

  Success in the Latin American Elections of 2018

  Summary

  Chapter 7: Conclusion The Propaganda Problem

  But What Can I Do?

  Index

 

‹ Prev