Book Read Free

Data Versus Democracy

Page 5

by Kris Shaffer


  propaganda problem.

  C H A P T E R

  2

  Cog in the

  System

  How the Limits of Our Brains Leave Us

  Vulnerable to Cognitive Hacking

  In an attention economy, understanding cognitive psychology gives an informer

  (or a disinformer) a major advantage in influencing opinion. What attracts

  attention? How do you hold attention? And how, over time, do you manipulate

  attention in ways that serve your purposes? These are key questions that we

  need to find answers to before we can fully grasp the role technology plays in

  influencing opinion.

  Clickbait: You Won’t Believe What

  Happens Next!

  Don’t you hate clickbait? Those online titles like “This Father Found a

  Rattlesnake in his Infant’s Crib, and You Won’t Believe What Happened Next!”

  or “27 New Dinner Plans that You Need to Try Before You Die!” As cheap

  and as cheesy as they sound, these titles are highly engineered, based on

  research into the clicking habits of millions of users online. Search something

  like “how to write a viral headline,” and you’ll find no limit of marketing how-

  to’s telling you to use odd numbers and phrases that create a sense of urgency.

  © Kris Shaffer 2019

  K. Shaf fer, Data versus Democracy,

  https://doi.org/10.1007/978-1-4842-4540-8_2

  20

  Chapter 2 | Cog in the System

  Why have these titles (and, for that matter, the articles attached to them)

  become so ubiquitous? Clickbait represents, perhaps better than anything

  else, life on the modern web. The abundance of free content and the limits of

  the attention economy have media outlets competing for clicks and the

  advertising dollars they represent. But because even high-quality information

  can be readily found with little effort, or money, many media outlets no longer

  compete to have the best content, but to have the most attention-grabbing

  content.

  That’s because in an attention economy, those who thrive aren’t those who

  control information—the ones with access to the biggest libraries or the best

  content. In an attention economy, those who thrive are those who control

  attention—those who can master their own attention and those who can

  attract, and hold, the attention of others.

  But just what is attention? And why is it that the one, two, or three things

  we’re aware of at any given moment can so readily dominate the massive store

  of memories we have tucked away in our brains?

  Mapping the Cognitive System

  Think of your computer. It stores data in several different places—the hard

  drive, RAM, the processor’s cache—each with its own function, and with its

  characteristic strengths and weaknesses. The hard drive is the largest data

  store. Even my 13-inch laptop can hold a terabyte of data on its hard drive.

  But compared to other data stores, it is also the slowest and least efficient

  when it comes to accessing, writing, and erasing data.

  RAM (random access memory), on the other hand, is considerably smaller,

  but also faster and more flexible. Because of its power, and its limited size,

  applications are constantly vying for access to it, and memory management is

  a key element of optimizing the performance of any app or algorithm.

  Even smaller—and more powerful—is the cache memory that lives on the

  processor itself. This is where the action happens. But it is severely limited

  according to what the processor itself is capable of. That’s because data

  storage means nothing if you can’t do something with it. The computer’s data

  processing capability ultimately sets the limits for what we can do with data.

  This hard drive, RAM, cache, processor model is a helpful analog for the

  human cognitive system. It’s an oversimplification, of course, but it’s a helpful

  starting point, especially if we want to know how the brain interacts with (big)

  data.

  Data versus Democracy

  21

  In many ways, the brain is one big hard drive. There’s a reason that we refer

  to a computer’s data storage as “memory,” after all. Both the brain and the

  hard drive store information in small, hierarchically organized units, which are

  activated, transferred, and copied via electrical signals. Some cognitive

  scientists even use the language of “bits” to describe the information our

  brains process. 1

  But rather than having separate organs for large/slow storage, fast/efficient

  storage, and data processing, the brain does it all in one massively complex,

  and flexible, organ. However, the brain’s capabilities are separated analogously

  to how a computer works. (Well, technically, it’s the computer that works like

  the brain.) The largest share of the brain’s storage capability is less efficient

  and not directly accessible to the “processor.” This is typically called long-term

  memory (LTM)—where we keep memories of life events ( episodic memory),

  physical skills and processes ( procedural memory), and important information

  like the identity of friends or the meaning of words ( semantic memory).2 It’s massive. Each human brain contains the informational complexity of the entire

  universe of stars, planets, and galaxies. 3 But we can’t access it all at once.

  That’s just too much for the brain to handle. We need something to help us

  manage all of that data.

  The Limits of Conscious Attention

  That’s where working memory comes in. Working memory is the combination

  of what scientists call the “central executive” (the brain’s CPU) and several

  independent resources for short-term, high-efficiency storage, often called

  short-term memory (STM—the brain’s RAM). 4

  Think of it this way. Every memory ever formed—every event you’ve

  experienced, every word you know, every friend and family member, every

  skill you’ve formed—is stored somewhere in your long-term memory (LTM).

  Because it takes energy (in the form of electrical signals) to access these

  memories, only some of those memories are “activated,” like an application

  launched and ready to use, or books taken off the shelf and ready to be read.

  1This is based on the “Shannon–Weaver equation,” developed by Claude Shannon and

  Warren Weaver in The Mathematical Theory of Communication, Urbana, Ill.: University of

  Illinois Press (1949).

  2Alan D. Baddeley, Human Memory: Theory and Practice, East Sussex: Psychology Press

  (1997), p. 29ff.

  3Christof Koch and Patricia Kuhl, “Decoding ‘the Most Complex Object in the Universe’,”

  interview by Ira Flatow, Talk of the Nation, NPR, June 14, 2013, www.npr.org/2013/06/

  14/191614360/decoding-the-most-complex-object-in-the-universe.

  4Alan D. Baddeley, Human Memory: Theory and Practice, East Sussex: Psychology Press

  (1997), p. 29ff.

  22

  Chapter 2 | Cog in the System

  This is short-term memory (STM). When something is activated and placed in

  STM, it’s only there for a short amount of time. That is, unless we do extra

  work to keep it alive. (Cognitive musicologists call that work “rehearsal,” but

  it’s actually a lot like renewing a librar
y book before it’s due.) When memories

  are in STM, we can do more with them—put them in order, forge relationships

  between them, and build higher-level groupings of them (called schemas). But

  only some of what we’ve called into STM forms conscious awareness—the

  small amount of information being processed right now. That’s the information

  that makes up the focus of our attention.

  The Triggers of Attention

  Attention is incredibly expensive. Such a high level of neural activation, not to

  mention the processing that goes on, requires large amounts of energy. That

  energy comes at a premium. And so the brain keeps all but a tiny window of

  time and data at a lower state of activation, away from the costly power of

  consciousness.

  Of course, such an efficient and orderly system requires structure—rules that

  decide what we pay attention to, and when. When it comes to external

  stimuli—sights, sounds, smells—there are many rules of prioritization based

  on millennia of natural selection.

  As we’ve already discussed, our genetic code was largely written by our

  ancient ancestors as they fought for survival.5 Thus, things that meant life or

  death on the savannah 150,000 years ago are more likely to command our

  attention today. If someone sneaks up behind you and makes a loud, sudden

  noise, you’ll jump, suck in extra oxygen (gasp), your hair will stand on end (the

  vestige of our ancestors making themselves look bigger and fiercer than they

  are), and your heart will speed up (preparing to deliver that extra oxygen to

  your muscles, whether they need it for fight or for flight). Perhaps surprisingly,

  if they sneak up on you again, but this time they tell you first, some of those

  reflexes will still kick in, even though you knew it was coming. Some of our

  ancestors survived because these reflexes were so fast and reliable that the

  brain could never turn them off. (In fact, some of these reflexes, called “spinal

  reflexes, ”6 are directed by the spinal cord, because the speed-of-light trip of

  our nerve impulses all the way to the brain and back simply takes too long!)

  And those ancestors who survived passed their saber-tooth-tiger-evading

  genes onto us.

  5Jerome H. Barkow, Leda Cosmides, and John Tooby (eds), The Adapted Mind: Evolutionary

  Psychology and the Generation of Culture, New York, NY: Oxford University Press (1992).

  6James Knierim, “Spinal Reflexes and Descending Motor Pathways,” in Neuroscience Online,

  University of Texas McGovern Medical School, accessed February 8, 2019, https://nba.

  uth.tmc.edu/neuroscience/s3/chapter02.html.

  Data versus Democracy

  23

  Other memories—those less critical to our ancestors’ survival—can be

  recalled on purpose. Who is that? What’s their phone number? I came into

  this room for something, what was it?

  Still others feel more natural, more automatic. We may have to actively recall

  our new colleague’s name on their second day of work, but it doesn’t take any

  work at all to remember our spouse’s, partner’s, or child’s name. That’s

  because inside our “big data” brain lies the world’s most advanced predictive

  analytics engine. Data we access often remains more “activated” than data

  seldom accessed, like the spices that always end up in the front and center of

  the cabinet, despite our attempts to alphabetize. Memories that are related

  to other memories we’ve already brought to a higher state of activation will

  likewise raise in activation. This is why we might not remember that password

  or that phone number out of the blue, but when we sit at the keyboard or

  pick up the phone, it just comes to us. Cognitive scientists call this priming.

  Priming is an essential part of how our brain manages our memories. It

  anticipates our needs, delivers what we need in time to respond quickly (just

  in case), and it does so while keeping energy usage low. But because priming

  is such a key element of how we manage our own attention, it can also be

  used by others, in combination with hard-wired evolutionary rules for

  processing stimuli, to command our attention and to control, at least in part,

  the way we think about the world.

  Familiarity Breeds Believability: The Role

  of Unconscious Memory

  We are constantly evaluating everything we perceive. Is it safe or dangerous?

  Is it good or bad? Is it exciting or boring? Did we expect it, or was it a

  surprise? These evaluations feed into what we experience as emotions. When

  something is surprising and dangerous, the result is fear. If that surprising and

  dangerous perception holds our attention for a long time, the result is terror.

  If, on the other hand, that surprising thing turns out to be not very dangerous

  at all, the result might be laughter—or as cognitive scientist David Huron calls

  it, “pleasurable panting” (what you do with all that oxygen your immediate

  fear responses sucked in).7 If something is predictable and harmless, the result

  might be boredom, especially if that something keeps going on for a long time

  (think of a concert or a lecture you found uninteresting, but had no way to

  escape). On the other hand, if something is difficult to predict, but also

  harmless, the result can be disorientation, and if it goes on for a long time,

  7David Huron, Sweet Anticipation: Music and the Psychology of Expectation, Cambridge,

  Mass.: The MIT Press (2006), p. 26.

  24

  Chapter 2 | Cog in the System

  frustration.8 This is the core of the creepy-crawly feeling some people get

  when listening to avant-garde music, or the repulsion they feel when they

  experience other avant-garde art forms. It’s just music, or a painting, or a

  sculpture, but the fact that it arrests our attention so strongly, for so long, but

  the brain doesn’t know how to make sense of it, that’s a very bad thing,

  evolutionarily speaking. And it makes perfect sense that our ancestors would

  have evolved an emotional response to such situations that would cause them

  to try and avoid them in the future.

  Emotion can be a very complex thing. Entire books have been written about

  the cognitive science of emotion. But for our purposes in understanding how

  humans interact with online media, we’ll cover just a few key elements. Things

  that affect our ability (or inability) to recognize disinformation, misinformation,

  and “fake news” online, as well as things that promote social polarization.

  We’ll start with something simple, yet powerful: the mere exposure effect.

  Because of our evolutionary past, we are always assessing whether something

  we encounter or perceive—a stimulus—is positive or negative. (Scientists call

  the positivity or negativity of a stimulus its valence.) Our life-or-death

  evolutionary history means that we make some of these assessments very

  quickly. These snap judgments can save our lives, but if they happen

  unconsciously in certain social settings, they can form the stuff of

  overgeneralizations, stereotypes, and even lead to sexism and racism.

  One factor that contributes to a positive affect when we encounter a stimulus

  is the ease with which our brain can process
what it perceives—called

  perceptual fluency. If the brain can process, understand, and evaluate the

  stimulus quickly and without difficulty, the result is a positive affect or emotion.

  This is why, for example, notes right next to each other on the piano (and the

  cochlea) are “dissonant” rather than forming a pleasant chord. 9 Things that

  make it harder for the brain to process—similar colors, sounds, or

  appearances—make it hard to distinguish perceived items and place them into

  categories, the way the brain likes.

  Differences in color, musical pitch, and the like are, for the most part, hard-

  wired differences. We may be able through artistic and musical training to

  make finer judgments than the average person, but not much finer. There are

  physical limitations in our inner ear and on our retina that determine how

  finely we can develop our perceptual abilities.

  But other differences are learned. We learn language, bodily gestures, faces,

  and skills through repeated exposure. Because the brain conserves energy by

  8Patrick Colm Hogan, Cognitive Science, Literature, and the Arts, New York: Routledge

  (2003), 9–11.

  9R. Plomp and J. M. Levelt, “Tonal Consonance and Critical Bandwidth,” in Journal of the

  Acoustical Society of America 37 (1965), pp. 548–60.

  Data versus Democracy

  25

  preactivating parts of memory that it anticipates us needing, and because it’s

  more likely to anticipate something that we encounter regularly than

  something we encounter rarely (let alone something entirely new), the brain

  preactivates the resources needed to parse familiar things far more readily

  than the resources needed to parse unfamiliar things. As a result, the brain

  processes familiar things faster and more easily than it processes unfamiliar

  things. Finally, since fast, easy processing is associated with a positive reaction,

  familiar things tend to lead to more positive emotions than unfamiliar things,

  all else being equal.

  This makes sense. When a piece of music ends the way we might expect, that

  feels good. No surprises. Everything is in its right place. And that’s true even

  when we don’t know enough music theory to explain what “should” happen

 

‹ Prev