by Kris Shaffer
70
Chapter 5 | Democracy Hacked, Part 1
But Russia is not the only player. The far-right, populist groups that Russia has
bolstered in the United States and Europe are real. The refugee crisis triggering
the anger and activism of those anti-immigrant populists is also real—though
Russia’s activity in Syria and elsewhere certainly fuel that crisis. These groups
were already having an impact within their own countries before Russia
stepped in and lent them a hand—if they even needed to—and they continue
to have a real influence, irrespective of any help they might get from Russia or
their allies.
And there are other nations using online propaganda to further their
geopolitical aims. Seven authoritarian nations have a budget for influence
operations and propaganda,7 and Twitter recently announced discovery of an
Iranian operation aimed at Western users of their platform. Private companies
from the United States, the United Kingdom, and Israel are certainly major
players. And the GamerGate trolls haven’t gone away, either.
In what follows, we’ll survey this new theater of battle, getting the lay of the
land in this global information war. Who are the players, what battles have
been fought, how have platforms and governments responded, and how we
got to where we ended up in 2016, where Russian influence operations played
a significant, and possibly decisive, role in the election of the president of the
United States.
Ukrainian “Separatists”
There’s no better place to start exploring Russian online disinformation than
Ukraine. According to an exhaustive RAND study on Russian influence
operations in Ukraine, “The annexation of Crimea in 2014 kicked off the
debut of online Russian propaganda on the world stage.” 8 Coordinated with
military operations in Ukraine and worldwide diplomatic operations aimed at
global recognition of the invasion, Russia conducted information warfare both
within Ukraine and without, in order to reinforce the idea of Crimea’s
“Russianness” to Crimeans, Ukrainians, and the larger global community.
The RAND study found several important strains in Russia’s influence
operations around the Crimean invasion. First, they targeted ethnic Russians
in Crimea and eastern Ukraine, many of whom were Soviet-era transplants
7Philip Howard in: “Foreign Influence on Social Media Platforms: Perspectives from Third-
Party Social Media Experts,” U.S. Senate Select Committee on Intelligence, Open Hearing,
August 1, 2018, www.intelligence.senate.gov/hearings/open-hearing-foreign-
influence-operations'-use-social-media-platforms-third-party-expert.
8Todd C. Helmus, Elizabeth Bodine-Baron, Andrew Radin, Madeline Magnuson, Joshua
Mendelsohn, William Marcellino, Andriy Bega, and Zev Winkelman, Russian Social Media
Influence: Understanding Russian Propaganda in Eastern Europe (Santa Monica: RAND
Corporation, 2018), DOI: 10.7249/RR2237, p. 15.
Data versus Democracy
71
and their descendants. Russian-language media still dominates the Ukrainian
information landscape, and they took advantage of this inroad to promote
unity between Russia and ethnically Russian Ukrainians. This had the triple
advantage of promoting the Russianness of Crimea, encouraging pro-Russia
political activity within Ukraine more broadly, and supporting pro-Russia
Ukrainian separatist movements (in conjunction with military support of
those same separatists). Ukraine has responded, in part, by blocking Russian
media access in 2014, but as of 2017, the Russian social media platform
VKontakte was still the most popular in Ukraine, followed by American-
owned Facebook. 9
Russia also targeted Ukrainians with its now standard Euro-skeptic narratives,
lest the EU and/or NATO expand to encompass one of Russia’s chief trade
partners and line itself up on Russia’s southwestern border. Paired with this
Euro-skepticism came the narratives of corruption around Ukrainian political
leadership. Russian outlets often portrayed Ukraine as being governed by
white supremacist fascists.10 Russia, of course, was Ukraine’s stable neighbor
and friend, stalwart of traditional moral, Christian values.
In the “far abroad”—more distant members of the world community, not
sharing a border with Russia or one of its historical Warsaw Pact vassals—
Russia advanced narratives in a variety of languages echoing many of the
pro-Russia and anti-Ukraine messages it shared more locally. While the
goal locally was to bring Ukraine back into Russia’s sphere of influence—if
not its direct control—the goal globally was primarily one of non-
intervention. By advancing narratives of Ukrainian corruption, Russian
virtue, Crimea’s historical and ethnic Russianness, promoting fear of
escalated global conflict, and sowing distrust of the United States, EU, and
NATO, Russia encouraged the global community to accept Crimea’s
annexation, or at least not to intervene.
Perhaps the starkest example of Russia’s anti-Ukrainian influence operations
involves the false narratives around the tragedy that occurred to Malaysia
Airlines Flight 17 in July 2014. While Western intelligence and the Dutch-led
Joint Investigation Team have concluded that Russia and/or Russian-backed
Ukrainian separatists are to blame,11 Russia used the tragedy to smear
Ukrainian forces as immoral and incompetent and Russia as an important
stabilizing force. Along the way, though, multiple contradictory explanations
were advanced by Russian propaganda outlets. Because, for all the value of
9Mariia Zhdanova and Dariya Orlova, “Ukraine: External Threats and Internal Challenges,”
in Computational Propaganda, ed. Samuel Woolley and Philip N. Howard (Oxford: Oxford
University Press, 2018), 47.
10Russian Social Media Influence, p. 104.
11Zhdanova and Orlova, p. 55.
72
Chapter 5 | Democracy Hacked, Part 1
people believing the anti-Ukrainian narrative, Russia had to know they would
be found out. However, by advancing multiple narratives through different
outlets to different audiences, they sow confusion, give the investigators more
to investigate, and prolong the general sense of disorientation. And for many,
that sense of “I don’t know whom or what to believe”—“paralysis through
propaganda”12—will last longer than knowledge of the ultimate finding of
investigators, especially if those findings are delayed by Russian obfuscation
until new stories dominate the news cycle.
It’s important to note that, for all the talk of bots in other locales, bots appear
to have played only small roles in Ukraine. According to Ukrainian media and
propaganda researchers Mariia Zhdanova and Dariya Orlova, “automated
bots seem to be less widespread in Ukraine.” Instead, “manually maintained
fake accounts are one of the most popular instruments for online campaigns”
aimed at the Ukrainian public. 13 This may be due to the relatively low saturation in Ukraine of bot-friendly platforms like Twitter, when compared to platforms
like Facebook, where automation is more difficult and therefore more
/>
expensive.
Though the annexation happened in 2014, and global attention has largely
been turned elsewhere, Russia’s information warfare around Ukraine is
ongoing, even in English. In my research into Russian disinformation, I
regularly encounter new (to me) web sites and Twitter and Facebook
accounts advancing anti-Ukraine narratives and promoting the right to self-
rule of the “separatists” in Donetsk, Luhansk, and Novorossiya. Once these
assets are developed, it is rare that Russia removes them voluntarily—
though they may retask them. Rather, the inexpensive, and at times effective,
operations continue on, until the asset is compromised or deleted by the
platform on which it operates.
Active Measures in the Baltic
Sweden, along with Finland, has long played a significant role as a buffer
between Russia (and formerly the USSR and Warsaw Pact countries) and
NATO. Because of this, Soviet spies were incredibly active in Sweden during
the Cold War, gathering intelligence and conducting operations, in the hopes
of preventing any eastward NATO expansion. 14 In the twenty-first century,
the threat to Russia is less military and likely more economic (despite Russian
statements to the contrary). To get exports, like oil, out of the Baltic Sea,
Russia has to pass between Denmark (NATO and EU member) and Sweden.
12Russian Social Media Influence, p. 9.
13Zhdanova and Orlova, p. 51.
14Kragh & Åsberg, p. 8.
Data versus Democracy
73
Given the economic pressure, including sanctions, that the United States and
NATO have put on Russia’s oligarchy since 2009—and the even stricter
sanctions placed on their geopolitical ally, Iran—the threat of NATO
encroachment via Sweden is real: not the threat of a military invasion, but the
threat of a trade blockade in the Baltic, enforced by or through Sweden.
So when Sweden started considering in 2015 an agreement with NATO that
would allow NATO military forces access to Swedish territories, Russia acted.
In the information space.
In early 2015, state-run propaganda outlet, Sputnik News, launched a Swedish-
language web site. Through that web site, a television network run by RT
(formerly “Russia Today,” an international, Russian-state-owned media outlet),
and numerous covert channels, Russia flooded Sweden with propaganda. 15
This propaganda included standard Russian narratives: anti-NATO messaging,
fear mongering about impending nuclear war (caused, of course, by U.S.-led
NATO action), narratives about how the EU is in decline, and even anti-GMO
and anti-immigration narratives.16 This was, of course, alongside the
dissemination and amplification of false narratives surrounding the Sweden-
NATO agreement.
Researchers Martin Kragh and Sebastian Åsberg studied this Russian influence
operation extensively. They found that, like is common for Russia, “Misleading
half-truths are the norm” and “outright fabrications occur on a limited
scope.” 17 While they did find a number of easily disproven fabrications and
forgeries, most Russian messaging amplified Russia-friendly narratives that
already existed in Swedish culture, particularly from groups on the fringe.
One sequence of false narratives stands out, in particular. Beginning in late
2014, there were several sightings of unidentified foreign submarines on or
near the Swedish coast. These sightings were credible and were reported in
legitimate, mainstream outlets. But these weren’t the first time that strange
things had happened involving foreign submarines on the Swedish coast. Kragh
and Åsberg write:
When the Soviet submarine S-363 ran aground in 1981 on the south
coast of Sweden, a forged telegram soon appeared in media pur-
portedly written by the Swedish ambassador to Washington,
Wilhelm Wachtmeister. The telegram expresses the ambassador’s
15Neil MacFarquhar, “A Powerful Russian Weapon: The Spread of False Stories,” New York
Times, August 28, 2016, www.nytimes.com/2016/08/29/world/europe/russia-swe-
den-disinformation.html.
16Kragh and Åsberg, p. 16.
17Ibid.
74
Chapter 5 | Democracy Hacked, Part 1
profound
disappointment over a secret agreement between
Stockholm and Washington, providing U.S. sub-marines access to
Swedish military bases in wartime. The telegram was immediately
revealed as a Soviet forgery, but its content continued to circulate in
the Swedish debate.18
Russian propagandists capitalized on this, not only spreading false rumors
about the 2014–2015 submarines but explicitly connecting them to the old,
false (yet still believed by many) rumors of secret military arrangements
between the United States/NATO and the Swedish that might anger Russia
and put Sweden on the frontlines of a (nuclear) World War III.
The Sweden-NATO host agreement was ratified in May 2016, but Russian
influence operations continue in Sweden to this day. According to a journalist
who allegedly worked undercover in Russia’s Internet Research Agency (the
ones responsible for Project Lakhta), Russian propagandists had their sights
set on Sweden’s 2018 national election.19 And it only makes sense. The
NATO threat that Russia perceives hasn’t gone away, and if anything, Russian
military pressure seems to be increasing in the Baltic and Scandinavia. 20
There’s no reason to expect Russian information warfare in Sweden to let
up any time soon.
Fancy Bear and the Great Meme War of 2016
Russia is known or suspected to have conducted numerous other campaigns
of information warfare online, but the one that has likely received the most
detailed scrutiny is their effort to influence the 2016 U.S. presidential election.
Russia conducted at least four influence operations around the 2016 U.S.
presidential election aimed at removing Western roadblocks to Russian
geopolitical aims, in particular the removal of Obama-era sanctions against
Russia and Russian oligarchs. All of these operations—and likely others of
which the American public is not (yet) aware—are directed from the
Presidential Administration: Vladimir Vladimirovich Putin and his closest
associates.
18Ibid., p. 9.
19“Journalist who infiltrated Putin’s troll factory warns of Russian propaganda in the
upcoming Swedish election - ‘We were forced to create fake facts and news’,” Jill Bederoff,
Business Insider, published April 7, 2018, https://nordic.businessinsider.com/
journalist-who-infiltrated-putins-troll-factory-warns-of-russian-propa-
ganda-in-the-upcoming-swedish-election---we-were-forced-to-create-fake-
facts-and-news--/.
20“Russia’s growing threat to north Europe,” The Economist, October 6, 2018, www.econo-
mist.com/europe/2018/10/06/russias-growing-threat-to-north-europe.
Data versus Democracy
75
The first influence operation was the work of a team of hackers within Russian
military intelligence (the GRU), a team known as APT28, or Fancy Bear. Fancy
Bear hacked key Democrat targets and ma
de public compromising material to
discredit them and hurt Hillary Clinton’s chances of winning the election. The
second public influence operation was undertaken by the Internet Research
Agency (IRA) on social media platforms like Facebook, Instagram, Twitter,
YouTube, Tumblr, Medium, and others. They created and amplified media that
supported Donald Trump, denigrated Hillary Clinton, and encouraged many
groups on the American political left to vote third party or disengage from
the electoral process. Both of these operations were aimed directly at
American citizens, and took place in public, online.
Two other more covert and more business-oriented operations were
conducted simultaneously. One involved the cultivation of human assets by
building relationships between Russian operatives like Maria Butina and
American public figures and business leaders, mostly (but not exclusively) on
the political right. In the case of Maria Butina, who was convicted for conspiracy
against the United States in 2018, the goal appears to have been to build
solidarity between Russia and American conservatives, possibly in the hopes
of convincing them to eliminate sanctions against Russia if/when they came to
power.21 The other operation saw Russian oligarchs and business leaders
seeking to cultivate financial relationships with Donald Trump, his family, and
his close associates and advisors. This is hardly different from how Putin’s
Presidential Administration relates to Russia’s oligarchs and business leaders.
Russia operates under a quid pro quo relationship between the government
and businesses, enriching those involved in this relationship, and making it
difficult for their competitors to do business, at least on a level playing field.
Where in the United States a corrupt individual may (attempt to) bribe a
public official with a payout in return for favorable legislation, in Russia it is
more common to enter into a business relationship, or joint ownership of an
asset, where the government official enacts favorable legislation or regulation
so that the business thrives, enriching both the corrupt business person and
the corrupt public official. (For a deep dive into these kinds of practices, see
Putin’s Kleptocracy: Who Owns Russia? 22)
These human relationships are of the utmost importance, as they are the end