You are here
Most of the investigations into Russian interference in the US electoral campaign and the alleged collusion between Donald Trump’s presidential electoral campaign and Russian elements are now over; the debate will continue to rage in the US media, and questions will endure about how Special Counsel Robert Mueller discharged his duties. Still, the report is out, and most of the opinions now expressed are largely along well-established partisan lines. Yet a less-discussed but nevertheless critical aspect of this episode is what it reveals about the broader details of Russia’s disinformation campaigns on social media. The Russian Internet Research Agency (IRA), which is accused of running a ‘Troll Farm’, generated most of the hostile activity related to the US elections. It had two goals in the run up to the 2016 American ballots: a broader goal of ‘sowing discord in the in the U.S. political system’, and more narrowly to successfully elect Donald Trump as president, through a wider series of operations known as ‘Project Lakhta’.
Russia has long felt the West has been using the internet to influence Russia’s youth and, in Moscow’s view, the West simply got a taste of its own medicine. This operation was almost certainly seen in Moscow as defensive, notwithstanding its offensive character, being implemented to expose the hypocrisy Russia saw in Western societies and their discourse about democracy.
And the operation was long in the planning. In 2014 the Russian IRA consolidated its US ‘specialists’ into its ‘Translator’ department and started creating fictious personas on social media networks; these would be imaginary individuals sharing profile images created in-house at the Russian IRA.
The fictious individuals, in turn, produced written posts about political topics, and images designed to go viral within their political demographic target area; Russians had used a similar strategy that same year during their war which led to the seizure of Crimea from Ukraine.
Though total numbers of such imaginary actors are hard to gauge, a 2018 US Senate report indicates that the Russian IRA had managed to reach extensive audiences, as ‘posts on Instagram received 187 million engagements and that Russian IRA posts on Facebook received almost 77 million engagements’.
Mueller’s report pushes these estimates higher; up to 126 million users on Facebook engaged with or actively helped spread Russian propaganda, including Donald Trump, who on 19 September 2017 tweeted from his personal account to a Russian propaganda account, @10_gop. The Russians had tweeted him ‘We love you, Mr. President!’ He replied: ‘I love you- and there is no question - TOGETHER , WE WILL MAKE AMERICA GREAT AGAIN!’.
Of course, Mueller does not suggest that President Trump knew this was a Russian propaganda account when he replied to it.
Either way, these fictitious personas would start a pattern of interactions with American citizens, reaching out to and contacting them personally. Once they started to build a relationship they would inform US citizens of rallies they were planning to hold. These were deliberate ruses, similar to ‘spearfishing’ attacks, which are more typically reserved for hacking. They required specialist researching and tracking of individuals before making contact. The purpose was to mobilise people for various rallies and other activities, and the ruses often succeeded in attracting hundreds of participants.
In the past the Russian IRA has set up ‘help hotlines’ for people ‘struggling with sexual behavior’ in order to use the information provided to them as blackmail. They have had some success with such traps.
In 2015, Moscow’s scope widened; apart from creating fake accounts for individuals, Russia started creating ‘grassroots’ organisations. Their goal, broadly, was to sow discord. This applied to both the far left and far right of the US political spectrum. A series of ‘hard left’ groups, mostly centered on race relations in the US with various names such as ‘Blacktivist’, ‘Black Matters’ and ‘Don’t Shoot Us’ were promoted; so were pro-Muslim groups, such as ‘United Muslims of America’.
New Knowledge, a think tank, noted in its 2018 report that ‘some of the most sophisticated IRA efforts on Facebook and Instagram specifically targeted Black American communities’ and that the Russian IRA was able to create a sprawling mass that mixed fake and real news, to create a believable but false news system designed to exploit organic protest movements.
These groups could draw from one another; if a Russian propaganda promoting left-wing issues posted something controversial, it could be reposted to a far-right group as evidence of Black or Muslim radicalisation.
The process could be infinitely repeated, with both political sides feeding off of one another. As well as simply posting reactionary content, the Russians also generated ‘counter protests’ to protests they had set up, another example of their infinite flexibility and flair to read local situations.
It is likely that many of these activities were designed to show ‘proof of concept’, to justify to the political leadership of the Russian IRA that the entire effort was worthwhile, and that it was producing the expected results.
And the costs, for the advertising at least, were very low. The various Russian-sponsored groups took out approximately 3,500 adverts on Facebook and spent just over $100,000 doing so, a small amount compared to the large media ‘footprint’.
The high rate of content production obscures the fact that much of it appears to be low quality; many of the images were hastily put together and the copy is sometimes sloppy. But to a large extent this did not matter as the content was amplified by a Russian-created ‘bot network’.
Unlike the specialists, which are real agents of the Russian IRA, the purpose of a bot network is not to create new content or contact American citizens. Rather, it is designed to ‘repost’ or share content with a wider audience. Such networks are often easier to detect and can be removed by social media companies, but they are cheaper to maintain.
Overall, the total cost of the operation would have been relatively high and must have enjoyed support from the top of the political leadership in Moscow. For, while the advertising spend of $100,000 may seem low (for context Marvel spent £153.5 million on adverts recently to promote its latest film), the overall cost of the Russian operation is likely to have been much higher, although still trifling if compared with the outcome.
And there is no question that this was an operation to which Moscow devoted a great deal of attention. Russian agents, engaging in online accounts, coming into the office every day, creating fictitious personas, going online to engage in local issues in US politics, over years, to create an entirely plausible narrative. It requires an ability to be both responsive to local American politics and creative in how it is spun to fit a narrative.
No space online is immune from such operations. Even ‘architecture Twitter’, where architects on Twitter discuss buildings they enjoy has become a meta political discussion on both sides; classical buildings, it seems, are linked with ideas of white cultural hegemony, and brutalist architecture with, allegedly, the failings of immigration. Alt-right commentators, such as Lauren Southern, make links between cultural and architectural traditions and anti-immigration, which they see as disrupting cultural hegemony. The Russian IRA can use any online space, in any discussion, and slowly pivot towards political ideas and make links to ‘far right identity politics’.
To a large extent, therefore, a Russia-led operation to disrupt the US elections would have happened, even if Trump had not won the Republican nomination. The US would have likely seen a different, but equal in size and scope, disinformation campaign. There was extensive Russian interest in information warfare since before Trump, and there will only be more of an appetite for it in 2020.
Tom Ascott is the Digital Communications Officer at RUSI.
BANNER IMAGE: Russia has used some of the most prominent social media groups on the web to create false narratives. Courtesy of Pixabay
The views expressed in this Commentary are the author’s, and do not necessarily reflect those of RUSI or any other institution.