That Uplifting Tweet You Just Shared? A Russian Troll Sent It

naptime

Well-Known Member
Here’s what Russia’s 2020 disinformation operations look like, according to two experts on social media and propaganda.
By
DARREN LINVILL
&
PATRICK WARREN

Internet trolls don’t troll. Not the professionals at least. Professional trolls don’t go on social media to antagonize liberals or belittle conservatives. They are not narrow minded, drunk or angry. They don’t lack basic English language skills. They certainly aren’t “somebody sitting on their bed that weighs 400 pounds,” as the president once put it. Your stereotypical trolls do exist on social media, but the amateurs aren’t a threat to Western democracy.

Professional trolls, on the other hand, are the tip of the spear in the new digital, ideological battleground. To combat the threat they pose, we must first understand them — and take them seriously.

On August 22, 2019, @IamTyraJackson received almost 290,000 likes on Twitter for a single tweet. Put in perspective, the typical tweet President Trump sends to his 67 million followers gets about 100,000 likes. That viral tweet by @IamTyraJackson was innocent: an uplifting pair of images of former pro football player Warrick Dunn and a description of his inspiring charity work building houses for single mothers. For an anonymous account that had only existed for only a few months, “Tyra” knew her audience well. Warrick’s former coach, Tony Dungy, retweeted it, as did the rapper and producer Chuck D. Hundreds of thousands of real users viewed Tyra’s tweet and connected with its message. For “Tyra,” however, inspiring messages like this were a tool for a very different purpose.

The purpose of the Tyra account, we believe, was not to spread heartwarming messages to Americans. Rather, the tweet about Warrick Dunn was really a Trojan horse to gain followers in a larger plan by a foreign adversary. We think this because we believe @IamTyraJackson was an account operated by the successors to Russia’s Internet Research Agency (IRA). Special Counsel Robert Mueller indicted the IRA for waging a massive information war during the 2016 U.S. election. Since then, the IRA seems to have been subsumed into Russia’s Federal News Agency, but its work continues. In the case of @IamTyraJackson, the IRA’s goal was two-fold: Grow an audience in part through heartwarming, inspiring messages, and use that following to spread messages promoting division, distrust, and doubt.



We’ve spent the past two years studying online disinformation and building a deep understanding of Russia’s strategy, tactics, and impact. Working from data Twitter has publicly released, we’ve read Russian tweets until our eyes bled. Looking at a range of behavioral signals, we have begun to develop procedures to identify disinformation campaigns and have worked with Twitter to suspend accounts. In the process we’ve shared what we’ve learned with people making a difference, both in and out of government. We have experienced a range of emotions studying what the IRA has produced, from disgust at their overt racism to amusement at their sometimes self-reflective humor. Mostly, however, we’ve been impressed.

Professional trolls are good at their job. They have studied us. They understand how to harness our biases (and hashtags) for their own purposes. They know what pressure points to push and how best to drive us to distrust our neighbors. The professionals know you catch more flies with honey. They don’t go to social media looking for a fight; they go looking for new best friends. And they have found them.

Disinformation operations aren’t typically fake news or outright lies. Disinformation is most often simply spin. Spin is hard to spot and easy to believe, especially if you are already inclined to do so. While the rest of the world learned how to conduct a modern disinformation campaign from the Russians, it is from the world of public relations and advertising that the IRA learned their craft. To appreciate the influence and potential of Russian disinformation, we need to view them less as Boris and Natasha and more like Don Draper.

As good marketers, professional trolls manipulate our emotions subtly. In fall 2018, for example, a Russian account we identified called @PoliteMelanie re-crafted an old urban legend, tweeting: “My cousin is studying sociology in university. Last week she and her classmates polled over 1,000 conservative Christians. ‘What would you do if you discovered that your child was a homo sapiens?’ 55% said they would disown them and force them to leave their home.” This tweet, which suggested conservative Christians are not only homophobic but also ignorant, was subtle enough to not feel overtly hateful, but was also aimed directly at multiple cultural stress points, driving a wedge at the point where religiosity and ideology meet. The tweet was also wildly successful, receiving more than 90,000 retweets and nearly 300,000 likes.



This tweet didn’t seek to anger conservative Christians or to provoke Trump supporters. She wasn’t even talking to them. Melanie’s 20,000 followers, painstakingly built, weren’t from #MAGA America (Russia has other accounts targeting them). Rather, Melanie’s audience was made up of educated, urban, left-wing Americans harboring a touch of self-righteousness. She wasn’t selling her audience a candidate or a position — she was selling an emotion. Melanie was selling disgust. The Russians know that, in political warfare, disgust is a more powerful tool than anger. Anger drives people to the polls; disgust drives countries apart.

Accounts like @IamTyraJackson have continued @PoliteMelanie’s work. Professional disinformation isn’t spread by the account you disagree with — quite the opposite. Effective disinformation is embedded in an account you agree with. The professionals don’t push you away, they pull you toward them. While tweeting uplifting messages about Warrick Dunn’s real-life charity work, Tyra, and several accounts we associated with her, also distributed messages consistent with past Russian disinformation. Importantly, they highlighted issues of race and gender inequality. A tweet about Brock Turner’s Stanford rape case received 15,000 likes. Another about police targeting black citizens in Las Vegas was liked more than 100,000 times. Here is what makes disinformation so difficult to discuss: while these tweets point to valid issues of concern — issues that have been central to important social movements like Black Lives Matter and #MeToo — they are framed to serve Russia’s interests in undermining Americans’ trust in our institutions.

These accounts also harness the goodwill they’ve built by engaging in these communities for specific political ends. Consistent with past Russian activity, they attacked moderate politicians as a method of bolstering more polarizing candidates. Recently, Vice President Biden has been the most frequent target of this strategy, as seen in dozens of tweets such as, “Joe Biden is damaging Obama’s legacy with his racism and stupidity!” and “Joe Biden doesn’t deserve our votes!”

The quality of Russia’s work has been honed over several years and millions of social media posts. They have appeared on Instagram, Stitcher, Reddit, Google+, Tumblr, Medium, Vine, Meetup, and even Pokémon Go, demonstrating not only a nihilistic creativity, but also a ruthless efficiency in volume of production. The IRA has been called a “troll farm,” but they are undoubtedly a factory.

While persona like Melanie and Tyra were important to Russian efforts, they were ultimately just tools, interchangeable parts constructed for a specific audience. When shut down, they were quickly replaced by other free-to-create, anonymous accounts. The factory doesn’t stop. They attack issues from both sides, attempting to drive mainstream viewpoints in polar and extreme directions.

In a free society, we must accept that bad actors will try to take advantage of our openness. But we need to learn to question our own and others’ biases on social media. We need to teach — to individuals of all ages — that we shouldn’t simply believe or repost anonymous users because they used the same hashtag we did, and neither should we accuse them of being a Russian bot simply because we disagree with their perspective. We need to teach digital civility. It will not only weaken foreign efforts, but it will also help us better engage online with our neighbors, especially the ones we disagree with.



Russian disinformation is not just about President Trump or the 2016 presidential election. Did they work to get Trump elected? Yes, diligently. Our research has shown how Russia strategically employed social media to build support on the right for Trump and lower voter turnout on the left for Clinton. But the IRA was not created to collude with the Trump campaign. They existed well before Trump rode down that escalator and announced his candidacy, and we assume they will exist in some form well after he is gone. Russia’s goals are to further widen existing divisions in the American public and decrease our faith and trust in institutions that help maintain a strong democracy. If we focus only on the past or future, we will not be prepared for the present. It’s not about election 2016 or 2020.

The IRA generated more social media content in the year following the 2016 election than the year before it. They also moved their office into a bigger building with room to expand. Their work was never just about elections. Rather, the IRA encourages us to vilify our neighbor and amplify our differences because, if we grow incapable of compromising, there can be no meaningful democracy. Russia has dug in for a long campaign. So far, we’re helping them win.

Darren Linvill is an associate professor of communication at Clemson. His work explores state-affiliated disinformation campaigns and the strategies and tactics employed on social media. Patrick Warren is an associate professor of economics at Clemson. Dr. Warren’s research focuses on the operation of organizations in the economy such as for-profit and non-profit firms, bureaucracies, political parties, armies, and propaganda bureaus.
 

Laela

Sidestepping the "lynch mob"
^^ I think because we're the most-fractured racial group, we've become the most vulnerable and it makes it easier to exploit us. :yep:
 

sheanu

Well-Known Member
It is downright SCARY that they're able to manipulate the beliefs and trust of whole demographics like this. To the point that they can alter alter a whole election. Even scarier is that those with with the power to sanction them want to ignore facts and pretend this isn't happening, thus allowing it to continue.
 

Tibbar

Well-Known Member
It is downright SCARY that they're able to manipulate the beliefs and trust of whole demographics like this. To the point that they can alter alter a whole election. Even scarier is that those with with the power to sanction them want to ignore facts and pretend this isn't happening, thus allowing it to continue.
As long as it continues to benefit them, it's a non issue!

It truly is scary how subtle and pervasive their efforts have been. The levels to their malevolence is mind boggling. We definitely have to be vigilant and mindful of what we take in and disseminate.

I really had dismissed their efforts as clumsy and readily apparent trolling that would only suck in the ignorant and easily led, but this right here... Next level.
 
Top