Facebook users’ attitude towards known Russian propaganda changes, study suggests

Facebook users are less likely to press the ‘Like’ button on content they learn is part of a foreign propaganda campaign, a report in the US has found.

 Researchers have said that Russia is using political memes to polarise Americans, particularly those at the extreme ends of the political spectrum who typically like and share content that aligns with their political views at higher rates than others.

However, a study published by US non-profit, non-partisan research group RAND Corporation, that exposed these hyper-partisan news consumers to potential interventions, suggests that most are open to reconsidering their initial response to a Russian meme after its source is revealed to them.

“Left- and right-wing audiences are particular targets of Russian propaganda efforts, so they naturally have a big reaction because the propaganda speaks to them,” said Todd Helmus, a senior behavioural scientist at RAND. “A big reaction also means a lot of room for improvement in terms of learning how to think critically about the source of a particular news item, and the consequences of passing it along to others.”

The RAND report is the third of a four-part series intended to help policymakers and the public understand – and mitigate – the threat of online foreign interference in national, state and local elections throughout the US.

The latest study used a randomised controlled trial of more than 1,500 Facebook users to understand how people react emotionally to Russian propaganda – specifically, memes that Russia used in the 2016 US election cycle – and whether media literacy content or labeling the source of a meme could help prevent the spread, and thus influence, of Russian propaganda on social media platforms.

According to the researchers, the study may be the first to test the impact of media literacy and labeling interventions on audience reactions to actual Russian propaganda memes.

In the study, the researchers asked participants about their consumption of news and categorised them into five groups. Here, they found that two of the groups react in the strongest and most partisan way to Russian memes.

The first of those two groups is the ‘Partisan Left’, who lean left politically and most often received their news from the New York Times. They also are least likely to believe that Covid-19 is a conspiracy. The second is ‘Partisan Right’, who lean right politically and get their news from Fox News or other politically far-right outlets. They are the group most likely to believe that Covid-19 is a conspiracy.

 People in these two groups also are the most likely to change their minds about liking a meme if the meme is revealed to be from a Russian source, according to the study.

Among members of the Partisan Right group, exposure to a short media literacy video reduced the number of likes for pro-US and politically right-leaning Russian content. The video also reduced the likes of pro-US themed Russian content among all study participants. The media literacy video had no significant effect on likes associated with left-leaning Russian content.

The researchers stressed that while is difficult to assess whether revealing the source of memes is a feasible mechanism for helping people recognise propaganda, there may be immense value in developing a third-party software plug-in that could unmask the source of state-sponsored content.

RAND researchers recommend educating Americans about the presence of Russian propaganda and encouraging them to be highly suspicious of sources and their intent. An example of a Russian meme, with directions on how to refute it, could help inoculate Americans against propaganda. Additionally, researchers point to technological media literacy interventions as a promising way to reduce the impact of Russian propaganda.

“Media literacy interventions that can be placed on phones or other devices have the potential to help people think through the way they interact with news or media content,” Helmus said.

In other news, YouTube has banned misinformation about Covid vaccinations, days after Facebook took similar action on its own platform.

The company said that the fact that such a vaccine might be imminent makes it the right time to take action and expand its pre-existing policies against Covid-19 medical misinformation. Examples of now-banned claims include false allegations that the vaccine would kill people or cause infertility and claiming that the vaccine would in some way implant microchips in recipients.

“A Covid-19 vaccine may be imminent, therefore we’re ensuring we have the right policies in place to be able to remove misinformation related to a Covid-19 vaccine from the platform,” a YouTube spokesperson said. “Any content that includes claims about Covid-19 vaccinations that contradict expert consensus from local health authorities or the World Health Organization (WHO) will be removed from YouTube.”