In the weeks before and after the 2020 US election, Facebook content from far-right sources of news and misinformation received more engagement than other sources elsewhere on the political spectrum, a new study from New York University has revealed.
The findings suggest that far-right pages have an advantage energizing followers on the world’s biggest social network. “My takeaway is that, one way or another, far-right misinformation sources are able to engage on Facebook with their audiences much, much more than any other category,” Laura Edelson, a researcher at NYU’s Cybersecurity for Democracy initiative who helped compile the report, told CNN. “That’s probably pretty dangerous on a system that uses engagement to determine what content to promote.”
Researchers looked at some 8.6 million public posts shared by 2,973 “news and information sources” from August 10th, 2020 to January 11th, 2021, categorizing the political slant and verisimilitude of their output based on evaluations by independent outlets like NewsGuard and Media Bias/Fact Check. The study measured how often Facebook users engaged with this content — sharing, commenting, or responding with reactions.
Their findings showed that far-right sources generated the highest average number of interactions per post, followed by far-left sources, then more centrist pages. Looking specifically at far-right sources, they found that pages spreading misinformation performed best. “Far-right sources designated as spreaders of misinformation had an average of 426 interactions per thousand followers per week, while non-misinformation sources had an average of 259 weekly interactions per thousand followers,” write the researchers.
Notably, the study showed that while spreading misinformation meant less engagement for sources on the far-left, left, center, and right wing of the political spectrum, it actually seemed to be an advantage for sources on the far-right. “Being a consistent spreader of far-right misinformation appears to confer a significant advantage,” said the authors.
The study is yet more evidence against the claim made by conservative politicians that Facebook is biased against right-wing sources. It also casts doubts on the efficacy of Facebook’s efforts to limit the spread of misinformation in the run-up to the US 2020 election.
The researchers behind the study note that their findings are limited, too. Although they were able to measure and compare engagement from different sources on Facebook, they couldn’t check how many people actually saw a piece of content or how long they spent reading it. Facebook simply doesn’t provide this data, leaving an incomplete picture.
“Such information would help researchers better analyze why far-right content is more engaging,” write the researchers. “Further research is needed to determine to what extent Facebook algorithms feed into this trend, for example, and to conduct analysis across other popular platforms, such as YouTube, Twitter, and TikTok. Without greater transparency and access to data, such research questions are out of reach.”
Originally posted: Source link