Facebook Continues Fight With ‘Super Spreaders’ on US Election Misinformation

Social

The US presidential election is finished: votes cast, the transition, though delayed, begun. 

But on Facebook, the fight against election misinformation continues, thanks to “super spreaders,” accounts that disseminate rumors and fabrications, falsely spreading the idea that the 2020 election was beset by organised, extensive fraud by the Democratic Party. 

The US nonprofit Avaaz has identified 25 pages in particular, including those of Donald Trump Jr and Eric Trump, the president’s sons, White House press secretary Kayleigh McEnany and combative conservative commentators Dan Bongino, Lou Dobbs and Rush Limbaugh, along with pro-Trump organisations such as Turning Point USA.

These are sowing doubt about the President-elect Joe Biden’s White House win earlier this month, taking their lead from the building’s current resident, who has also taken to social media to tweet that he will not “concede” and to outline his so-far unfounded claims that the election was “stolen.”

Unproven allegations of fraud from these accounts have been “liked,” commented on, and shared more than 77 million times since November 3, according to a study from Avaaz.

And that doesn’t take into account the Facebook accounts of the “super-spreader” in chief, Donald Trump himself, nor that of his former adviser Steve Bannon, which was recently removed by the network. 

The social media giant has increased efforts to stop the spread of disinformation. 

It restricted and in some cases banned the publication of some political advertisements, highlighted reliable sources of information and tackled foreign manipulation campaigns. 

Going viral
Thanks to those measures and others, Facebook was able to avoid a repeat of the 2016 presidential campaign, when organised disinformation campaigns permeated the network ahead of Trump’s election. 

But these efforts were not enough to stop run-of-the-mill rumor circulation.

“The superspreaders in this list, with the helping hand of Facebook’s algorithm, are central to creating this flood of falsehoods that are now defining the political debate for millions across the country,” explained Fadi Quran, Avaaz campaign director.

Private Facebook groups have also contributed to the far-reaching spread of misinformation, according to Avaaz. 

Such groups, often made up of Trump supporters or those who also believe his allegation of a “stolen” vote, have exploded in the aftermath of the election, Avaaz reported, and they can be difficult to monitor and manage.

Facebook on November 5 suspended a group called #StopTheSteal, which had attracted some 350,000 members in 48 hours.

“The false rumors about election fraud continue as they being passed through these networks. So it’s less big accounts… it is more the millions of people who continue to push this narrative to one another,” said Claire Wardle, US director of the First Draft NGO.

Fact-checking
AFP works with Facebook’s fact-checking programme in almost 30 countries and nine languages. Around 60 media work worldwide on the programme.

Content rated “false” by fact-checkers is downgraded in news feeds so fewer people will see it.

If someone tries to share a post found to be misleading or false, Facebook presents them with the fact-checked article.

But Facebook has been widely criticised for its reluctance to take a more rigid stance, including by some employees, according to the US publication The Information.

According to an article published Tuesday, the site in 2018 compiled a list of 1,12,000 government and political candidate accounts that should be exempt from verification efforts, but says it is unclear if the list remains active, and Facebook has not confirmed its existence.

The situation led to an internal outcry in the summer of 2019, The Information reported, with employees calling for an end to the Facebook policy that exempts politicians from the fact-checking programme.

They pointed to an internal study that showed that users were more likely to believe misinformation if it came from a politician.

But Facebook says the study’s findings actually support their approach and helped them devise ways to call out politicians who share links or posts that have already been fact-checked.

That method allowed a warning to appear on a video shared by Trump, showing Los Angeles election workers collecting ballots but which the president said showed them stealing the envelopes, explaining the post was “missing context” and that “the same information was checked in another post by independent fact-checkers.” 

“We don’t believe it’s appropriate for us to prevent a politician’s speech from being subject to public scrutiny,” said Facebook spokesman Joe Osborne.


iPhone 12 Pro Series Is Amazing, but Why Is It So Expensive in India? We discussed this on Orbital, our weekly technology podcast, which you can subscribe to via Apple Podcasts, Google Podcasts, or RSS, download the episode, or just hit the play button below.

Products You May Like

Leave a Reply

Your email address will not be published. Required fields are marked *