Facebook made some changes to its News Feed following the 2020 US presidential election. In response to the influx of misinformation on the platform, Facebook prioritized the appearance of major news outlets on users’ News Feeds.
Facebook’s Attempt to Avoid Post-Election Chaos
A report by The New York Times revealed that Facebook made an “emergency change” to its News Feed algorithm in the aftermath of the US election. According to the report, this meant deprioritizing sources that spread misleading content about election integrity.
Facebook decided to adjust its ”news ecosystem quality” (NEQ) scores, a system that prioritizes the prevalence of certain news outlets on the platform. Mark Zuckerberg, Facebook’s CEO, approved the adjustment after more users began engaging with potentially false content from right-wing outlets.
This change increased the scores for mainstream outlets like The New York Times, CNN, and NPR, which made them show up more often on users’ feeds. Conversely, it made sure that smaller, “hyperpartisan” publications, such as Occupy Democrats and Breitbart, were less prominent.
In case you were wondering whether the changes to the News Feed algorithm are here to stay, they’re not. Guy Rosen, Facebook’s vice president of integrity, explained to The New York Times that “there has never been a plan to make these permanent.”
Facebook’s goal was to make the post-election News Feed less divisive. The platform has been planning its response to potential Election Day chaos months ahead of the election, and it seems that it actually had to utilize one of its emergency measures.
Facebook has also taken plenty of other steps to curb election misinformation before, during, and after the presidential election. When it was hit with President Trump’s premature declarations of a false victory, it immediately shut down his post.
How Well Did Facebook Manage the 2020 Election?
Facebook was criticized in the past for its handling of presidential elections, as it allowed misinformation to run rampant. This backlash drove Facebook to change its approach to elections, which is why the platform has executed such an aggressive response to the 2020 election.
Facebook isn’t the only social media platform that has employed bold measures to counteract misinformation. Twitter also rolled out an arsenal of weapons to combat misleading content throughout the election, and some of those measures are remaining permanently on the platform. As it stands, it seems that warning labels on Twitter aren’t going anywhere.
You’ll now see a warning label when you try to Like a Tweet with disputed content.
About The Author