Ahead of the US Presidential election 2020 in November, Facebook will enact a host of measures to ensure its platform is not used to sow chaos and spread misinformation before, during and after the poll. The company announced that it will restrict new political advertisements a week before the election and remove posts that convey misinformation about COVID-19 and voting. 
In a post shared on his official page, CEO Mark Zuckerbeg wrote: "This election is not going to be business as usual. We all have a responsibility to protect our democracy. That means helping people register and vote, clearing up confusion about how this election will work, and taking steps to reduce the chances of violence and unrest." 
With the nation divided, and election results potentially taking days or weeks to be finalized, there could be an "increased risk of civil unrest across the country," Zuckerberg said.
Here's what he posted:
Posts shared on the social media platform with obvious misinformation on voting policies and the coronavirus pandemic will be removed. While users will be restricted to forward articles to a maximum of five others on Facebook Messenger. The company also will work with Reuters to provide official election results and make the information available both on its platform and with push notifications.
Zuckerberg said Facebook had removed more than 100 networks worldwide engaging in such interference over the last few years. "Just this week, we took down a network of 13 accounts and two pages that were trying to mislead Americans and amplify division," he said.
Facebook and other social media companies have been scrutinized over how they handle misinformation/fake news, problems with President Donald Trump and other candidates posting false information and Russia's continued attempts to interfere in US politics.
Facebook, in particular, has faced criticism for not fact-checking political ads or limiting how they can be targeted at small groups of people.
After being caught off-guard by Russia's efforts to interfere in the 2016 US Presidential election, Facebook, Google, Twitter and other companies put safeguards in place to prevent it from happening again. 

That includes taking down posts, groups and accounts that engage in "coordinated inauthentic behavior" and strengthening verification procedures for political ads. Last year, Twitter banned political ads altogether.

No comments:

Post a Comment