Facebook said Wednesday that it will take down content that seeks to intimidate voters, including posts that encourage people to engage in poll watching. The company will also direct users to accurate election results through notifications and labels after the polls close in November.
Social networks, including Facebook, have been criticized for not doing enough to safeguard democracy after Russian trolls abused these platforms to sow discord among Americans during the 2016 US presidential election. Since then, the social network has taken more steps to prepare for the US elections such as creating an online hub for voter information, pulling down fake accounts and displaying warning labels on posts that contained misinformation. Still, politicians, celebrities and activists have been pressuring Facebook to do more to tackle misinformation. The company doesn’t send posts from politicians to third-party fact checkers, a policy that has continued to spark scrutiny this year.
“We believe we have done more than any other company over the past four years to help secure the integrity of elections,” said Guy Rosen, Facebook’s Vice President of Integrity in a press conference on Wednesday. Rosen said the company has been planning for different scenarios that might happen during the US elections.
On election night, which is on Nov. 3, Facebook will notify users on Facebook and Instagram and display labels under a candidate’s posts that direct users to the social network’s Voting Information Center. If a candidate or party declares victory before a major media outlet calls a race, Facebook will let users know in a notification that the voters are still being counted and a winner hasn’t been declared yet. If a candidate contests the results of the election, Facebook will show the name of the winner in the notifications displayed in the main social network and Instagram. Posts from presidential candidates will also be labeled with a notice that displays the winner’s name and link to the voting information center.
Facebook will also temporarily stop running ads in the US about the election, social issues and politics after the polls close. The company will let advertisers know when this pause on ads gets lifted.
Facebook said it remove content that aims to intimidate voters such as posts that use militarized language such as words like “battle” or “army” to encourage people to engage in poll watching. The policy applies to any new content but not retroactively. Donald Trump Jr. has posted videos encouraging people to join an elections security “army” for his father President Donald Trump, who is running for re-election.
Monika Bickert, who oversees content policy at Facebook, said that this type of video, would be taken down moving forward. Content moderators will have to consider the context of the posts.
“For us, this is really about spotting when people are trying to discourage or stop others from voting,” she said.
The social network, which uses a mix of technology and workers to moderate content, said that between March and September, the company has pulled down more than 120,000 US posts on Facebook and Instagram that violate its rules against voter interference. Facebook has also displayed warning notices on more than 150 million pieces of content that contained misinformation debunked by fact checkers.
“That doesn’t mean that we consider our work complete. We know that we will miss things and that our enforcement won’t be perfect,” Bickert said.