Facebook employees are staging a rare protest against the company for leaving up a post from President Donald Trump they say could incite violence. The employees, who began publicly criticizing the social network on Twitter over the weekend, have escalated their disapproval by staging a virtual walkout and symbolically changing their workplace profile pictures.
The social media giant has facedabout the company’s mostly hands-off approach to political content. The internal disapproval, however, has reached a new boiling point because Facebook’s inaction contrasts with rival Twitter’s response to Trump’s posts.
After labeling two of Trump’s tweets about mail-in ballots for “potentially misleading information,” Twitter placed anotherfor violating its rules against glorifying violence. In the tweet, which was also posted by the White House’s account, Trump said “when the looting starts, the shooting starts.” The president made the remarks in response to news about protests that have erupted following the death of George Floyd, a black man from Minnesota who died after a white police officer pinned him down with his knee. The incident was recorded on video and Floyd says in the footage that he couldn’t breathe.
Facebook left up Trump’s post after the company determined that the president’s remarks didn’t “cause imminent risk of specific harms or dangers,” a decision that conflicts with Twitter’s interpretation of the remarks. The world’s largest social network doesn’t have a notice like Twitter that allows a politician’s post to stay up even if it violates its rules.
The decision prompted dozens of Facebook employees to reportedly stage a “virtual walkout” on Monday by taking the day off and requesting time off to support protesters. Employees also added an automated message to their emails saying that they were out of the office to show that they disagreed with the company’s position on Trump’s posts, according to a report from The New York Times.
Facebook employees also took to Twitter to criticize the decision, an unusual public rebuke of their own company.
“Censoring information that might help people see the complete picture *is* wrong. But giving a platform to incite violence and spread disinformation is unacceptable, regardless who you are or if it’s newsworthy. I disagree with Mark’s position and will work to make change happen,” Andrew Crow, who heads design for Facebook’s video chat device Portal, referring to CEO Mark Zuckerberg.
Crow wasn’t the only Facebook employee to publicly speak about against Zuckerberg. Ryan Freitas, director of product design for Facebook’s News Feed, said that Zuckerberg “is wrong” and he “will endeavor in the loudest possible way to change his mind.”
In an effort to underscore their discontent, Facebook employees reportedly changed their internal profile pictures on a workplace version of the social network to the Twitter logo, sources told Kate Klonick, an assistant professor at St. John’s University School of Law.
Facebook didn’t immediately respond to a request for comment.
The internal revolt against Facebook is unprecedented even though the company has had a long history of being criticized for what content it leaves up or pulls down. Liberals and even its own employees have been trying to get the company to change its approach to political content in both ads and regular posts. Conservatives have complained the company censors right-wing speech.
Trump wasn’t happy with Twitter’s response and signed anon Thursday in an attempt to curtail the legal protections social media networks get under a federal law for posts created by its users. Social networks have denied it censors conservative speech.
Late Friday, Zuckerberg defended the company’s decision to keep the president’s post up.
“Although the post had a troubling historical reference, we decided to leave it up because the National Guard references meant we read it as a warning about state action, and we think people need to know if the government is planning to deploy force,” Facebook CEO Mark Zuckerberg said in a post. He added that the president clarified in a later post that he discourages violence.
Facebook typically doesn’t send political posts or ads to its third-party fact checkers. But the company has taken action against an ad from the Trump reelection campaign in the past. In 2018, Trump’s campaign posted a controversial immigration ad that Facebook removed for violating its rules against “sensational content.”, an undocumented immigrant who was convicted of killing two California sheriff’s deputies in 2014. It falsely attempts to connect Bracamontes’ crimes to the migrant caravan making its way from Mexico to the US border. The video was allowed on Facebook even though the ad was pulled.
Facebook also has a higher bar when a post comes from a politician because the company considers it direct political speech. In March,of Democratic presidential candidate and former Vice President Joe Biden that made it appear as if the politician was endorsing Trump even though he had not. Facebook flagged the post as “partly false information” after it was fact checked. The video was also shared by Trump on Facebook and the warning notice appeared on his post. Even though the video was shared by Trump, it’s not considered direct speech because it’s a clip from someone else. Twitter added a “manipulated media” label to the same clip.
Facebook is also creating an oversight board to tackle some of its most contentious content decisions. The board, though, isn’t operating until later this year.