Skip to main content

Violence, Hate Speech and Revenge Porn: Facebook's Internal Rulebook Exposed

Share This article

From videos of violent deaths to revenge porn, if you've ever wondered how Facebook moderators decide what to allow on the popular social media sight, those guidelines are now out in the open.

The revelation is being called the Facebook Files

In an investigation into the social media giant, The Guardian has seen more than 100 internal training manuals, spreadsheets and flowcharts that give unprecedented insight into the blueprints Facebook has used to moderate issues such as violence, hate speech, terrorism, pornography, racism and self-harm.

The files do more than just expose policies, they also expose concerns by those moderating content. They say they're overwhelmed with content, sometimes given only a 10-second window to make a decision about what to allow.

"Facebook cannot keep control of its content," said one source to The Guardian. "It has grown too big, too quickly."

This leak comes at a time when Facebook is under pressure from several sides–critics demanding both more censorship and less–some calling on the social media site to remove more hateful, violent or indecent content. Others are concerned about Facebook's role to censor.

The Guardian released several examples of guidelines given to moderators over the last few years:

  • Remarks such as "Someone shoot Trump" should be deleted, because as a head of state he is in a protected category. But it can be permissible to say: "To snap a b****'s neck, make sure to apply all your pressure to the middle of her throat," or "f*** off and die" because they are not regarded as credible threats.
     
  • Videos of violent deaths, while marked as disturbing, do not always have to be deleted because they can help create awareness of issues such as mental illness.
     
  • Some photos of non-sexual physical abuse and bullying of children do not have to be deleted or "actioned" unless there is a sadistic or celebratory element.
     
  • Photos of animal abuse can be shared, with only extremely upsetting imagery to be marked as "disturbing."
     
  • All "handmade" art showing nudity and sexual activity is allowed but digitally created art showing sexual activity is not.
     
  • Videos of abortions are allowed, as long as there is no nudity.
     
  • Facebook will allow people to livestream attempts to self-harm because it "doesn't want to censor or punish people in distress."
     
  • Anyone with more than 100,000 followers on a social media platform is designated as a public figure – which denies them the full protections given to private individuals.

In another leaked document, Facebook says, "people use violent language to express frustration online" and feel "safe to do so" on the site, according to The Guardian. 

"We should say that violent language is most often not credible until specificity of language gives us a reasonable ground to accept that there is no longer simply an expression of emotion but a transition to a plot or design," the document continues.

In a statement responding to The Guardian's story, Facebook's head of global policy management, Monika Bickert said: 

"Keeping people on Facebook safe is the most important thing we do. We work hard to make Facebook as safe as possible while enabling free speech. This requires a lot of thought into detailed and often difficult questions, and getting it right is something we take very seriously. Mark Zuckerberg recently announced that over the next year, we'll be adding 3,000 people to our community operations team around the world – on top of the 4,500 we have today – to review the millions of reports we get every week, and improve the process for doing it quickly."

Bickert says Facebook is also investing in new technology that will help in the content review process.

"In addition to investing in more people, we're also building better tools to keep our community safe," she said. "We're going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help."

Content moderation expert, Sarah T Roberts, told The Guardian that Facebook is entering a "disaster situation."

"It's one thing when you're a small online community with a group of people who share principles and values, but when you have a large percentage of the world's population and say 'share yourself,' you are going to be in quite a muddle," Roberts said.

Share This article

About The Author

Caitlin Burke Headshot
Caitlin
Burke

Caitlin Burke serves as National Security Correspondent and a general assignment reporter for CBN News. She has also hosted the CBN News original podcast, The Daily Rundown. Some of Caitlin’s recent stories have focused on the national security threat posed by China, America’s military strength, and vulnerabilities in the U.S. power grid. She joined CBN News in July 2010, and over the course of her career, she has had the opportunity to cover stories both domestically and abroad. Caitlin began her news career working as a production assistant in Richmond, Virginia, for the NBC affiliate WWBT