Facebook's internal rules on sex, violence, hate speech leaked

Terrell Bush
May 23, 2017

The Guardian hit the motherload when the publication got its hands on more than 100 documents, containing everything ranging from internal training manuals to spreadsheets and flowcharts.

Facebook's justification for allowing the latter is that it is not viewed as a credible threat, versus someone calling for the assassination of the President of the United States.

"We build technology, and we feel responsible for how it's used", Monika Bickert, ā€ˇFacebook's head of global policy management, told the newspaper.

Blueprints have alarmed free speech advocates concerned about Facebook's "de facto role as the world's largest censor", the report said.

"This requires a lot of thought into detailed and often hard questions, and getting it right is something we take very seriously", Bickert said. Facebook already has a team of over 4,500 content moderators that review reported content every week, and earlier this month, Facebook CEO Mark Zuckerberg said that the company would hire an additional 3,000 moderators to review reported content. In addition to investing in more people, we're also building better tools to keep our community safe. But, "someone shoot Trump" should be deleted as a head of state comes under a protected category.

The social media platform takes the similar approach with violent death videos, as they think those will increase awareness.

An investigation by The Guardian has exposed the company's standards for removing offensive material, including the fact that images of animal and child abuse do not have to be deleted unless the context is overtly sadistic. Extreme cases of abuse is also allowed but must be marked "disturbing".

Moderators have also voiced concerns about the inconsistency and confusing nature of Facebook's content guidelines, specifically on issues related to sexual content.

Out is any direct threat like "Someone shoot Trump" - but it allows details of how to murder women.Including: "To snap a bitch's neck, apply your pressure to the middle of her throat" does not break its rules.

"Generic" or "not credible" threats - and it was not clear how Facebook arrived at a definition of "not credible" - included "I hope someone kills you".

However, moderators are told the content should be deleted after "there is no longer an opportunity to help the person" and minimise the risk of encouraging copycat behaviour, unless they are considered particularly newsworthy, such as footage of 9/11 which shows people leaping from the Twin Towers.

With that said, videos of violent deaths are left untouched as they might raise awareness surrounding mental illness.

Another example is in relation to violent language, which Facebook only deems as against the rules if the specificity of language makes it seem like it's "no longer simply an expression of emotion but a transition to a plot or design".

The company's document showed that it allows "moderate displays of sexuality, open-mouthed kissing, clothed simulated sex, and pixelated sexual activity" involving adults.

Other reports by TheDigitalNewspaper

Discuss This Article