How Facebook decides what violent and explicit content is allowed
- Author: Joanne Flowers May 22, 2017,
May 22, 2017, 17:34
The leaked rules land not only at the height of election season in the United Kingdom, but also follow politicians of all stripes attacking Google, Twitter, and Facebook for failing to effectively police the content posted on their ad-stuffed services. General statements like "let's beat up fat kids" (a direct quote) can remain on the site, whereas someone's request for a presidential assassination would be removed.
As well as human moderators that look over possibly contentious posts, Facebook is also known to use AI-derived algorithms to review images and other information before they are posted.
Threats of violence must be credible. But this points to a bigger problem: without good data about how Facebook makes such decisions, we can't have informed conversations about what type of content we're comfortable with as a society. A similar approach is adopted for non-sexual child and animal abuse, with footage permitted as a way of raising awareness of the issues as well as drawing help to those affected.
Its large user base of almost 2 billion also means that it is hard to find consensus on content guidelines.
HONG KONG- Facebook is taking a lot of heat over the way it handles violent and disturbing content.
According to a leaked repository of about 11 policy documents from Facebook, the company's policy on self-harm clearly shows that the company will allow self-harm content to be livestreamed until "there's no longer an opportunity to help the person".
Facebook's head of global policy management, Monika Bickert, said it was always going to be hard to create standards when things aren't necessarily black and white.
"In most cases the reality of sharing vile and violent images of violence and child abuse simply perpetuates the humiliation and abuse of a child".
Threats against so-called "protected categories" such as President Trump should be deleted according to the publication's files - and so "Someone shoot Trump" is not acceptable to post.
A spokesman said: 'It (Facebook) needs to do more than hire an extra 3,000 moderators.
The same slide indicates Facebook moderators would not remove a comment that says: "To snap a b--h's neck, make sure to apply all your pressure to the middle of her throat". For instance, videos of mutilations are removed no matter what, whereas photos are marked as "disturbing".
"Handmade" art showing nudity and sexual activity is allowed, the documents show. In the past few months, everyone from Hamilton cast members to the Donald Trump campaign has turned to Facebook to broadcast in real time.
Internal company documents, leaked to The Guardian, detail the rules for allowing or rejecting content on violence, including live-streaming self-harm, racism, hate speech, terrorism and pornography.
Anyone with more than 100,000 followers on social media then becomes a designated public figure, meaning they don't get the same privacy rights as a private individual.