Facebook unveils secret rules on policing users
- Author: Carolyn Briggs Apr 26, 2018,
Apr 26, 2018, 13:35
"For instance, we may warn someone for a first breach, but if they continue to breach our policies, we may restrict their ability to post on Facebook or disable their profile". She admits there's still the concern that terrorists or hate groups will get better at developing "workarounds" to evade Facebook's moderators, "but the benefits of being more open about what's happening behind the scenes outweighs that". "We also recognize that this is a challenging and sensitive issue", the community standards for the "Integrity and Authenticity" section reads. The social network took action on 1.9 million pieces of content from those groups in the first three months of the year, about twice as many as in the previous quarter. Sri Lanka took a stand against social media by banning Facebook, WhatsApp, and Instagram in an attempt to prevent further violence against Muslim minorities. At the moment there are over 7,500 content reviewers, "more than 40 per cent the number at this time last year".
Facebook says that the standards are an evolving document and that it hopes to receive feedback to improve them.
These newly expanded guidelines give photographers a clearer picture of not only the rules themselves, but some of the reasons behind them.
One of Facebook's new privacy settings will ask users whether they want to block certain features, such as facial recognition, targeted advertising or personal information. It also provides them with the details of the Facebook's definition of the hate speeches and the violent threats, the sexual exploitation and the many other more details are mentioned.
Bickert also employs high-level experts including a human rights lawyer, a rape counselor, a counterterrorism expert from West Point and a PhD researcher with expertise in European extremist organizations as part of her content review team.
Monica Bickert, Vice President of Facebook policy products explained that the company chose to publish an internal document on "community standards", according to the online edition of the Chronicle.info with reference for a New time. The rules update the short "community standards" guidelines Facebook has previously allowed users to see. If the decision is overturned, the post will be restored and the user notified. Compared with private citizens, public figures need to meet a more stringent standard (actual malice) to prove damages in a libel lawsuit.
The community standards, which are provided on the company's website, details various words and images that the platform censors.
As of this moment, Facebook employs more than 7,500 content reviewers, which is up more than 40 percent from this same time a year ago. For example, the Free the Nipple movement can post images of their topless protests without fear of removal, but cannot post images of nude breasts in most other contexts. Facebook says its goal is to answer every report of possible content violation within 24 hours. Whether it's Facebook's fake news problem or disrespectful YouTube videos, consumers have clamored for regulation-and Silicon Valley is trying to get a head start. A policy team meets every two weeks to review potential additions or edits. Teams who present are required to come up with research showing each side, a list of possible solutions, and a recommendation.
Giving a broad sense of how community standards work, Bickert said Facebook will not allow anyone to be attacked on the basis of their religion or country, but will allow criticism on the same lines. "Even if were at 99 percent accuracy, that's still a lot of mistakes".