Facebook better at policing nudity than hate speech

Share

Facebook stepped further into its new era of data transparency Tuesday with the release of its inaugural Community Standards Enforcement Report.

Facebook revealed that the Community Standards Enforcement Report focused on suspicious activities that took place from October 2017 to March 2018.

The social media giant promised the report will be the first of a series seeking to measure how prevalent violations of its content rules are, how much content they remove or otherwise take action on, how much of it they find before it is flagged by users, and how quickly they take action on violations. That means as many as 88 million fake Facebook accounts were still slipping through the cracks in the company's policing system through March.

Despite this, the group said fake profiles still make up 3-4 percent of all active accounts.

Nevertheless, Facebook said that it "removed 2.5 million pieces of hate speech in Q1 2018", and that 38 percent of it was flagged by the social network's automated systems. The company says more than 96 percent of the posts removed by Facebook for featuring sex, nudity or terrorism-related content were flagged by monitoring software before any users reported them.

Facebook on Tuesday unveiled for the first time a transparency report that shows an increasing number of posts identified as containing graphic violence in the first of quarter of 2018.

Nobel Prize For Literature 2018 Cancelled After Sexual Harassment Scandal
It may be literature's greatest honor, but the Nobel Prize has sometimes drawn negative attention to writers as well as praise. We find it necessary to commit time to recovering public confidence in the academy before the next laureate can be announced.

"We have a lot of work still to do to prevent abuse", Facebook Product Management vice president Guy Rosen said.

Since the fallout over political firm Cambridge Analytica obtaining millions of Facebook users' data without their permission, Facebook reiterated its commitment to being more transparent.

We're often asked how we decide what's allowed on Facebook - and how much bad stuff is out there. Overall, Facebook estimates that out of every 10,000 pieces of content viewed on Facebook, nine to 10 views were of content that violated its adult nudity and pornography standards.

The last stat that Facebook highlighted was hate speech; it admitted its technology wasn't very good at picking it up so it still gets reviewed by review teams. It refrained from showing how prevalent terrorist propaganda and hate speech is on the platform, saying it couldn't reliably estimate either. Facebook is expected to host summits in India, Singapore, and the U.S. Through the company's two-part investigation, they found 200 potential apps leaked confidential data.

"In addition, in many areas - whether it's spam, porn or fake accounts - we're up against sophisticated adversaries who continually change tactics to circumvent our controls, which means we must continuously build and adapt our efforts".

The committee has also urged Facebook boss Mark Zuckerberg to appear before them, adding that it would be open to taking evidence from the billionaire company founder via video link if he would not attend in person.

Share