Breaking News

Facebook releases report on millions of accounts it removed

Facebook released a report on its content removal processes, revealing that the tech giant took down hundreds of millions of fake accounts, spam posts, and offensive or violent content.

The social network said it took down 21 million pieces of adult nudity in the first three months of the year, according to its first Community Standards Enforcement Report.

The report, released Tuesday, revealed how much content has been removed for violating standards. Its enforcement efforts between October 2017 and March 2018 include six categories: graphic violence; adult nudity and sexual activity; terrorist propaganda; hate speech; spam; and fake accounts.

The company estimated that for every 10,000 pieces of content seen on Facebook overall, between seven and nine of them violated its adult nudity and pornography standards.

Facebook's internal technology flagged adult nudity or sexual content about 96% of the time before it was reported by users, according to the report.

But most of Facebook's removal efforts centered on spam and fake accounts promoting it. In the first quarter, Facebook disabled about 583 million fake accounts and removed 837 million pieces of spam, the report said.

Facebook acknowledged it has work to do when it comes to properly removing hate speech. It took down 2.5 million pieces of hate speech during the period, only 38% of which was flagged by its algorithms.

"For hate speech, our technology still doesn't work that well and so it needs to be checked by our review teams," Guy Rosen, Facebook's VP of product management, said in a blog post.

Meanwhile, Facebook removed or added warning labels to about 3.5 million pieces of graphic violence content. In this case, 86% was flagged by its technology.

No comments