Facebook on Tuesday published a community standards enforcement report for the first time, to allow users to “judge their performance” on efforts to crack down on posts that violate their standards. Among the major findings, the report said the number of posts on social media platform depicting graphic violence increased in the first three months of 2018 when compared to the period of October to December 2018.
“This report covers our enforcement efforts between October 2017 to March 2018, and it covers six areas: graphic violence, adult nudity and sexual activity, terrorist propaganda, hate speech, spam, and fake accounts,” Facebook said. The company said the report has data on “how much content people saw that violates our standards; how much content we removed; and how much content we detected proactively using our technology — before people who use Facebook reported it.”
Facebook said it removed or put a warning screen for graphic violence for 3.4 million posts in the first quarter of 2018-19, a significant rise from the 1.2 million in the last three months of 2017. The report said the company had cracked down on 837 million posts for spam, 21 million pieces of content for adult nudity or sexual activity and 1.9 million for promoting terrorism.
The social media website claimed to have disabled 583 million fake accounts.
Data breach case
On April 11, Facebook Chief Executive Officer Mark Zuckerberg had testified before a committee of the United States Senate in the case of British political consulting firm Cambridge Analytica harvesting the private data of 87 million Facebook users.
Cambridge Analytica was accused of using the information of 87 million Facebook users to bolster United States President Donald Trump’s campaign before the 2016 election.
Limited-time offer: Big stories, small price. Keep independent media alive. Become a Scroll member today!
Our journalism is for everyone. But you can get special privileges by buying an annual Scroll Membership. Sign up today!