The content actioned for bullying and harassment by Facebook fell to 2.6 million pieces in Q1FY19, from 2.8 million in Q4FY18; with the proactive rate also dropping to 14.1 percent from 21 percent for the same period.
Proactive rate is calculated as those cases where Facebook acted against policy violations based on its internal detection systems before users reported them.
In the latest Community Standards Enforcement Report (CSER) released on May 23, the tech giant said, “in areas like bullying and harassment, where context is so vital to understanding if content violates our policy, we expect that for the foreseeable future, our automated systems will not have the ability to detect this content at a scale similar to other policy violations.”
In contrast, Facebook had proactive action rates of 65.4 percent against hate speech and 95 percent against fake accounts.
Recognising that bullying can be especially harmful to minors, Facebook acknowledged that its highly personal nature required that other users report such behaviour to be identified and removed.
The report showed that 496,000 pieces of content actioned in Q1FY19 were appealed, with 80,200 of those pieces being restored. Another 3,500 pieces of content were restored without an appeal for being ‘incorrectly removed or when circumstances change’.
The site, however, remained vague on the prevalence metrics for bullying and harassment, “We cannot estimate this metric right now. This is because communication can be highly dependent on language and context, and often reflects the nature of personal relationships.”
On the other hand, it calculated the prevalence of fake accounts at approximately 5 percent between Q4FY18-Q1FY19, for its total monthly active user (MAU) base.
Facebook’s CSER is part of the tech giant’s move towards increased transparency. Founder and CEO Mark Zuckerberg promised to better the company’s data and security policies following increased global discontent against privacy breaches.
As part of this, Facebook also chartered an independent advisory board – Data Transparency Advisory Group (DTAG), composed of seven experts from various disciplines who specialise in measurement and the role that metrics play in building legitimate, accountable institutions.
The DTAG in its report on May 23 set out steps that Facebook could take in order to improve the metrics in CSER and transparency about Community Standards enforcement practices, which included improving accuracy, reporting detailed prevalence rates and measures, separating review and appeal rates, among many others.
Facebook in the CSER said that it would hold itself accountable for doing better and that the company was committed to communicating more openly about efforts in future.
Guy Rosen, Facebook’s VP - Integrity (VP, Product Management) said
, “Over the last year, we’ve taken a number of steps to be more transparent in how we develop our policies and how we measure our efforts to enforce them. We will continue to bring more transparency to our work and include more information about our efforts so people can see our progress and hold us accountable for where we need to do more.”