With 2.7 billion monthly active users, Facebook is a truly global platform. For better or worse, Facebook users around the world depend on the company’s services for information and for expression. This is especially true in countries and contexts with limited media freedom. To be responsive to the needs of our partners and the communities in which we work, Internews thinks deeply on the role and responsibilities of platforms like Facebook.
In this piece, Internews’ Platform Accountability Adviser Rafiq Copeland takes a deep dive into Facebook’s Community Standards Enforcement Report (CSER).
Read the full report: Transparency Without Accountability? Fixing Facebook’s Community Standards Enforcement Report
The CSER is a key transparency document for Facebook, in which the company lays out its global performance in moderating its platforms. How many fake Facebook accounts were removed in the last three months – and how many are still on the platform? The CSER is where Facebook shares this information, and beginning this past August, the report is to be updated quarterly.
However, while Facebook’s transparency around content moderation is considered to be amongst the best in the industry, as an accountability tool the CSER is not up to the job.
Key amongst the questions asked in the paper is, do the metrics provided in the CSER allow us to make an assessment of whether Facebook is doing an adequate job enforcing its rules and keeping its users safe? In its current form the answer is no. For outsiders wishing to understand and critically assess Facebook’s enforcement of its community standards – which is nominally the idea of this transparency – there is little in these quarterly updates that would assist with a meaningful analysis.
Amongst the biggest problems with the CSER is its use of global aggregates. Every metric is provided in a global figure. An observer wishing to understand how Facebook is enforcing its community standards in their own country or language will find no help here. If Facebook’s enforcement of its community standards was uniform across its billions of users this may not be a problem. But enforcement is not uniform, and rather than providing transparency, the CSER helps to mask this reality. If the enforcement of community standards is to be meaningful then these distinctions matter. The great majority of Facebook’s 2.7billion monthly active users are not native English speakers and almost all of its 12% year-on-year growth is from non-English speaking populations.
Along with issues such as language and geography, Internews assesses issues such as prevalence estimates, user reporting, automation, speed, reach, and user appeals, to reach a set of detailed recommendations for developing the CSER into a true accountability tool.
Read the full report: Transparency Without Accountability? Fixing Facebook’s Community Standards Enforcement Report
(Banner image: courtesy of SnappyGoat/CC)