Meta

Facts About Content Review on Facebook

Yesterday, The New York Times published an article about the way we moderate content on Facebook. We’ve been accused of being “ad hoc, ” “disorganized,” “secretive,” and doing things “on the cheap.” This couldn’t be further from the truth.

We welcome debate about how to help keep Facebook a safe place where people can express their ideas. But that debate should be based on facts, not mischaracterizations. Here’s where we disagree with the Times:

Our policies are public, not “secret” or “closely held.”
For years, we’ve published our Community Standards, the overarching guide that outlines what is and isn’t allowed on Facebook. Earlier this year we went a step further and published the internal guidelines we use to enforce those standards. Anyone can view them at facebook.com/communitystandards.

Our policies are carefully considered, not “ad hoc” responses.
The Times is right that we regularly update our policies to account for ever-changing cultural and linguistic norms around the world. But the process is far from “ad hoc.” We make changes to our policies based on new trends that our reviewers see, feedback from inside and outside the company, as well as unexpected, and sometimes dramatic, changes on the ground. And we publish the changes we make every month.

What the Times refers to as a gathering “over breakfast” among “young engineers and lawyers” is, in fact, a global forum held every two weeks where we discuss potential changes to our policies. It includes experts from around the world with deep knowledge of relevant laws, online safety, counter-terrorism, operations, public policy, communications, product, and diversity. Yes, lawyers and engineers are present, but so are human rights experts, people who have studied hate and extremist organizations, former law enforcement and other public servants, and academics. As part of this process, we seek input from people outside Facebook so we can better understand multiple perspectives on safety and expression, as well as the impact of our policies on different communities.

Last month we started publishing minutes from these meetings, and early next year we plan to include a change log so that people can track updates to our Community Standards over time.

The people enforcing our policies are focused on accuracy, not quotas.
The team responsible for safety on Facebook is made up of around 30,000 people, about 15,000 of whom are content reviewers around the world, as the Times updated its story to note. Contrary to what the story reports, content reviewers don’t have quotas for the amount of reports they have to complete. Reviewers’ compensation is not based on the amount of content they review, and our reviewers aren’t expected to rely on Google Translate as they are supplied with training and supporting resources.

We hire reviewers for their language expertise and cultural context — we review content in over 50 languages — and we encourage them to take the time they need to review reports. They work in more than 20 sites around the world, which resemble Facebook’s own offices, and they provide 24/7 support. As the Times notes, some reviewers are based in Morocco and the Philippines, while others are based in the United States, Germany, Latvia, Spain and other locations around the world.

We’ve taken a careful approach in Myanmar based on feedback from experts.
When discussing our efforts to curb hate speech in Myanmar, the Times incorrectly claims that a paperwork error allowed an extremist group to remain on Facebook. In fact, we had designated the group – Ma Ba Tha – as a hate organization in April 2018, six months before The Times first contacted us for this story. While there was one outdated training deck in circulation, we immediately began removing content that represents, praises or supports the organization in April – both through proactive sweeps for this content and upon receiving user reports.

This was one of several steps we’ve taken in Myanmar in 2018. Another covered in the story was our decision to remove Facebook accounts belonging to senior military officials in Myanmar without notifying the government. We did this based on guidance from international experts, who cautioned us as to potential reactions from the military, the blame that could be placed on the government, and the risk to people’s safety on the ground.


We play an important role in how people communicate, and with that comes an expectation that we’ll constantly identify ways we can do better. That’s how it should be. And it’s why we constantly work with experts around the world to listen to their ideas and criticism, and make changes where they’re warranted. Throughout 2018, we’ve introduced more transparency into our policies and provided data on how we enforce them. We’ve got more in store in 2019, and we look forward to people’s feedback.



To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookie Policy