Meta

Publishing Our Internal Enforcement Guidelines and Expanding Our Appeals Process

By Monika Bickert, Vice President of Global Policy Management

One of the questions we’re asked most often is how we decide what’s allowed on Facebook. These decisions are among the most important we make because they’re central to ensuring that Facebook is both a safe place and a place to freely discuss different points of view. For years, we’ve had Community Standards that explain what stays up and what comes down. Today we’re going one step further and publishing the internal guidelines we use to enforce those standards. And for the first time we’re giving you the right to appeal our decisions on individual posts so you can ask for a second opinion when you think we’ve made a mistake.

We decided to publish these internal guidelines for two reasons. First, the guidelines will help people understand where we draw the line on nuanced issues. Second, providing these details makes it easier for everyone, including experts in different fields, to give us feedback so that we can improve the guidelines – and the decisions we make – over time.

The Policy Development Process

The content policy team at Facebook is responsible for developing our Community Standards. We have people in 11 offices around the world, including subject matter experts on issues such as hate speech, child safety and terrorism. Many of us have worked on the issues of expression and safety long before coming to Facebook. I worked on everything from child safety to counter terrorism during my years as a criminal prosecutor, and other team members include a former rape crisis counselor, an academic who has spent her career studying hate organizations, a human rights lawyer, and a teacher. Every week, our team seeks input from experts and organizations outside Facebook so we can better understand different perspectives on safety and expression, as well as the impact of our policies on different communities globally.

Based on this feedback, as well as changes in social norms and language, our standards evolve over time. What has not changed — and will not change — are the underlying principles of safety, voice and equity on which these standards are based. To start conversations and make connections people need to know they are safe. Facebook should also be a place where people can express their opinions freely, even if some people might find those opinions objectionable. This can be challenging given the global nature of our service, which is why equity is such an important principle: we aim to apply these standards consistently and fairly to all communities and cultures. We outline these principles explicitly in the preamble to the standards, and we bring them to life by sharing the rationale behind each individual policy.

Enforcement

Our policies are only as good as the strength and accuracy of our enforcement – and our enforcement isn’t perfect.

One challenge is identifying potential violations of our standards so that we can review them. Technology can help here. We use a combination of artificial intelligence and reports from people to identify posts, pictures or other content that likely violates our Community Standards. These reports are reviewed by our Community Operations team, who work 24/7 in over 40 languages. Right now, we have more than 7,500 content reviewers, over 40% more than the number at this time last year.

Another challenge is accurately applying our policies to the content that has been flagged to us. In some cases, we make mistakes because our policies are not sufficiently clear to our content reviewers; when that’s the case, we work to fill those gaps. More often than not, however, we make mistakes because our processes involve people, and people are fallible.

Appeals

We know we need to do more. That’s why, over the coming year, we are going to build out the ability for people to appeal our decisions. As a first step, we are launching appeals for posts that were removed for nudity / sexual activity, hate speech or graphic violence.

Here’s how it works:

  • If your photo, video or post has been removed because we found that it violates our Community Standards, you will be notified, and given the option to request additional review.
  • This will lead to a review by our team (always by a person), typically within 24 hours.
  • If we’ve made a mistake, we will notify you, and your post, photo or video will be restored.
This post shows an example that could have been incorrectly removed and can now be appealed.

We are working to extend this process further, by supporting more violation types, giving people the opportunity to provide more context that could help us make the right decision, and making appeals available not just for content that was taken down, but also for content that was reported and left up. We believe giving people a voice in the process is another essential component of building a fair system.

Participation and Feedback

Our efforts to improve and refine our Community Standards depend on participation and input from people around the world. In May, we will launch Facebook Forums: Community Standards, a series of public events in Germany, France, the UK, India, Singapore, the US and other countries where we’ll get people’s feedback directly. We will share more details about these initiatives as we finalize them.

As our CEO Mark Zuckerberg said at the start of the year: “we won’t prevent all mistakes or abuse, but we currently make too many errors enforcing our policies and preventing misuse of our tools.” Publication of today’s internal enforcement guidelines – as well as the expansion of our appeals process — will create a clear path for us to improve over time. These are hard issues and we’re excited to do better going forward.



Q&A

Did you change any of your policies with this update?

What we’re sharing today isn’t new; it reflects standards that have long been in place. But, for the first time, we are publishing the internal implementation guidelines that our content reviewers use to make decisions about what’s allowed on Facebook.

It’s important to note that our standards do evolve – in some cases, changes are prompted by input we receive from external stakeholders; in others, we make changes to account for the way language is used; in still others, a change is necessitated by a gap in existing policy. This process will continue – and with it, future updates to our standards.

We will be sharing these updates publicly, and will be releasing a searchable archive so that people can track changes over time.

Are these the same guidelines that your reviewers use? Did you remove anything?

Yes. As of today, our external-facing Community Standards closely mirror our internal guidelines. You have told us that you don’t understand our policies; it’s our responsibility to provide clarity. This is an effort to explain where we draw the line when it comes to content on Facebook. We hope it’s one that invites and encourages feedback, and pushes us to further refine and improve our policies.

You say you are working with experts; who are these experts?

We work with experts around the world, including academics, non-governmental organizations, researchers, and legal practitioners. These individuals and organizations represent diversity of thought, experience and background. They provide invaluable input as we think through revisions to our policies, and help us better understand the impact of our policies.

On hate speech, for example, we have worked with Timothy Garton Ash, a professor at Oxford University who created the Free Speech Debate to look at these issues on a cross-cultural basis. Similarly, in developing our policies to help protect people from sexual exploitation, we convened over 150 safety organizations and experts in countries around the world, including the United States, Kenya, India, Ireland, Spain, Turkey, Sweden and the Netherlands.

How do you ensure consistency across your review teams and protect against human bias and error?

Our Community Standards are global and all reviewers use the guidelines we released today when making decisions. They undergo extensive training as part of their on-boarding process, and are trained and tested at regular intervals thereafter.

In developing our policies, we are extremely prescriptive and try to write actionable policies that clearly distinguish between violating and non-violating content so as to make the decision making process for reviewers as objective as possible. Our reviewers are not working in an empty room; there are quality control mechanisms in place, and management on site, that reviewers can look to for guidance. We also audit the accuracy of reviewer decisions on a weekly basis. Where mistakes are being made, we follow up with people on the team so as to prevent recurrence in the future.

Even with our quality audits, we know we don’t always get it right. That’s why we have historically given people the opportunity to appeal our decisions when we’ve removed their profile, Page, or Group. Over the coming year, we are going to further build out the ability for people to appeal our decisions. As a first step, we are launching appeals for posts that were removed for nudity / sexual activity, hate speech or graphic violence so that people can let us know when they think we’ve made a mistake.

Tell me more about Facebook Forums: Community Standards? Have you ever done anything like that before?

We do our best to get external input in different ways. Facebook Forums: Community Standards is a format that we haven’t tried before, and we’re excited to listen to and learn from our community. Event structure will vary depending on the city we’re in, and we look forward to sharing more details soon.



To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookie Policy