Meta

Enforcing Our Community Standards

We believe in giving people a voice, but we also want everyone using Facebook to feel safe. It’s why we have Community Standards and remove anything that violates them, including hate speech that attacks or dehumanizes others. Earlier today, we removed four Pages belonging to Alex Jones for repeatedly posting content over the past several days that breaks those Community Standards. Here’s more detail on enforcement of our standards:

How do you deal with people and Pages who repeatedly violate your standards?
Simply removing content that violates our standards is not enough to deter repeat offenders. It’s why every time we remove something, it counts as a strike against the person who posted it. And when it comes to Pages, we hold both the entire Page and the person who posted the content accountable. Here’s a step-by-step overview of what happens when content is reported to Facebook:

What is the number of strikes a person or Page has to get to before you ban them?
We don’t want people to game the system, so we do not share the specific number of strikes that leads to a temporary block or permanent suspension.

When we remove content for violating our policies, we notify the person who posted it to explain why, with some narrow exceptions to account for things like child exploitation imagery.

If someone violates our policies multiple times, their account will be temporarily blocked; a Page that does so will be unpublished. When a person is in a temporary block, they can read things on Facebook, but they can’t like, comment or post. If that person is also the admin of a Facebook Page, the block prevents them from posting to the Page.

If a Page is unpublished, is that different from removing them and if so why?
We offer Pages the opportunity to appeal in case we made a mistake. So our first step is to “unpublish” the Page so that it is no longer available on Facebook. If they don’t appeal or their appeal fails, we remove the Page.

This is very complicated — why do it this way?
This approach ensures Pages (even those with multiple admins), as well as Page admins, are held accountable for the content they post. It also means that admins cannot use multiple Pages to violate our policies and avoid strikes against their personal profiles. It’s not perfect — but we believe it’s a practical way to deter repeat offenders and help keep people safe.

How do you distinguish between fake news and content that breaks your Community Standards?
People can say things on Facebook that are wrong or untrue, but we work to limit the distribution of inaccurate information. We partner with third-party fact checkers to review and rate the accuracy of articles on Facebook. When something is rated as false, those stories are ranked significantly lower in News Feed, cutting future views by more than 80%.

When it comes to our Community Standards, they’re focused on keeping people safe. If you post something that goes against our standards, which cover things like hate speech that attacks or dehumanizes others, we will remove it from Facebook.

So what happened with InfoWars? They were up on Friday and now they are down?
As a result of reports we received, last week, we removed four videos on four Facebook Pages for violating our hate speech and bullying policies. These pages were the Alex Jones Channel Page, the Alex Jones Page, the InfoWars Page and the Infowars Nightly News Page. In addition, one of the admins of these Pages – Alex Jones – was placed in a 30-day block for his role in posting violating content to these Pages.

Since then, more content from the same Pages has been reported to us — upon review, we have taken it down for glorifying violence, which violates our graphic violence policy, and using dehumanizing language to describe people who are transgender, Muslims and immigrants, which violates our hate speech policies.

All four Pages have been unpublished for repeated violations of Community Standards and accumulating too many strikes. While much of the discussion around Infowars has been related to false news, which is a serious issue that we are working to address by demoting links marked wrong by fact checkers and suggesting additional content, none of the violations that spurred today’s removals were related to this.