Meta

Hard Questions: How Is Facebook’s Fact-Checking Program Working?

Hard Questions is a series from Facebook that addresses the impact of our products on society.

By Tessa Lyons, Product Manager

False news is a money maker for spammers and a weapon of state actors and agitators around the world. This has introduced important questions for society and new responsibilities for companies like Facebook.

Misinformation is bad for our community and bad for our business. It’s why we’re investing significant time and resources to fight it. As I explained in my last post, there are three main ways we’re doing this:

  • Removing accounts and content that violate our policies
  • Reducing the distribution of false news and the financial incentives to create it
  • Informing people by giving them more context on the stories they see

One part of our strategy that we get asked about a lot is our partnership with third-party fact-checking organizations. They help us identify false stories so we can stop them from spreading on Facebook. Overall, we’re making progress and have learned a lot. This year we expanded to more countries and started having fact-checkers review photos and videos, not just links. We’re also looking for more ways to be transparent about these efforts and to have independent researchers measure our results.

This program is just one part of our strategy, and we won’t be able to address this problem with human fact-checkers alone. Still, I wanted to share more on our work and the challenges ahead.

How Third-Party Fact-Checking Works
We started the third-party fact-checking program in December 2016. Now we have 25 partners in 14 countries, many with recent or upcoming elections. Our partners are independent and certified through the non-partisan International Fact-Checking Network. When fact-checkers rate an article as false, we show it lower in News Feed — reducing future views by over 80% on average.

Here’s how it works:

  • We use technology to identify potentially false stories. For example, when people on Facebook submit feedback about a story being false or comment on an article expressing disbelief, these are signals that a story should be reviewed. In the US, we can also use machine learning based on past articles that fact-checkers have reviewed. And recently we gave fact-checkers the option to proactively identify stories to rate.
  • Fact-checkers provide a rating and reference article. Independent third-party fact-checkers review the stories, rate their accuracy and write an article explaining the facts behind their rating.
  • We demote links rated false and provide more context on Facebook. If a story is rated false, we reduce its distribution in News Feed. (See more on how News Feed ranking works.) We let people who try to share the story know there’s more reporting on the subject, and we notify people who shared it earlier. We also show the fact-checker’s reference article in Related Articles immediately below the story in News Feed.
  • We take action against repeat offenders. If a Facebook Page or website repeatedly shares misinformation, we’ll reduce the overall distribution of the Page or website, not just individual false articles. We’ll also cut off their ability to make money or advertise on our services.

The Limits of Fact-Checking
Over the last 18 months we’ve made good progress, but we’re also aware of the limits of this program. Fact-checkers don’t exist in all countries, and different places have different standards of journalism as well as varying levels of press freedom. Even where fact-checking organizations do exist, there aren’t enough to review all potentially false claims online. It can take hours or even days to review a single claim. And most false claims aren’t limited to one article — they spread to other sites. To make real progress, we have to keep improving our machine learning and trying other tactics that can work around the world.

There are other challenges, too, such as how to treat opinion and satire. We strongly believe that people should be able to debate different ideas, even controversial ones. We also recognize there can be a fine line between misinformation and satire or opinion. For example, sometimes people try to call their sites “satire” as cover for their true motivation — to spread fake stories. This can make it more difficult for fact-checkers to assess whether an article should be rated “false” or left alone.

Another question is what to do when publishers want to challenge a decision — especially after their article has already reached a lot of people. We allow publishers to contact fact-checkers to dispute their rating or offer a correction in order to restore their distribution in News Feed. If a fact-checker accepts the correction or changes their rating, we’ll remove the strike against a publisher. Our goal here is to prevent bad actors from exploiting loopholes without unduly punishing reputable publications that sometimes make mistakes.

And ultimately, it’s important that people trust the fact-checkers making these calls. While we work with the International Fact-Checking Network to approve all our partners and make sure they have high standards of accuracy, fairness and transparency, we continue to face accusations of bias. Which has left people asking, in today’s world, is it possible to have a set of fact-checkers that are widely recognized as objective? We’ve also made some changes to how we let people know that a story is disputed so that they can learn more and come to their own conclusions.

It’s clear that even as we continue to improve this program, we need solutions beyond fact-checkers. That’s why we’re also working on removing fake accounts, which are often responsible for misinformation. And as we make it harder for fake stories to spread and we prevent malicious sites and Pages from using our tools to make money, we will break the business models that incentivize bad actors to share it. We also continue to invest in news literacy programs to help people better judge the publishers and articles they see on Facebook. It’s through the combination of all these things — and by collaborating with other companies and organizations — that we’ll be able to continue to make progress on false news.



To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookie Policy