By Dan Zigmond, Director of Analytics, News Feed

Misinformation is a real problem on the internet — including on Facebook. Unfortunately, there’s no silver bullet for solving it. So we try to come at the issue from a few different angles, using both technological solutions that can scale broadly as well as human reviewers, who can understand context and nuance.

In the video above, I’ll walk you through some of our work around false news and provide an update on how I think we’re doing.

To reduce the spread of false news, we remove fake accounts and disrupt economic incentives for traffickers of misinformation. We also use various signals, including feedback from our community, to identify potential false news. In countries where we have partnerships with independent third-party fact-checkers, stories rated as false by those fact-checkers are shown lower in News Feed. If Pages or domains repeatedly create or share misinformation, we significantly reduce their distribution and remove their advertising rights.

We also want to empower people to decide for themselves what to read, trust, and share. We promote news literacy and work to inform people with more context. For example, if third-party fact-checkers write articles about a news story, we show them immediately below the story in the Related Articles unit. We also notify people and Page Admins if they try to share a story, or have shared one in the past, that’s been determined to be false.

In addition to our own efforts, we’re learning from academics, scaling our partnerships with third-party fact-checkers and talking to other organizations about how we can work together.

There’s a lot of work that’s been done and a lot more work to do. But most things worth doing are never finished. Watch this space for further updates.

See also:
The Three-Part Recipe for Cleaning up Your News Feed (video)
News Feed Ranking in Three Minutes Flat (video)
Designing New Ways to Give Context to News Stories
Helping Ensure News on Facebook Is From Trusted Sources