Meta

Protecting the EU Elections From Misinformation and Expanding Our Fact-Checking Program to New Languages

By Antonia Woodford, Product Manager

Ahead of the European Parliament Elections in May, we have made fighting misinformation a top priority. One of the ways we reduce the spread of false news is by partnering with independent, third-party fact-checkers around the world. Today we’re announcing the expansion of this program in the EU with five new local fact-checking partners: Ellinika Hoaxes in Greece, FactCheckNI in Northern Ireland, Faktograf in Croatia, Observador in Portugal and Patikrinta 15min in Lithuania. These organizations will review and rate the accuracy of content on Facebook.

Our fact-checking partners are all accredited by the International Fact-Checking Network (IFCN), which applies standards such as non-partisanship and transparency of sources. These partners are also part of a collaborative effort led by IFCN to fact-check content related to the European Parliament elections, called FactCheckEU. Starting today, nearly all FactCheckEU participants will able to rate and review claims on Facebook. (Updated on April 29, 2019 at 11AM PT to reflect the fact that nearly all, but not all, FactCheckEU participants will able to rate and review claims on Facebook.)

Our program now includes 21 partners fact-checking content in 14 European languages: Croatian, Danish, Dutch, English, French, German, Greek, Italian, Lithuanian, Norwegian, Polish, Portuguese, Spanish and Swedish. When a fact-checker rates a story as false, we show it lower in News Feed, significantly reducing its distribution. This reduces the spread of the story and the number of people who see it. In our experience, once a story is rated as false, we’ve been able to reduce its distribution by 80%. Pages and domains that repeatedly share false news will also see their distribution reduced and their ability to monetize and advertise removed. This helps curb the spread of financially motivated false news. 

This program is part of our three-part framework to improve the quality and authenticity of stories in News Feed. We remove accounts and content that violate our Community Standards or ad policies, reduce the distribution of false news and inauthentic content like clickbait, and inform people by giving them more context on the posts they see. In line with this approach, we’re working on additional measures to protect elections in the EU, including:

Introducing a Click-Gap signal: Our News Feed ranking system uses many signals to ensure that people see stories that are relevant and interesting to them. We recently introduced a new signal called click-gap, which can identify whether a website is producing low-quality content by looking at the percentage of total Facebook clicks it gets, and comparing that number to its web graph — the clicks it gets from other sources, or its status within the broader internet. Sites that have a high click-gap ratio are likely to produce lower-quality content like misinformation.

Expanding the context button: To help people evaluate the credibility of an article, we provide a button that displays more context about the article’s source, such as the publisher’s Wikipedia entry and where the article is being shared. We’re expanding this to indicate if a Page has a history of sharing misinformation, as well as “Trust Indicators,” which are publisher-provided links to a publication’s fact-checking principles, code of ethics, corrections policy, ownership/funding, and editorial team.

Informing publishers with new updates to the Page Quality Tab: To better inform Page managers about our policies around repeatedly sharing misinformation, we’re adding information about a Page’s misinformation violations to the Page Quality tab. If a Page has repeatedly published misinformation, we apply a demotion to all of that Page’s content on Facebook, and revoke that Page’s ability to advertise or use monetization products. The Page Quality Tab tells Page admins if they have repeatedly shared misinformation, if they haven’t, or if they’re at risk of reaching “repeat offender” status. We plan to add the ability for Page admins to see if their Page is receiving any demotions for sharing clickbait as well.

Informing group administrators about our misinformation tools: We’ll soon let group admins know if a third-party fact-checker has rated content that was posted in their groups as false. We’ll demote group content in News Feed if a group repeatedly shares misinformation, much in the same way that Pages and domains get a demotion on all their content if they frequently share false news.

Finally, we’re improving how we identify content that contains a claim that has already been debunked by a fact-checker so we can demote that too. Publishers who share this content will receive a notification that they’ve shared misinformation, much in the way that people who shared that claim would.

Misinformation is a complex and evolving problem, and we have more work to do. We’re investing heavily to get ahead because we believe in providing a space for civic discourse during elections. We’ll continue to take steps to ensure this discourse is safe, authentic, and accurate.

You can read more about how we’re maintaining the integrity of information on Facebook here.



To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookie Policy