Meta

Reinforcing Our Commitment to Transparency

By Chris Sonderby, VP & Deputy General Counsel

We’re frequently asked about the nature and extent of the government requests we receive for user data, and how we make decisions about what content stays up or what comes down on Facebook. These are important questions — and it’s why we issue a Transparency Report to address them. We believe this transparency helps hold governments and Facebook accountable.

For this release, we’ve updated our Transparency Report to also include a Community Standards Enforcement Report. This provides information about our enforcement efforts between October 2017 and March 2018 in six areas: graphic violence, adult nudity and sexual activity, terrorist propaganda, hate speech, spam and fake accounts. VP of Product Management, Guy Rosen, explains more about these numbers in this post, including the fact that this is very much a work in progress and we will likely improve our methodology over time.

Government requests for account data increased globally by around 4% compared to the first half of 2017, increasing from 78,890 to 82,341 requests. In the US, government requests remained roughly even at 32,742, of which 62% included a non-disclosure order prohibiting Facebook from notifying the user, which is up from 57% during the first half of 2017. In addition, as a result of transparency updates introduced in the 2016 USA Freedom Act, the US government lifted the non-disclosure orders on 14 National Security Letters (NSLs) we received between 2013 and 2016. These requests, along with the US government’s authorization letters, are available to download below.

We always scrutinize each government request we receive for account data — whether from an authority in the US, Europe, or elsewhere — to make sure it is legally valid. If a request appears to be deficient or overly broad, we push back, and will fight in court, if necessary.

During the second half of 2017, the number of pieces of content we restricted based on local law fell from 28,036 to 14,294. Last cycle’s figures had been increased primarily by content restrictions in Mexico related to the video of a tragic school shooting.

There were 46 disruptions of Facebook services in 12 countries in the second half of 2017, compared to 52 disruptions in nine countries in the first half. We continue to be deeply concerned by internet disruptions, which prevent people from communicating with family and friends and also threaten the growth of small businesses.

The report also includes data covering the volume and nature of copyright, trademark and counterfeit reports we received, as well as the amount of content affected by those reports. During this period, on Facebook and Instagram we took down 2,776,665 pieces of content based on 373,934 copyright reports, 222,226 pieces of content based on 61,172 trademark reports and 459,176 pieces of content based on 28,680 counterfeit reports.

Publishing this report reinforces our commitment to transparency and we’re always working to improve our reporting in these areas.

Please see the full report for more information.



To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookie Policy