Meta

Update on New Zealand

By Chris Sonderby, VP and Deputy General Counsel

Our hearts go out to the victims, their families and the community affected by the horrific terrorist attacks in Christchurch. We remain shocked and saddened by this tragedy and are committed to working with leaders in New Zealand, other governments, and across the technology industry to help counter hate speech and the threat of terrorism. We continue to work around the clock to prevent this content from appearing on our site, using a combination of technology and people.

We have been working directly with the New Zealand Police to respond to the attack and support their investigation. We removed the attacker’s video within minutes of their outreach to us, and in the aftermath, we have been providing an on-the-ground resource for law enforcement authorities. We will continue to support them in every way we can. In light of the active investigation, police have asked us not to share certain details. While we’re still reviewing this situation, we are able to provide the information below:

  • The video was viewed fewer than 200 times during the live broadcast. No users reported the video during the live broadcast. Including the views during the live broadcast, the video was viewed about 4000 times in total before being removed from Facebook.
  • The first user report on the original video came in 29 minutes after the video started, and 12 minutes after the live broadcast ended.
  • Before we were alerted to the video, a user on 8chan posted a link to a copy of the video on a file-sharing site.
  • We designated both shootings as terror attacks, meaning that any praise, support and representation of the events violates our Community Standards and is not permitted on Facebook.
  • We removed the personal accounts of the named suspect from Facebook and Instagram, and are actively identifying and removing any imposter accounts that surface.
  • We removed the original Facebook Live video and hashed it so that other shares that are visually similar to that video are then detected and automatically removed from Facebook and Instagram.
  • Some variants such as screen recordings were more difficult to detect, so we expanded to additional detection systems including the use of audio technology.
  • In the first 24 hours, we removed about 1.5 million videos of the attack globally. More than 1.2 million of those videos were blocked at upload, and were therefore prevented from being seen on our services.
  • Member organizations of the Global Internet Forum to Counter Terrorism (GIFCT) coordinate regularly on terrorism and have been in close contact since the attack. We have shared more than 800 visually-distinct videos related to the attack via our collective database, along with URLs and context on our enforcement approaches. This incident highlights the importance of industry cooperation regarding the range of terrorists and violent extremists operating online.
  • We identified abusive content on other social media sites in order to assess whether or how that content might migrate to one of our platforms.

We will continue to work around the clock on this and will provide further updates as relevant.



To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookie Policy