Getting Our Community Help in Real Time

By Guy Rosen, VP of Product Management

When someone is expressing thoughts of suicide, it’s important to get them help as quickly as possible.

Facebook is a place where friends and family are already connected and we are able to help connect a person in distress with people who can support them. It’s part of our ongoing effort to help build a safe community on and off Facebook.

Today, we are sharing additional work we’re doing to help people who are expressing thoughts of suicide, including:

  • Using pattern recognition to detect posts or live videos where someone might be expressing thoughts of suicide, and to help respond to reports faster
  • Improving how we identify appropriate first responders
  • Dedicating more reviewers from our Community Operations team to review reports of suicide or self harm

Over the last month, we’ve worked with first responders on over 100 wellness checks based on reports we received via our proactive detection efforts. This is in addition to reports we received from people in the Facebook community. We also use pattern recognition to help accelerate the most concerning reports. We’ve found these accelerated reports— that we have signaled require immediate attention—are escalated to local authorities twice as quickly as other reports. We are committed to continuing to invest in pattern recognition technology to better serve our community.

Expanding our use of proactive detection

  • We are starting to roll out artificial intelligence outside the US (except in the EU at present) to help identify when someone might be expressing thoughts of suicide, including on Facebook Live. This will eventually be available worldwide.
  • This approach uses pattern recognition technology to help identify posts and live streams as likely to be expressing thoughts of suicide. We continue to work on this technology to increase accuracy and avoid false positives before our team reviews.
  • We use signals like the text used in the post and comments (for example, comments like “Are you ok?” and “Can I help?” can be strong indicators). In some instances, we have found that the technology has identified videos that may have gone unreported.

Improving how we identify first responders and dedicating more reviewers

  • Our Community Operations team includes thousands of people around the world who review reports about content on Facebook. The team includes a dedicated group of specialists who have specific training in suicide and self harm.
  • We are also using artificial intelligence to prioritize the order in which our team reviews reported posts, videos and live streams. This ensures we can get the right resources to people in distress and, where appropriate, we can more quickly alert first responders.
  • Context is critical for our review teams, so we have developed ways to enhance our tools to get people help as quickly as possible. For example, our reviewers can quickly identify which points within a video receive increased levels of comments, reactions and reports from people on Facebook. Tools like these help reviewers understand whether someone may be in distress and get them help.
  • In addition to those tools, we’re using automation so the team can more quickly access the appropriate first responders’ contact information.

Our ongoing commitment to suicide prevention

Already on Facebook if someone posts something that makes you concerned about their well-being, you can reach out to them directly or report the post to us. We have teams working around the world, 24/7, who review reports that come in and prioritize the most serious reports. We provide people with a number of support options, such as the option to reach out to a friend and even offer suggested text templates. We also suggest contacting a help line and offer other tips and resources for people to help themselves in that moment.

Facebook has been working on suicide prevention tools for more than 10 years. Our approach was developed in collaboration with mental health organizations such as Save.org, National Suicide Prevention Lifeline, and Forefront Suicide Prevention and with input from people who have had personal experience thinking about or attempting suicide. They are also available globally — with the help of over 80 local partners — in whatever language people use Facebook in.

With the help of our partners and people’s friends and family on Facebook, we hope we can continue to support those in need.

 



To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookie Policy