By Makda Zewde, NPHR Editor
Suicide is the 10th leading cause of death in the United States.  More than 40,000 Americans lose their lives to suicide every year, at a rate which has been increasing steadily since 1999.  Worldwide, this number is close to 800,000 — approximately one person every 40 seconds. 
Last November, Facebook released a statement announcing that its pattern recognition software, which aims to detect users with potential suicidal intent, will become available to most of its users worldwide. The artificial intelligence product was initially tested only in the U.S., but after successful testing, the company announced plans to make it available in other countries, with the exception of the EU, which has unique data privacy restrictions. 
With more than 2 billion monthly users , the social media platform can have a broad impact on suicide prevention. Its new technology aims to identify the intent of self-harm using data from comments and posts as well as live videos. The platform also has tools in place for users to make reports about concerning content. Once a report is made, the site provides users with resources such as tips and helplines, as well as suggested messages to communicate to their friends. In addition to the software, Facebook is working to improve how it identifies first responders and plans to dedicate more reviewers from its team to look over reports of suicide.
One of the most common myths about suicide is that people who talk about it will not follow through.  In fact, people who are seriously considering suicide often make direct or indirect references to these thoughts. This may occur during casual conversations with friends, colleagues, or strangers — when people are not actively looking for these cues. With the help of artificial intelligence, expressions of suicidal thoughts on social media can be flagged and addressed in real time, providing a unique opportunity for intervention.
Another misconception is that nothing can be done to stop a person who is determined to commit suicide.  In many cases, suicidal people are ambivalent about their decision to live or die up until the last moment.  Moreover, their suicidal impulses do not last forever; and reaching out to a person who is on the verge of harming themselves may prolong their life. With these considerations in mind, there is hope that the right intervention at the right time may indeed be life-saving.
Although Facebook’s efforts are promising, it is uncertain how accurate algorithms will be in predicting self-harm. In clinical settings, suicide has been notoriously difficult to predict. Researchers studying suicidal ideation have not yet been able to identify clinically meaningful risk factors for self-harm.  Conversations with mental health professionals do not always provide better insights either; in a study examining risk factors of in-patient suicide, 78 percent of patients who committed suicide denied suicidal ideation in their last conversation with their mental health professional. 
Suicide is a major public health concern that requires timely prediction in order for any intervention to be meaningful. Whether it comes from mining social media data or simply from paying closer attention to our peers and loved ones, there is an urgent need to improve our ability to reach people while we can still make a difference. Facebook’s new technology has the potential to do just that.
1. “Suicide Statistics.” American Foundation for Suicide Prevention.
2. “Suicide Rates.” National Institute of Mental Health.
3. “Suicide Data.” World Health Organization.
4. “Facebook is rolling out AI-based suicide prevention effort.” CNN Money.
5. “Stats.” Facebook Newsroom.
6. “Suicide Prevention.” Helpguide.org.
7. “How Can You Stop a Suicide?” Psychology Today.
8. Walsh, C.G. , Ribeiro, J.D., Franklin, J.C., (2017). Predicting Risk of Suicide Attempts Over Time Through Machine Learning. Clinical Psychological Science, 5(3), 457-469.
9. Busch, K. A., Fawcett, J., & Jacobs, D. G. (2003). Clinical correlates of inpatient suicide. The Journal of Clinical Psychiatry, 64(1), 14-19.