Facebook wants us to help each other and it uses AI to identify those in need of support.
Can Facebook use all that it knows about us to help stop someone from committing suicide?
It’s been more than a rhetorical question since January, after a video, pulled from the social media platform Live.Me and shared on Facebook, showed a 12-year-old Katelyn Nicole Davis taking her own life. Facebook couldn’t control the spread of the video and appeared unsure if it even violated its own terms of service.
A month later, Facebook CEO Mark Zuckerberg’s 6,000 word global community manifesto made it clear that Facebook is ready to take on a more parental role, one that acknowledges its incredible influence and impact over nearly 2 billion people around the world. Zuckerberg wrote:
The Facebook community is in a unique position to help prevent harm, assist during a crisis, or come together to rebuild afterwards. This is because of the amount of communication across our network, our ability to quickly reach people worldwide in an emergency, and the vast scale of people’s intrinsic goodness aggregated across our community.
Now Facebook is ready, the company announced on Wednesday, to take a first and significant step in building a safer and more supportive Facebook community by significantly strengthening its own suicide prevention tools (Facebook has had suicide reporting and tools for a decade). The update includes a rather incredible claim: Facebook will use artificial intelligence to identify those members contemplating suicide.
"I wrote a letter on building global community a couple weeks ago, and one of the pillars of community I discussed was keeping people safe, "Zuckerberg wrote on his personal Facebook feed on Wednesday. "These tools are an example of how we can help keep each other safe."
According to a release outlining its plans, Facebook is now testing a system that relies on pattern recognition based on posts previously reported for suicide risk.
The AI tool looks at words in the post and, especially, comments from friends — such as "Are you okay?" and "I’m here to help" — that may indicate someone is struggling.
This part of the system won’t auto-report those at risk to Facebook, but will instead make the options for reporting self-injury and suicide more prominent the next time that at-risk person logs in to Facebook.
The AI tool looks at words in the post and, especially, comments from friends that may indicate someone is struggling.
"This is really breaking new ground. Taking technology to the next level of saving people’s lives," said Dr. Daniel J. Reidenberg, Psy.D., FAPA, Executive Director of Save.org, a nonprofit, nationwide suicide prevention organization.
Reidenberg added that Save.org hopes that Facebook’s new AI-based tools can speed up the work they already do to save people’s lives.
"If people can be engaged with friends or family members on Facebook and they notice something that’s concerning or alarming and technology can pick up on some of those signals We can more rapidly intervene." That speed can make the difference and, he added, "prevent a tragedy from happening."
Facebook is also testing a pattern recognition system, that will identify posts that include suicidal thoughts. Those posts will get reviewed by Facebook’s Community Team, which will decide if they need to extended suicide prevention resources to that Facebook member.
In addition and perhaps in acknowledgement of the Live.Me video tragedy, Facebook is also introducing suicide prevention tools to Facebook Live posts. "People watching a live video have the option to reach out to the person directly and to report the video to us," said Facebook in a release.
Sources tell Mashable that Live.Me is working on similar AI-based self-harm detection tools for live broadcasts.
Facebook’s multi-pronged effort also includes live crisis support through Facebook Messenger and a video campaign that reminds people to reach out to those in need of emotional support. The updated system will also offer the option to connect directly with someone from several suicide prevention organizations including Crisis Text Line and the National Suicide Prevention Line.
“As a community, we cannot prevent every suicide, but we must do more to reach out to people who are struggling," wrote Facebook COO Sheryl Sandberg in a Facebook post. "As individuals, we can be alert for the signs in ourselves and in others and act immediately. Together, we can be there for people in distress.”
While the Facebook Live tools roll out globally today, the ability to contact real-time help through one of Facebook’s partners and all of the AI pattern recognition technology is, for now, only part of a limited test in the U.S.
Additional reporting by Kerry Flynn