Ask for help zodebala/Getty

Facebook is trialling new tools to help with suicide prevention efforts. One approach will use artificial intelligence to identify concerning posts and make it easier for other people to report them.

Facebook says it will use pattern recognition algorithms to spot posts that could indicate someone is suicidal and help their friends to flag this content by making the option to report posts about “suicide and self injury” more prominent for those that are considered potentially concerning. The algorithms are trained on posts that have previously been reported.

It will also use pattern recognition to flag posts “very likely to include thoughts of suicide” so that its community operations team can take action even if the post is not reported. The team will review posts to see if the person appears to be in need of help and provide resources directly if they deem it appropriate.


This system is currently being tested on a small number of users in the US.

The social media giant is also making it easier for people to report Facebook Live videos and to reach out directly to the person involved. People sharing live videos will see resources onscreen that offer support options such as connecting with a friend or contacting a help line.

Signpost to support

Mental health charity Mind says the principle behind the new tools is a good idea. “People in crisis may ask for help in places where the help they need is not readily available,” a spokesperson said. “Signposting people to appropriate sources of support can be a really important step in helping people to access the help they need.”

SANE, also a mental health charity, says it welcomes new online tools to identify people at risk of suicide and offer support, “but they are no substitute for human intervention”.

“What is needed is time, patience and human kindness to support people when they are going through their darkest moments,” a spokesperson told New Scientist.

Facebook is also working with organisations including the Crisis Text Line, the National Eating Disorder Association and the National Suicide Prevention Lifeline to let users directly contact people on crisis support teams over Facebook messenger.

The company plans to expand this test over the next few months.

Need a listening ear? UK Samaritans: 116123; US National Suicide Prevention Lifeline: 1 800 273 8255; hotlines in other countries

Read more: Can mental health apps replace human therapists?