European data laws have prevented Facebook from introducing a tool designed to spot users at risk of suicide.

The social media firm has announced that it will use artificial intelligence to spot posts and video comments that indicate someone is expressing suicidal thoughts.

However, data protection laws across the EU, which ban processing of an individual's sensitive personal data without their explicit permission, mean that the update will not make it to countries in the EU.

Facebook already allows users to report posts when they think the person is at risk, at which point a moderator is alerted and can choose to send the person help such as numbers for helplines or the chance to talk to a friend. However, it says many red flags often go unreported.

Mark Zuckerberg, Facebook's boss, said the company has now started to introduce "proactive detection", which automatically looks for trigger phrases in posts or comments on videos such as "are you OK?” and “can I help?".