Can your Facebook friends predict from your status updates that you’re contemplating suicide? A new study purports to find out whether it's possible by collecting data from a number of volunteer social media and mobile phone users to create a tool for real-time analysis of suicide risk factors.

The experiment is named for Emile Durkheim, a sociologist who conducted a broad survey and analysis of suicide statistics in 1897. In his book, Suicide, Durkheim classified suicide victims into distinct types and found correlations between the lives they led and the likelihood they had to kill themselves.

The Durkheim Project’s participants are drawn from a number of US military veterans. The data collection will rely several apps “available on Facebook as well as for iPhone and Android devices” that will forward the actions and content of participants’ mobile activity—from tweets to LinkedIn interactions to text message content—to a medical database.

As it’s collected, the data will be analyzed by “artificial intelligence systems.” Predictive applications will then use the analyzed data to monitor content and behavior in real time for patterns that are typical of people who are considering harmful behavior.

The scientists involved in the project have yet to develop a way to synthesize the data they collect with actual instances of self harm. "we still need to get extensions to our authorized medical protocol to do this," said Chris Poulin, the Durkheim Project's principal investigator. "How do you ask for this 'date of death' information in such a way as to not be insensitive?"

Poulin and his team are looking to develop consent forms that would allow a participant's support network to report the loss of a loved one. The researchers also state that the social media data will be cross-referenced with concussions, family stresses, PTSD, and other factors that correlate with a high suicide rate in the military and among veterans.

Part of Durkheim’s suicide theory stated that societal integration plays a heavy role in suicidal tendency. According to the data Durkheim collected, a person who is well-integrated into a supportive environment—for instance, a Catholic man with a wife and two kids—is less likely to kill himself than one who exists at the social outskirts (a single Protestant soldier).

But if that social integration is so heavy that a person may feel customs and habits dictate that they shouldn’t live anymore, another type of suicide arises: a woman killing herself after her husband dies or a soldier within military service who has disgraced himself.

While analysis of individual behavior across many volunteers will no doubt provide clues to who is suicidally ideating and who isn’t, we wonder if there isn’t a more earnest Durkheim methodology lurking in here—that is, analyzing how each person identifies him- or herself and how they fit among others in their literal online social networks.

There’s a little potential for “thought-crime” too. If the system doesn’t work well enough, it could mis-predict or treat users unfairly. But for now the program will be “non-interventional,” so it won't be making official diagnoses or trying to fix anyone’s mental state. Volunteers can opt in or out of which social networks are monitored, and the initial version of the application allows data to be seen only by the participant and the collection engine.

The project has already completed an experimental “phase one” that was able to predict suicidality with correlations of 65 percent by analyzing social media-related text. The ultimate goal of the project is to get a better understanding of mental health risk factors and help doctors make better-informed—and well-timed—decisions to intervene.