Facebook often flaunts the vast stores of data it has on its users — last week's announcement of Graph Search is only the latest example — but some of the information posted onto the site is undoubtedly a bit darker than relationship statuses and tastes in music. The social network offers a record of its users' thoughts, and suicide researchers believe that analyzing the Facebook interactions of people who cut their own lives short could help them better identify warning signs. Facebook is now giving some of that data to independent researchers through a partnership with suicide prevention non-profit SAVE.

Researchers no longer have to rely on memories from friends and family, but is Facebook going too far?

To start, researchers will study data from "at least 20" from an unspecified Minnesota county who have committed suicide. While organizations like SAVE and others have long focused on identifying behavior that warns that a person is considering suicide, SAVE executive director Dan Reidenberg told Bloomberg in an interview that the new research could surface new trends by looking at factors like the length between posts or the sort of language used in updates. In the past, researchers have relied on obtaining accounts from family members and friends to identify warning signs.

This isn't the first time Facebook has worked on this issue. Back in 2010, the social network partnered with the National Suicide Prevention Lifeline to let friends flag posts that were suicidal in nature. The flag would immediately send an email to the user who posted the comment urging them to call the hotline. Facebook's work with SAVE is different, however, and it raises some (familiar) concerns that the social network isn't respecting its users' privacy. It's not clear if Facebook intends to release more data to SAVE in the future, or if the company worked together with the victims' families to obtain permission. The partnership comes a couple of weeks after Reddit co-founder Aaron Swartz died at age 26.