People’s Facebook posts might predict whether they are suffering from depression, researchers reported on Monday.

The researchers found that the words people used seemed to indicate whether they would later be diagnosed with depression.

The findings offer a way to flag people who may be in need of help, but they also raise important questions about people’s health privacy, the team reported in the Proceedings of the National Academy of Sciences.

People who were later clinically diagnosed with depression used more “I” language, according to Johannes Eichstaedt of the University of Pennsylvania and his colleagues. They also used more words reflecting loneliness, sadness and hostility.

“We observed that users who ultimately had a diagnosis of depression used more first-person singular pronouns, suggesting a preoccupation with the self,” they wrote. That is an indicator of depression in some people.

The team recruited 683 people who visited an emergency room for their study and asked to see their Facebook pages. Most were not depressed, but 114 had a depression diagnosis in their medical records.

The team went back for as long as six months before the depression diagnosis to see if each patient’s posts might contain hints about their mental health.

“Using only the language preceding their first documentation of a diagnosis of depression, we could identify depressed patients with fair accuracy,” they wrote.

Words such as "tears," "cry," "pain," "miss," "hate" and "ugh" were more common in the posts of people later diagnosed with depression, they reported.

It’s not the first study to look at whether social media posts might offer clues about mental health problems. Facebook launched its own program in 2015 allowing people to flag accounts if they thought people indicated suicidal thoughts.

Other teams have noted patterns of use on platforms such as Twitter and Instagram may point to depression.

“Previous work observed that depressed users are more likely to tweet during night hours,” Eichstaedt and colleagues wrote.

They did not see that pattern on Facebook, Eichstaedt’s team said.

Another group of researchers reported in 2017 that Instagram users might signal depression with black-and-white or otherwise muted colors in their posts.

Depression and suicide are both on the rise in the U.S. and social media provides the opportunity to spot people who may need help, but the researchers said privacy is a big issue.

"Developers and policymakers need to address the challenge that the application of an algorithm may change social media posts into protected health information, with the corresponding expectation of privacy and the right of patients to remain autonomous in their health care decisions,” they wrote.

“Similarly, those who interpret the data need to recognize that people may change what they write based on their perceptions of how that information might be observed and used.”