In October 2014, the company was embroiled in a scandal after it emerged that Facebook researchers used its news feed feature to try to manipulate the emotions of its users. The incident resulted in an overhaul of its user research methodology. Last month, Facebook grappled with accusations of political bias and fears about how much it could influence the views of its members.

“The company really has to walk a fine line here,” said Dr. Jennifer Stuber, an associate professor at the University of Washington and the faculty director of Forefront, a suicide prevention organization. “They don’t want to be perceived as ‘Big Brother-ish,’ because people are not expecting Facebook to be monitoring their posts.”

Facebook said it had a role to play in helping its users to help one another. About a third of the posts shared on the site include some form of negative feelings, according to a study released in February by the company’s researchers. Posts with negative associations tended to receive longer, more empathetic comments from Facebook friends, the company said.

“Given that Facebook is the place you’re connected to friends and family, it seemed like a natural fit,” said Dr. Jennifer Guadagno, a researcher at Facebook who is leading the suicide prevention project. Facebook has a team of more than a dozen engineers and researchers dedicated to the project.

Facebook’s new suicide prevention tools start with a drop-down menu that lets people report posts, a feature that was previously available only to some English-speaking users. People across the world can now flag a message as one that could raise concern about suicide or self-harm; those posts will then come to the attention of Facebook’s global community operations team, a group of hundreds of people around the world who monitor flagged posts 24 hours a day, seven days a week.

Posts flagged as potential self-harm notes are to be expedited and reviewed more quickly by the team members, who also examine posts that Facebook users have reported as objectionable. Community operations team members who evaluate potentially suicidal content are given special training, Facebook said.