SAN FRANCISCO — Facebook said on Thursday that future research on its 1.3 billion users would be subjected to greater internal scrutiny from top managers, particularly if it focused on “deeply personal topics” or specific groups of people.

The company — which suffered a black eye this summer from a study in which it used its prominent news feed feature to manipulate the emotions of some users without telling them — also said it would train all of its engineers in research ethics.

But no outsiders will be invited to review Facebook’s research projects, and the company declined to disclose what guidelines it would use to decide whether research was appropriate. Nor did it indicate whether it would seek consent from users for projects like the emotion study, which set off a global furor when it was published in June.

In essence, Facebook’s message is the same as it has always been: Trust us, we promise to do better.