According to a new paper in the Proceedings of the National Academy of Science, Facebook altered the News Feeds for hundreds of thousands of users as part of a psychology experiment devised by the company's on-staff data scientist. By scientifically altering News Feeds, the experiment sought to learn about the way positive and negative effect travels through social networks, ultimately concluding that "in-person interaction and nonverbal cues are not strictly necessary for emotional contagion."

"Each emotional post had between a 10 percent and 90 percent chance...of being omitted."

To test the hypothesis, the researchers identified 689,003 different English-language Facebook users, and began removing emotionally negative posts for one group and positive posts for another. According to the paper, "when a person loaded their News Feed, posts that contained emotional content of the relevant emotional valence, each emotional post had between a 10 percent and 90 percent chance (based on their User ID) of being omitted from their News Feed for that specific viewing." The posts were still available by visiting a friend's timeline directly or reloading the News Feed. The researchers also state that they did not alter any direct messages sent between users.

As the researchers point out, this kind of data manipulation is written into Facebook's Terms of Use. When users sign up for Facebook, they agree that their information may be used "for internal operations, including troubleshooting, data analysis, testing, research and service improvement." While there's nothing in the policy about altering products like the News Feed, it's unlikely Facebook stepped outside the bounds of the Terms of Use in conducting the experiment. Still, for users confused by the whims of the News Feed, the experiment stands as a reminder: there may be more than just metrics determining which posts make it onto your feed.