Facebook has tried to apologize for rebalancing around 700,000 users' News Feeds as part of a social experiment in 2012. But not only is one lawmaker still angry, he's hoping Facebook will have to answer to the FTC. Today, Senator Mark Warner (D-VA) sent an open letter to the FTC commissioners, requesting that they "fully explore the potential ramifications" of Facebook's experiment.

The Facebook study, which was published with the help of Cornell University researchers in the journal PNAS, was meant to study the phenomenon of "social contagion," or the spread of emotions between people. Over the course of a week, 689,000 users saw a slightly different version of their normal feeds, with either some positive or some negative stories (determined by a language analysis tool) removed. The resulting analysis found that these users correspondingly posted — again, very slightly — either more positive or more negative updates during the period. But while academic research that involves human subjects has to be approved by a university ethics board, Facebook had no such restrictions. As a result, it was unclear whether the experiment met accepted ethical standards, especially because the rigorous informed consent process was replaced by a line in a much longer list of terms and conditions. In addition, the clause that specifically allowed Facebook to use information for research and testing had been added after the study.

"The very fact that important questions remain unanswered highlights the lack of transparency."

Facebook and others have defended the study, saying that it was reviewed internally and used minimal changes to find real results. Co-author Adam Kramer noted that his study addressed common questions about Facebook and other social networks, including whether seeing someone else's happy posts made users depressed about their own lives (a theory that he says was contradicted by the results.) Cornell University said in a statement that it had passed on reviewing the study's ethical implications, since Facebook had been wholly responsible for collecting the data. And since Facebook commonly tries out new features on small groups of users, it's likely that many people have seen larger changes to their feeds over the course of standard testing.

In his letter, Warner wondered whether Facebook had really adequately informed users, and whether it assessed the potential risks and benefits of the study before going ahead with it. Though he acknowledged that the results have been barely noticeable for individual users, he expressed concern that the effects could grow if the practice became common — the paper itself argued that a large sample size could turn modest results into real phenomena. "The very fact that important questions remain unanswered highlights the lack of transparency around these business practices," he wrote. "While Facebook may not have been legally required to conduct an independent ethical review of this behavioral research, the experiment invites questions about whether procedures should be in place to govern this type of research."

Warner doesn't necessarily believe that the FTC should regulate the issue, but he's asked the agency to assess whether it has a role in increasing transparency and accountability in big data research, as well as whether it can help nudge companies towards creating their own best practices and internal regulations. He also wants an answer on whether simply observing data should be treated differently to actively manipulating it, as Facebook did. While Facebook has access to an especially deep well of user information, Warner wants the FTC's answers to apply across the industry. "Big data has the potential to help power economic activity and growth while serving consumers in meaningful ways," he says. "Companies like Facebook may have to perform research on a broad scale in order to improve their products. However, because of the constantly evolving nature of social media, big data, and the internet, many of these issues currently fall into unchartered territory."