Despite the public outcry, Facebook’s emotions test was basically the same algorithm writing that already happens on the social network and is unlikely to damage the company. Officials in the U.K. are investigating the company’s study, however, and the backlash highlights concerns about the future of social media and advertising.

Facebook conducted a study on the news feeds of approximately 700,000 people during one week in 2012 when the company’s data scientists used an algorithm affecting to determine whether lowering the number of positive or negative posts in a feed would encourage them to make more positive or negative posts.

The findings of the study, conducted with Cornell University and the University of California, San Francisco, were published in the March issue of the Proceedings of National Academy of Sciences. Findings included that seeing happy posts online encouraged people to post more often.

Regulators are sniffing for foul play, however. The U.K. regulators at the Information Commissioner’s Office will examine if Facebook broke any privacy laws while carrying out the experiment without consent, and could result in fines of up to $839,500 if misconduct occurred, Reuters reports.

Courtesy of Pew Research Center

The Federal Trade Commission in the U.S. has also held Facebook accountable in the past for not gaining consent from users about changes to the social network. The company misinformed users that they could keep information on Facebook private, while repeatedly allowing that information to be shared and made public, according to the 2011 settlement with the FTC.

People upset with the study do not like the thought of being lab rats in an experiment – and not without reason. The American Psychological Association's code of conduct requires informed consent from people for their permission to carry out a psychological experiment. Concerns raised about the study include whether it was safe to subject some users to negative posts despite the risk that it might reawaken emotional wounds or trigger chronic depression.

This line is blurred by the very common online advertising that collects information on people and tracks patterns to predict what they want when they want it. Predictive advertising algorithms will become even more common as the Internet grows, says Scott Strawn, a program director analyzing the strategy of tech companies at the International Data Corporation, a market research firm.

“Facebook manipulates news feeds all the time for various purposes,” Strawn says. “Advertising is the business of manipulation. We have decided that’s OK but to do it for science is wrong.”

Stories about the privacy fears of Facebook and its impact on users’ lives is nothing new, and uproar about this latest study will likely be forgotten as people begin summer vacation, Strawn predicts.

To encourage people to comment more on the social network Facebook in April tinkered with boosting posts on news feeds that feature the word “congratulations" because they indicate a life event, according to a story in Wired.

Internet and social media use has boomed since Facebook overtook Myspace as the most widely used social network in 2008. There have been numerous articles since then on how that has changed society by ways including making its users more lonely because of a false sense of inactivity or a greater need to validate social worth with online attention.

The media coverage on the news feed study about emotions has turned into Facebook-bashing, but what is needed instead is a dialogue about how to “create meaningful ethical oversight in research and practice,” wrote written Danah Boyd, a fellow at the Berkman Center for Internet & Society at Harvard University, in Medium.

“We need to hold companies accountable for how they manipulate people across the board, regardless of whether or not it’s couched as research,” Boyd wrote. “If we focus too much on this study, we’ll lose track of the broader issues at stake.”