Like this:

It should be no surprise that Facebook collects personal data from its users; tons and tons of it. Almost a billion people’s phone numbers, pictures, and emails exist in the database, as well as the interests, relationship statuses, and personal statuses of those individuals. The possibilities of what can be done with this massive amount of data is limitless. Previously, Facebook data was used to determine the amount of visitors to Brazil for the World Cup, and the Wall Street Journal was even able to make a list of the best cities for singles (if you’re single and ready to mingle by the way, try Louisville, Kentucky or El Paso, Texas. I don’t blame you if you’re curious). These publications seem relatively harmless, and since Facebook doesn’t just give away personal data to advertisers, most users feel secure with their online information. But what if Facebook itself were to stir up controversy by using user data in a subjectively inappropriate manner? Well, that’s essentially what happened in a recent experiment, where subjects were, according to many, used as “lab rats”.

Many Facebook users claim that viewing the exciting posts of their friends puts a damper on their own mood. If Carlos starts posting his selfies from the World Cup or Alex updates his status about his skydiving experience while you’re sitting at home, you would probably understand. So, to test the validity of this phenomenon, Facebook researcher Adam Kramer decided to test how content visibility affected the statuses of different users. For an entire week, a computer program identified posts containing words that held either positive or negative connotations for each of some 689,003 randomly selected users, and omitted them from their news feeds. As a result, the group that saw less positive content started posting more negative statuses, and vice versa. Kramer analyzed the results, and deemed the aforementioned claim false from the conclusions. Although the experiment may have given insight into the psychology of social media, others were well aware of its inflammatory nature, and Kramer himself said that, “In hindsight, the research benefits of the paper may not have justified all of this anxiety.” Indeed, in the months after the publication of the paper, many were critical of the actions and felt violated regarding Facebook’s proceedings without consent. Users reported that they felt insecure, especially those in the “negative words” group since they were supposedly manipulated to feel sadder for a week. Various ethics groups, psychologists, and professors have also expressed their disapproval. One blogger reported, “Is it okay for Facebook to play mind games with us for science?”

Perhaps most surprisingly, all of the experiment’s procedures were allowed under Facebook’s user policy agreement. However, the resulting outrage is anything but surprising, given that most people automatically accept user agreements without giving the content even the slightest glance. Furthermore, despite his doubts and official apology, Kramer supported the core study itself and rebutted claims of its unethical nature. By pointing out that the study was anonymous and that excluded content eventually reappeared, Kramer stated that the actual effects were minimal. In fact, the respective emotional dependent variables barely changed; less than a 0.1% shift. Kramer’s response prompted thousands to post on his Facebook page, with people both addressing their lack of concern as well as their criticism. One user posted that “emotional manipulation is emotional manipulation, no matter how small.” Well, Mr. Kramer, if you really “care[d] about the emotional impact of Facebook and the people that use our product” like you claimed, perhaps there were better approaches. “Facebook didn’t do anything illegal, but they didn’t do right by their customers,” said tech analyst Brian Blau, “They keep on pushing the boundaries, and this is one of the reasons people are upset.” Why decide to trigger negative emotions in hundreds of thousands of people when a poll or feedback question could have sufficed? Even if an experiment was conducted, including such a vast amount of people (the sample size was more than the entire population of Boston) without their permission just simply isn’t a proper action. Even if the actual emotional changes were insignificant, the resulting uproar was definitely not