Canada’s privacy commissioner plans to probe Facebook over an experiment in which nearly 700,000 users’ news feeds were tampered with in order to manipulate users’ emotions.

The experiment “raises some questions that we are following up on," the office of newly-appointed commissioner Daniel Therrien the Globe and Mail and CBC.

"We will be contacting Facebook to seek further details related to this research and have been in touch with some of our international counterparts about the matter."

That follows a similar move by the U.K.’s Information Commissioner’s Office, which is reportedly investigating whether the experiment violated any laws, and plans to contact Irish authorities, as Facebook’s European operations are located there.

A recent academic article revealed that researchers affiliated with Facebook, Cornell University and the University of California-San Francisco manipulated the news feeds of 689,003 Facebook users, alternately removing positive or negative stories.

The idea was to see if “emotional contagion” happens online — that is, if moods can be transferred from person to person on Facebook, as they are in real life.

The researchers found that, yes, moods are contagious on Facebook. Those shown more positive stories were more likely to post positive stories of their own, and those shown more negative stories were more negative themselves.

Many academics have questioned the ethics behind the experiment.

“Probably nobody was driven to suicide,” tweeted Christian Sandvig, an associate professor of communications at the University of Michigan, adding a #jokingnotjoking hashtag.

"There is a big difference between our expectations for academic social science and our expectations for Facebook. And that difference is reasonable," he told HuffPost UK.

"We are right to expect that psychologists are not secretly experimenting on us via our Facebook feed, but I certainly expect Facebook to experiment on its users for its own gain."

Some observers have even suggested this could have implications for democracy. For instance, mass online emotional manipulation could determine whether people turn out to vote.

Facebook COO Sheryl Sandberg addressed the controversy on Wednesday, saying the company was sorry for poorly communicating the experiment. But she stopped short of apologizing for the experiment itself.

“This was part of ongoing research companies do to test different products, and that was what it was; it was poorly communicated,” she said during an appearance in New Delhi. “And for that communication we apologize. We never meant to upset you.”

A Facebook spokesperson told HuffPost earlier that “none of the data used was associated with a specific person's Facebook account.”

The spokesperson said the research is part of efforts to make Facebook “as relevant and engaging as possible. … A big part of this is understanding how people respond to different types of content, whether it's positive or negative in tone, news from friends, or information from pages they follow. We carefully consider what research we do and have a strong internal review process."

Also on HuffPost