Facebook has long been reluctant to admit that it has become a news organization -- and not just a news organization, but perhaps the most important one.

It wasn't until the very end of 2016 -- a year in which the company was forced to address the promotion of fake news stories on its platform -- that founder and CEO Mark Zuckerberg even referred to Facebook as a "media company."

In 2017, Facebook's role as a disseminator of news will face even more scrutiny.

On Wednesday, four people in Chicago bound and gagged a man with special needs, then beat and tortured him -- and they broadcast the whole thing on Facebook Live, allowing people all over the world to watch in real time.

The Facebook Live feature, which the company has been promoting in television ads and on billboards as a way to share fun or uplifting videos with friends, was now being used as a means to broadcast torture.

The video was eventually taken down. "We do not allow people to celebrate or glorify crimes on Facebook and have removed the original video for this reason," a spokesperson told CNNMoney.

Still, Facebook acknowledges "the unique challenges of live video." The broadcasts are live, after all, and almost impossible to prevent before they start.

This presents Facebook with a dilemma: Will it simply show everything, or will it acknowledge the responsibilities that come with being a media company and hire editors who can supervise the content and decide in real time what is important and newsworthy and what must be taken down as inappropriate?

Facebook's standards do not ban torture or violence outright. "In many instances... when people share this type of content, they are doing so to condemn violence or raise awareness about it," the spokesperson said. "In that case, the video would be allowed."

One instance in which Facebook Live was used to raise awareness came last July, when a Minneapolis-area woman named Diamond Reynolds streamed a live video of her fiancé, Philando Castile, being shot by a police officer during a traffic stop. Facebook initially took down that post, but republished it not long after. Days later, citizens in Dallas used Facebook Live to show police being ambushed by a domestic terrorist.

Related: Facebook Live's big moment raises even bigger questions

Facebook's content policies have gone beyond ethical decisions about broadcasting and into the realm of investigating its users. In the wake of the 2015 San Bernardino terrorist attack, Facebook stepped up efforts to remove users who backed terrorist groups.

Last month, Facebook, Twitter, Microsoft and YouTube said they planned to create a shared database to help them track and remove "violent terrorist imagery or terrorist recruitment videos." The database will contain the digital "fingerprints" of the images and videos, the companies said, allowing them to identify potential terrorist content more efficiently.

Those actions raise questions: Did Facebook help authorities find the suspects in this case? Will it mine the data that it has on them? And what might it do with that data?

The Facebook spokesperson said the company does not discuss specific cases, and declined to answer specific questions regarding its use of the suspects' data.

Facebook, which is fast approaching 2 billion monthly active users, is a major source of news, information and now live video for people all across the world.

But at any minute, it can also become a broadcaster of violent crimes, as well as a key witness in the investigation into those crimes. It has yet to present a coherent explanation of just how it intends to handle that role.