"Online platforms are becoming people's main gateway to information, so they have a responsibility to provide a secure environment for their users," Andrus Ansip, VP for the Digital Single Market, said in a statement. "What is illegal offline is also illegal online. While several platforms have been removing more illegal content than ever before -- showing that self-regulation can work -- we still need to react faster against terrorist propaganda and other illegal content which is a serious threat to our citizens' security, safety and fundamental rights."

In the newly published guidelines, the Commission requests that terrorist content be reviewed and removed within one hour of it being reported because it "is the most harmful in the first hours of its appearance online." The Commission also asks social networks to implement better automated detection so that it doesn't have to rely on reports as much. Additionally, for terrorism and other illegal content -- including incitement of hatred and violence, child sexual abuse material, counterfeit products and copyright infringement -- the Commission requests that more efficient tools be developed for its detection, that those tools be shared throughout the industry and that all platforms work more closely with law enforcement.

The Commission says that it will continue to review social networks' performances regarding these new guidelines and will determine later on if additional steps, such as legislation, will be needed.