Facebook has been forced to restate its live video rules after footage of the aftermath of a black man being shot and killed by police officers during a routine traffic stop in the US was viewed by millions—before being removed and returned under mysterious circumstances.

The company insists it will only remove a video of someone's death if it has been "used to mock the victim or celebrate the shooting."

Philando Castile’s death at the hands of a traffic police officer while he reached for his driver's licence in Minnesota on July 7 shocked the world, after his girlfriend Lavish Reynolds had the presence of mind to film the immediate aftermath and upload it to Facebook. It was online for about 10 minutes before disappearing for around an hour in what Facebook has insisted was "a technical glitch." He had been stopped for what Reynolds said was "a busted tail light."

The video's temporary disappearance prompted complaints from activists associated with the Black Lives Matter movement, alleging that Facebook might be colluding in a cover-up with the authorities, which the Mark Zuckerberg-run company has denied.

Facebook has now moved to clarify its position, and seems to be saying that it would not attempt to hush highly charged incidents such as Castile’s death. It has acknowledged that live video is a "powerful tool in a crisis," and that "just as it gives us a window into the best moments in people’s lives, it can also let us bear witness to the worst." Facebook added:

One of the most sensitive situations involves people sharing violent or graphic images of events taking place in the real world. In those situations, context and degree are everything. For instance, if a person witnessed a shooting, and used Facebook Live to raise awareness or find the shooter, we would allow it. However, if someone shared the same video to mock the victim or celebrate the shooting, we would remove the video.

Facebook said its live video service has a team on call around the clock which will respond quickly to reports of inappropriate content of violations of community standards. Additionally, "anyone can report content to us if they think it goes against our standards, and it only takes one report for something to be reviewed."

Once a report of graphic content is reviewed by the team, there are three outcomes: the content is not deemed graphic and left up; the content is found to violate Facebook's standards and is removed; the content is deemed graphic but not a violation and receives a disclaimer which viewers must click through, which reads, “Warning—Graphic Video. Videos that contain graphic content can shock, offend, or upset. Are you sure you want to see this?”