One variation had been around since the time of the attack. All of the videos Feinberg found were sitting on Arabic-language pages.

Facebook has removed one of the videos as of this writing, and reiterated its plans to improve its filtering technology. It's using audio recognition to spot clips that might otherwise evade filters, and it's researching tech that could identify edited versions of clips.

At the moment, though, the findings illustrate the challenges of completely removing terrorist material. It's difficult to account for every possible variation of a video, especially if posters are deliberately evading filters. That, in turn, raises questions about laws that would punish companies for failing to remove extremist material. Would Facebook be held responsible if authorities found videos that slipped through the cracks? While it's doubtful internet giants would face significant punishment for mistakes, the laws may set a bar that current technology can't clear.