"This was a first-person shooter video, one where we have someone using a GoPro helmet with a camera focused from their perspective of shooting," Neil Potts, Facebook's public policy director, told British lawmakers. As Bloomberg reports, this violence was so unprecedented that the AI didn't know what to look for.

The company has since come under fire for failing to remove the videos fast enough, and the EU is considering legislation that could fine social media platforms that don't remove terrorist content within one hour of notification. This revelation proves platforms will need to bolster their AI detection systems in order to meet such goals.