According to WSJ, the videos showed things like corpses being paraded through the streets, ISIS fighters and women who call themselves "jihadist and proud." Some videos were set to catchy songs, and some used TikTok filters with stars and hearts. They were shared by nearly two dozen accounts.

The videos have been removed, and in a statement provided to Engadget, a TikTok spokesperson said:

"Content promoting terrorist organizations has absolutely no place on TikTok. We permanently ban any such accounts and associated devices as soon as identified, and we continuously develop ever-stronger controls to proactively detect suspicious activity. This is an industry-wide challenge complicated by bad actors who actively seek to circumvent protective measures, but we have a team dedicated to aggressively protecting against malicious behavior on TikTok."

Facebook, Twitter and YouTube have fought their own battles against terrorist content, and now that TikTok has become one of the most popular apps in the US, it will likely face similar challenges. The company has hired thousands of content moderators and its policies prohibit terrorist and criminal organizations from using the app. As we've seen on other platforms, keeping terrorist propaganda at bay requires constant and evolving vigilance.