Child pornography is still being openly and extensively shared via the messenger app WhatsApp, despite the company banning thousands of accounts on a daily basis, researchers have claimed.

The encrypted messaging smartphone app, bought by Facebook in 2014, has been criticised after two charities in Israel were able to easily find large groups of people people sharing videos and images of child sexual abuse.

The two online safety organisations, Netivei Rishet and Screensaverz, tracked the groups for months after a whistle-blower alerted them of the issue in August.

The groups, which contained up to 256 individuals, made clear what they were sharing by either naming themselves with abbreviations such as “cp” or using explicit images as their profile pictures.

Despite the purpose of the groups being visible, they went undetected by systems used by WhatsApp to identify illegal content.

The two charities warned WhatsApp about the groups last month, however they were still active this week, according to a report in the Financial Times.

One group, called “Kids boy gay” had individuals from India, Pakistan and the US sharing and requesting illegal child content. The group has now been taken down and its participants banned after WhatsApp was contacted this week about the group.

WhatsApp said it periodically monitors group names and profile photos to curb individuals sharing child pornography. This has led to the messenger app to ban 130,000 accounts in the last 10 days.