We now have the answer for why the popular messaging app Telegram was pulled from the App Store last week. Telegram for iOS notably disappeared from the App Store for several hours without an explanation before the service’s CEO blamed the problem on Apple pulling the app due to ‘inappropriate content’ appearing in the app.

According to an email shared by 9to5Mac reader Alijah that includes a response from Phil Schiller who manages the App Store, Telegram was abruptly pulled when Apple learned that the app was serving child pornography to users.

9to5Mac has verified the authenticity of the email with Apple before publishing this story.

In the email, Schiller takes an admirable and firm position on never allowing such vile content as child pornography to be distributed through the App Store.

The Telegram apps were taken down off the App Store because the App Store team was alerted to illegal content, specifically child pornography, in the apps. After verifying the existence of the illegal content the team took the apps down from the store, alerted the developer, and notified the proper authorities, including the NCMEC (National Center for Missing and Exploited Children).

The response also explains what Telegram CEO Pavel Durov referenced when responding to a user last week who asked why the app was pulled:

We were alerted by Apple that inappropriate content was made available to our users and both apps were taken off the App Store. Once we have protections in place we expect the apps to be back on the App Store.

Similar to Apple’s iMessage, Telegram offers a secure messaging feature that relies on end-to-end encryption for protecting the privacy of messages sent between users. This means the illegal content was likely not simply media being shared between users but more likely content being served up from a third-party plug-in used by Telegram.

Within hours of Telegram being pulled, the secure messaging app returned to the App Store with fixes in place to prevent the illegal content from being served to users.

You can read the full email below.

The Telegram apps were taken down off the App Store because the App Store team was alerted to illegal content, specifically child pornography, in the apps. After verifying the existence of the illegal content the team took the apps down from the store, alerted the developer, and notified the proper authorities, including the NCMEC (National Center for Missing and Exploited Children). The App Store team worked with the developer to have them remove this illegal content from the apps and ban the users who posted this horrible content. Only after it was verified that the developer had taken these actions and put in place more controls to keep this illegal activity from happening again were these apps reinstated on the App Store. We will never allow illegal content to be distributed by apps in the App Store and we will take swift action whenever we learn of such activity. Most of all, we have zero tolerance for any activity that puts children at risk – child pornography is at the top of the list of what must never occur. It is evil, illegal, and immoral. I hope you appreciate the importance of our actions to not distribute apps on the App Store while they contain illegal content and to take swift action against anyone and any app involved in content that puts children at risk.

While it’s terribly unfortunate that such evil exists in the world and managed to find its way into an iOS app, it is reassuring to know that Apple will not hesitate to use its resources to stop illegal content from being distributed when possible.

FTC: We use income earning auto affiliate links. More.

Check out 9to5Mac on YouTube for more Apple news: