Last week, Telegram, one of the more popular messaging apps out there, was suddenly and unceremoniously pulled from the App Store.
Telegram and the slimmed-down version, Telegram X, were pulled from the App Store due to “inappropriate content,” but, other than that, the reason, or reasons, for the removal weren’t known. There was plenty of speculation, of course, but nothing concrete from the developer or from Apple, other than the general inappropriate content reasoning.
Now, according to 9to5Mac, and citing an email conversation that includes Apple’s own Phil Schiller, the reason for Telegram getting pulled from the App Store so abruptly was indeed inappropriate content — which included sharing child pornography. According to the email, and Schiller, the App Store team was alerted to the illegal content within the app, and, as such, quickly pulled it because Apple won’t let that type of content be distributed through the App Store:
“The Telegram apps were taken down off the App Store because the App Store team was alerted to illegal content, specifically child pornography, in the apps. After verifying the existence of the illegal content the team took the apps down from the store, alerted the developer, and notified the proper authorities, including the NCMEC (National Center for Missing and Exploited Children).”
Last week, Telegram’s CEO did confirm the inappropriate content being distributed within the app, and said that once protections were in place the app would more than likely be made available within the App Store again. Sure enough, Telegram did go back into circulation in the digital storefront soon after it was removed, and it would seem that those protections were indeed put in place.
We have included the full email just below.
The Telegram apps were taken down off the App Store because the App Store team was alerted to illegal content, specifically child pornography, in the apps. After verifying the existence of the illegal content the team took the apps down from the store, alerted the developer, and notified the proper authorities, including the NCMEC (National Center for Missing and Exploited Children).
The App Store team worked with the developer to have them remove this illegal content from the apps and ban the users who posted this horrible content. Only after it was verified that the developer had taken these actions and put in place more controls to keep this illegal activity from happening again were these apps reinstated on the App Store.
We will never allow illegal content to be distributed by apps in the App Store and we will take swift action whenever we learn of such activity. Most of all, we have zero tolerance for any activity that puts children at risk – child pornography is at the top of the list of what must never occur. It is evil, illegal, and immoral.
I hope you appreciate the importance of our actions to not distribute apps on the App Store while they contain illegal content and to take swift action against anyone and any app involved in content that puts children at risk.
The App Store has had plenty of moments where apps have been pulled for seemingly no reason, but, at least in this case, the reason was a solid one. It’s good that Telegram was able to react quickly, too, putting in place the elements it needed to prevent this type of situation from happening again.
[via 9to5Mac]Like this post? Share it!