The popular instant messaging App. Highly applauded for how secure it is; Telegram is the closest alternative to WhatsApp. Last week, it got its presence on Apple store removed. In this post, we will show you the dramatic event that caused the incident.
The reason why Apple removed it from Apple’s store was a misery until now. On 31st January, it was reported that both the original telegram App and the Telegram X (which is still in testing) were both removed from the iOS App Store. The next day, Pavel Durov, Telegram CEO pushed a tweet onto his twitter handle saying that the App was removed due to the presence of “inappropriate content.” What about that? Nothing else was said about the incident.
Today, we’ve got a clear understanding of what he was referring to as “inappropriate content” that caused the secure messaging App to be kicked out of Apple’s store.
Phil Schiller, Apple marketing chief responding to an email to a 9to5mac reader, stated that the Apple team managing the App store got alerted to “illegal content,” a report which pointed out to child pornography that was shared through telegram.
According to Phil, the team verified the existence of the illegal content reported and had to take the app down from the store following its strict terms of service. They also alerted the developer behind the App and also notified the proper authorities that handle cases like this, authorities such as the national center for missing and exploited children (NCME).
After some time, telegram app was re-instated into the App store after they’ve set strict protections to prevent the illegal content from being spread. Now following Apple’s guidelines that require apps to contain various filters that stop such contents from being spread within an app, the telegram app came back to the App store.
The internet we play through today has its rules and regulations, and it’s sanctioned like this that enforces such rules. The Distribution of child pornography is among the most grievous offenses on the internet, and when authority cannot get hold of the individuals involved they lash out their punishment on the platforms the distribution of this contents are taking place.
These days, social networks, messaging apps and websites where users directly interact with each other include mechanisms to immediately detect the distribution of this illegal contents, and they remove it immediately. But Telegram is a different story altogether.
The secure messaging app is the cradle to a lot of advanced security features that allow users to host private and secret conversations that are end-to-end encrypted.
Telegram was one of the first messaging platforms to feature end-to-end encryption when it launched in 2013.
While Telegram has taken pride in creating a platform where users can hold secret conversations, being the most secured has always been it’s primary flaw, the activities of users and groups within the app are highly private. Anything can be shared among users within a group, and nobody will know because of the highly secured nature of the platform.
June last year, Vox reported that Telegram had faced issues in the past with terrorism and terrorist-related content being distributed around the platform, and they’ve been criticized heavily by governments of different countries for creating a communication ground terrorist organizations like ISIS.
The verge also reported news last year that The Indonesian government nearly banned telegram for terrorist-related contents