Telegram U-turns and joins child safety scheme
After years of ignoring pleas to sign up to child protection schemes, the controversial messaging app Telegram has agreed to work with an internationally recognised body to stop the spread of child sexual abuse material (CSAM).
The Internet Watch Foundation (IWF) is used by major online services to help them detect and remove CSAM, and prevent its spread.
Telegram had repeatedly refused to engage with it or any similar scheme.
But, four months after its founder Pavel Durov was arrested in Paris for Telegram’s alleged failure to moderate extreme content, the platform has announced a U-turn.
The IWF has described Telegram’s decision as “transformational” but warned it was the first step in a “much longer journey” for the app.
“By joining the IWF, Telegram can begin deploying our world-leading tools to help make sure this material cannot be shared on the service,” said Derek Ray-Hill, Interim CEO at the IWF.
Telegram is used by around 950 million people worldwide and has previously positioned itself as an app focussed on its users’ privacy rather than the policy norms prioritised by other global social media companies.
But reporting from the BBC and other news organisations highlighted criminals using the app to advertise drugs as well as offer cybercrime and fraud services and, most recently, CSAM.
It led one expert to brand it “the dark web in your pocket.”
In August, its billionaire owner was detained at an airport north of Paris.
Mr Durov is accused of a failure to co-operate with law enforcement over drug trafficking, child sexual content and fraud.
French judges have barred the 40-year-old from leaving France pending further investigations.
The company maintains that his arrest is unfair, and that he should not be held liable for what users do on the platform.
Nonetheless, Telegram has since announced a series of changes to the way it operates, including:
- Announcing IP addresses and phone numbers of those who violate its rules will be handed over to police in response to valid legal requests
- Disabling features like “people nearby” which it admitted had issues with bots and scammers
- Publishing regular transparency reports about how much content is taken down – a standard industry practice it had previously refused to comply with
Mr Durov has also vowed to “turn moderation on Telegram from an area of criticism into one of praise”.
The partnership with the IWF appears to be the latest step in that process.
The IWF is one of a few organisations in the world that is legally able to search for child sexual content to get it taken down.
Its ever-evolving list of known abuse content is used by websites to detect and block matches to stop it spreading.
Telegram says that before becoming a member of IWF it removed hundreds of thousands of pieces of abuse material each month using its own systems. The IWF membership will strengthen its mechanisms, the company said.
The app is marketed as a fully end-to-end encrypted messaging service – meaning only the sender and recipient of a message can read it – like WhatsApp and Signal.
But in fact the majority of communication is done with standard encryption, raising questions about how secure from hacking and interception it is.
Mr Durov, who was born in Russia and now lives in Dubai, has citizenship in Russia, France, the United Arab Emirates and the Caribbean island nation of St Kitts and Nevis.
Telegram is particularly popular in Russia, Ukraine and former Soviet Union states as well as Iran.