As CoinDesk reports, Telegram has quietly edited its FAQ to remove language that said it does not moderate private and group chats. A section titled “There is illegal content on Telegram. How can I remove it?” previously stated that content in chats and group chats stays between the participants only. However, now, the section states that “all Telegram apps have ‘Report’ buttons” that will give users a way to flag illegal content to the app’s moderators. Users simply need to tap on the message on Android, or press and hold it on iOS, and choose the Report option. They can also note the link to the content they want to report and send an email to the service’s takedown email address (abuse@telegram.org).

The change comes after Telegram chief Pavel Durov published his first public comments since his arrest on his channel. Durov was arrested at an airport in France in late August as part of authorities’ investigation into the lack of moderation on the app and its failure to curb criminal activity. He has already been released from custody, but was charged with “complicity in distributing child pornography, illegal drugs and hacking software” on the messaging app, as well as “refusal to cooperate with an investigation into illegal activity on Telegram.”

French authorities apparently told Durov he was arrested because they had not received any response from Telegram about their investigation. This was surprising, the app’s founder explained in his post, because Telegram has an official representative in the EU and an email address publicly available to anyone. He also said French authorities have multiple ways to contact him for assistance and that he had previously helped them set up a Telegram hotline to deal with terrorism threats in the country. In addition, he called the French authorities’ decision to “accuse the CEO for crimes committed by third parties on the platform” a “misguided approach.” He said that no innovator would ever create new tools without being held responsible for the potential misuse of those tools.

Durov also talked about how Telegram protects people’s basic rights, especially where they are violated. In Russia, for example, Telegram was banned after the service refused to hand over encryption keys that would allow authorities to spy on users. He said the service removes “millions of harmful posts and channels every day”, publishes transparency reports and maintains direct hotlines with NGOs for urgent moderation requests.

However, the CEO admitted that Telegram has room for improvement. The “sudden increase in the number of users” to 950 million caused “growing pains”, making it easier for criminals to abuse its platform. Telegram aims to “significantly improve things in this regard” and has already started the process internally.

Presumably, this change in its rules is part of the messaging service’s efforts to address authorities’ allegations that it has failed to prevent criminals from using its app. Note, the service reported earlier this year that it had 41 million users in the European Union, but authorities believe it lied about its user numbers to avoid being regulated under the Digital Services Act (DSA).

Leave a Reply

Your email address will not be published. Required fields are marked *