The popular messaging app Telegram has made a significant policy change that has sparked controversy and raised concerns about user privacy and free speech protections on the internet.
In a recent announcement, Telegram CEO Pavel Durov stated that the platform will now hand over users’ IP addresses and phone numbers to authorities with valid legal requests, such as search warrants. This decision, according to Durov, is aimed at deterring criminal activity on the platform and protecting the interests of the almost billion users.
This change comes in the wake of Durov’s recent arrest in France, where he was charged with enabling criminal activity on the platform, including allegations of spreading child abuse images and drug trafficking. Durov has denied these charges and criticized authorities for holding him responsible for crimes committed by third parties on the platform.
Critics of Telegram have long raised concerns about the platform being a breeding ground for misinformation, child pornography, and terror-related content, partly due to its feature allowing groups of up to 200,000 members. In contrast, Meta-owned WhatsApp limits group sizes to 1,000 members.
The recent policy change has raised questions about whether Telegram will now cooperate with authorities in repressive regimes, potentially putting political dissidents at risk. Cybersecurity experts have also pointed out that Telegram’s moderation of extremist and illegal content is weaker compared to other social media platforms.
While Telegram has introduced a team of moderators and artificial intelligence to address problematic content, experts like Daphne Keller at Stanford University’s Center for Internet and Society question whether these measures will be sufficient to comply with French and European laws regarding illegal content.
As the debate over user privacy and content moderation continues, many are closely watching how Telegram will navigate these challenges and balance the demands of law enforcement with the protection of user rights.