Telegram: a criminal service?
- 28 augustus 2024
On August 24, Telegram founder Pavel Durov was arrested in France in connection with an ongoing investigation into child abuse material, drug
trafficking, fraud and ransomware. We are getting a lot of questions on the subject and are listing a few of our concerns.
Durov's arrest comes at a particularly volatile time for online platforms and their users. The Digital Services Act has just gone into force, requiring companies to, among others, do more to combat illegal material, anticipate and mitigate potential harms, and be more accessible and responsive to users and authorities. But even more critical are the ongoing, strained negotiations on the highly contested European proposal for a directive aimed at preventing and combating child sexual abuse online, proposed by the European Commission in May 2022, in which the right to encrypted, i.e. private communications is at stake.
Bits of Freedom worries that European member states will use this case to invigorate their crusade against encryption, misleadingly framing encryption as the problem. It will therefore be paramount to continue to separate two issues: Telegram hosts illegal content and activity and needs to do more about that, and limiting the availability of end-to-end encryption is neither necessary nor the solution.
Liability
In France, web hosters and services like Telegram and Instagram are protected from liability for the information users send or store. Which is just as well, because if they were liable, they would "need" to know you're not getting them into trouble. They would monitor and scan each message or file you send or store against copyright infringements and a whole slew of other offenses. This not only means that none of your communications would be private, but it would undoubtedly lead to a great deal of censorship.
However, services lose immunity when they come to have actual knowledge of illegal content. So when they are notified of illegal content by the police or a “trusted flagger” such as a child sexual exploitation hotline, they generally take content down pretty quickly. Some parties do not, by the way. In the world of hosting providers, these parties are also called “bad hosters”. These parties know that they host illegal content, but structurally refuse to comply with removal requests. Or worse: they assist content providers in evading law enforcement by moving their content around.
In a way, Telegram is similar to a bad hoster. The independent Dutch child sexual abuse hotline Offlimits just this past week raised alarm about Telegram, calling on numerous authorities to take action against the company. A cry for help quite uncommon in the Netherlands, a country whose internet sector has a long history of close and considerate cooperation when it comes to notice and takedown. And it isn't only hotlines and police that Telegram responds poorly to: it also structurally ignores civil society organizations reaching out about content on Telegram that endangers activists and other vulnerable groups.
"Telegram knowingly hosts illegal content, much of it out in the open, and needs to start complying with takedown requests. Limiting the availability of end-to-end encryption is neither necessary nor the solution."
The accusations
France has issued a host of accusations against Telegram, some more worrying than others. In addition to accusations of complicity in criminal activity, related, probably, to not responding to take-down requests, the French also seem to be accusing Telegram of essentially being an online platform for organized crime. At first glance, and with what we know now, that is a worrisome statement. Over the years, Europe has seen a slue of encrypted app services being taken down by the police, such as EncroChat, Sky and Exclu. As a rule, these apps' user numbers are relatively low, they're marketed at criminal organizations, and they almost exclusively host criminal activity. Telegram is quite different. It has nearly a billion users from all over the world, and with great diversity. Governments, activists, families, organizations and yes, even criminals use the app. But that does not make the app in itself a criminal service.
In addition, France seems to be accusing Telegram of offering a "cryptology" service and tool in France without having the proper "prior declaration". France has rules around offering and importing products that include encryption, requiring, in some cases, that such a service is reported to the authorities. It is not at the moment clear whether the French accusation refers to, for example, an integrated crypto payment system, or to the end-to-end encryption Telegram offers for (some!) messages.
Speaking about encryption: it's good to keep in mind that Telegram unfairly presents itself as a secure app. Yes, technically you can send encrypted messages, but the app doesn't make it easy. End-to-end encryption is only available for 1-on-1 chats, and only when you turn it on yourself. This also requires both parties to the conversation being online at the same time. Groups and channels, where much of the illegal content on Telegram is shared, are never encrypted. Telegram, and anyone else who gains access to their infrastructure, can access those messages in plaintext.
There is still much we don't know about the allegations. However, it is beyond doubt that Telegram needs to get its act together. First, that means complying with takedown requests. If it continues to ignore these, it not only puts people at risk, but also gives governments more incentive to impose obligations on platforms that violate the privacy and freedom of speech of all users.
Again: it would be misleading to frame encryption as the problem here. Telegram knowingly hosts illegal content, much of it out in the open, and needs to start complying with takedown requests. Limiting the availability of end-to-end encryption is neither necessary nor the solution.