Telegram’s Bans on Extremist Channels Aren’t Really Bans

In the wake of the October 7 Hamas attacks on Israel, people concerned about online extremism turned their attention to the encrypted messaging app Telegram, where a Hamas-aligned group posted graphic images of the group’s attacks to a channel that now has 1.9 million followers. That content was then shared widely across social media. Following public pressure on Apple and Google several weeks into the Israel-Hamas war, Telegram “restricted” two of the major channels used by Hamas. But it did not, as it may appear to some users, ban them.

A WIRED investigation reveals that rather than ban or delete Hamas channels or those run by right-wing extremist groups, Telegram is hiding them from the users of the two major app stores, but they are still there. Some of the content from restricted channels is being shared broadly in unrestricted ones—despite Telegram’s mechanisms for stopping the sharing of such content. The findings show that while Telegram makes some of its most violative communities difficult to find, the people in restricted channels are still able to spread their messages, experts say, and the channels continue to function as spaces of radicalization.

WIRED and Jeff Allen, cofounder of the tech policy think tank the Integrity Institute, analyzed over 100 restricted channels and thousands of posts over more than two months. Most of these channels include content related to right-wing extremism and other forms of radicalized hate. WIRED’s analysis found that most of these channels remained active even when restricted.

“What they’re doing is making [channels] not show up in your search and discoverability, but they’re keeping them on the back end,” says Nicole Stewart, assistant professor of digital media at Texas State University.

Founded by Pavel Durov in 2013, Telegram has long been a favorite platform for extremists. Its channels have no limit on the number of subscribers who can join (groups have a 200,000-person limit), and its approach to content moderation has meant that groups and content that might violate the terms of service on platforms like Facebook, Instagram, and X (formerly Twitter) remain accessible.

Most of the channels were restricted during the months WIRED monitored them. However, in examining the Hamas channels that were restricted after the October 7 attacks, WIRED found that while content views dropped significantly—by about 70 percent—they still remained active, with the Hamas military-affiliated channel @qassambrigades continuing to share more than 20 pieces of content per day. Although views on the restricted channel often dipped, WIRED found that content from @qassambrigades was copied and pasted into the unrestricted channel @hpress, a popular news-focused channel that has over half a million subscribers.

“It’s not surprising that in a restricted environment that you’ll start to see that same network-sharing pattern,” says Stewart. “It’s what they do to share their own ideological beliefs and spread the message.”

Experts who study extremist groups on Telegram tell WIRED that the platform typically moderates these groups’ content in response to pressure from governments, law enforcement, or entities like Google and Apple that have the ability to restrict the app’s availability.

“What we seem to know from experience is that when Google and Apple say, ‘Hey, remove this,’ that’s when Telegram comes in and starts removing things,” claims Aleksandra Urman, a researcher at the University of Zurich who has studied Telegram and the far right in Europe. That also speaks about the “power of Google and Apple,” Urman says, because both companies are able to exert influence over Telegram’s moderation decisions.

While Telegram is available for download on Google Play and the Apple App Store, both Google and Apple disallow apps that promote hate speech, terrorism, and violent content. For example, Google’s terms of service say it doesn’t “allow apps with content related to planning, preparing, or glorifying violence against civilians” or “apps that promote violence, or incite hatred against individuals.” Apps that violate these standards can be removed from Google Play or the App Store, significantly limiting their availability. That both app stores offer Telegram for download suggests the app is complying with the companies’ terms of service.

Apple did not respond to a request for comment. Google spokesperson Danielle Cohen tells WIRED, “Play apps that contain or feature UGC [user-generated content], including apps which are specialized browsers or clients to direct users to a UGC platform, must implement robust, effective, and ongoing UGC moderation. This includes providing an in-app system for reporting objectionable UGC and users and taking action against that UGC and/or user where appropriate.”

Telegram did not respond to WIRED’s request for comment.

In 2021, Telegram released a version of the app that could be downloaded directly onto Android, which the company billed as having “fewer restrictions.” Downloading the app directly puts it outside the scope of Google Play’s terms of service. Users who have this “sideloaded” version of Telegram can continue to see and post in restricted channels that regular users can’t access.

Cohen, the Google spokesperson, tells WIRED that “Google does not control app distribution on Android,” noting that Android’s open operating system means apps can be downloaded from other Android app stores or directly to users via an app’s website.

Telegram’s terms of service state that users may not “promote violence on publicly viewable Telegram channels, bots, etc.,” but it says nothing about how the platform responds to users who violate these rules. Telegram did not respond to questions about how it assesses whether a channel should be restricted or banned entirely, as it has done to Islamic State-affiliated channels in the past.

In a post on his public Telegram channel on October 13, Durov addressed the pressure to remove Hamas channels and content, saying, “Every day, Telegram’s moderators and AI tools remove millions of obviously harmful content from our public platform.” He also asserted that because Telegram doesn’t algorithmically amplify content, “it’s unlikely that Telegram channels can be used to significantly amplify propaganda.”

While Telegram may not be algorithmically promoting content, it is used to spread hateful ideologies, experts say. “Hate groups and designated terrorist groups who are using the platform as a way to mobilize the troops, basically, they’re functioning as influencers,” says Stewart of Texas State University. Part of that work is sharing and promoting the posts and ideas of other aligned groups and influencers, pushing users deeper into the extremist ecosystem.

When a Telegram user from a restricted channel forwards content to a nonrestricted channel, that content won’t be visible to anyone in the channel who has downloaded the app from Apple’s App Store or Google Play. But a simple copy and paste easily circumvents this feature for text-based messages. WIRED and Allen found over 400 instances where text-based content was copied word for word into both restricted and nonrestricted channels, with users often tagging the restricted channels in the unrestricted ones. The analysis did not include images and videos, which also comprise a considerable amount of the content.

For instance, messages sent to an unrestricted channel called OfficialTheCollective, with more than 4,000 members, included tags to restricted channels like SpecialQForces, a QAnon conspiracy channel, and often encouraged users to join.

One message posted to OfficialTheCollective even instructs users how to get around Google and Apple’s restrictions, saying, “Google and Apple [are] censoring the Telegram app downloaded via Google Play and Apple App Store. Use the web browser.” The message includes instructions about how to sideload Telegram, including a link to an instructional video on the video platform Rumble, which is popular with right-wing influencers.

In the unrestricted ExpatsPortugalEngChannel, WIRED found a message referencing a riot by Eritrean migrants in Germany. “Where are all the liberals who wanted these invaders in their country with open arms?” the post reads. “WE need you to go talk to them nicely, go first before the NAZIS take charge since YOU going to need them now.” The text is identical to a post shared in the restricted channel @InTheEndGodAlwaysWins.

In some cases, Telegram will restrict channels based on local laws. This includes Germany, which has strict rules against hate speech and neo-Nazism. Heidi Schulze, a researcher at Ludwig-Maximilians-Universität München, says that she was able to access channels restricted in Germany by using VPNs and Dutch SIM cards, which tricked Telegram into treating her as though she weren’t in Germany at all. Channel restrictions “from Telegram’s perspective might be a smart thing because they’re adhering to the local laws, but they’re not deleting channels,” she claims, as it would allow the platform to retain its claim of protecting freedom of speech.

But Urman, of the University of Zurich, says that restricting the channels can also create a space for more radicalization. “People who are likely to follow these groups even after they are restricted, who really want to see that content and are going to seek out the technical possibilities to do that, are more likely to be more radical,” she says.

Having already been “deplatformed” in some capacity can also mean that content is likely to be more extreme because channel operators are no longer factoring in a fear of content moderation, according to Urman. “These groups, even when they are restricted, are probably conducive to radicalization,” she says. “You’re not going to plan, in most cases, a certain attack out in the open. But now you’re not in the open. You are among your club.”

Rita Katz, executive director and founder of SITE Intelligence Group, a consultancy that monitors terrorist groups online, believes the company’s decision whether to ban a channel likely hinges on laws and enforcement. “The European Union has strong-armed Telegram to take action against ISIS and al Qaeda since 2015, and the company acted even more surgically against the groups’ networks in 2019,” claims Katz. “Telegram began deleting whole channels and accounts, not only of administrators or ISIS- and al Qaeda-connected accounts but also of everyone in those chat groups and channels.”

When a user forwards a message from a restricted channel to a nonrestricted one, a notice will appear to anyone who doesn’t have the “sideloaded” version of the app saying that the message can’t be displayed on that user’s device. WIRED’s analysis of thousands of restricted pieces of content found that only 36 of them appeared with the notice that they were restricted for violating Telegram’s terms of service. This means that a majority of restricted channels and content appear to be an effort to comply with the policies of Google and Apple, which can drop the app from their stores.

“Unless Telegram’s operations are going to be threatened somehow,” claims Stewart, “it’s probably not a priority for them.”

Source : Wired