Skip to main content Accessibility

To make Press Center inquiries, email press@splcenter.org

New SPLC Analysis: Extremists Leverage Little-Known and Encrypted Technology To Spread Dangerous Messages and Plan Violence

WASHINGTON, D.C. – Today, the Southern Poverty Law Center (SPLC) released a new analysis – titled “How an Encrypted Messaging Platform is Changing Extremist Movements.” The report details how extremist groups are migrating to lesser-known social media platforms and messaging apps like Telegram and using encrypted communications, bots, and invite-only chat rooms to plan new attacks. The use of these encrypted spaces, where groups share hate-filled imagery and memes, makes it challenging to track their activities. It also highlights the ways that Facebook, Twitter, YouTube and Instagram continue to fall short on their pledges to combat the spread of violent rhetoric and conspiracy theories like QAnon. The analysis is the final part of the three-part series of SPLC’s flagship Year in Hate report.
 
“For years, far-right extremist and hate groups have increasingly used social media to spread dangerous messaging and coordinate violence. Despite constant warnings to tech companies about the dangers of these groups, those companies chose to allow users to post whatever content they pleased, no matter the potential repercussions. The only goal for tech companies was profit,” said Margaret Huang, president and CEO of SPLC. “It was only when violent insurrectionists stormed the Capitol, killing and injuring many, that tech companies finally decided to change their tune and deplatform some people. But that was far too late.”
 
Some extremists are congregating in a disturbing echo chamber they call “Terrorgram,” which fuses the glorification of political violence with a distinctive hyper-stylized visual aesthetic. It consists of a small network of channels that share memes glorifying an apocalyptic race war, instructions for 3D-printing weapons, extremist literature and manifestos, and video recordings of white supremacist terror attacks that have been removed from other platforms. Many of these channels came into existence in 2019 and grew their audience significantly thereafter. 
 
Other findings from SPLC’s analysis include:

  • Telegram’s file storage feature is being used to share massive libraries of multimedia propaganda. These may include images and videos dedicated to white supremacist terrorism and encouraging violence against religious and ethnic minorities, guides for 3D-printed weaponry, white supremacist literature, hate music and more.
  • These same channels have also provided a space for livestreams of white supremacist terrorists, including the perpetrators of the 2019 terrorist attacks in Christchurch, New Zealand, and Halle, Germany. Telegram’s lax enforcement policies have allowed these videos to proliferate long after they were purged from other parts of the internet.
  • Many channel administrators have also leveraged Telegram “bots” to automatically erase content from the channel, making it harder for law enforcement agencies to monitor the platform for signs of potential impending attacks or evidence about ones that have already taken place.
  • The extreme right has weaponized youth culture by using irony, wit, and satire to present extremist and hateful ideas as edgy humor, allowing youth who get the joke to feel like powerful insiders. Nearly a quarter of online gamers will encounter white supremacist extremist propaganda while playing, for example. Teenagers are likely to encounter racist memes, jokes that minimize or deny the Holocaust, and other dehumanizing and misogynistic content on a regular basis. 
  • One channel associated with the Nationalsozialistische Deutsche Arbeiterpartei International – a nod to the official title of Hitler’s Nazi Party – grew at a rate of 2,031.87% after being “restricted” by Telegram, boosting its followers from 775 in 2019 to 16,522 in 2020.

“With constantly evolving technology, extremist groups spread dangerous messaging across our nation at an alarming rate and coordinate vicious attacks in secret. Without immediate action to moderate these technologies and hold technology companies accountable while respecting free speech, there’s no telling the violence these extremists could unleash next,” said Susan Corke, director of SPLC’s Intelligence Project.