Skip to main content Accessibility

Year In Hate: New SPLC analysis shows extremists leveraging little-known and encrypted technology to spread dangerous messages, plot violence

A new SPLC analysis released today details how far-right extremist groups are migrating to lesser-known social media platforms and messaging apps like Telegram, as well as encrypted communications, bots and invite-only chat rooms to plan new attacks. The use of these encrypted spaces, where groups share hate-filled imagery and memes, makes it challenging to track their activities.

At the same time, widely used social media platforms such as Facebook, Twitter, YouTube and Instagram continue to fall short on their pledges to combat the spread of violent rhetoric and conspiracy theories like QAnon.

The analysis – How an Encrypted Messaging Platform is Changing Extremist Movements is the third and final installment of the SPLC’s annual Year in Hate and Extremism report.

“For years, far-right extremist and hate groups have increasingly used social media to spread dangerous messaging and coordinate violence,” said Margaret Huang, SPLC president and CEO. “Despite constant warnings to tech companies about the dangers of these groups, those companies chose to allow users to post whatever content they pleased, no matter the potential repercussions. The only goal for tech companies was profit. It was only when violent insurrectionists stormed the Capitol, killing and injuring many, that tech companies finally decided to change their tune and deplatform some people. But that was far too late.”
Some extremists are congregating in a disturbing echo chamber they call “Terrorgram,” which fuses the glorification of political violence with a distinctive, hyper-stylized visual aesthetic. It consists of a small network of channels that share memes glorifying an apocalyptic race war, instructions for 3D-printing weapons, extremist literature and manifestos, and video recordings of white supremacist terror attacks that have been removed from other platforms. Many of these channels came into existence in 2019 and grew their audience significantly thereafter. 
Other findings from SPLC’s analysis include:

  • Telegram’s file storage feature is being used to share massive libraries of multimedia propaganda. These may include images and videos dedicated to white supremacist terrorism and encouraging violence against religious and ethnic minorities, guides for 3D-printed weaponry, white supremacist literature, hate music and more.
  • These same channels have also provided a space for livestreams of white supremacist terrorists, including the perpetrators of the 2019 terrorist attacks in Christchurch, New Zealand, and Halle, Germany. Telegram’s lax enforcement policies have allowed these videos to proliferate long after they were purged from other parts of the internet.
  • Many channel administrators have also leveraged Telegram “bots” to automatically erase content from the channel, making it harder for law enforcement agencies to monitor the platform for signs of potential impending attacks or evidence about ones that have already taken place.
  • The extreme right has weaponized youth culture by using irony, wit and satire to present extremist and hateful ideas as edgy humor, allowing young people who get the joke to feel like powerful insiders. Nearly a quarter of online gamers will encounter white supremacist extremist propaganda while playing, for example. Teenagers are likely to encounter racist memes, jokes that minimize or deny the Holocaust and other dehumanizing and misogynistic content on a regular basis. 
  • The audience of one channel associated with the Nationalsozialistische Deutsche Arbeiterpartei International – a nod to the official title of Hitler’s Nazi Party – grew by more than 2,000% after being “restricted” by Telegram, boosting its followers from 775 in 2019 to 16,522 in 2020.

“With constantly evolving technology, extremist groups spread dangerous messaging across our nation at an alarming rate and coordinate vicious attacks in secret,” said Susan Corke, director of the SPLC’s Intelligence Project. “Without immediate action to moderate these technologies and hold technology companies accountable while respecting free speech, there’s no telling the violence these extremists could unleash next.”

Photo by Roberto Schmidt/AFP via Getty Images