A network of far-right extremists is self-censoring, and in at least one instance mass-deleting, content from several key online communities following the devastating terror attacks on mosques in Christchurch, New Zealand.
A review of 12 far-right servers on Discord, a chat application favored by video gaming communities and more recently the extreme right, reveals that at the same time users were celebrating the horrific attacks of March 15, administrators and moderators of these online spaces deleted large amounts of content and instituted bans on posts glorifying the perpetrator.
The Southern Poverty Law Center and outside researchers affiliated with the non-profit media organization Unicorn Riot conducted the review in the weeks since the March attacks. The review brought to light that while social media networks have been slow to remove hate content from their platforms, some extremist communities are taking down or banning content due to legal concerns. Portions of those deleted conversations are available from Unicorn Riot.
“Attention all users. Considering the circumstances we find ourselves in it is very likely that this man was in any number of /k/ servers,” user “Maj. Asshole,” an administrator of a 4chan-affiliated server titled “The Pathetic Life of an Average /K/ommand,” wrote. “Considering this it is very likely we could all be, in the event the man was in the server, considered accomplices and held for a federal investigation. Seeing as that is the case, any mentioning of the recent habbening from now on us strictly verboten.”
“Maj. Asshole’s” fears seem legitimate. Federal prosecutors in Harrisburg, Pennsylvania, recently charged Corbin Kauffman, a 30-year-old resident of Lehighton, Pennsylvania, with interstate transmission of threats to injure another person for content that he posted on Minds.com, a fringe social media site.
If the suspect in the New Zealand attacks was a member of “The Pathetic Life of an Average /K/ommand,” its users may also have cause for concern. David Hyman, a law professor at Georgetown University, told Newsweek in 2018 that anonymous online users can have their identities revealed if a judge deems it relevant to a case. “Private and privileged are not the same thing,” Hyman told Newsweek.
The 12 servers examined in this investigation – chosen for how their moderators and users discussed and responded to the tragedy in Christchurch – are part of a larger network of 50 that is being reviewed comprehensively by the researchers. This smaller cluster of chat servers posted an estimated 38,932 messages in the first 24 hours following the terror attack that left 50 dead and 50 wounded.
An even more complex picture of these obscure and chaotic online spaces emerges when considering this: As moderators of these extremist spaces undertook an unprecedented level of voluntary censorship, their users and the broader far-right were engaged in a haphazard campaign to immortalize the perpetrator and his manifesto.
The SPLC and the researchers reviewed copies of the Christchurch suspect’s manifesto and the Facebook Live video of the attack filmed by the shooter, which tech companies across the globe were scrambling to remove.
Users in these servers were also creating memes and coordinating the creation of other content, including YouTube playlists celebrating the alleged killer. Some pledged to follow the Christchurch attacker’s footsteps.
“Wow. Just finished reading the manifesto. Truly powerful,” “Sulferix” wrote in Outer Heaven, one of the servers reviewed for this piece. “I will be starting my own contribution to the fight soon, in every way that I can. I will start a group. I will train. I will be part of this if it fucking kills me. I hope I’m not the only one.”
Statements from moderators and users show they fear Discord will remove them from its platform, and they fear prosecution for hateful and violent remarks.
The resilience of these apocalyptic communities on Discord combined with self-censorship illustrates how far Silicon Valley’s policymakers and content moderators are lagging behind far-right extremists on their platforms.
One moderator told his followers about a delay that offenders can exploit to avoid content moderation or bans from the platform.
“If someone reports the server, it takes 24 hours for discord to look into the report,” “Captain Kirk JT,” owner and administrator of the server “Outer Heaven,” wrote. “If the messages are gone by the time they look, its as if it never happened, and the report is dropped.”
Discord’s terms of service, last updated on Oct. 19, 2018, state, “The company reserves the right to remove and permanently delete your content from the service with or without notice for any reason or no reason.” According to Discord’s community guidelines, flagged content is reviewed “as it comes in as quickly as we can.”
The dizzying speed with which far-right extremists archive and redistribute propaganda signals an awareness of attempts to limit the spread of media associated with terrorism.
Technology companies are being pressured by governments and civil society organizations into enforcing their corporate policies governing harmful content. Human rights and technology professionals have long been concerned with archiving vulnerable online content. The far-right, which until recently was able to create and disseminate hateful propaganda with little worry of moderation, is demonstrating similar concerns. As content moderation improves, white supremacists and their sympathizers are changing tactics and refocusing their efforts to create archives of their online propaganda.
For instance, the Daily Stormer published its 88th weekly digest of white supremacist content from the site in PDF, ePub and Build file format on April 28, 2019, in response to its vitriolic content being taken offline repeatedly by web hosting providers.
“We began publishing a redistributable and archival weekly magazine for two reasons,” read the April 28, 2019, edition’s introduction. “The first was that we wanted to give people the ability to spread our publication as samizdat to evade this global censorship regime. Secondly was that it is easily archivable, and there’s a nonzero chance that the publisher and staff of this website will be murdered by global Jewry before all is said and done and we want a survivable record for history’s sake of what we actually said and did.”
As violence tied to far-right extremist communities, particularly those online, intensifies, participants are becoming more aware of potential legal and reputational liability. The project of removing violent and terroristic content that defines many of these spaces online is in direct conflict with many users’ attempts to glorify those who commit extremist violence and the materials that inspired them.
As these Discord servers illustrate, while extremist communities are resilient and committed to spreading violent ideologies, meaningful content moderation can change the paradigm.
Unicorn Riot (UR), an independent media organization, has published chat logs from Outer Heaven, the Pathetic Life of an Average /K/ommando, and several other related Discord servers, commonly known as chat rooms. The release is part of UR's efforts to expose far-right online spaces promoting the New Zealand shooting.
Photo credit: iStockphoto