Twitter temporarily suspended my account this week after I posted a tweet that opposed far-right extremism.
The incident, which has since been resolved with an apology from Twitter and the unlocking of my account, underscores the struggles Twitter has had with keeping white supremacists from weaponizing their platform to further an agenda of cruelty.
To be more specific about what happened to me, the social media giant locked me out and also threatened me with a permanent suspension for posting a tweet that demonstrates a connection between an American white supremacist leader, the terrorist who gunned down 50 worshippers including children at two New Zealand mosques in March, and a mysterious fire that recently ravaged a Tennessee-based building linked to the civil rights movement.
Twitter highlighted no other tweets of mine as being abusive. That was it.
“Not good,” I remarked in the tweet, referring to the proliferation of an obscure fascist symbol first embraced by a Romanian group called the Iron Guard – that once butchered innocent Jews alive and hung them from meat hooks.
I learned about the lockout on Wednesday evening, when I received a notice on Twitter that asked me to remove the tweet. Twitter enables users to appeal decisions like that one, and I did, noting to the company that my account had most likely been mass-reported in bad faith by white supremacists who want to shut me up.
When I woke up on Thursday morning, I received a reply from Twitter in which the company appeared to double down:
“We’re writing to let you know that your account features will remain limited for the allotted time due to violations of the Twitter Rules, specifically our rules against abusive behavior,” Twitter said. “To ensure that people feel safe expressing diverse opinions and beliefs on our platform, we do not tolerate behavior that crosses the line into abuse. This includes behavior that harasses, intimidates, or uses fear to silence another person's voice.”
It's the last part that should unnerve people who are opposed to the rising tide of far-right extremism in America and abroad. The only voices I or any of my colleagues have ever sought to diminish on social media are the same white supremacists who repeatedly sign up for account after account on Twitter to preach hatred or to bully minorities.
People who report on extremists, either for rights groups like the SPLC or for print and digital news publications, typically endure constant harassment on the site from white supremacists who don’t want us to be able to do our jobs.
If white supremacists are included in the types of voices Twitter is concerned about protecting on their website, it is an error in judgment, and one for which innocent people will eventually pay the price.
That’s because there is no respectable reason to take a balanced or nuanced view of white supremacy. It’s an ideology that causes death to innocent people and one that calls for the outright destruction of our society.
Brenton Tarrant, the alleged killer in New Zealand, engaged directly with “white genocide” propaganda that is promoted every day on Twitter. Racist mass murderer Dylann Roof killed partly in response to messaging found online. Atomwaffen Division, a neo-Nazi group that has killed at least five people since the beginning of 2017, formed online, responding to nihilistic language that is still heavily promoted on Twitter today.
Twitter has made noise about cleaning up its platform in the past, but the same hateful voices continue to abuse it to promote their cruel and dystopian worldview. White supremacist leader David Duke continues to dehumanize Jews and other minorities to an audience of over 50,000 followers on Twitter from the handle @DrDavidDuke. Richard Spencer, whose name has become synonymous with the contemporary repackaging of white supremacy, tweets to an audience of over 75,000 accounts from the handle @RichardBSpencer.
Lauren Southern and Stefan Molyneux, two internet-based, racist pundits who espouse white supremacist talking points about white genocide, are not only on Twitter, but they are also verified by the site, which is seen by some as a tacit endorsement of their value. Both pundits are rapidly climbing to half a million followers. It’s unclear what kind of audience racists like this would have if not for Twitter. Twitter does not just support the careers of people like this: It makes the careers happen in the first place.
There are, of course, swarms of other anonymous and familiar white supremacist accounts on Twitter, with audiences ranging between the single digits and the tens of thousands. They are almost impossible to count, and they return with a frequency that is often difficult to track. They tell their followers hate crimes aren’t real. They scapegoat Jews, ridicule transgender people, and attempt to deify mass murderers.
I’m grateful for Twitter’s apology to me, which would not have come without other people raising concerns publicly on the platform. But Twitter owes a more urgent apology to minorities and marginalized voices across the world for giving professional hate merchants a megaphone to promote a violent ideology of white supremacy. People who embrace their words are murdering people. That’s not speculation on my part, it’s a demonstrable fact.
Twitter and other tech companies need to Change the Terms, as well as enact and enforce comprehensive policies to protect the voices of underrepresented communities, and combat the spread of hateful ideologies.
The time to get serious about moderating Twitter is right now.
Michael Hayden is a senior investigative reporter for the SPLC’s Intelligence Project, which monitors hate groups and other extremists across the United States, and exposes their activities to the public, the media and law enforcement.