Skip to main content Accessibility

Wikipedia wars: inside the fight against far-right editors, vandals and sock puppets

Like Facebook, Google and Twitter, Wikipedia has become a fixture of online life.

With more than five million articles, it is the world’s go-to source for all kinds of information. However, the free encyclopedia’s openness and anonymity leave it vulnerable to manipulation by neo-Nazis, white nationalists and racist academics seeking a wider audience for extreme views.

The far right has been active on Wikipedia since it first went online in 2001, but in the past two years, its presence has grown with the emergence of the alt-right and the surge in rightwing populism in Europe and North America, says administrator Doug Weller.

Throughout the years, the site has refined its processes and policies for handling disruption and fringe points of view, but as Wikipedia gets bigger, so does the challenge of ensuring its integrity and neutrality. Weller says it’s an issue of manpower. Though Wikipedia has more than 32 million registered users, only around 130,000 have used the site in the past month. The task of guarding against vandalism and bias generally falls to a much smaller core of veteran editors and admins. The sheer volume of activity — 10 edits per second and 600 new pages created daily — can easily outstrip Wikipedia’s capacity to police its content.

There’s an ever-present threat that an organized faction or a group of single-purpose editors working in concert can exploit Wikipedia’s mechanisms to tilt its point of view in favor of a fringe perspective.

Civil POV

The far-right’s activity on Wikipedia is relatively easy to manage when it clearly violates one of the site’s policies, such as vandalism, harassment, ad hominem attacks or edit-warring, which refers to dueling edits by one or more factions outside of the normal dialogue mechanism. But it’s much harder to keep editors in check when they dance on the line between acceptable and unacceptable behavior — this is known as “civil POV.” Civil POV is when an editor or a group of editors tries to tilt an article to a particular point of view but remains polite and abides by site-wide norms of behavior.

Wikipedia’s policies are more oriented toward conduct than content, said Magnus Hansen, a postdoc fellow at the University of Copenhagen who has been editing on Wikipedia for more than 10 years.

“That means that it is hard to get users blocked or restricted for consistently providing ideologically skewed content, unless it can be demonstrated that they are deliberately breaking the community’s rules for conduct or content creation,” Hansen said, citing the case of one user who repeatedly made antisemitic pages and when called out on it, claimed he was acting in good faith.

One of the bedrock principles of Wikipedia is the assumption of good faith. Unless there is evidence to the contrary, edits are assumed to be made with the intent of improving the encyclopedia. Editors who attempt to insert their ideological bias but maintain the semblance of civility are given the benefit of the doubt until their disruption becomes apparent enough to warrant action by administrators. Civil POV-pushers can disrupt the editing process by engaging other users in tedious and frustrating debates or tie up administrators in endless rounds of mediation.  

Users who fall into this category include racialist academics and members of the human biodiversity, or HBD, blogging community. Often these are single-purpose accounts that exclusively edit on topics like race and intelligence, racial classification and bios of related researchers, like Linda Gottfredson or Helmuth Nyborg. Some have direct ties with racist journals or organizations, like Mankind Quarterly editor Gerhard Meisenberg. Emil Kirkegaard, who edits frequently under the username Deleet, is a research fellow at Richard Lynn’s Ulster Institute for Social research and the co-founder of the online pseudojournal OpenPsych.

These users tend to maintain a moderate, non-confrontational tone and adopt a posture of academic neutrality, so they are less likely to run afoul of site-wide rules and more likely to make edits that stand.

One way a civil POV-pusher can nudge the narrative of a page and still comply with the rules is by adding information that is reliably sourced and factually accurate but nevertheless misleading. On the “race and intelligence” page, William Shockley is described as a “Nobel laureate,” though it neglects to mention that Shockley received the prize for a field irrelevant to the topic or that he held extreme racist views. The biography of Arthur Jensen misrepresents the impact and credibility of the controversial psychologist by enthusiastically billing one of his papers as “one of — if not the most — cited papers in the history of psychological testing and intelligence research” but a footnote clarifies that many of the citations were works refuting his ideas or using the paper as an example of the controversy.

Editors can also abuse Wikipedia’s guidelines and processes. For example, the restrictions on biographies of living persons, or BLP, were used to block academic criticism of Jensen and fellow racialist academic Jean-Phillippe Rushton when they were still alive.

Neutral POV, or NPOV, is another policy that requires nuance and is subject to abuse. It’s often difficult to tell if users who cite this rule are acting in good faith. Appeals to neutrality are sometimes used to soften language in articles and cast doubt on generally accepted facts. In the past, neo-Nazi users have challenged the use of the words “slaughter” and “murder” with reference to the Holocaust on these grounds.


An ideal Wikipedia article should be a neutral presentation of all the notable perspectives on a particular topic drawn from reliable sources. Like everything else on Wikipedia, however, there’s room for debate on what meets the standard.

Hansen explains: “Getting consensus about what sources are reliable is very hard, since the racialists have their own venues for publication — primarily Mankind Quarterly but also Intelligence, and Personality and Individual Differences — and because one has to make an assessment of every proposed source individually in a given context and among a given group of editors. No decisions are binding across contexts or across different articles or groups of editors.”

Wikipedia doesn’t have a master list of unreliable sources, but it does maintain a noticeboard for fringe sources and theories, and the discussions there override other localized debates, Weller said.

Though some sources like the white nationalist American Renaissance website are easily identified as fringe, others, like the journal Intelligence, sit on the borderline. Intelligence, a peer-reviewed academic journal, is ranked 10th among journals dealing specifically with the topic of psychometrics. Based on these attributes alone, it meets Wikipedia’s criteria for a reliable source.

However, as Hansen pointed out, the journal often serves as a platform for some questionable research. Intelligence is edited by Richard Haier, who is sympathetic to the hereditarian point of view. Haier was one of the signatories of Gottfredson’s “Mainstream Science on Intelligence” op-ed and he recently penned an article on Quillette defending Charles Murray.

One example illustrates just how complicated the process of establishing reliability can be and how sources with dubious credibility can still sometimes find their way onto Wikipedia. Not long ago, a reference to a paper by a man named Davide Piffer was added to the article “history of the race and intelligence controversy.” While it initially failed to pass peer review, Piffer’s paper on the frequency of a set of intelligence-linked genes in different populations was ultimately published in Intelligence in 2015. It was subsequently hailed on American Renaissance and a variety of HBD sites as solid evidence for racial differences in intelligence.

Though the paper was published in a source considered reliable by Wikipedia, Piffer’s credentials, affiliations and the scientific merit of the paper itself are suspect. Piffer is an associate of the Ulster Institute of Social Research. In an interview with American Renaissance, Lynn, the head of the institute, referred to him as one of the “rising stars” of intelligence research, noting that “he is from the north of Italy where the more intelligent Italians are found.”

At the time he published the paper, his highest level of education was a master’s in evolutionary anthropology, and his list of research interests includes parapsychology. Aside from his credentials, Piffer’s other work on fringe topics like “remote viewing,” a form of extrasensory perception widely dismissed as pseudoscience, also raises eyebrows.

Piffer’s lack of profile poses a dilemma, too. Because it appeared in an academic journal, his paper was dubbed fit for inclusion on Wikipedia but at the same time, it’s not noteworthy enough to justify published responses from other academics, so there are no sources to supply criticism or contextualization. Ironically, one of the few credible academic responses, a commentary by statistical geneticist Danielle Posthuma, would likely not meet Wikipedia’s standards since it was published on the Unz Review, a platform for fringe writers.

In addition to issues of reliability, there is also the question of weight. Wikipedia’s guidelines state that academic sources should be given weight in proportion to their credibility and currency within established scholarship. This is to prevent a false balance in the application of NPOV guidelines, but because these determinations are made by the community as a whole, a false balance can emerge anyways.

In the article on “race and intelligence,” relatively equal weight is given to the two sides of the debate — hereditarian and environmentalist — though environmentalism is the mainstream perspective in psychology.

Researchers associated with the Pioneer Fund, an organization with eugenicist roots that is the primary benefactor of racialist research, are mentioned slightly more often than mainstream researchers.  For example, Nicholas Mackintosh, author of a widely used graduate-level textbook on intelligence, is mentioned 30 times, while Rushton, the long-time president of the Pioneer Fund, is mentioned 38. Similarly, Jensen, who received more than $1 million from the Pioneer Fund in his lifetime, racked up 61 mentions versus 51 for renowned psychologist James Flynn.

Sock puppeting

Another common problem for Wikipedia is abuse of multiple accounts, or sock puppeting. Multiple accounts can be used to skirt bans and other administrative actions or to stage small dramas inside the discussion sections of articles. Some sock puppets are employed create the impression that a point of view has wider support while others serve as a strawman to the puppeteer’s argument. Editors can play both sides to get a debate started or “prime the pump.” Others use separate accounts to pull a Jekyll and Hyde routine, adopting one persona for disruption while reserving a “clean” account for civil editing.

One of the white nationalists who co-founded Rightpedia, a far-right free encyclopedia that split from Metapedia, created more than 140 accounts in the past 10 years.  Administrators have gotten better at identifying sock puppets, but it’s still a frustrating, time-consuming process. When a sock puppet is suspected, it has to first be reported to a “checkuser,” a special type of admin who has privileges to see the user’s IP address, which is then cross-referenced against IPs of a known user.

However, there are ways to avoid detection, such as using public computers, Wi-Fi hotspots or proxies and VPNs, so often it’s hard to block a sock puppet on the basis of an IP alone. Admins have to rely on other evidence, like patterns of behavior and verbal clues. Wikipedia is working on automated solutions that use machine learning to identify certain commonly used phrases but for now, the front line of defense continues to be human editors.

Canvassing/meat puppeting

In addition to creating multiple accounts, Wikipedians can tilt the editorial processes in their favor by recruiting other likeminded individuals on Wikipedia and off-site forums. They can also call upon people they know offline. For instance, during the arbitration on “race and intelligence,” it was discovered that two accounts had been created within 24 hours of each other on the same computer and the users, who had been tag-team editing, were in a romantic relationship. The couple had also enlisted the help of another user that they knew from the website DeviantArt.

In recent years, the proliferation of far-right online spaces, such as white nationalist forums, alt-right boards and HBD blogs, has created a readymade pool of users that can be recruited to edit on Wikipedia en masse. Before he was banned from the site, the aforementioned prolific sock puppeteer posted a message on Stormfront, decrying the domination of Wikipedia by “Jews” and “cultural Marxists” and calling on others to help him.  A quick search of the forum reveals several similar threads recruiting members to edit. The Wikipedia entry dedicated to Stormfront has been plagued by constant edit-warring initiated by one forum user who called for some help “keeping an eye on the page.”

The alt-right /pol/ board on 4chan has also acted as a platform for launching attacks on Wikipedia. Not long ago, members of the board vandalized the entry for Buzzfeed by adding crude jokes and changing the name of its owner to Donald Trump. r/The_Donald sub-reddit, which has significant overlap in users with /pol/, has dozens of threads directing members to make changes to various Wikipedia entries ranging from a request to “unc---” the entry on the sub-reddit to a call for someone to fix Sean Hannity’s page. Most recently, there was a thread calling attention to Wikipedia’s labeling of Alex Jones’ Infowars as a “fake news website.”

While many of these efforts might only result in petty vandalism and trolling, some of the threads hint at a more sophisticated attempt to influence the content of Wikipedia. Posts on both Stormfront and The_Donald include discussions about how to operate within the boundaries of Wikipedia’s policies, which could lead to activity on the site that is much more difficult to deal with.

POV forks

“Watchlists” are one of the main safeguards against fringe perspectives on Wikipedia. Editors can receive notifications when changes are made to a page and respond quickly to fishy edits. But when editors find their point of view blocked on a controversial article they sometimes bypass these mechanisms by starting a new page dedicated to that position called a POV fork.

While new pages are patrolled for obvious violations of Wikipedia policy, a POV fork could deal with an esoteric topic, so the volunteer patroller might not have sufficient specialized knowledge to identify problems with it. Such is the case with fringe racialist theories.

Last November, Kirkegaard created a new page on “Cold winters theory,” a fringe “race realist” theory put forward by Lynn and Rushton to offer a pseudoscientific evolutionary explanation for alleged race differences in intelligence. In this case, Wikipedia’s mechanisms acted relatively quickly. The article was recommended for deletion and redirected to Lynn’s page in less than a month following a short discussion. In other cases, the process can take much longer. Wikipedians spent three months deliberating over an article titled “Jews and communism,” which so closely resembled neo-Nazi propaganda that it was copied almost verbatim on Metapedia.

Similar forks have escaped detection. In September 2016, an entry on “Differential K theory,” another of Rushton’s core ideas, was added by a different user. The theory, an essential component of the “race realist” canon, is extensively covered on far-right alternative encyclopedias like Metapedia and Rightpedia as well as a number HBD blogs and YouTube channels. The article has remained up for over a year now. Though it contains one or two lines of criticism, most of the academic responses included are from other researchers affiliated with the Ulster Institute or Pioneer Fund, like Lynn and Michael A. Woodley.

If there are enough different perspectives on a topic, a POV fork might avoid deletion altogether, but they can still be problematic, especially if research on a niche topic is dominated by a fringe group of scholars. The article “nations and intelligence” gives disproportionate weight to the views of Lynn and his co-author Tatu Vanhannen because they are among the few scholars to research the subject. Much has been done to balance the article over the years, but throughout much of the article’s history it has served to promote and legitimize Lynn and Vanhannen’s thesis, and at one point it hosted charts of national IQ based on their dubious methodologies as well as an IQ map that has become an often-shared right wing meme.

Soft Targets

Wikipedia’s watchlist system is only as good as the editors watching a given page and it functions best with high participation. A page with few watchers is vulnerable to being manipulated by a self-selected group.

The “race and intelligence” page has more than 700 watchers while the page on the Afrikaner Weerstandsbeweging (AWB), a neo-Nazi South African separatist group, has fewer than 90. The edit history of the AWB page reveals a long pattern of edit-warring over the past year contesting the characterization of the group as “neo-Nazi white supremacists” or accurate descriptions of the group’s use of Nazi imagery. Neutrality flags have been quickly removed.

The page’s bias would hardly be obvious to a reader who is not familiar with the AWB’s history, but it omits virtually all references to the group’s violence. Of the AWB’s activity in the run-up to multiracial elections, the page says only “During bilateral negotiations to end apartheid in the early 1990s, the organization received much publicity” and elsewhere that it “threatened all-out war.” But it did much more than threaten.

The group unleashed a wave of violence aimed at derailing the transition to majority rule. In addition to bombing schools, ANC facilities and black taxi ranks, the AWB enforced “white by night” racial curfews with lethal force and massacred black travelers at makeshift roadblocks.  While none of the bloodiest chapters from the group’s history made it onto their Wikipedia page, there is an entire section devoted to AWB’s charity initiative known as Volkshulpskema, which is sourced entirely to the blog of South African Neo-Nazi Arthur Kemp.

Interestingly, Kemp was cited not by any far-right extremist but by a veteran editor named Zaian who has thousands of contributions to various articles on South Africa. In 2007, an anonymous editor added a section titled “Volkshulpskema and Rise to Power” without references, which contained favorable comparisons to the social welfare aspects of Nazi Germany.

Apparently, Zaian was doing some housekeeping on the page and he saw a section without references, so he added the only source he could find: Kemp’s book. On another Talk page, Zaian had called out the same anonymous editor for making bizarre statements about race mixing, yet, through inattention to detail, Zaian ultimately validated the contributions of a person they described as an “extremist” and ensured these changes would still be around more than 10 years later.

What can be done?

The presence of white nationalists and other far-right extremists on Wikipedia is an ongoing problem that is unlikely to go away in the near future given the rightward political shift in countries where the majority of the site’s users live. Wikipedia’s leadership is acutely aware of the issue, but it faces the difficult task of balancing its mission of inclusiveness and tolerance of a diversity of viewpoints with the aim of maintaining the integrity, accuracy and neutrality of its content.

When asked how Wikipedia can maintain this balance, Hansen responded: “It cannot. Or at least that it can only do so to the extent that there are sufficient people who are sufficiently well versed in a specific topic and who have sufficient time and patience to keep defending the representation of it in long and tedious discussions with people who know less and use academically absurd standards of evidence.” Unfortunately for Wikipedia editors, Hansen said, those people “are often more numerous and more persistent.”

Comments or suggestions? Send them to Have tips about the far right? Please email: Have documents you want to share? Please visit: Follow us on Twitter @Hatewatch.