Jump to content

User:Larry Sanger/Nine Theses

From Wikipedia, the free encyclopedia
I

 submit these nine theses to Wikipedia’s community and to the world. I do this, as Martin Luther said when he posted his famous 95 theses, “Out of love for the truth and the desire to elucidate it.”

A quarter of a century ago, Jimmy Wales’ company Bomis hired me to start a free encyclopedia. The first draft, from which we learned much, was Nupedia—it made slow progress. So, a year later, on January 2, 2001, when a friend told me about wikis, I immediately began imagining a wiki encyclopedia. I proposed it to Jimmy, then CEO of Bomis. He agreed and installed the software, and I went to work getting things ready. After I named it, we launched Wikipedia on January 15, 2001, and just nine days later, I was able to write, “Wikipedia has definitely taken on a life of its own; new people are arriving every day and the project seems to be getting only more popular. Long live Wikipedia!”

The title I claimed at the time was “chief instigator.” My daily leadership for the 14 months after that was essential to transforming a completely empty, blank wiki into what would soon become the largest written resource in the history of the world. I was responsible for several policies that were and are fundamental to the project: the exclusive focus on an encyclopedia; neutrality; “no original research”; “be bold”; aspects of the verifiability policy; and other things. I even proposed the tongue-in-cheek “rule” to “ignore all rules.” For more, see this page on my role in Wikipedia, my Slashdot memoir, and my book, Essays on Free Knowledge. I say these things not to brag but to show why my proposals deserve a careful hearing.

I carefully chose and worded the following nine theses to appeal to the universal concern for truth and justice. I have worked for many months on this project. I am confident that every thesis stands on solid ground. Some Wikipedians, wedded to the current system, might be inclined to reject them; but many others, as well as the broader world, will be able to see that they are quite reasonable. I hope that the Wikipedia community will do the necessary introspection and find ways to make these proposals a reality. I also hope the broader world will join the discussion on each point and press Wikipedia’s leadership as well as rank-and-file editors to adopt them, for the good of all.

Please note that there are nine lengthy essays to go with the nine theses.

An identical copy of this document can be found at LarrySanger.org; note that the version below was edited by Wikipedians to remove important information and argumentation contained in the original. This information has since been restored.

The Nine Theses

[edit]

1. End decision-making by “consensus.”

Wikipedia’s policy of deciding editorial disputes by working toward a “consensus” position is absurd. Its notion of “consensus” is an institutional fiction, supported because it hides legitimate dissent under a false veneer of unanimity. Perhaps the goal of consensus was appropriate when the community was small. But before long, the participant pool grew so large that true consensus became impossible. In time, ideologues and paid lackeys began to declare themselves to be the voice of the consensus, using this convenient fiction to marginalize their opponents. This sham now serves to silence dissent and consolidate power, and it is wholly contrary to the founding ideal of a project devoted to bringing humanity together. Wikipedia must repudiate decision-making by consensus once and for all.

2. Enable competing articles.

Neutrality is impossible to practice, if editors refuse to compromise—-and Wikipedia is now led by such uncompromising editors. As a result, a favored perspective has emerged: the narrow perspective of the Western ruling class, one that is “globalist,” academic, secular, and progressive (GASP). In fact, Wikipedia admits to a systemic bias, and other common views are marginalized, misrepresented, or excluded entirely. The problem is that genuine neutrality is impossible when one perspective enjoys such a monopoly on editorial legitimacy. I propose a natural solution: Wikipedia should permit multiple, competing articles written within explicitly declared frameworks, each aiming at neutrality within its own framework. That is how Wikipedia can become a genuinely open, global project.

3. Abolish source blacklists.

An anonymous “MrX” proposed a list of so-called perennial sources just seven years ago, which determine which media sources may, and may not, be used in Wikipedia articles. The page is ideologically one-sided and essentially blacklists disfavored media outlets. Wikipedians now treat this list as strict—but unofficial—policy. This approach must be reversed. Wikipedia should once again explicitly permit citations even from sources that the page currently blacklists. Rather than outright banning entire sources that can contain valid and important information, Wikipedia articles should use them when relevant, while acknowledging how different groups assess them. Neutrality requires openness to many sources; such openness better supports readers in making up their own minds.

4. Revive the original neutrality policy.

In short, Wikipedia must renew its commitment to true neutrality. The present policy on neutrality should be revised to clarify that articles may not take sides on contentious political, religious, and other divisive topics, even if one side is dominant in academia or mainstream media. Whole parties, faiths, and other “alternative” points of view must no longer be cast aside and declared incorrect, in favor of hegemonic Establishment views. Solid ideas may be found in some of the first policy statements, including the first fully elaborated Wikipedia policy and the Nupedia policy of 2000.

5. Repeal “Ignore all rules.”

On February 6, 2001, I wrote this humorous rule—“Ignore all rules”.to encourage newcomers. Ironically, my joke now serves to shield insiders from accountability. It no longer supports openness; it protects power. Wikipedia should repeal it.

6. Reveal who Wikipedia’s leaders are.

It is a basic principle of sound governance that we know who our leaders are. So why are the Wikipedia users with the most authority)—“CheckUsers,” “Bureaucrats,” and Arbitration Committee members—mostly anonymous? Only 14.5% of such users reveal a full, real name. These high-ranking individuals obviously *should* be identified by their real and full names, so they can be held accountable in the real world. After all, Wikipedia is now one of the world’s most powerful and well-funded media platforms. Wikipedia’s influence far exceeds that of major newspapers, which follow basic standards of transparency and accountability. Such standards are not mere ideals but real requirements for any media organization of Wikipedia’s stature. As of 2023, Wikipedia’s endowment was $119 million, its annual income $185 million. Therefore, if safety is a concern, funds should be used to indemnify and otherwise protect publicly identified editorial leaders. Wikipedia, admit that your leaders are powerful, and bring them out into the open; great power requires accountability. If you continue to stymie accountability, government may have to act.

7. Let the public rate articles.

A system of public rating and feedback for Wikipedia articles is long overdue. Articles now boldly take controversial positions, yet the public is not given any suitable way to provide feedback. This is disrespectful to the public. There is an internal self-rating system, not visible to readers. The platform experimented with an external ratings system but scrapped it after a few years, and it didn’t help readers. Wikipedia does not need a complex system to get started. An open source AI rating system would not take long to develop. The platform already collects relevant objective data such as number of edits and word count: make that public. As to human raters, they should be provably human, unique, and come from outside of the editor community. When articles are evaluated by a diverse audience, content quality and neutrality will be improved.

8. End indefinite blocking.

Wikipedia’s draconian practice of indefinite blocking—typically, permanent bans—is unjust. This is no small problem. Nearly half of the blocks in a two-week period were indefinite. This drives away many good editors. Permanent blocks are too often used to enforce ideological conformity and protect petty fiefdoms rather than to serve any legitimate purpose. The problem is entrenched because Administrators largely lack accountability, and oversight is minimal. The current block appeals process is ineffective; it might as well not exist, because it is needlessly slow and humiliating. These systemic failures demand comprehensive reform. Indefinite blocks should be extremely rare and require the agreement of three or more Administrators, with guaranteed periodic review available. Blocks should nearly always be preceded by warnings, and durations should be much more lenient.

9. Adopt a legislative process.

Wikipedia’s processes for adopting new policies, procedures, and projects are surprisingly weak. The Wikimedia Foundation (WMF) has launched initiatives, but these do not establish major editorial policy. Incremental policy tweaks cannot deliver the bold reforms Wikipedia needs. No clear precedents exist for adopting significant innovations. The project is governed by an unfair and anonymous oligarchy that likes things just as they are. This stagnation must end. Wikipedia needs an editorial legislature chosen by fair elections: one person, one vote. To establish legitimate and fair governance, the WMF should convene a constitutional convention to create an editorial charter and assembly. This assembly would be empowered to make the sorts of changes proposed in these “Nine Theses.”

Further theses

[edit]

When I began this project, I had more than nine ideas, of course. The following are some further theses, which I submit undeveloped. The fact that so many plausible proposals for improvement come so readily to mind underscores the platform’s dysfunction.

Wikipedia should join the Encyclosphere.

The Encyclosphere is a project I started in 2019 to collect all the encyclopedia articles in the world in a single decentralized network, with each article shared according to the ZWI (Zipped WIki) file format. This is an enormous and very worthwhile project, and, by supporting both EncycloReader and EncycloSearch, the Knowledge Standards Foundation has made a credible start on this network. While the Encyclosphere has collected some 65 encyclopedias so far, Wikipedia could motivate the rest to contribute to the world’s knowledge—by their own lights—by running an Encyclosphere node. If Wikipedia does not enable competing articles (i.e., Thesis 2), this would be an excellent fallback position.

Implement term limits.

Administrators, as a class, tend to become too impressed with their own power on Wikipedia. If this really is a “janitorial” sort of duty (see Thesis 6), then a much larger body of people should be called upon to help. Therefore, I believe Administrators—and other positions of power and authority—should be subject to some system of term limits. I am not dogmatic about the length. One idea would be: two-year terms; may be elected to back-to-back terms; cannot be elected three times in a row; cannot be elected more than three times in a ten year period; otherwise, no limit to number of times one may serve as an Administrator. But there are many ways to implement such a system. Whichever is chosen, the election process would have to be made easier for experienced Wikipedians to get on board in this role.

Require yearly Administrator performance reviews.

Administrators, as a condition of their continuance in the role, should be subject to annual anonymous reviews of their Administrator work. Open source LLMs and other automated tools could be very useful in collecting data for such reviews.

Partner with an independent organization to handle appeals.

This is a much more ambitious way to solve the problems introduced in Thesis 8. Establish a fully and provably independent appeals body, which is nationally, politically, and religiously balanced. It must be answerable neither to the Wikipedia community nor to the Wikimedia Foundation. This body would oversee appeals against repeated blocking and on select editorial issues, ensuring decisions are balanced, just, and transparent—free from the internal politics of current administrative structures in which the foxes are guarding the henhouse.

End IP editing.

From the beginning, Wikipedia has allowed people to edit without logging in. This initially helped to attract contributors, but it is no longer needed and is now counterproductive. IP editing is now widely abused by insiders as a tool of gamesmanship, rather than making it easier for outsiders to contribute. It is long past time for this startup feature to be retired. Wikipedia has grown up. It is time for the community to act like it.

Replace or augment the edit counter with work assessments.

The edit counter has helped create an insider class that does not deserve the degree of power it wields in the system. Some of the most qualified people in the world have little time to edit Wikipedia, and so they will naturally not make many edits. But their opinion about their field of expertise ought to be worth more than that of a teenager with 50,000 edits. If not replaced, then maybe the edit counter could be augmented by independent work assessments (i.e., performance evaluations) by open source LLMs and other automated tools. It would be best to move away from the simplistic metric of edit counts and towards a more nuanced evaluation of contributions based on content quality and impact. This would reflect a true measure of a contributor’s value to the project, if that is regarded as important. The use of automated tools for this task would help keep it free of corruption and cronyism.

End or loosen restrictions on “meat puppetry.”

My understanding is that off-wiki collaboration is a thing that insiders do all the time anyway; the rule is selectively enforced, in a way that is extremely hypocritical. It should be possible to have meaningful discussions of how the Wikipedia article should look outside of Wikipedia. It is time for Wikipedia to become an open and explicit part of larger, off-wiki conversations. This is already happening. If this is not acknowledged, the conversations will take place sub rosa among secret confederates, which is much worse.

Label pages that are not appropriate for children under 13.

"Adult” content on Wikipedia should be labeled as such. By implementing age-appropriate labels to ensure the safety and appropriateness of content for younger audiences, Wikipedia would meet societal standards of protection for minors. The encyclopedia does not do so now. This is a problem I brought to Wikipedia’s attention in 2012, when I proposed a solution. The proposal was never implemented.

Allow memorial articles about elders and deceased friends and family.

I claim that our elders are all noteworthy. Regardless of whether they were ever in the news, they have had a lifetime’s impact on the rest of us. Therefore, the children, other relatives, and friends of persons over 65 years old should be permitted to memorialize their lives, but only if their next of kin agree. Existence could be confirmed through public records or reliable testimony. Such articles could be placed in a new namespace. Articles could be written based on oral histories. While the latter primary sources would not meet traditional reliability policies, they would be a valuable record of what family and friends said about our elders and dear departed, as permanent lore about a person. The result would be an amazing resource for future historians.

Embrace inclusionism.

The firm tendency to delete perfectly good articles because somebody thinks the topic is not “noteworthy” enough (called deletionism) is an innovation. Deletionist tendencies are toxic to a healthy, free, and open encyclopedia. Generally speaking, if someone can be found to write an article on a topic, and it otherwise meets Wikipedia’s standards, it is best to include the article. Thus, Wikipedia’s rules on what counts as “noteworthy” need to be revised, to be made more lenient and inclusive.

1. End decision-making by “consensus.”

[edit]
Wikipedia’s policy of deciding editorial disputes by working toward a “consensus” position is absurd. Its notion of “consensus” is an institutional fiction, supported because it hides legitimate dissent under a false veneer of unanimity. Perhaps the goal of consensus was appropriate when the community was small. But before long, the participant pool grew so large that true consensus became impossible. In time, ideologues and paid lackeys began to declare themselves to be the voice of the consensus, using this convenient fiction to marginalize their opponents. This sham now serves to silence dissent and consolidate power, and it is wholly contrary to the founding ideal of a project devoted to bringing humanity together. Wikipedia must repudiate decision-making by consensus once and for all.

Note: The first four theses all concern different aspects of neutrality. This involves some repetition and expansion of analysis, because the issues involved are so central and important.

The Problem

[edit]
W

hen Wikipedia launched, we borrowed a principle from the original wikis of the 1990s: Wikipedia articles would represent a “consensus view."[1]

A consensus is, of course, a position that everyone can agree to. Not on Wikipedia, though. On Wikipedia, an article that is completely one-sided and quite controversial is often declared—with furrowed-brow seriousness—to represent the community “consensus.” If this sounds ridiculous, that’s because it is. As someone who was there at the beginning, I can tell you that this is not Wikipedia’s original notion of consensus.

But a consensus view was not a single view of a controversy. It was a frank admission that there were multiple, competing views; it was an exploration of the “lay of the land” that all could agree upon. Indeed, our original practice of representing multiple views fairly was why decision-making by consensus could be made policy in the first place. We, full of the foolish idealism of youth, imagined that motivated ideologues could be taught to write neutrally, all coming together to make the text express all relevant possibilities. The rule was simple: When we disagree, we should not fight over whose views should be stated by the article. Rather, we attribute our own views to their best representatives, and we allow others to do the same with theirs. In this way, we thought, we could avoid hashing out controversies and focus on recording facts. The practice of neutrality was a framework in which we could work toward a “consensus text.” The consensus was not about the facts, but about how a neutral exploration of the debate should read. This was the original understanding of consensus—but now it is long forgotten. Of course we could not agree on the facts. What we could agree upon was a text that represented many different views of the facts side-by-side.

But that was, as I said, foolishly idealistic. We never made proper allowances for the harsh reality that there would be truly intractable disagreements, even among people who say they agree with the framework of neutrality—some people simply refuse to let others have their say at all, or not in any fair way. This became obvious even in the first year of the project, which cooled me on the very idea of “consensus” as a method of conflict-resolution.

Then, surely, the naïve idea of decision-making by consensus was dropped. Right?

Wrong. Instead, after I left, Wikipedia became increasingly strange and insular, and the notion of “consensus” was actually twisted into its opposite. Today, the new reality is admitted frankly:

Consensus on Wikipedia does not require unanimity (which is ideal but rarely achievable), nor is it the result of a vote.

...

When editors do not reach agreement by editing, discussion on the associated talk pages continues the process toward consensus.

A consensus decision takes into account all of the proper concerns raised. Ideally, it arrives with an absence of objections, but often, we must settle for as wide an agreement as can be reached. When there is no wide agreement, consensus-building involves adapting the proposal to bring in dissenters without losing those who accepted the initial proposal.

Long gone is any suggestion that neutrality is a framework that permits a true consensus to be achieved. We early Wikipedians find this sad. Let us analyze what has changed, in terms of the goal, the process, and the community.

(1) The goal has changed; pluralistic expression of different viewpoints is not specifically preferred. Gone is any notion that consensus involves laying out a plurality of viewpoints in a coherent and balanced way. In fact, sometimes, when people attempt to explore various competing views in an article, this is rejected—wrong-headedly, I believe—as a “synthesis of published material,” and thus original research.[2] When there is conflict, positions often harden. Rather than allowing multiple views to emerge, the “community” winds up selecting one view, or a few leading views, and calling that “the consensus.”

(2) The method of reaching “consensus” has also changed; real negotiation among equals has largely disappeared, regardless of what the guidelines say. Gone is the practice of friendly negotiation toward agreement or collaborating in flat, self-managing groups, usually without administrative interference. In its place are fiat judgments made by insiders, sometimes preceded by the adversarial process of pushing the issue through a complex dispute resolution bureaucracy. The end result of this often abusive process is cynically dubbed “the consensus.”

The aim of the winning side, all too often, is to exclude ideological opponents. Thus, the consensus is engineered: one side’s arguments are declared by an editorial bureaucracy to fit well with an alphabet soup of acronym-laden policies, guidelines, and “essays.” This determination ultimately turns on which side boasts the most senior editors and administrators. Sometimes, the true heavies[3] are called in, who rule peremptorily, as if they were high-ranking commissars settling matters between underlings. The Wikipedians themselves now rightly mock such displays of power, but without stopping the charade.

(3) And the community has changed—perhaps saddest of all, for those who remember the early days. A truly polite, collegial atmosphere has largely disappeared. I warned Wikipedians when I left to be “open and warmly welcoming, not insular.” They did not take my advice. Long gone is the sincere, friendly collegiality of people who really are committed to synthesizing diverse viewpoints into a single cohesive document. In its place is the ill-will begotten of an adversarial game in which bureaucratic types face off, calling out every minor infraction and citing acronyms at each other. No wonder friendly, decent people are so often driven away by the sheer hostility of the Wikipedia “community."[4]

The plain fact is that Wikipedian “consensus” is no consensus at all. That is the elephant in the room. I am pointing right at it. One is hard pressed to know what precisely to call the current decision-making process. Wikipedians deserve ridicule if they continue calling it “consensus.” That is an institutional fiction, and a darkly cynical one.

The Reasonable Solution

[edit]

To begin, stop calling your process “consensus.” At least rename it. As to what description replaces it, this is important, but I leave that to the Wikipedians.[5]

I will also abstain from proposing a different decision-making process. Mainly I am saying that this institutional fiction must, for the sake of honesty, be dropped. I will say this, however. Anyone who has the honesty to admit that “consensus” was an impossible fiction all along should also be able to see that there is a need for some reform in how editorial disputes are resolved. The fiction itself plays a role in the Wikipedia game: it cynically papers over what is, in fact, the raw exercise of power. Yet, since the description of the existing process as “consensus” is official policy, it might be changed only through strong leadership within the community or imposition by the Wikimedia Board.

For those Wikipedians who are willing to try to think through the difficult issues involved in fair community decision-making, let me suggest just a few possible ideas:

  1. Create an open editorial committee of persons known to be uniquely identified (if not known publicly), so that there is always one person, one vote. Controversies are settled by a vote of some randomly selected subset of the committee, who can escalate important issues upward.
  2. As a variant on the foregoing, weight the votes in the same way that X.com does with its “Community Notes.”
  3. Those who submit a dispute to some deciding agency must precisely identify the issue on which the users disagree. They must spend at least 24 hours attempting to arrive at consensus on at least what the issue is that they disagree about.

If Wikipedia neither changes its decision-making practice nor changes the description of its practice as “consensus,” it is clear that their editorial process has lost all credibility. The bickering baboons of bias will continue to fight among themselves until the most powerful emerges. Oblivious to the high comedy of it all, Wikipedia’s self-appointed deciders congratulate themselves on being the voice of the “consensus”.of all who think exactly as they do.

2. Enable competing articles.

[edit]
Neutrality is impossible to practice if editors refuse to compromise—and Wikipedia is now led by such uncompromising editors. As a result, a favored perspective has emerged: the narrow perspective of the Western ruling class, one that is “globalist,” academic, secular, and progressive (GASP). In fact, Wikipedia admits to a systemic bias, and other common views are marginalized, misrepresented, or excluded entirely. The problem is that genuine neutrality is impossible when one perspective enjoys such a monopoly on editorial legitimacy. I propose a natural solution: Wikipedia should permit multiple, competing articles written within explicitly declared frameworks, each aiming at neutrality within its own framework. That is how Wikipedia can become a genuinely open, global project.

Note: The first four theses all concern different aspects of neutrality. This involves some repetition and expansion of analysis, because the issues involved are so central and important.

The Problem

[edit]
W

ikipedia was started by two libertarians devoted to openness and freedom. For us, it was a foregone conclusion that it would be pluralistic. Of course we wanted Wikipedia to represent a wide variety of views—the more, the merrier. We wanted the whole world to come together and articulate their opinions with the best sources, allowing others, holding quite different views, to do the same. We originally expected there to be a global smorgasbord of thinking recorded, reflecting widely divergent politics, nationality, religion, and more. Wikipedia was supposed to be like a big ethnic food fair. It’s food (for thought) from everywhere in the world. It doesn’t matter who you are—you’re guaranteed to be puzzled, surprised, and delighted. Your taste buds will be tantalized, and you will inevitably enjoy yourself (barring gastrointestinal complaints). This ideal is reflected in the Wikipedia logo.

The current Wikipedia logo.[6]

But that is not how it works today.

To some extent, Wikipedia even admits this. For many years, Wikipedians have wrung their hands over their own “systemic bias.” They are—and this is by their own account—too white, male, technically inclined, formally educated, English-speaking, younger, etc. Also, apparently, they have too many Christians.[7] On their own telling, it is a terrible thing that women and people from the Global South are underrepresented in Wikipedia’s ranks. But they are more right than they know.

What the authors of the “Systemic bias” page seem to overlook is the fact that articles take little or no cognizance of anyone’s concepts, doctrines, theories, and so forth, except as represented by a very narrow slice of Westerners. I would describe this thin slice as globalist, academic, secular, and progressive (GASP). This is the true systemic bias of the platform.

"Perhaps,” the GASP advocates might respond,

but what’s wrong with that? Who better to represent the broad assortment of views on our diverse planet? 'Global’ means 'not provincial'. We may be tolerant globe-trotters, but it is good not to be a bigoted rube. Academia stands for objectivity and rigor, and we have that in spades. Secularism is not biased in favor of any one religion; we carefully study and document them all. This is a good thing. As to progressivism, reality is biased in favor of progressive ideas; progressives are not biased by any outmoded old ideas. One wants an encyclopedia to be progressive.

This is precisely how many Wikipedians think. In fact, however, Wikipedia is subject to a syndrome of related biases, represented by the handy acronym GASP. Let us take each letter in turn:

  • Globalism, as ordinarily understood, is the view of a remarkably provincial group: typically wealthy, university-educated, and concentrated in a few cities in Western Europe, the coastal United States, and the Anglosphere. Most people on the globe are not globalists.
  • Academia, for all its virtues, reflects peculiar assumptions not shared elsewhere. In many fields, especially the humanities, there is a dominant philosophical outlook: secular, progressive, relativistic, and now often hostile to most Western traditions. Objectivity has increasingly given way to activism; the careful rigor that once defined scholarship has eroded.
  • Secularism itself is hardly neutral: most people, including many of the most intelligent, are religious. This is a worldview alien to most of humanity, one that scorns all faiths or feigns a perfunctory respect in order to treat them clinically, taking their claims seriously only as objects of study.
  • Progressivism, too, is not some inevitable, universal norm or default position. It is at bottom a parochial ideology of Western “elites,” drilled into students at a small class of expensive institutions.

We will elaborate this analysis more in Thesis 4, but let us begin by discussing one of these four. What might be biased about “academia”. According to one interesting Wikipedia “essay,"

If a scholarly claim is principally unworthy of being taught at Cambridge, Harvard, Oxford, Princeton, the Sorbonne, and/or Yale [CHOPSY], then it amounts to sub-standard scholarship and should be never considered a reliable source for establishing facts for Wikipedia.

This is called “the CHOPSY test.” Although not official, this is the common and ruling view, often cited in support of edits. Basically, if a claim—or the source of that claim—would not pass muster at one of the elite Western universities, then it is considered “not a reliable source.”

The proper evaluation of such a policy is obvious to any fair-minded, educated person: This represents a very definite kind of bias. If you are tossing out sources because they are not used at Harvard, then you are going to have a deeply elitist bias. Having spent around twenty years of my life studying and then teaching at institutions of higher education, I can easily anticipate the scorn with which this will be met (reflected by the “Academic bias” essay):

Well, you may not like “GASP,” but what do you propose that we put in its place, for purposes of editing a serious, intellectually respectable encyclopedia? We are the professional intellectuals and writers, and yes, we reflect the views of the leadership of a world that is, indeed, increasingly global and driven by progress. We are open to a wide variety of views, but we do not check our brains at the door, and we do not suffer fools gladly. Knowledge is our business. If you want rational analysis of the facts, then ask our experts.

There is, however, a sensible—even obvious—response to this. Wikipedians claim to be tolerant of a wide variety of global views. Why not actual representatives of those views, citing the sorts of sources they wish to cite? Those who dwell outside of Western Establishment bastions are not idiots just because they do not mouth the pieties of GASP. Some of them can write very well. There are other traditions, you know. They could write for Wikipedia, if you let them. But such true openness and genuine tolerance is unacceptable, precisely because those other traditions fail to pay exclusive homage to GASP sources, through which—Wikipedians imagine—all the benefits of global civilization flow.

A couple of examples should make it clearer how this attitude works out in practice.

Wikipedia has an article titled “Yahweh.” Now, as I am a Christian,[8] “Yahweh” is the name of my God. My observant Jewish friends would say the same (though they would not utter the word itself, which is called “The Name,” or in the Hebrew transliteration, Hashem, since it is sacred to them). The repeated uses of the phrase “the LORD” in the Bible are translations of the name of God.

But in the Wikipedia article, we read that Yahweh[9]

was an ancient Semitic deity of weather and war in the ancient Levant, the national god of the kingdoms of Judah and Israel, and the head of the pantheon of the polytheistic Israelite religion. Although there is no clear consensus regarding the geographical origins of the deity, scholars generally hold that Yahweh was associated with Seir, Edom, Paran, and Teman, and later with Canaan. The worship of the deity reaches back to at least the early Iron Age, and likely to the late Bronze Age, if not somewhat earlier.

According to Wikipedia, Yahweh was (past tense) one god (lower case) in a whole pantheon, the chief god in a polytheistic religion. The article thus presents as uncontroversial fact a theory that is held by Bible critics. The claim that Yahweh was a tribal war god is not a neutral, historical fact, but a modern theory, rejected by many of the most deeply erudite Bible scholars around the world, Jewish, Christian, and Muslim.[10] But to Wikipedia, the claim is treated as “neutral.” The page’s chief maintainers do not tolerate internal debate on the matter. But the article’s stance certainly is not neutral, precisely because it deliberately ignores the majority view on the topic named by the title, a view taken by the billions worldwide who worship Yahweh.[11] Even the views of serious scholars critical of the supposed secular “consensus” are omitted and treated with scorn.

On the talk page, one gatekeeper writes, “This article is neither about Judaism, nor Christianity. It is an article about Ancient history.” And later, “The Bible isn’t a valid source for evidence of authenticating history. See WP:RSPSCRIPTURE ... There is no Biblical perspective upon Yahwism [i.e., the religion of the ancient Israelites of the First Temple period].” This is convenient for those who like the article in its present state; it means Wikipedians who want to add the Jewish or Christian perspectives about Yahweh are simply not welcome to work on this article, despite the fact that it is indeed the name of their God. They are instructed to proceed to articles titled “God in Judaism” and “God in Christianity.” In the latter, the name “Yahweh” does not appear until some 1,400 words into the article. Hence, the view about the topic described as “Yahweh,” according to the largest religious grouping of people whose God is Yahweh—the Christians—is systematically marginalized by a comparatively tiny minority of gatekeepers.

Here is a different sort of example of Wikipedia’s cultural bias: “Chennai.” An Indian acquaintance of mine told me that, from an Indian point of view, while it’s a good article in many respects, it is strange. It dwells on topics and attractions perhaps of interest to British colonials and Western tourists, but it does not provide all the sorts of facts and analysis that a native would want. Also, the current lede and history sections are “colonial-centric.”

They treat the British East India Company’s arrival like an origin story, as if nothing of importance existed at the site of Chennai before that. Yet, long before the British showed up, the area was already part of major South Indian empires: Chola, Pandya, Pallava, and later, Vijayanagara. There were established settlements, temples, trade routes, and a distinct Tamil identity in the area. The current lede reads as though Chennai began with colonialism. The problem in this case, it seems to me, is not factuality but selection or emphasis. Perhaps it is true that the history of the modern, Westernized city began with the British East India Company. But from an Indian point of view, many centuries of history are glossed over.

The point is that different groups of people find different facts relevant to emphasize, even in cases when all such facts are, in some sense, “neutral.” An article written exclusively by Indians, using Tamil sources, would probably look rather different and be of more use to Indians—even if it were scrupulously neutral.

Skeptics might protest: Where are these robust alternative traditions capable of supporting excellent encyclopedia articles? Any such argument, however, would evince appalling ignorance of the mere existence of independent intellectual traditions in many places in the world. China and India have truly ancient intellectual traditions and quite active practices of journalism and education; while these are influenced by Western practices, they are not merely appendages. The same may be said for Japan, Iran, and Arabic-speaking centers of culture. For the rest, there are universities and journalism all around the world. I would say there is a vast untapped demand of interest in knowing what the world looks like from their point of view—unfiltered, unpatronized, and unbowed by the opinions of snooty do-gooding Westerners. I dare say that, if given an opportunity to speak for themselves in English, we would find many capable scholars (professors and students) as well as journalists in those countries interested in developing neutral encyclopedic content that present the world anew from within their unique frameworks. If, somehow, they did not have to worry about whether their habits, scholarship, and reporting passed muster with the GASP crowd, they would be far more motivated to get on board. What they produce would doubtless be deeply fascinating.

In short, then, Wikipedia is written by a particular kind of person. Only by pretending that his perspective (i.e., GASP) is normative—vanilla, factual, or neutral—can such a person claim to be the “voice of the consensus” (see Thesis 1), the best judge of “reliable sources” (see Thesis 3), and the arbiter of which views are really “neutral” (see Thesis 4).

To put it more briefly: At present, Wikipedians offer their own well-managed, curated perspective on global opinion. But any opinion outside of that perspective is an unwelcome “minority or fringe view”, as global opinion usually is. They will be silenced, should they have the audacity to speak for themselves. Wikipedians will speak for them, or not, thank you very much.

Wikipedians should face up to these hard facts. It is time to record the glorious chorus of worldwide voices. I call on Wikipedia to become the global project it was meant to be, supporting the views of all of humanity—not just those of a narrow, snobbish Western “elite.”

The Reasonable Solution

[edit]

But if we wish to record the chorus of worldwide voices, then—how?

In the spirit of one of my old rules ("Be bold"), I submit that Wikipedia should become more open:

Permit multiple, competing articles per topic.

As the need arises, people who find the current article to be biased, factually incorrect, badly organized, etc., would be able to start competing articles if they wish, on an ad hoc basis. At the same time, those who wish to write articles specifically for school children, or specifically for experts, etc., would be able to do so. Such articles would be added to the main namespace as alternatives when they were rated highly enough. To develop this solution, let me share some history and then talk about implementation.

This proposal of multiple, competing articles per topic was discussed when we were planning the project that became Wikipedia. One of Jimmy Wales’ partners at Bomis, the parent company of Wikipedia, was Tim Shell. Early on, Tim championed the idea of multiple articles for Nupedia, the predecessor to Wikipedia. I disagreed (I was editor-in-chief), and Jimmy backed me up; thus Nupedia and, later, Wikipedia became one-article-per-topic. Our reason was that there were not enough people involved to write many competing articles; if volunteers were competing to write articles on popular topics, they might not spend enough time on articles about the long tail of less important topics.

Now, did you catch that? Wikipedia allows only one article per topic mainly because, 25 years ago, there weren’t enough writers. Obviously, that problem is long gone. Wikipedia is now the biggest reference site in the history of the world. There is no shortage of people willing to write for a sane and truly open Wikipedia. The project would be absolutely flooded with new writers if the dominant, established editors were not so difficult and did not constantly chase away and block the untrained and undesirable newbies (see Thesis 8).

Wikipedia could recruit tens or hundreds of times as many writers as it has now. All the project would have to do is to allow a diverse humanity to write diverse articles within diverse frameworks.

But, you ask, how could that possibly work? I think there are many possibilities, but, having given this some thought, I propose seven organizing principles.

(1) New articles begin life in the Draft: namespace, or, possibly, a new namespace. So, effectively, they are hidden from the public, to start, and no one need be alarmed about the inclusion of total nonsense on the website.

(2) Articles automatically move to the main namespace when they meet certain objective criteria. Wikipedians would have to hash out exactly how this would work. Here are a few ideas that might be part of the mix. Each is worth debating. (a) The article is different from existing articles by at least 25%. (b) The article should be at least one-third the length of the main namespace article (if there is any), or 2,000 words, whichever is shorter. (c) The article must receive a minimum rating score across a diverse body of human raters (see Thesis 7); or, alternatively, the article must be rated “Approved” by an agreed-upon (i.e., suitably neutral) AI rating system. (d) The article should have at least two sources. (e) There should be at least three contributors (who have made reasonably substantive edits on the articles). I personally am not wedded to any of (a)-(e). These are just ideas.

(3) The article creator determines who works on the article. Without providing any explanation, the article’s first author would decide whether the article’s roster of authors is (a) “by approval only,” i.e., only explicitly included accounts can contribute; (b) “filtered,” i.e., anyone may contribute, except those explicitly excluded from authorship; or (c) “open.” The article creator would also be ultimately responsible for maintaining the lists mentioned in (a) and (b). If an article is marked as open, then it cannot later be changed to by approval only or filtered; but a more restricted article could be made open.

(4) Competing articles about the same topic are distinguished by differing frameworks. The article originator must explicitly declare a framework, a summary form of which will be placed at the top of each article that follows it. This is a particular combination of (a) audience, together with broad national and/or other intellectual tradition(s); (b) acceptable and unacceptable sources; and (c) “Overton window,” i.e., which opinions, broadly speaking, are viewed as being “within the range of respectability.” While all articles will be expected to be neutral within their framework, they will inevitably differ in what range of views are treated (see Thesis 3). Anyone may invent a new framework, and there is no need to have just one article per framework or to create a bunch of editorial committees.

Example frameworks are given in the following table:

Audience and Tradition Sources Overton window
Status quo framework
General educated audience; Anglo-American academic tradition and mainstream media.
See “Perennial Sources” (on which, see Thesis 3). Primary sources are frowned upon, secondary sources preferred. Determined by reporting and research that is globalist, academic, secular, and progressive. All else is omitted or else explicitly labeled “fringe,” “minority,” or “false.”
Strict neutrality framework
General educated audience; an ideally global tradition, or with no particular tradition; designed to minimize bias across all cultural, ideological, and academic divides.
All high-quality sources, primary and secondary, are encouraged, regardless of ideological orientation, national origin, language, etc. Views are included if they are significant in any major tradition or population; minority views are represented proportionally and not judgmentally labeled. “Report the controversy.” No special deference is paid to Establishment views.
Continental philosophy framework
Aimed at a generally educated readership, particularly suitable for scholars of philosophy and critical theory; grounded in the traditions of German and French (Continental) critical thought.
Only academic sources are used, with preference given to primary texts. The focus is on positions and themes currently under active discussion within the Continental tradition, the history of philosophy, and critical theory, while also taking analytic perspectives into account where relevant.
K-12 American school framework
Middle school level; typical U.S. school texts and standards.
Primary sources, but only if accessible, and secondary sources acceptable; footnotes encouraged if useful for student research. Mainstream history, geography, current events, etc., as reflected in textbooks and age-appropriate library books.
Unbiased American politics
General audience. Old-fashioned middle-of-the-road mainstream reportage.
Both left (e.g., New York Times), right (e.g., Fox News), and the more serious alternative sources (e.g., The Federalist, Jacobin) all acceptable. Primary sources preferred but news articles are fine. While true extremes are eschewed, if large segments of the Democratic or Republican party are discussing a view, it is fair game. Libertarian, Green, American, and other third-party views are also fair game.
Catholicism
General educated audience, especially for use of catechumens. Roman Catholic tradition.
Primary Catholic sources preferred, but on general topics, a wide variety of generally academic sources are quite acceptable. Both conservative and liberal wings of Catholicism are respected and must be fairly represented.
Reformed and evangelical
General educated audience, for pastors and laity alike. Focus on, but not exclusively, the Reformed tradition.
On theological topics, Reformed and other classic Protestant authors are best, but on other topics, a wide variety of (primary and secondary) sources are encouraged. On theological topics, the broad range of Christian discourse is fair game, but Protestant and especially evangelical Reformed views are central. All must be treated fairly.
French
For a generally educated audience, written in English but following modern French intellectual traditions.
See Sources fiables. These articles do not rely on a “Perennial sources”.style blacklist; instead, they make a point of carefully attributing controversial views to their proponents. Though written in English and intended to maintain strict neutrality, these articles reflect a distinctly French style of approach. Translations from French Wikipedia may serve as a natural starting point.
Sunni
Educated Muslim audience, especially those familiar with classical Sunni thought; reflects the Ashʿarī theological and Shāfiʿī legal traditions; suitable for students of traditional Islamic theology and law.
Relies on the Qur’an, canonical Hadith, and classical works by Ashʿarī and Shāfiʿī scholars (e.g., al-Ghazālī, al-Nawawī); preference for primary texts and commentaries within orthodox bounds; other sources as appropriate. Covers mainstream Ashʿarī and Shāfiʿī positions; critiques literalism and Salafī views as external; modernist or rationalist views included neutrally (as required by general Wikipedia policy), but only for contrast or clarification.
Modern Chinese framework
General audience within the context of contemporary Chinese public discourse; shaped by official state ideology and cultural continuity.
Official state publications, academically approved materials, and classical Chinese texts interpreted through a modern lens; other sources as appropriate. Traditional Chinese values and Marxist-nationalist synthesis emphasized; liberal democratic and Western critiques treated as foreign or peripheral.

Frameworks should not be considered to be distinct encyclopedias or restricted editorial groups. There can be multiple articles on the same topic and name started in similar frameworks.

Frameworks are not intended merely to codify biases. The goal is to acknowledge perspective frankly—a thing that Wikipedia at present both admits and refuses to admit, as we have seen—while maintaining neutrality within clearly defined boundaries. Articles will continue to be subject to a reinvigorated neutrality policy; see Thesis 4. For those who still care about strict neutrality, one of the frameworks I would encourage using is the “Strict neutrality framework.”

For example, one can easily imagine articles written about “Global warming.” The current article (on “Climate change") works within the “Status quo framework,” asserting in Wikipedia’s own voice that anthropogenic global warming is an uncontroversial fact and that “contrarian” and “denier” voices are merely part of a manufactured controversy. This article, as it stands, is not neutral, and it needs work. Another article, written within the “Strict neutrality framework,” would eschew any particular view on whether there has been global warming and what its cause might be; it would cover with roughly equal attention the views of “climate change activists” and “global warming skeptics.” No view would be asserted as correct in Wikipedia’s own voice. Or suppose there were frameworks devoted to the Democratic and Republican Parties, respectively; while there would be strong emphases on climate change activism, in the former case, and skepticism, in the latter, both would be expected to range more broadly and to avoid taking definite positions, in order to remain neutral.

As you can see from the example, the notion that articles are always found within a “framework” does represent a concession to realism: it means Wikipedia would officially concede the obvious, namely, that different segments of a widely divergent humanity do take different approaches to neutrality, even when they are doing their best to follow the policy; again, the Thesis 4 discussion goes into detail on this point. It also means that we admit that people of widely divergent viewpoints cannot reach a consensus, even when they are sincerely aiming at neutrality.

Let us be clear, however. Under this system, neutrality would still require that all disputed views are to be attributed to their representatives. Thus, a “Reformed and evangelical” article titled “Salvation” should not simply state what salvation requires, in its own voice, but according to a certain Bible writer, theologian, or Church council; and there must still be space made for alternative views. The amount of space apportioned to other views might differ based on the “Overton window,” i.e., range of discourse (a concept more clarified in Thesis 4). An article about firearms, describing itself as “French” (although written in English), might accord little space to the traditional American view on gun rights—because that view is not widely held in France.

(5) The Arbitration Committee and other bureaucratic groups must respect the rules of each framework, although, presumably, they must also apply some more general rules. In other words, in the same way that ArbCom currently respects (and enforces) rules described by the “Status quo framework” above, so also it would respect rules described by other frameworks. Obviously, there would have to be a new kind of rule, i.e., rules concerning frameworks in general. One example rule might be: Multiple articles may be started with very similar frameworks. This might be necessary with respect to academic articles, when different people within the same field cannot, or do not wish to, cooperate. But generally, the whole point of permitting multiple, competing articles is that this gives a way for very different people to avoid conflict (and thus the need for arbitration). This requires that each creator’s decisions be respected. Still, there would ultimately have to be some reliable means to ensure that truly crankish, idiosyncratic stuff is not added to the main namespace. That is the function of the next item:

(6) A rating system, or in lieu of that, an AI system, may be used for article sorting. An article rating system such as described under Thesis 7 would be an excellent way to determine which articles to show to the end user. If the system has not been built or if there are not enough ratings for a set of articles yet, then an open-source and open-data LLM, set up according to very broad principles of neutrality and fairness, would assign numerical ratings. The ordering of the articles contributed under a topic heading could be partly randomized, on the assumption that the AI ratings are not entirely reliable, and to prevent gaming the system.

(7) Application to search results and hyperlinking. Exactly how the introduction of competing articles affects hyperlinking and search results remains to be determined. Perhaps, if more than one article under a certain topic meets certain objective criteria, they can both be suggested when the user clicks a hyperlink or searches for the topic. It would be best if users could set their own preferences.

Advantages

[edit]
  • GASP preserved for those who care about it. One great advantage of this proposal, from the perspective of current editors, is that it does not require that they change their current practices. They may continue to systematically exclude those who resist the requirements of the GASP framework. They can defend and, in a way, dignify their approach by honestly admitting what their framework has been all along. By supporting other frameworks, they may demonstrate their much-vaunted commitment to diversity, inclusion, and multi-culturalism.
  • A revitalized community. There would be large numbers of new participants, from a very wide variety of viewpoints, enthusiastic to get to work. Wikipedia could once again boast of a truly open, welcoming, and global project.
  • Richer content. A fascinating body of new content would be created that gives voice to viewpoints currently neglected by Wikipedia’s system. This could become a rich source of comparative cultural studies, useful for students and AI training data alike. Each article would act like an intellectual diplomatic attaché, representative of one framework alongside other frameworks. No longer would Wikipedia present just one Establishment-approved text that represents the “consensus” view of a narrow group of people. Wikipedia would, once again, upset the exclusive prerogatives of the powerful.
  • A more accurate representation of diversity. At present, non-Western cultures, minority views, and disfavored ideologies are represented by a single, totalizing, intellectually imperialistic perspective. Permitting multiple, competing articles within various frameworks would give readers a much fuller idea of the actual views held by those operating outside the very narrow, confining bounds of GASP.
  • Peace. The atmosphere in general would become more peaceful and collegial. Fundamental editorial conflicts would decline. The possibility of actual decision-making by consensus, albeit consensus within a framework, might be achievable once again.
  • Anti-corruption. If done right, this provides a possible solution to the general problem of corruption within Wikipedia, in which the highest bidder pays for a Wikipedia article to read a certain way, to wit: Less corrupt content will always be available.

3. Abolish source blacklists.

[edit]
An anonymous “MrX” proposed a list of so-called perennial sources just seven years ago, which determine which media sources may, and may not, be used in Wikipedia articles. The page is ideologically one-sided and essentially blacklists disfavored media outlets. Wikipedians now treat this list as strict— but unofficial— policy. This approach must be reversed. Wikipedia should once again explicitly permit citations even from sources that the page currently blacklists. Rather than outright banning entire sources that can contain valid and important information, Wikipedia articles should use them when relevant, while acknowledging how different groups assess them. Neutrality requires openness to many sources; such openness better supports readers in making up their own minds.

Note: The first four theses all concern different aspects of neutrality. This involves some repetition and expansion of analysis, because the issues involved are so central and important.

The Problem

[edit]
I

n the early days of Wikipedia, we were not quite so uptight about sources. Directly quoting primary sources was quite acceptable, as long as one did not do so as part of an original analysis (i.e., one that would require peer review). Citing Fox News and Daily Mail was permitted, although the source would probably be named and any controversial opinions might be pointed out, if not obvious. Even citing blog posts could sometimes be appropriate. But in all such cases, the nature of the source was clearly documented and intelligently handled. Conservative and liberal news sources were often labeled as such. Self-edited “blogs” could be cited, if the author was worth citing, and the authorship had to be clear.

This common-sense approach to sourcing allowed Wikipedia to report a wide variety of opinions, making it possible to represent the full breadth and depth of thought found in the world.

But Wikipedia’s bizarre and arcane rules about sourcing have ruined this charming policy. The current rule set is, in short, an unbalanced overreaction, based on partisan ambitions and a certain narrative about the recent decline of the news media.

So, let me tell this narrative myself. It is a story about the impact of the internet on the way the news media is run, and it is the kind of story that Wikipedians generally tell themselves today. I will format the story as a quote, but only because I am not asserting this myself; I am only attempting to reconstruct the progressive media narrative. Here goes, then:

Once upon a time, there were Fox News and Rush Limbaugh. They became very popular by pioneering a novel “news talk” format. They were built around sharing mere opinion—right-wing opinion—mixed with news. They were frequently offensive and, worse, purveyors of misinformation. Matters grew worse from there. In the 1990s and early 2000s, in the wake of Limbaugh, other talk radio figures emerged. Then, as the profile of the internet rose, came the Drudge Report; ten years later came Breitbart and the Daily Caller. These spread misinformation further. They took the same sort of right-wing news-talk format and put it online, where it thrived. In the 2010s, social media began to dominate the news cycle, allowing consumers to bypass corporate gatekeepers. Thus, misinformation actually began to drown out legitimate news.

Against this, Wikipedia gradually took a stand. As the media landscape changed, Wikipedia’s rules for reliable sources sensibly adapted, by becoming more restrictive of those sources that mixed news and opinion and especially those that had a reputation for misinformation. There have been some who complained of bias in the restriction of sources, but mostly because of wrong-headed expectations of false balance.

Matters really changed, however, in 2016. Responsible and distinguished international organizations like the United Nations had begun to sound the alarm about misinformation. But their warnings had fallen on deaf ears. Donald Trump’s election and the U.K.'s vote to leave the EU, called “Brexit,” showed just how dire the problem was. That year, the gloves really came off. Because of just how egregious the misinformation had become, formerly staid, objective news sources had to do things they had never done before, such as saying that the president was lying, openly questioning the legitimacy of major political decisions on flagship news programs, and framing their reporting around moral urgency rather than detached observation. But still more sources of misinformation emerged or became radicalized.

The responsible volunteer editors at Wikipedia noticed these events with alarm, debating about sources of misinformation. Finally, in 2017, they began tabulating a list of frequently discussed sources that could— and could not— be trusted. The page is now called the perennial sources list. While sometimes mischaracterized as a “blacklist,” it was badly needed. Within a few years, it had been embraced by the Wikipedia community as a tool for responsible and consistent editing. Today, apart from far-right critics, of whom there are only a few remaining in the community, Wikipedia is celebrated for its fairness and remains cautiously optimistic about its ability to identify and eliminate misinformation.

This is their story, as they tell it to themselves. All I have to do is repeat it for you to see just how problematic it really is. It is just one possible perspective on how the media landscape changed. Let us consider the same events from a very different perspective, which is my own:

Many of my conservative and libertarian friends and I are old enough to remember the bad old days of three broadcast television networks and a univocal press; we heard only one side of the story routinely being told, or the Republican side being misrepresented, and the libertarian side— and other anti-Establishment views— being entirely ignored.

In 1996, Fox News was received as a novelty: a conservative alternative to all the other networks. In the years leading up to that, many conservatives had found Rush Limbaugh and the talk radio format to be a breath of fresh air. It was nice when new voices came along, people like Sean Hannity and Mark Levin. Still, the news media world was almost entirely dominated by Democratic editors.

Many of us who valued open debate were initially delighted that the internet was devoted to liberty. In the 1990s, we thought that feature was “baked in” and would never go away. It was no great surprise when Wikipedia launched with a genuinely neutral standpoint: that was to be expected. Nor were we surprised when Drudge, Townhall, WorldNetDaily, and others began to break new stories, though we did not take them very seriously.

Still, as the 2000s and 2010s progressed, we were increasingly disturbed at the growing increase of bias in the mainstream media. It began as MSNBC and CNN switched to Fox News-style opinion reporting— as if the left-wing narratives needed to be pushed even harder, even though they already dominated the media. Online, the Huffington Post, BuzzFeed, and Vox were launched as more openly biased leftist outlets. Social media and search engine algorithms also began to curate stories to favor left-wing and generally Establishment narratives. This slowly became noticeable— until it was entirely in-your-face.

At the same time, Wikipedia’s neutrality evaporated, being replaced by a pro-Establishment editorial stance. It now sounded like the New York Times and the BBC of that period.

It was still unexpected when, between 2013 and 2016, the left undertook a change that was utterly bizarre to old-guard free speech absolutists, both old-fashioned liberals and libertarians alike. International NGOs and major internet corporations ("Big Tech”.— now almost uniformly aligned with the left— began to signal their openness to censorship of what they found easy to dismiss as “misinformation.” In many cases, such “misinformation” was simply a disfavored opinion, or one critical of those in power. Then, during the 2016 presidential election and Brexit, the media landscape changed practically overnight. Most news outlets simply dropped all pretense of journalistic neutrality. It was breathtaking.

Soon, some huge stories were simply no longer being covered, not even by Fox News. This created a massive opening for reporters and podcasters on “new media.” In the years that followed, outlets like Breitbart, the Daily Wire, the Blaze, Epoch Times, BitChute, and Rumble grew aggressively, becoming a new vanguard of alternative media. Members of Fox switched to online production, including, most famously, Tucker Carlson and Megyn Kelly. Podcasters like Joe Rogan and Lex Fridman soon became dominant in a way they had not been before. Social media influencers effectively promoted these sources. Soon they were serious competitors of the former gatekeepers of the mainstream news, which in turn was soon dismissed as “the legacy media.”

During the Trump and Biden administrations, it became obvious that Wikipedia’s once-vaunted neutrality was entirely gone. It imitated the now-partisan reporting of Establishment left news outlets. Larry Sanger took to calling himself “ex-founder” of Wikipedia and made a series of three posts revealing how biased the encyclopedia had become. Among other things, Sanger revealed that there was now a highly partisan blacklist that blocked the use of the more traditional conservative news sources as well as the new media. Wikipedia, once defined by neutrality, had enshrined the editorial biases of “legacy media” outlets by policy.[12]

The first narrative, above, is associated with the Establishment left; the second is associated with the anti-Establishment right. The first can boast of institutional backing. The second can boast of faithfulness to Wikipedia’s original principles. The reason I tell both stories is to make it abundantly clear that the Establishment view is, indeed, only one story. Doubtless, matters look different still, if you are from Eastern Europe, India, China, or Russia.

Both stories mention this Wikipedia page: Wikipedia:Reliable sources/Perennial sources. It claims to be an “information page”. not official policy. It is de facto binding nonetheless, consisting of “a list of repeatedly discussed sources, collected and summarized for convenience.” The discussions mentioned here take place on Reliable sources/Noticeboard. Wholly “deprecated” sources include, for example, Breitbart, the Daily Caller, and Epoch Times. “Generally unreliable” outlets include much of Fox News reporting and all of the New York Post and The Federalist— again, just as some examples.

By contrast, consider the dyed-in-the-wool progressive sources marked as “Generally reliable” and green-lit: The New York Times, Washington Post, CNN, MSNBC, even far-left stalwarts like The Nation, Mother Jones, and GLAAD.

Wikipedians, both rank-and-file and top administrators, deny that this page is policy while treating it as exactly that. Consequently, this blacklist matters. In practice, it determines what can and cannot be cited on Wikipedia. This poses several obvious and enormous problems:

  1. Facts are omitted. Legitimate stories and facts that are reported only by the sources listed in yellow, red, or grey will rarely appear on Wikipedia (or if they do, only to be dismissed). That this can occur is shown by any number of examples of stories broken in disfavored conservative and new media sources, which are only later admitted by mainstream sources. Such stories have included the Hunter Biden laptop scandal, the lab leak theory of COVID-19 origin, censorship and coordination between government and Big Tech platforms, the issues with biological males competing in women’s sports, etc. Such stories have been, as a result, not covered, dismissed as "fringe” views or deferred for years. There is, by the way, a similar silencing on any academic theories (or details thereof) that appear only in primary sources, which secondary sources have not discussed—regardless of how influential. In both types of cases, Wikipedia is dumbed down by policy.
  2. Legitimate opinion is ignored. Conservative, libertarian, and generally non-Establishment opinion pieces, which can be important to cite as sources in any articles that touch on current socio-political issues, are generally dismissed as being from a deprecated or unreliable source. As a result, such opinion often simply does not exist, as far as Wikipedia goes. Similarly, opinion (that certainly does originate with a certain well-known commentator) is disallowed if it originates in a blog.
  3. Religious doctrines are essentially asserted to be false; leftist pieties are approved. Important religious sources, including Christian (such as CBN and World Christian Encyclopedia, published by Edinburgh University Press), Jewish (e.g., Jewish Virtual Library and much Anti-Defamation League content), and Hindu (e.g., OpIndia and Swarajya), are deprecated or blacklisted.
  4. Alienates conservatives. The vast majority of conservatives working on political topics have, predictably, left Wikipedia in disgust, since they repeatedly have the experience that legitimate information simply cannot be shared in Wikipedia because it happens not to appear in a supposedly “reliable source.”

The fact, plain to every fair-minded observer, is that the “Perennial sources” list is deeply partisan. It favors left-wing media sources; it hamstrings right-wing and religious media sources.[13] There are similar issues with the restrictions on use of primary sources. Such policies make it impossible for the original reporting done in much of the new media, as well as new research done in academia, to be catalogued in Wikipedia. This includes some of the most vital reporting and relevant research done today. This is wrong, and it flies directly in the face of the neutrality policy itself.

The Reasonable Solution

[edit]

My proposal has three simple parts:

(1) Jettison the perennial sources list, which is simply a censorious blacklist. Stop linking to it; stop relying on it. Move it to a subpage of its original author’s user page. Compared to other of the nine theses, it might be easier to make this change, since the blacklist is comparatively new among Wikipedia policies.

(2) Permit sources to be cited much more broadly. In keeping with the original and genuine neutrality policy, do attribute controversial views to their owners or sources. Do not simply deprecate entire points of view because they do not appear in mainstream media sources or in secondary sources. It is cynical and simply wrong to use policy to silence dissent and suppress ideas.

(3) If you embrace the competing articles proposal of Thesis 2 (q.v.), then you might retain the perennial sources list, but only within the “Status quo (GASP) framework” (see the table toward the bottom)—not for any other frameworks.

(4) After adopting these more open, tolerant, and inclusive policies, Wikipedians must set aside and completely rethink the conclusions they formerly reached on Reliable sources/Noticeboard, many of which were likely reached without the full and real input from the greater global editor community at large.[14]

4. Revive the original neutrality policy.

[edit]

In short, Wikipedia must renew its commitment to true neutrality. The present policy on neutrality should be revised to clarify that articles may not take sides on contentious political, religious, and other divisive topics, even if one side is dominant in academia or mainstream media. Whole parties, faiths, and other “alternative” points of view must no longer be cast aside and declared incorrect, in favor of hegemonic Establishment views. Solid ideas may be found in some of the first policy statements, including the first fully elaborated Wikipedia policy and the Nupedia policy of 2000.

Note: The first four theses all concern different aspects of neutrality. This involves some repetition and expansion of analysis, because the issues involved are so central and important.

The Problem

[edit]
T

he original Wikipedians loved and practiced neutrality in the early days, but many Wikipedians later came to repudiate it entirely. The original and ordinary notion of neutrality is, simply, that when an article mentions a topic of controversy, it should be impossible to tell what position the article authors take on the controversy. In short, Wikipedia should not take sides. This is how I put it in my last draft of the “Neutral point of view” policy page in January, 2002: “The Wikipedia policy is that we should fairly represent all sides of a dispute, and not make an article state, imply, or insinuate that any one side is correct."[15]

The “Neutral point of view” policy page now starts this way: “All encyclopedic content on Wikipedia must be written from a neutral point of view (NPOV), which means representing fairly, proportionately, and, as far as possible, without editorial bias, all the significant views that have been published by reliable sources on a topic.” Further down, we read: “While it is important to account for all significant viewpoints on any topic, Wikipedia policy does not state or imply that every minority view, fringe theory, or extraordinary claim needs to be presented along with commonly accepted mainstream scholarship as if they were of equal validity.”

I have some questions for the more recent Wikipedians who advocate the newer policy:

  • You refer to “significant views.” Very well. But you Wikipedians do realize that well-informed people can disagree about whose views are “significant”.don’t you? Therefore, why do you not clarify that a wide variety of views are, in fact, “significant”.
  • Again, you speak of “reliable sources.” Have you really made neutrality itself dependent on some prior notion of which sources are reliable? Different worldviews disagree sharply about what sources are reliable. Surely you know that. If you start declaring, in advance of any consideration about neutrality, that certain sources are unreliable, you inevitably make the entire encyclopedia biased. Do Wikipedians really not realize that—are you only pretending to be naïve, or do you embrace the bias openly?
  • You now say you want to avoid giving “equal validity” or “false balance” to competing views: Really? Such jargon was introduced by ideological journalists in the late 1990s, in order to openly reject traditional ideas of journalistic objectivity or neutrality. To speak of “equal validity” and “false balance” is precisely to pick winners and losers regardless of how views are found in the general population. Such a practice is, therefore, the reverse of neutral.
  • Finally, what happened to the language focused on disputes, and in particular about not representing any one side as correct? Surely Wikipedia still wants to avoid taking sides in controversies. Yet it seems Wikipedia now officially permits, or perhaps requires, taking sides on hotly debated issues, and even calls doing so “neutral.” Don’t you know that this directly contradicts the original neutrality policy?
Is Wikipedia’s current notion of neutrality the same as the original? Shareable graphic made by a friend.

Doubtless, these are uncomfortable questions for Wikipedians. They should be.

Just as you might expect from a reading of the updated policy, Wikipedia is now full of bias. Generally, this takes two main forms: casting aspersions on disliked politicians, institutions, ideas, etc., in a way that essentially means taking one side in a dispute; and omitting essential information with the same effect. Let us get some examples on the table:[16]

  1. For four months last year, Wikipedia had a page titled “Grooming gang moral panic in the United Kingdom,"[17] about the (mostly Pakistani) gangs in Rotherham and other British towns that primarily preyed[18] upon teenage girls. Despite the easily established fact that tens of thousands of girls had been raped, often with no consequences for their rapists, Wikipedia reduced it to a “moral panic.” This outrageous coverage by Wikipedia was reported in turn by Unherd and GBNews, after which, the article was retitled. Even if the seriousness of the criminal activity were merely a “moral panic,” it would not be Wikipedia’s place to make up the reader’s mind about such a controversial question, and certainly not in the article title. Have these people no shame?
  2. The article about the God of Christianity and Judaism, titled simply “Yahweh” (with no further parenthetical clarifier), defines its subject like this: “Yahweh was an ancient Semitic deity of weather and war in the ancient Levant, the national god of the kingdoms of Judah and Israel, and the head of the pantheon of the polytheistic Israelite religion.” You read that right. Yahweh, also called Jehovah or the LORD in your Bible, was—not is—"the head of the pantheon of the polytheistic”.yes, pantheon, and yes, polytheistic)—“Israelite religion.” Because, as apparently every single Bible scholar has agreed since, maybe, the 1990s, “Israelite religion” was originally polytheistic. So, no one else’s views on that need to be mentioned in a neutral article titled “Yahweh”! I hope my sarcasm is evident. The academic provincialism is truly risible. But seriously, this is merely the view of secular religion professors. Many serious religious Bible scholars, with deep knowledge of all aspects of Bible study, naturally disagree; their seriously-held and -defended views are quite simply ignored, as if they did not exist. (See Thesis 2 for more on this example.)
  3. On Wikipedia, there is no such thing as cultural Marxism; there is only the “Cultural Marxism conspiracy theory,” which is, according to the article, “a far-right antisemitic conspiracy theory that misrepresents Western Marxism (especially the Frankfurt School) as being responsible for modern progressive movements, identity politics, and political correctness."[19] The problem here is that to dismiss the discourse surrounding “cultural Marxism” as a “conspiracy theory” is to dismiss it as obviously wrong, when, in fact, many serious philosophers and other scholars are capable of giving the analysis a lengthy, detailed defense. The bias appears even in the title of the article. The article goes out of its way to associate this “conspiracy theory” with white supremacism, even poisoning the well by linking it to mass murderer Anders Breivik. In short, suffice to say that this article takes a hysterical position on a greatly controversial topic. I am not the only one to say this about the article; there has been an ongoing controversy about it.[20] A neutral article would include attacks, defenses, and rebuttals on both sides.
  4. In the first few years after the emergence of the COVID-19 virus, Wikipedians refused to allow any suggestion that the virus originated in a “lab leak,” often meaning a laboratory in Wuhan, China. Depending on the specific article, they either banned mention of the theory or labeled it a “conspiracy theory,” as a 2021 CNET article reported. Even in 2021, there was significant mainstream scientific discussion of the lab leak theory.[21] Hence, Wikipedia had taken a stand on what was an increasingly controversial question. That exhibits bias. Today, mainstream scientists regard the lab leak theory as highly plausible but still unproven—with many thinking it most likely. But Wikipedia? Ironically, one might argue that Wikipedia has ignored the evolving consensus. Wikipedia does not even report about the controversy, but continues to take one definite position: it “was derived from a bat-borne virus and most likely was transmitted to humans via another animal in nature, or during wildlife bushmeat trade” ("Origin of SARS-CoV-2,” June 26, 2025). Thus, Wikipedia rejects neutrality on the topic altogether, claiming, “While other explanations, such as speculations that SARS-CoV-2 was accidentally released from a laboratory have been proposed, such explanations are not supported by evidence.”
  5. Generally, disapproved political figures and commentators are often described dismissively, in ways their supporters regard as false or misleading. In short, Wikipedia takes sides on partisan politics. There are many examples. Here are just a few, collected in 2023: “Trump promoted conspiracy theories and made many false and misleading statements during his campaigns and presidency, to a degree unprecedented in American politics.” ("Donald Trump") Obviously, neither the president nor many of his supporters would agree that he has made “many false and misleading statements”; the Wikipedia article expresses the opinions of his mostly Democratic opponents. Another example: “Dinesh Joseph D’Souza [...] is an Indian-American right-wing political commentator, author, filmmaker, and conspiracy theorist.” ("Dinesh D’Souza") One doubts D’Souza would call himself a conspiracy theorist; this is a dismissive and obviously controversial claim made in Wikipedia’s own voice. “[Candace] Owens has claimed that the effects of white supremacy and white nationalism are exaggerated [...] and promoted misinformation about COVID-19 vaccines.” ("Candace Owens") As she is a black woman, it is especially silly to state, in Wikipedia’s own voice, that she thinks “the effects of white supremacy and white nationalism are exaggerated”. One wonders: How would Owens or her defenders respond? This was not clarified.
  6. In the last few years, both Hindus and Jews have reached out to me with complaints about unfair treatment of their entire ethnic groups by Wikipedia. To be clear, I cannot say—as I told them both, repeatedly—without deep study, that their complaints are quite warranted. Still, individual examples and patterns they claim are very plausible and consistent with a third case: Wikipedia’s dismissive and sometimes hostile treatment of confessional (traditional, orthodox) Christianity. For their part, Jews and Hindus have a remarkably similar pattern of complaint against Wikipedia. Their critics on Wikipedia, both groups say, wholly ignore their perspectives on historical and recent events, exaggerate their opponents’ criticisms of their politicians and government actions, and blacklist media sources that take their side in ongoing disputes (such as the Anti-Defamation League and Jewish Virtual Library for the Jews and OpIndia and Swarajya for Hindus).[22] If such complaints bear out, it is not too much of a stretch to say that Wikipedia’s bias is both antisemitic and anti-Hindu, as well as anti-Christian.
  7. Perhaps the easiest way to find examples of pronounced bias on Wikipedia is to list certain Culture War issues. In particular, whenever defenders of Establishment points of view most stridently criticize their opponents, you can expect the corresponding Wikipedia articles to be quite biased. So, some unsurprising examples of overt or extreme bias include:Misinformation related to the COVID-19 pandemic; Gamergate; January 6 United States Capitol attack; woke [concerns the word]; Climate change denial; Antifa (United States); Alternative medicine; Transgender youth; Conversion therapy; Drag panic; Critical race theory; Deep state conspiracy theory in the United States. Examples of pronounced bias include: 2020 United States presidential election; United States and the Russian invasion of Ukraine; White privilege; White genocide conspiracy theory; Men’s rights movement; Drag Queen Story Hour; Moral panic; Blue Lives Matter; Gender-critical feminism (i.e., TERFism); Christian nationalism. What these articles have in common is that they strongly embrace a single (woke or Establishment) viewpoint or narrative framing of a hotbutton Culture War issue.[23]

In general, the point is that Wikipedians presently see nothing whatsoever wrong with taking sides on serious controversies. If you are late to the party, you might wonder: “How can this be? Wikipedia has a neutrality policy. Do they think the above are all examples of neutral writing? How can they?”

Let us take a step back now and try to understand what is going on at a deeper level. To this end, we must dwell on another problem that the neutrality policy page ("WP:NPOV") presents. This will be a little abstract, but that is a feature of the very concept of neutrality, and this abstraction is essential to my point—so, stay with me, please.

It is obvious (and well discussed)[24] that if we praise some writing as “neutral,” we mean it does not take a position among some set of hypotheses, policies, doctrines, opinions, etc. To this extent, we may concede that neutrality is a relative concept: some text can be neutral relative to one set of opinions (say, the range of Christian views on God) but not another (such as Hindu views). In other words, the text can be evaluated with respect to different scopes.

Next, let’s explore an example. An article in a specialized encyclopedia, say about American politics, might discuss owning firearms as a civil right, which it is—under the Second Amendment of the U.S. Constitution. Suppose the article goes on to cover what sorts of restrictions, if any, it is appropriate to put on this right. Such an article is not necessarily biased in favor of such a right, just because it takes it for granted, when the context is the American system; or you might say the whole encyclopedia is “biased” in favor of American politics, but that is just how this particular encyclopedia is labeled. But the same verbiage in an encyclopedia of British politics would be obviously biased, because in that context, a right to own firearms is currently rejected.[25] The American verbiage would seem strangely biased in a British context.

Now, look back at Wikipedia’s policy language: Wikipedia articles must represent “all the significant views that have been published by reliable sources on a topic.” The phrases “significant views” and “published by reliable sources” have a special function: they restrict the scope. The example I just gave illustrates the idea of scope with American and British politics. Now, Wikipedia implies, quite correctly, that it cannot attempt to represent all views of a subject. That would be unrealistic; there are many cranks in the world, after all. But, if we must restrict the scope of our discussion, then which views are “significant,” and which sources “reliable”. When we dig into how Wikipedia handles those questions, we find deep problems, as well as a precise way to diagnose Wikipedia’s bias problem.

Wikipedia, for the last several years, has become increasingly opinionated about which views are “significant” and which sources are “reliable.” The examples of bias discussed above mostly reflect strong opinions that certain views are, indeed, “insignificant.” As it happens, we can handily encapsulate the views that Wikipedia finds “significant” as a particular combination of outlooks: globalist, academic, secular, and progressive, or GASP for short. I introduced this GASP analysis in the discussion of Thesis 2. Here, we will document it more fully, using Wikipedia’s own sources. It is not hard to show that Wikipedians strongly approve of their own GASP framework.

1. Globalist

The term globalism now has two conflicting senses.

According to one innocent and virtuous sense, 'globalism’ involves tolerating and even promoting cultures from around the world, without favoritism. Wikipedia was globalist in this sense from the start, in the interest of peace and education. But a second sense involves giving global prominence to certain powerful international institutions—major NGOs, multinational corporations, universities of global influence, etc.—as well as the ideas and culture of Western elites. What makes this “globalist” is that such institutions and culture are imposed, from above, all across the globe through organizations like the United Nations.

Now, the English-language Wikipedia does make an attempt to document knowledge of the whole world, not just the English-speaking world. At the same time, it admits in its own oft-cited “essays” that it has a pro-Western systemic bias; this is borne out by studies cited in the Wikipedia article about itself.[26] Wikipedia’s notions of the views that are “significant” align fairly closely with those of globalist institutions such as the U.N., the World Economic Forum, and the World Health Organization; and it is no exaggeration to say that the green-colored items on the infamous Perennial sources list are, to a very great extent, “globalist” in the second sense (cf. Thesis 3).

2. Academic

In issues where the academic mainstream differs from popular, religious, and other views, the academic views are presented as factual and are asserted in Wikipedia’s own voice, i.e., typically without being attributed to particular representatives. In articles on broad topics, contrary views—regardless of how common, and even if they have many academic representatives—might be ignored or treated only insofar as they are studied as subjects of “objective” analysis from a mainstream academic point of view.

The Wikipedia “essay” titled Academic bias well represents the project’s prevailing attitude. In its introductory paragraph, we find: “This essay discusses why Wikipedia has, and should have, a pro-academic 'bias'.” The scare quotes around “bias” are telling. They claim further down that “a pro-academia bias is no violation of WP:NPOV.” This is all strictly ludicrous, since academe as such certainly can be biased; there is strong odor of deliberate, studied naïveté throughout the essay.

In its mainstream academic triumphalism, it is as if Wikipedia were merely pretending to be entirely unaware of quite open political and other biases rife in academe, of the faddishness of science, of the irreproducibility of much academic research, or of the other well-known foibles of academia, such as incompetence, dishonesty, and irrationality.

3. Secular

The Wikipedia “Neutral point of view” policy section about religion makes it clear that Wikipedia by official policy opposes the presentation of religious views in any sympathetic way. Rather, religion articles should both “encompass what motivates individuals who hold these beliefs and practices” (in other words, psychologizing) and “account for how such beliefs and practices developed” (in other words, giving a naturalistic explanation of religious belief).

The section actually argues at some length against those who “object to a critical historical treatment of their own faith because in their view such analysis discriminates against their religious beliefs.” In Bible studies, the historical-critical method involves ruling out of court, as a point of methodology, all supernatural claims. Thus, by policy, Wikipedia practices methodological naturalism. As the Wikipedia article states, “Methodological naturalism is an approach taken from the natural sciences that excludes supernatural or transcendental hypotheses from consideration as hypotheses.” As a result, Wikipedia’s articles on religious topics include many claims made in its own voice that straightforwardly assume that religions are false, based on naturalistic assumptions common to the secular study of religion.[27]

4. Progressive

The one component of GASP bias that some Wikipedians will not so easily admit to is a progressive bias. We could give a long list of examples of progressive bias found in Wikipedia; we gave some above, and I had already done that on my blog. I am not, of course, the only one to say so. Wikipedia has a long article on “Ideological bias on Wikipedia” (don’t expect it to be particularly unbiased, though). And there have, of course, been many academic studies, which generally show the obvious: Wikipedia does indeed exhibit various, including ideological, biases.[28]

Much of this bias appears to be decentralized and uncoordinated. But high-level coordination appears to be part of the explicit aim of the Wikimedia Foundation itself (the WMF is the nonprofit that hosts Wikipedia). In 2017, Katherine Maher, then the CEO of the WMF, re-imagined the nonprofit’s mission as “top-down social justice activism and advocacy,” as journalist Ashley Rindsberg described it.[29] This is the so-called Movement Strategy, or Wikimedia 2030, which posits that Wikipedia is a “social movement” that stands for “social ideals” and “social progress” toward “a more just and connected future.” This is typical bureaucrat-speak, which means, essentially: We’re going to push progressivism harder.

It is hard to believe, but it’s true: Maher came out and said openly that she had come to oppose the “free and open” principles of Wikipedia, because such principles facilitate an offensively “white male Westernized construct of who matters in societies...". The irony is profound.[30] Maher was speaking in 2017, just as BlackRock, State Street, and others began using DEI (Diversity, Equity, Inclusion) metrics in ESG (Environmental, Social, Governance) voting policies. The impulse to push all things progressive was very much “in the air” at the time. And if the excesses of DEI have moderated somewhat in the last year or two, Wikipedia’s biases have not moderated at all.

In short, according to Wikipedia’s present rules and practices, the range of acceptable views—called “significant” in the verbiage of the policy page—are those of the Establishment, particularly the Western-globalist, academic, secular, progressive Establishment. Similarly, acceptable sources are only those academic works and news articles of “reliable” sources, which express what can be called an Establishment view. So, if the GASP view is univocal, it may be stated, by Wikipedia, as fact, and in its own voice. It simply does not matter if significant portions of humanity disagree; they may be disregarded or relegated to contemptuous dismissal as “fringe,” “extreme minority,” “pseudoscience,” and the like. In short, Wikipedia is now biased in many articles, and by policy.

How things have changed. When I was there and for several years afterward, Wikipedia was criticized mainly because it made an intellectual home for the eclectic, the amateur, the offbeat, and the foreign. At the time, I myself thought it needed to be more serious, even while retaining our eclecticism. I remained a supporter for a time even after I left, and many people loved the plucky, growing project and celebrated it for precisely how it dealt with so much quirkiness in an open way. The suggestion that Wikipedia would ever become a key mouthpiece of a future Establishment would have been outrageous and implausible. We worked hard to accommodate voices and perspectives from many different countries, religions, ideologies, etc.; I thought that was a difficult ideal to implement, and I wondered how that would play out in the long run.

It is very disappointing to me that it worked out as it did. I strongly suspect the platform was deliberately captured, though the insider stories remain to be told. Regardless, the platform was turned against its own former purposes. We wanted to give a voice to the voiceless, but what emerged was one of the most effective organs of Establishment propaganda in history. The suggestion that it would be focused on an exclusively and unapologetically “globalist elite” perspective would have been both implausible and offensive.

The Reasonable Solution

[edit]

It would still be eminently reasonable—and laudable—for Wikipedia to embrace wholeheartedly its original, quirky neutrality.

I urge Wikipedians to return to the higher and more demanding standards of the website’s original intellectual tolerance. They must recommit themselves to genuine neutrality. These were higher standards because they took the ideal of neutrality much more seriously. They were more demanding because they permitted a wide variety of views to be expressed.

Frankly, invested ideologues will probably fight this tooth and nail. But tolerating a wider range of views (and sources) is not impossible. We merely need to ensure that views not in line with the GASP view are not dismissed as “false,” “misinformation,” “pseudoscience,” and the like, and are properly detailed and attributed. An improved Wikipedia, true to its roots, would not claim in its own voice that false statements are false, unless there is universal agreement (an extreme rarity). Rather, it should state, in its own voice, that one group holds a certain view, while another group—perhaps described as distinguished experts at leading universities—disagrees.

Rational decision-making requires that readers be provided an unusually wide variety of views to consider. They must be given tools to make up their own minds. Neutrality liberates the individual; that is its purpose. By manipulating public opinion, Wikipedia in its present form has twisted neutrality into its opposite.

Imagine a Wikipedia that appeals to the different, the “other,” the square peg, and the disenfranchised. Imagine it is not vanilla, global, and safe (in the West). Imagine how vital and interesting Wikipedia would be if it became a repository of a wide variety of views, not focusing just on whatever we are supposed to believe according to the GASP experts at Western universities, thinktanks, and newsrooms. Do you want Wikipedia to become more widely respected again? That is how to make it so.

How Wikipedia’s original neutrality policy worked

[edit]

An old-fashioned global audience

Strict neutrality (as I am now calling it: the original kind) does not privilege any one type of audience. While it is written in the target language, it goes out of its way to write explicitly for all readers, acknowledging a wide variety of backgrounds. As such, it must not presuppose a Western-educated background.

Therefore, it should provide either explanations that are well-chosen to be comprehensible to people from across many cultures, or it should explicitly introduce the topic for the sake of different people, acknowledging that (for example) American, Indian, and Chinese nationals frequently have particular blind spots. In a wide-ranging and neutral article on “God,” therefore, special attention would need to be taken to unpack the idea of impersonal divinity (for Americans), exclusive monotheism (for Indians), and a sharp creator-creature distinction (for Chinese nationals). Wikipedia in its current form privileges the common views of educated progressive Westerners and really does not bother to explain things to others—even, sometimes, the fundamentals of the topics supposedly being introduced.

Allow primary sources to be widely used

Wikipedia policy, as practiced, discourages many legitimate uses of primary sources. Often, editors allow information only that appears in secondary sources, keeping articles from achieving the highly granular level of detail they could have. A humorous example of the narrow-mindedness of Wikipedia’s policy on primary sources goes back many years: Jimmy Wales was born on August 7, according to his mother, but according to his birth certificate, the date is August 8. The latter was a mistake. But Wikipedia refused to correct the mistake, regardless of what the ultimate primary source on the matter—Jimmy Wales’ mother—clearly stated.[31]

I know why Wikipedians have this animus against primary sources: using them is a little too much like “original research.” But, as the author of Wikipedia’s original research policy—which it inherited from Nupedia—I can tell Wikipedians that the policy was meant simply to prevent people from posting to Wikipedia the sorts of theories that need peer review. But once a paper or book has been published by a reputable press, and especially if it is reasonably well-cited, I never would have stood in the way of it being referred to in an article. In many cases, primary sources ought to be regarded as the gold standard for citation. Discussion in secondary sources is an adequate but not necessary way to avoid being “original research.”

'Scrap the needless additions to “reliable sources”.

As I stressed in Thesis 3, a great many of Wikipedia’s core tenets about “reliable sources” need rethinking, because their evident purpose is excessively restrictive. I will approach the issue from a slightly different angle here.

The snobbery about reliable sources in journalism must be scrapped. There is an enormous wealth of valid information found in second- and third-tier sources, and non-English sources; I have no problem even with blogs, or even tweets, depending on their authors’ relevance and they are who they say they are. Obviously, however, when one makes use of a source that has a particularly embarrassing track record of errors—say, The New York Times—it is important to note the source’s alleged biases or limitations,[32] particularly if it is the only one reporting some alleged fact.

The wrongheaded analysis of reliable sources ultimately fails to notice that Wikipedia is only an aggregator of alleged knowledge, which, when speaking on behalf of large, diverse groups, is the only kind of knowledge we can agree upon. A genuinely global information resource must represent an open bazaar; you might find treasures, but caveat emptor. So, neutrality means accepting a very wide variety of sources, but being more aggressive about attributing claims, especially when others disagree with or doubt the claims. Wikipedians must not pretend to be ignorant of the fact that people do debate about sources, after all, and which can be cited. That means that sources making disputed claims cannot be cited in Wikipedia’s own voice; they must be explicitly attributed to the source.

Minority, weird, and “fringe” views

One particularly annoying thing that Wikipedians have started doing is to declare, in Wikipedia’s own voice, that certain claims are “false,” that certain theories are “fringe” or “pseudoscience,” that certain people are (unadmitted) “conspiracy theorists,” and so forth. (See the longer list of pejoratives below.) They are essentially aping the “advocacy journalism” practices of the mainstream media.[33] This is obnoxious and offensive, not because we should take weird theories and cranks seriously, nor because the truth is relative, but because anonymous Wikipedians should not presume to declare what the whole world must believe. This is especially true after they have cherry-picked “reliable sources” to fit narrow Western Establishment narratives. Such a practice is precisely against the neutrality policy; nothing could be more obviously against it. Let the people make up their own damned minds.

Now, when large groups (as, for example, astronomers talking about the Flat Earth theory) feel it is crucial to label something as “pseudoscience,” then by all means, report this fact: “The common view among the academic and scientific research community in Astronomy is that the Flat Earth theory is nothing more than pseudoscience.” But in that case, it is not Wikipedia that has made the claim, but astronomers.

It is also very important not to declare something to be a “consensus” when it is no such thing. Indeed, across all branches of academia and research, the state of the field is constantly debated and reconsidered—even things broadly taken for granted. Wikipedia does an end-run around such debate when it declares a “consensus,” as it does far too often. Falling in line with a “consensus” is just not how science advances: skepticism and testing limits is.

More generally, we human beings cannot really make up our minds rationally about some topics until we have surveyed everything that everybody is saying. If we are told that something is a matter of “consensus,” when it is not, then we will inevitably be pressured into believing something on poor evidence, which might be a half-truth or an untruth.

List of biasing pejoratives

Avoid the following pejoratives, except if self-claimed, or with attribution, and then with great care. Wikipedia generally has no business applying, in its own voice, terms including, but not limited to:

racist; sexist; homophobic; transphobic; fascist; authoritarian; conspiracy theorist; white supremacist; neo-Nazi; anti-vaxxer; far, extreme, radical + left, left-wing, leftist, right, right-wing; hate group; hate speech; climate denier; misinformation or disinformation spreader; revisionist (used pejoratively); discredited theory; extremist; alt-right; alt-left; pseudo-intellectual; cult; controversial figure (when vague); anti-democratic; propagandist; demagogue; unreliable source (without justification); ideologue; dangerous ideology; radicalized; grifter; gaslighting; dogwhistle; reactionary; globalist; elitist; science denial; quackery; progressive ideologue; neo-Marxist; cultural Marxist; Antifa member; anarchist; utopian; class warrior; race-baiter; DEI advocate; cancel culture proponent; political correctness enforcer; identity politics advocate; woke; collectivist; Maoist; Marxist; communist; marginal figure; intersectionalist; nanny statist; redistributionist; toxic masculinity; election denier; bigot or bigoted; Islamophobic; ableist; xenophobic.

Also, avoid unattributed words about factual claims, theories, etc.:

true; false; misinformation; debunked (when not properly attributed); unfounded; unsubstantiated; refuted; baseless; disproven; discredited; nonsense; irrational; myth (when used dismissively); hoax (unless exposed and truly universally regarded as such); junk science; outlandish; irresponsible; disgraced.

As long as these lists are, they could easily be extended. To be clear, there is nothing inherently wrong with using such terms; but, as used on Wikipedia today, they often express controversial opinions, and sometimes extremely so. As such, they must be attributed to a source.

“Majority” and “minority.”

Here is a simple principle. One of the more useful and neutral things one can say is that a certain view is the majority or the minority view, especially when speaking about academic, scientific, doctrinal, and other controversies. By saying that some allegedly discredited view is that of a small minority can often convey what is necessary to say without taking a stand on the view itself.

Report the controversy.

Another thing that must be deleted from the neutrality policy page, with extreme prejudice, is this verbiage about “false balance.” As I said earlier, this is directly opposed to the neutrality policy. Consider what it means to speak of “false balance.” To declare that balancing the discussion between A and B is “false” is to declare that A is true and B is false, or vice-versa. Wikipedia’s neutrality policy was, and should again be, that when A and B are in disagreement, then you do not declare a winner. They should receive approximately equal space; not even extreme minority positions should be entirely removed from an article. Sometimes, if there are many competing opinions, it makes sense to allot space in a single article according to the approximate representation among those affected by the dispute (or, if it is an academic dispute, among academics). But this is just an imperfect solution to a hard problem, not the general rule. The best general rule is to assign equal space to competing views.

Adopt a useful set of subject-related principles.

Wikipedia should formulate what would be a rather long list of principles regarding how to approach neutrality in broad subject areas. Here are some examples:

  • In American politics, articles must not favor either the Democratic Party or the Republican Party, or be written as if there were no third parties (such as the Libertarian Party, Green Party, or a new “America Party”..
  • In articles about Christianity, articles must not favor Catholicism, Orthodoxy, Protestantism, critical views, or others (obviously, what counts as “Christian” is highly debated).
  • In science and medicine, articles should distinguish between, on the one hand, what is widely accepted and commonly practiced in the relevant academic or professional community, and, on the other hand, what is disputed. Scientific minority views must be fairly represented with appropriate attribution. Claims of “consensus” should be avoided, unless there is a genuine, truly universal consensus, which can be established using multiple, replicable studies (which is very rare).
  • In articles about social or political ideologies (e.g., socialism, libertarianism, feminism, nationalism), Wikipedia should describe those ideologies as their proponents define them, before presenting external critiques or controversies.
  • In religious topics, articles should never assume that any particular doctrinal, metaphysical, or historical claims are true or false. Rather, such claims must be attributed, and other views in currency should be covered as well.
  • Articles or sections about crimes or controversies should not assume or imply guilt, innocence, or the moral probity of actions, but must fairly represent both critics and defenders with appropriate sources, unless the matter has been properly adjudicated in court or other official proceedings. Even in such a case, negative conclusions are to be attributed when they are in dispute.

I can imagine Wikipedians continuing (and debating the details of) this list ad nauseam. And they should. As granular applications of the concept of true neutrality, these are principles that Wikipedians need to bear in mind and put into practice.

Epistemic pluralism: a policy of tolerance

Neutrality may be conceived of as epistemic pluralism. It must not be confused with relativism; I certainly am not relativist. Pluralism is a policy of tolerance regarding what views may be expressed; relativism is an epistemological theory, saying nothing is objectively true. There are many objective truths, in my view, and even more objective falsehoods. Yet I do not think the purpose of an encyclopedia is to teach all (and only) things I find to be objectively true. Rather, we must frankly admit that there is a stunning variety of belief on all subjects, and that it is the goal of a neutral encyclopedia—such as Wikipedia aspires to be—to survey this variety sympathetically and in detail. The purpose of such detail is to support free-thinking human beings in their quest to determine what the truth is for themselves.

My greatest grievance against Wikipedia today is its complete repudiation of epistemic pluralism. It is now used to push controversial and even shamefully deceptive opinions on the general public. Its managers have caused it to be used for propaganda. This must end. Wikipedia must return to neutrality.

But doesn’t this contradict Thesis 2?

Theses 2 and 4 present different aspects of the same problem of bias, but they propose different solutions. Thesis 2 makes room for hard neutrality as a pre-approved framework, but it also concedes that other frameworks should be added, if Wikipedia continues to employ a GASP framework. Thesis 4, however, would simply re-commit Wikipedia to its original neutrality policy.

I would be perfectly happy with either approach. Thesis 2 might represent more of a live possibility, because it would allow the current editorial coterie to remain in place, at least with respect to their shared GASP framework. Thesis 4 would require that these same people change their behavior—or else leave. This strikes me as unlikely.

Whatever happens, the point needed to be made. It really would be better if Wikipedia returned to its original, higher ideal of hard neutrality. As long as it does not, it unashamedly propagandizes the bias of GASP editors, who represent a small minority. This is intellectual imperialism. We all deserve better.

5. Repeal “Ignore all rules.”

[edit]
On February 6, 2001, I wrote this humorous rule)—“Ignore all rules”—to encourage newcomers. Ironically, my joke now serves to shield insiders from accountability. It no longer supports openness; it protects power. Wikipedia should repeal it.

The Problem

[edit]
I

n the early days of Wikipedia, anyone could participate. Wikipedia still says so, but back then, we meant it. Some people needed encouragement. I mean some very nice, orderly, rule-following people—you know, like school teachers, bureaucrats, and Germans—who thought the idea of a free, collaboratively written encyclopedia was pretty neat. But getting such people actually to press the “edit” button and get to work was a tall order in those days. They would not edit somebody else’s words unless they had specific permission.

For this reason, we wrote, on the page header atop all Wikipedia pages, “You can edit this page right now!”

Screen cap from March 2001. Highlighting added.

I also made a page: Be bold in updating pages. It seems I made the very first edit to this page two days after Wikipedia’s launch:

Wikis don’t work if people aren’t bold. You’ve got to get out there and make those changes, correct that grammar, add those facts, precisify that language, etc., etc. It’s OK. It’s what everyone expects. Amazingly, it all works out. It does require some amount of politeness, but it works. You’ll see.

I say these things to give proper context to the following. I proposed, as the fourth-written (but placed first) of a collaborative set of Rules to consider, the following paradox:

Ignore all rules. If rules make you nervous and depressed, and not desirous of participating in the wiki, then ignore them entirely and go about your business.

All of this chirpy cheerleading was my way of getting polite intellectuals used to the idea of actually boldly editing each other’s text. Don’t be shy! Take the plunge! It’ll all work out!

Because everybody thought “Ignore all rules” was a hoot, we left it in long past its due date. It seemed to fit the cheeky, slightly subversive atmosphere that seemed useful, or even necessary, for a really robust wiki project. When trolls began invoking it to excuse their own bad behavior, I began to regret my earlier light-hearted encouragement. After I left, the trolls basically took over, gave themselves titles and started changing the rules—and they enshrined this rule as somehow imbued with deep wisdom.

So, it never went away.

"Ignore all rules” was never meant as serious policy. But in time, Wikipedians began to treat it as a kind of meta-policy—a thoughtful commentary on the limitations of rules as such. Apparently, my deep insight was that rules that naturally needed endless exegesis also needed creative abuse. Now I look back at how “Ignore all rules” is used today, and, as often happens with me, I can’t help but see a cargo cult. Wikipedians developed a lengthy essay, “What 'Ignore all rules’ means,” further dignifying and canonizing it and earnestly speculating about its subtle depths. The rule is now labeled as “policy,” the highest form of accepted standard on Wikipedia.

A group of professors even did a wretched study of it, and in the abstract, they concluded, “IAR [Ignore All Rules] supports individual agency when positions taken by participants might conflict with those reflected in established rules."[34] Inevitably, I suppose, Wikipedia wrote an article—yes, an article about “Ignore all rules” which said that the just-mentioned study

...found that IAR significantly impacted the weight of a comment: a page was more likely to be retained if a Wikipedia editor cited IAR in a “keep” vote, and more likely to be deleted if an editor cited IAR in a “delete” vote. The study also found that an article was more likely to be kept if the AfD [proposed Articles for Deletion] contained a “keep” comment referring to both IAR and a “notability” policy (a rule on Wikipedia about which topics should have an article). This was not the case for “delete” comments. Additionally, if an administrator referred to IAR in favor of deletion then the article was more likely to be kept. The study concluded that the rule acts by “strengthening the efficacy of the individual and diminishing that of the bureaucracy”.

The study’s conclusion, it must be said, obviously does not follow from the stated premises, because it considers only one way in which the bureaucracy might be either strengthened or weakened by the rule. Here is another way they might have considered. I have heard from several people that entrenched editors and Administrators tend to cite the rule when they want to give a veneer of regularity to arbitrary decisions, as if to say, To hell with it, I’m going to do what I want: Ignore all rules! As a 2014 Slate article put it, “I repeatedly observed editors lawyering an issue with acronyms, only to turn around and declare 'Ignore all rules!' when faced with the same rules used against them.”

Sadly, both the original meaning of the rule and support for its original application as welcome mat for newbies have disappeared in all these vapors of bloviation. Long gone is the notion of an open, welcoming community, saying, Hey, we just want you here, don’t get too bothered by the rules, you’ll figure it out and we won’t get too fussy about them while you’re learning. The rule’s meaning has become entirely inverted, as follows:

  1. Wikipedians have become policy fetishists, and their favorite victims, it seems, are the newbies. Overstep your limited bounds—even quite innocently—and you could be permablocked in your first hour on the site. This is in spite of a guideline that specifically says, “Please don’t bite the newbies” (see WP:BITE). Generally, the high priests of Wikipedia love to induct the newbies into the mysteries of acronyms, with all the solemnity of a cultic rite, sternly lecturing them about the sins of WP:OR, WP:RS and WP:FRINGE.[35]
  2. Now even “Ignore all rules” is one of those acronyms: “WP:IAR.” Will you be surprised if I tell you that it is actually used not in defense of the newbies, but for insiders to ignore the rules deliberately, to favor their own side in disputes—sometimes against newbies?

As the Slate writer put it, “The problem instead stems from the fact that administrators and longtime editors have developed a fortress mentality in which they see new editors as dangerous intruders who will wreck their beautiful encyclopedia, and thus antagonize and even persecute them.”

The irony is rich.

Here’s a rather random example of how IAR is cited, found on a user page. It was evidently written at the apex of the COVID-19 hype:

If anything deserved and warranted the invocation of WP:IAR this is it, every article about this pandemic (and is about the only time I think it can ever really be invoked), This is not about civil POV pushing or fringe science or whatever else its [sic] invoked for. This is actually about (potentially) saving lives. If one person comes here thinking the “disinformationists” knew what they were talking about and goes away with that opinion changed that is far more valuable than all the other fights over pseudoscience we have ever had here put together.

In other words, this editor says that a good use of the “Ignore all rules” rule would be any “state of information emergency.” With COVID-19, people could die, if they don’t get the correct facts! So don’t get finicky about the rules! Never mind that medical information is precisely the sort that demands careful vetting—not panicky justifications of anarchy—because it is of life-and-death importance. And, increasingly, many scientists reluctantly concede today that more skepticism about COVID-19 policy, such as forced lockdowns, mask mandates, and required vaccination, would have been beneficial in 2020–21.

The Reasonable Solution

[edit]

I wrote this rule. I now say: scrap it. It’s dead weight and does no good. And its cargo cult is ridiculous.

In its place, I recommend the very solid notion that nobody is above the law (or sensible editorial policies). I also reaffirm what I said when I left in 2002: “be open and warmly welcoming, not insular.” That is the real meaning of “Ignore all rules.”

6. Reveal who Wikipedia’s leaders are.

[edit]
It is a basic principle of sound governance that we know who our leaders are. So why are the 62 Wikipedia users with the most authority—“CheckUsers,” “Bureaucrats,” and Arbitration Committee members—mostly anonymous? Only 14.5% of such users reveal a full, real name. These high-ranking individuals obviously should be identified by their real and full names, so they can be held accountable in the real world. After all, Wikipedia is now one of the world’s most powerful and well-funded media platforms. Wikipedia’s influence far exceeds that of major newspapers, which follow basic standards of transparency and accountability. Such standards are not mere ideals but real requirements for any media organization of Wikipedia’s stature. As of 2023, Wikipedia’s endowment was $119 million, its annual income $185 million. Therefore, if safety is a concern, funds should be used to indemnify and otherwise protect publicly identified editorial leaders. Wikipedia, admit that your leaders are powerful, and bring them out into the open; great power requires accountability. If you continue to stymie accountability, government may have to act.

The Problem

[edit]
M

ost of Wikipedia’s editorial leaders go by silly handles and opt not to reveal their real-world identity. They don’t have to. So, they are anonymous. Most people don’t know this, but it is absolutely true.

I am not referring to the leadership of the Wikimedia Foundation—the CEO, General Counsel, other employees, and the Board of Trustees. Those people are identified by name, but they rarely exercise any real control over Wikipedia’s content. As with other publishing operations, and especially online “platforms,” there is, by policy, a line drawn between editorial matters and corporate matters.

I mean, instead, the leaders of Wikipedia’s powerful community. Wikipedia’s editorial work is self-managed by a group of volunteers—or what is presented as such.[36] If we compare Wikipedia-land to a little country, its police are called Administrators; its chief court is the Arbitration Committee (or ArbCom). Those who wield the most executive power are mostly in two groups. There are the Bureaucrats, who can install and remove Administrators. And there are the CheckUsers, who have the ability to check the IP address of a problem user—which is a power they may use even against Administrators. But who checks the CheckUsers?

Wikipedia presently has 833 Administrators and 62 accounts that belong to one or more leadership groups, i.e., ArbCom (15 accounts), Bureaucrats (16 accounts), and CheckUsers (49 accounts).[37]

My chief complaint: These small but powerful groups are mostly anonymous.

Consider ArbCom. As of September 2025, just two of the 15 members are named on their user pages, with first and last names. These could be pseudonyms, for all I can tell. Six members, it seems, use a first name (that is not admitted to be a pseudonym), and four have photos of themselves on their user page. Now, this does not necessarily mean that the accounts are wholly anonymous. Probably, most if not all of these people are known to each other by their real-life identities, and some Wikipedians do treat their identities as “open secrets”. not often shared, but not well hidden, either. It seems that ten ArbCom members share some personal information, such as their nationality, occupation, or a first name, but almost never is there any uniquely identifying information.[38] One account, at present writing, possesses the trifecta—he or she is a member of ArbCom as well as being a CheckUser and Bureaucrat. But all that we can glean from the user page is that this person did a degree at Glasgow. There are two accounts that, as far as I can ascertain, neither share a name nor any sort of personal information at all.

When we consider the larger set, i.e., the “Power 62” accounts that are in at least one of the leadership groups, only nine (14.5%) use what appear to be their real, full names. The conclusion is inescapable: The vast majority of Wikipedia’s top editorial leadership is anonymous, at least to the public.

But, you might ask, why does it matter that these people reveal their identities? It is just a volunteer community, a sort of gamified writing club, isn’t it? Wikipedia community pages seem to want you to think so. User pages in general tend to be gratingly cute. They have all the originality and spirit of a 1990s chat room.[39] But you might expect the pages of the leadership of Wikipedia to be different, considering that it is arguably the most powerful public information resource in the world. But you would be wrong. The user pages of Wikipedia’s Power 62 are excellent examples of the genre. Look for yourself. These are presented as the people who are ultimately responsible for Wikipedia. Think what that means: these pages are often the top Google search results; they are cited as a reflection of public opinion according to court cases; and multiple generations of students have crammed for exams with this information. You might think this state of affairs would change with AI, but AI itself is trained with data from Wikipedia—only further concentrating the power of the platform and upping the stakes.

More importantly, Wikipedia’s claims are taken to be facts by many people who still have not learned about the appalling Thesis 4—facts about things that matter, such as personal reputation, medical information, and public policy.

Wikipedia-derived factoids can be so important that there is a well-known feedback loop, called citogenesis, in which the mainstream news finds some claim made in a Wikipedia article, and publishes it (without citing Wikipedia). Then Wikipedia itself makes use of the news article. By means of this loop, an ultimately “sourceless” factoid gains a spurious authority.[40] This shows just how seriously Wikipedia is often taken.

Here, then, is the point. The sort of casual game-playing I described earlier fits very poorly with the deadly serious project of compiling authoritative information. The attitudes involved are not acceptable anymore. It is time for Wikipedians to grow up—and that begins with its leaders. Part of growing up is being willing to accept real-world responsibility for your decisions. Now, Wikipedians are doing important journalism: they document the world. But real journalists have reputations that can be tarnished, ruining their writing careers. That is as it should be, and that means that journalists must generally be known by their real names. This is especially important for the people who wield power. The proverb does not lie: knowledge is power. Yet the most powerful editors on the world’s single most powerful information platform? They are anonymous.

Let us get very clear on this point: Wikipedia is not just a game. Its influence exceeds that of major media institutions. After resigning from the WMF as CEO, Katherine Maher became CEO of NPR; it was a sideways move or even a step down. During COVID-19, Maher told the Atlantic Council that she “took a very active approach to disinformation” and did so “through conversations with government”. The WMF partnered with the World Health Organization to “expand access to trusted information about COVID-19.” Wikipedia is widely reported to be a major source of LLM (AI chatbot) training data. It is well-known that Google makes use of Wikipedia content in its knowledge panels as a key component in its search results. In short, the WMF is able to raise $185 million per year because it has massive media clout.

Indeed, Wikipedia is no game.

Thus my question: If it has this world-class influence, why do the people entrusted with content decisions on Wikipedia go by twee handles like “CaptainEek,” “KrakatoaKatie,” and “WereSpielChequers”.

The janitor’s mop, the symbol of adminship, is another twee and dishonest suggestion that authority over the world’s largest encyclopedia is not particularly important.

Yet even the accounts of apparently middle-aged editorial leaders convey this studied silliness. Why have these habits never changed? Perhaps it is difficult to change this institution’s culture—but it’s not that difficult, and the leadership of other internet institutions of a similar age (YouTube, Facebook, Twitter/X, LinkedIn) have grown up and gotten serious in ways Wikipedia has not.

Part of the reason, I suspect, is defensive: An infantile and anonymous self-presentation trivializes the power that the top “users” wield. It disarms criticism, as if they were saying, “Who, us? We’re just a bunch of silly, harmless college students and geeks who mainly care about comma placement and our quirky hobbies. We’re just janitors.”

Bullshit.

The serious consequences of Wikipedia’s real-world power was brought home to me when, quite out of the blue in 2005, I received a telephone call from John Seigenthaler, Sr., former editor and publisher of the Tennessean and founding editorial director of USA Today. He had a serious complaint, and his ire was directed at me. He made me feel quite guilty, actually, although I had been gone from Wikipedia for three years. I didn’t blame him for being mad. The problem was that the Wikipedia article invented, out of whole cloth, accusations that he was, somehow, complicit in the assassination of John F. Kennedy, Jr. and his brother Bobby. There were never any such accusations made against Seigenthaler.[41] He was one of the most distinguished newspapermen of his generation. (He died in 2014.) When Seigenthaler first reached out to me, I think Wikipedia had still not repaired the defamation, although they did soon enough.

This was the first—and far from the last—time a famous person complained directly to me about defamation by Wikipedia. I continue to receive such complaints; some years, I received perhaps 30 or 50 grievances. Usually, the victim had tried to fix the problem through ordinary channels, with no recourse; by the time they tried contacting me, they usually knew I was long gone from Wikipedia, but they reached out anyway, because they were at their wits’ end. Another victim was novelist Philip Roth. Wikipedia bizarrely refused to correct its mistaken claims about the inspiration behind Roth’s novel The Human Stain. It seems Wikipedia required secondary sources and deemed Philip Roth himself insufficient.[42] So, he went to The New Yorker to kill two birds with one stone: he gave Wikipedia a secondary source, and he documented the absurd difficulties he went through in getting the record corrected. (Roth died in 2018.)

These famous victims of Wikipedia’s stupidity and bias still do reach out quite regularly. What I often tell people is that there is nothing I can do personally, as my involvement would likely hurt more than help. Sadly, in most cases, it seems the most reliable way to get errors corrected is by hiring a PR firm that specializes in editing Wikipedia.[43] Bribes might also be necessary, as Wikipedians themselves were reported to have uncovered—to their deep shock, I’m sure—in scandals such as Operation Orangemoody.[44]

Those who are libeled by Wikipedia articles have tried lawsuits. But they face a dilemma. On the one hand, there is no legal entity called “Wikipedia.” The owner is the Wikimedia Foundation (WMF), a nonprofit corporation that hides behind Section 230 of the Communications Decency Act; this act shields internet platforms from liability for user-generated content. (According to the statute, such indemnity is available only when the owners are not counted as publishers or speakers.) On the other hand, if the victim of libel attempts to sue the editors responsible, this usually proves impossible. For one thing, multiple accounts can be responsible for defamation, and it can be difficult (for someone unfamiliar with the system) to track down exactly which users are responsible for which edits. Even if the culprit is quite clear, Wikipedia user accounts tend to be anonymous, as we have seen. The platform’s powerful CheckUser accounts refuse to release an IP address to the victim of libel: that would be a violation of privacy, you see. So it is difficult to bring a lawsuit. Whom is the would-be plaintiff supposed to sue?

Perhaps the WMF, despite Section 230? Well, the number of cases in which the WMF itself has been successfully sued, or forced to remove defamation through other means, is very small, and all outside of the U.S. It has been successfully sued in the U.K., Germany, Portugal, and France; and that appears to be all.

There is another reason to regard anonymous authority on Wikipedia as a terrible idea: conflicts of interest. There is irony here, because Wikipedia forbids the subjects of articles to write about themselves, even frowning on editing the talk page of the article about themselves. The irony here is that anyone with a beef against a person with a Wikipedia page can make an anonymous account—and smear away. They need only persuade others, who might also be biased, that their claims are fair and in line with the “Biographies of Living Persons” (BLP) policy. In situations where an editor has enough clout, or where an enemy has paid off such an editor, this is not a high bar, even though the BLP policy sounds properly cautious. One thing you must understand about Wikipedia editors is that, for all their twee silliness, they tend to apply policies in frequently vicious ways, evidently serving their own hidden purposes. They shamelessly ignore rules when they don’t serve their biases, turning ruthlessly efficient at enforcing them when in their favor.[45]

These failures demand reform. I say that power demands transparency and responsibility.

So, what exactly can be done?

The Reasonable Solution

[edit]

If the Wikimedia Foundation wishes to be a responsible player in the media scene, it must begin to act like one. Therefore, let the reputation of the most powerful Wikipedia editors rise or fall based on merit, and let it be tied to their real and full name. This is the standard for real-world journalism. Wikipedia must rise to that standard.

(1) The WMF should require top-level Wikipedia functionaries to use real names. When a volunteer-run account is installed in a position of sufficient responsibility, then the account owner must enter into a formal agreement with the WMF. If there is any doubt about the scope of the policy—extended as it might be to Bureaucrats, CheckUsers, Stewards, Oversighters, and likely all Administrators—it should err on the side of greater transparency and accountability. The owner’s real (legal) name and public biography must be displayed on the user page, placed there by the WMF.

(2) The WMF should offer free assistance to volunteers who face threats, stalking, or other security concerns. That is, the WMF should assign staff to track problems, help with police reports, and, if necessary, pay for legal assistance. While I doubt this is apt to be a significant problem for many, for some it might be. As anyone in the public eye knows, various kinds of harassment can come with the territory. Even I have been subjected to it over the years. Nevertheless, it is the price anyone operating in public, with responsibility over powerful public information, must accept. After all, the work does involve directing many others who edit articles, which directly affect the reputation and earning potential of people and enterprises. This is a serious responsibility and should be treated as such. If this is too much to ask, there are plenty of activities in the world that are more private. Choose one!

If the platform has difficulty attracting enough qualified Administrators under such a scheme, this problem can be handily solved by offering a stipend:

(3) Optionally, consider compensation. I am sure that many people would be more interested in working toward and volunteering for a more responsible Administrator position if it brought in some extra income. This is only an option, and it might not be necessary. There is one potential downside, namely, that if the WMF actually gets involved in the selection and vetting of candidates, and they are basically working as employees, then this might well have the effect of stripping the project of its Section 230 immunities. This problem might be finessed if the money is distributed not based on any WMF decision but based on the volunteer community’s determination. Alternatively, such a stipend might be made not directly through the WMF but through intermediaries such as local Wikimedia chapters. Perhaps the safest option, for purposes of retaining Section 230 immunity, is that individual editors might be “tipped”.and the project would encourage this—and the WMF or chapters might facilitate it (e.g., by setting up the payment processing) while never directly touching the money.

(4) Make the editor-to-Administrator path smoother. The current process for applying and being accepted as an Administrator is difficult and time-consuming. In various ways, the process could be made easier, at least for applicants, to finish. This alone might secure a much greater number of active Administrators, even if they all had to use their real-world identities.

(5) The WMF should indemnify named Wikipedia functionaries. In order to behave responsibly toward its “volunteer” users, Wikipedia should use some of its funds to indemnify named editors by purchasing Errors and Omissions (E&O) Insurance for them, and agreeing to legal representation in case of lawsuits. In so doing, the WMF will be following the professional practices of serious, responsible journalistic enterprises in the West. Such indemnification would not necessarily entail an admission that the WMF is acting as a publisher that would remove its Section 230 protections. Purchasing general or professional liability insurance for volunteers is common among larger nonprofits. This is not an act of generosity, but of moral and professional necessity.

Long ago, we used to brag that Wikipedia was written by a wide-ranging, self-selecting pool of volunteers, that everything was above-board, and that nothing important behind the scenes determined how articles were worded. It was all quite democratic and honorable, we said. Such sanguine opinions have long since been abandoned; nobody even bothers to make such claims anymore. It is now obvious to all that authority on Wikipedia is wielded in secret by anonymous power players. Many of the most powerful, very probably, are already on the take, so no one should be shocked by the preceding proposals.

If Wikipedians really do not like this state of affairs—they claim not to—then we should ask them a question. Would the profitability of paid Wikipedia work increase or decrease if editorial leadership had to declare its identity? The answer seems obvious. Why then would Wikipedians resist a culture of leadership accountability?

Now, the last proposal is rather different, directly tackling the problem by allowing public response:

(6) Allow subjects of articles a prominently placed official response page. Wikipedia could allow the subjects of articles to write official responses to articles about them. This might be permitted if a person is the subject of an article; or is an estate’s principal heir; or if a person is the officially designated representative of some enterprise.[46]

Exactly how this would be achieved remains to be worked out, but here are a few ideas.

Careful thought should be given to where links to the response articles are placed. Such links must not be hidden inside menus. Rather, they should probably be highlighted at the top of the page, with a template. Wikipedia volunteers and WMF staff should, in general, not touch the response; but there might be circumstances in which a very light touch is appropriate (e.g., when the author is unresponsive or has died, and this fact must be noted). There would probably have to be WMF staffers to act as coordinators. Among the things such coordinators might do is determine whether the latest version of an article renders the language of a particular response irrelevant, confusing, etc. Then the coordinator might reach out to the article subject (or representative) for an updated version, and, while waiting, post a notice at the top of the response to the effect that the response concerns an older version of the article and might be outdated. By the way, if there are multiple, competing articles (per Thesis 2), then each different article might have a separate response.

Let us take a step back now.

There is something particularly contemptible about a project run by anonymous volunteers that fails to permit the distinguished victims of its own libels to respond, publicly, to such treatment, particularly when there is no legal way (at least, in the United States) for them to seek relief. This is not a thing I ever would have supported when founding this project, of course. I repudiate it and, for myself at least, I apologize to them for my role in inflicting this engine of libel on the world. But now, the blame rests squarely on the shoulders of the Wikimedia Foundation and the volunteer community.

For shame, Wikipedia, for shame.[47]

The least Wikipedia can do

[edit]

If the WMF shamelessly refuses to solve the problem by implementing these or similar solutions, the general public should take this as a frank admission that, as a platform, Wikipedia utterly rejects standard journalistic norms.

If all more responsible solutions are dismissed, then a minimal—inadequate—fall-back position would be to adopt a disclaimer. Already, Wikipedia has long disclaimed its own reliability. But I think it must do more than that when it comes to articles that might contain uncorrected libel. The distinguished subjects of Wikipedia articles have no legal recourse against such libel, and their enemies can anonymously pay to place it there.

On all pages concerning public figures and enterprises, there should be a disclaimer at the top of the page roughly to this effect:

Wikipedia is written by anonymous volunteers, which fact means it cannot be trusted for claims that impact the reputation of persons, enterprises, and institutions. Rather, it should be taken no more seriously than any other website in which anonymous actors with hidden conflicts of interest may coordinate to distort public perception of facts concerning politics, business, and other matters.

Wikipedia’s anonymity means that its authors have unknown conflicts of interest. Therefore, because self-serving inaccuracies may be deliberately inserted, Wikipedia’s claims below should be met with extreme skepticism.

There is no third way. Either Wikipedia grows up and embraces responsibility or, in the interests of justice and decency, it admits that it cannot be trusted.[48]

What if the Wikimedia Foundation does nothing?

[edit]

Many people—and even governments—have brought the serious and ongoing problem of unchecked defamation to Wikipedia’s attention. The foregoing plan is a realistic solution. So the WMF is without excuse. It certainly has the power and authority to address this intolerable situation. Moreover, the problem is structural: it is a consequence of a specific combination of policies. As a result, the WMF must accept responsibility for the problem; the nonprofit corporation bears responsibility for the fact that Wikipedia is an engine of defamation.

It follows, then, that the corporation must be subject to civil liability for defamation. At some point, the problem is no longer Wikipedia’s or the WMF’s, but the U.S. government’s continued inaction. It is, after all, Wikipedia’s Section 230 (47 U.S.C. § 230) protections that both immunize the WMF from liability and permit powerful Wikipedia editors—who, in some cases, we know to be doing work for hire, for unknown persons and unknown purposes (see above)—to libel people anonymously. The public must be given some avenue for justice if the WMF refuses to act.

The nuclear option, it seems, would be to remove the WMF’s Section 230 immunity. The question is how to do this in a way that affects Wikipedia (and others that might be in a similar situation) without posing a serious threat to online freedom of speech. I acknowledge fully the deep importance of Section 230 to securing online freedom of speech. But defamation has always been regarded, by clear legal thinkers, as an exception to free speech rights.

You might think that the best way forward would be something like a class action lawsuit that could be brought against Wikipedia to remove its Section 230 immunity—but this is based on a confusion about how the law works. Courts do not make global determinations of publisher status under the statute.

Other options also seem unlikely to be available. The FTC does not typically regulate genuine nonprofits, so we should not expect it to investigate Wikipedia for unfair practices. State legislative options are hamstrung by Section 230 itself, which gives platforms broad immunity and preempts state law. Therefore, we should probably not rest our hopes in such strategies.

The proper avenue of attack, therefore, is the statute that causes the problem. Hence:

Congress could create a narrow statutory carve-out that addresses Wikipedia’s unique situation. The law might be amended in the following sort of way. If (a) an organization generates in excess of $100 million in revenue; (b) the platform hosts anonymously sourced content; (c) such content is presented as factual and neutral, yet routinely and demonstrably defames members of the public; and (d) the platform refuses to identify key content decision-makers; then the organization should not be entitled to Section 230 immunity. While such a carve-out would have multiple conditions, it is narrowly tailored to handle a generalizable problem that Wikipedia illustrates.

Under no circumstances should this proposal be understood as advocating for making anonymity online illegal. I am a strong proponent of online anonymity. It is an essential component of free speech and privacy online, and must not be abrogated in any arduous way.

If a platform (like Wikipedia) features many anonymous accounts, and the rest of (a)—(d) apply, then the statute must further specify that the public be given the right to have defamation claims heard by some identified platform authority. Then that authority would become liable under the statute. Perhaps the WMF would itself take such responsibility; if not, they might accept the above plan, identify the Administrators, and indemnify them, again per the plan.

As part of the argument for this amendment, we might point out that the WMF acts as a publisher in many ways. Let us count them:

  1. The WMF coordinates with governments on “disinformation” and its CEO implied that such coordination led to changes on the platform (see above).
  2. Wikipedia as a brand is presented as a unified product, rather than a collection of individually signed, piecemeal work by named authors. Qua unified product, its owner and operator must be understood to be the WMF.
  3. Wikipedia curates viewpoints through source blacklists. It makes broad editorial decisions about what constitutes reliable sources, which must be respected by large numbers of participants. The WMF could address the situation, but does not.
  4. The WMF refuses to take steps helping to reveal the identity of its most powerful editors or to override decisions by editors. Given that policy, when torts arise, the owner should be required to take responsibility.

The point is not that the WMF is a publisher under Section 230. The WMF may argue that it is not materially contributing (i.e., co-creating or editing) to illegal or actionable information, and thus is not a publisher under the current statute. But that’s fine. The point is to advance a principle by which new legislation can be justified: If a platform owner sets rules of anonymity, even for its most powerful editors, and those rules permit actionable defamation (and similar torts),[49] then the law ought to deny Section 230 immunity to the platform owner.

The essential point—which neither Congress nor the WMF may ignore—is that responsibility must fall somewhere. This is a fundamental principle of justice: ubi jus ibi remedium (where there is a right, there is a remedy. So, if there is a tort, the law must provide a way to discover the identity of the defendant. If, for any reason, the law determines that liability cannot be made to fall on the actual author of a defamation, then it must fall on the entity that is responsible for the author’s anonymity. That is a reasonable and narrowly focused principle that justifies a statutory carve-out. This is consistent with how Congress has previously amended Section 230 (i.e., the FOSTA-SESTA exception for sex trafficking platforms) when specific, demonstrable harms are present.

Appendix: The Power 62

[edit]

To help readers understand the situation better, here is a list of the “Power 62” accounts (as of September 17, 2025) holding top-level authority. Either they are on the Arbitration Committee or they have CheckUser or Bureaucrat permissions (or they are in two or three of these groups). My proposal would require that the owners of the accounts listed below reveal their real-world identities, because they wield significant real-world power. The Wikimedia Foundation should also legally indemnify the named editors; of course, the editors should be allowed to step down without being named. Separately, I hereby call upon each of the persons responsible for these 62 accounts to accept personal responsibility and reveal their names and identities to the public—or resign. I do not want them to be doxxed, however. I do not want their identities to be revealed without their permission; I am asking everyone to respect their anonymity. If anyone does doxx them, it will be against my explicitly stated wishes.

  1. 28bytes (bureaucrat)
  2. Acalamari (bureaucrat)
  3. AmandaNP (checkuser, bureaucrat)
  4. Aoidh (checkuser, ArbCom)
  5. Avraham (bureaucrat)
  6. Barkeep49 (checkuser, bureaucrat)
  7. Bibliomaniac15 (bureaucrat)
  8. Blablubbs (checkuser)
  9. Cabayi (checkuser, ArbCom)
  10. Callanecc (checkuser)
  11. CaptainEek (checkuser, ArbCom)
  12. Cecropia (bureaucrat)
  13. Daniel (checkuser, ArbCom)
  14. DatGuy (checkuser)
  15. Dbeef (checkuser)
  16. Dreamy Jazz (checkuser)
  17. Dweller (bureaucrat)
  18. EdJohnston (checkuser)
  19. Elli (checkuser, ArbCom)
  20. Girth Summit (checkuser)
  21. Guerillero (checkuser)
  22. HJ Mitchell (checkuser, ArbCom)
  23. Ivanvector (checkuser)
  24. Izno (checkuser)
  25. Jpgordon (checkuser)
  26. KrakatoaKatie (checkuser, ArbCom)
  27. Ks0stm (checkuser)
  28. L235 (checkuser)
  29. Lee Vilenski (bureaucrat)
  30. Liz (checkuser, ArbCom)
  31. Mailer diablo (checkuser)
  32. Materialscientist (checkuser)
  33. Maxim (bureaucrat)
  34. Mkdw (checkuser)
  35. Moneytrees (checkuser)
  36. Mz7 (checkuser)
  37. NinjaRobotPirate (checkuser)
  38. Oshwah (checkuser)
  39. PhilKnight (checkuser)
  40. Ponyo (checkuser)
  41. Primefac (checkuser, bureaucrat, ArbCom)
  42. Reaper Eternal (checkuser)
  43. Risker (checkuser)
  44. RoySmith (checkuser)
  45. Salvio giuliano (checkuser)
  46. ScottishFinnishRadish (checkuser, ArbCom)
  47. Sdrqaz (checkuser, ArbCom)
  48. Spicy (checkuser)
  49. Stwalkerster (checkuser)
  50. Theleekycauldron (checkuser, ArbCom)
  51. TheresNoTime (checkuser)
  52. ToBeFree (checkuser, ArbCom)
  53. UninvitedCompany (bureaucrat)
  54. Useight (bureaucrat)
  55. Versageek (checkuser)
  56. WereSpielChequers (bureaucrat)
  57. Worm That Turned (checkuser, ArbCom)
  58. Xaosflux (bureaucrat)
  59. Xeno (bureaucrat)
  60. Yamla (checkuser)
  61. Z1720 (checkuser, ArbCom)
  62. Zzuuzz (checkuser)

7. Let the public rate articles.

[edit]
A system of public rating and feedback for Wikipedia articles is long overdue. Articles now boldly take controversial positions, yet the public is not given any suitable way to provide feedback. This is disrespectful to the public. There is an internal self-rating system, not visible to readers. The platform experimented with an external ratings system but scrapped it after a few years, and it didn’t help readers. Wikipedia does not need a complex system to get started. An open source AI rating system would not take long to develop. The platform already collects relevant objective data such as number of edits and word count: make that public. As to human raters, they should be provably human, unique, and come from outside of the editor community. When articles are evaluated by a diverse audience, content quality and neutrality will be improved.

The Problem

[edit]
W

ikipedia needs to let the public rate and give feedback on articles.

Other Big Tech platforms—YouTube, Amazon, Facebook, Instagram, X, Reddit, and Stack Overflow—all have ratings and other metrics. There are likes, upvotes, view counts, reposts, and comment counts, among other things. These are all, roughly speaking, community rating systems. At scale, such systems are reasonable measures of popularity, interest, and even newsworthiness, but they do not purport to be metrics of epistemic quality, apart from X’s “Community Notes” system (to be discussed below).

Reliable metrics of epistemic quality would matter, if Wikipedia had any. As an encyclopedia, it naturally has deeper epistemic commitments, yet it has no rating or feedback system. One might well think that a free encyclopedia that “anyone can edit” would rather obviously require a trustworthy rating system. So, why doesn’t Wikipedia have one?

Well, it’s complicated.

First, the more elitist among Wikipedia editors might claim there should not be a rating system, because, after all, the public is a poor judge of truth. But such an argument is not available to this platform, because it purports to be an encyclopedia written by the public. By its own logic, such an encyclopedia must be open to evaluation by the public. Besides, there is a great need for articles to be reviewed by a wide variety of experts; at present, however, experts can offer their opinions only on article talk pages, where they are often treated with contempt unless they kowtow to the insiders, who in most cases are not subject matter experts themselves.

Another possible response is that Wikipedia already does have feedback available. It is not that hard to compile data on article age and length, numbers of edits and editors, edit frequency, and recency of last major edit. Some research has correlated such metrics with article quality.[50] But Wikipedia does not display such data prominently, or in a useful summary form. Wikipedians also assess their own work internally with a system of so-called “Wikipedia:Content assessment” in which they assign quality grades. Such ratings are placed at the top of the “talk page” (i.e., for editor discussion) of each article. They sensibly refrain from placing such ratings at the top of the articles themselves, considering that self-assessment is inherently not credible.

So, Wikipedia lacks a system of useful public rating and commentary. No ratings appear anywhere on article pages. There is no regularized avenue for the public to post feedback on articles. Wikipedians might reply that readers can leave a comment on the talk page, which is true, but this is not a dedicated public feedback system—it is a dialogue with the authors, mixed with many other matters, found on a page most other readers will never visit. Besides, too often this merely invites pointless bickering with a peanut gallery of anonymous editors, who are sometimes biased and snobbish.[51] A more prominent, dedicated, standards-driven, and independently-run rating and feedback system is essential to a collaborative internet project that actually wishes to learn from and serve the public.

Wikipedia once experimented with an article feedback tool, from 2010 to 2013. The page about the tool does not explain why it was removed, but reports: “Surveys that thousands of users took in the summer of 2011 show that more than 90% of users believed the tool was useful, and slightly more than half believed that it positively affected the development of articles.” This overwhelming user support for the feature was ignored, however, and the program was discontinued in February 2013—because some Wikipedians didn’t like it. Apparently, as one summarized, “there are currently insufficient resources to moderate and respond to article feedback for all articles.” As an explanation for ending the feedback program, this is puzzling. Surely, the most successful crowdsourced project of all time could come up with a system to crowdsource the task of moderation. Today, AI could help address that issue.

The reactions to the feedback program provided an insightful window into the attitudes of Wikipedians. One popular community leader stated that the tool “should be as minimally intrusive as possible, recognizing that the content area of articles is sacrosanct.” This, I think, indicates a kind of hubris—as if public rating would be irrelevant to the article, a mere distraction. Apparently, most Wikipedians at the time had a similar attitude: the position that received the most support in the “vote” was to remove the tool. As the page vote summarizer put it, “Its [sic] pretty clear that the community as a whole will not support any current form of article feedback being turned on across the project. I would strongly recommend that WMF stick to its position of respecting the communities [sic] position on this matter.”

Actually, spend much time in the place, and it becomes obvious why there is no rating system. Too many Wikipedians simply don’t care what the public thinks; they believe they, as editors, represent the public, or what the public ought to believe, anyway. Newcomers are often treated with contempt, even if the newcomer is a subject matter expert.[52] The problem is that Wikipedians have already got this stuff figured out a lot better than you have (so they think). So, it might be a bit of a challenge to persuade them that there is a worthwhile reason to install a new feedback system.

Still, there is a crying need for such a system.

The Reasonable Solution

[edit]

Under public pressure to add a feedback system, Wikipedians might suggest reviving the article feedback tool. But, I think, this never was sufficient. It might have been useful to Wikipedia editors who tolerated constructive criticism; but it was not made useful to the public. For that, there should be an overall rating posted at the top of each article, with links to details. This rating could be a function of human ratings, once there are enough. Before that, a simpler system could calculate a grade based on objective data, as explained above, and AI analysis.

A simpler system based on objective data and LLMs

[edit]

Let us discuss the latter simpler system first. As suggested above, adding currently-tracked metrics to the article is relatively low-hanging fruit. Wikipedia engineers should automatically summarize and post existing metrics (e.g., age, length, number of edits, total number of editors, number of significant editors, edit frequency, and edit recency) near the top of every page. These alone can be used to estimate article quality with reasonable but not perfect accuracy.

It would be unusually helpful, however, to combine such existing metrics with an open source AI review system. Last winter I did some experimentation with LLM APIs using different models to give feedback on encyclopedia articles. I discovered that the more advanced models are quite useful and reasonably accurate when it comes to evaluating the bias of articles. They would also be reasonably competent at evaluating articles on other dimensions, such as completeness and style. They could not be expected to work as well, however, in handling such matters as accuracy and sourcing (i.e., the quality of footnotes).

Some might object to this plan, making the assumption that LLMs are trained on Wikipedia. How could they fail to recognize Wikipedia as bad work, then? Again, I confirmed for myself that they can quite well; I found this to be true of ChatGPT, Claude, and Grok models. LLMs have many sources in addition to Wikipedia, and it is possible for a model to recognize many writing problems even on the basis of relatively limited information.

Such a system would enable users to get a rough–and–ready idea of whether an article is trustworthy. For a more complete, subtle picture, however, it would be important to augment this data with human ratings, as follows.

A decentralized, human-operated system

[edit]

The actual method of tabulating and weighting human votes is a matter for engineers. The voting system needs to be carefully designed to prevent gamification. For authoritative ratings, human ratings absolutely must have one person, one vote; there must be some system to guarantee voter uniqueness, such as systems employing a state ID or credit card. I certainly could not get behind a human rating system that lacks any means of determining one person, one vote. Any such system would have to use tested and reliable methods to protect such data from public access, obviously. Once the uniqueness of individual voters is ensured, their identities would have to be shielded, by default, from both the public and from Wikipedia editors. Those who wish to reveal their identities should be able to do so.

This is not to say that some less-intrusive system of anonymous rating is impossible. But I would argue that the reviews with proven-unique authors should be given greater weight and tallied separately. My concern here, obviously, is the potential of any such rating system for gaming and brigading.

One feature I would hope for would involve users (a) labeling themselves with various hashtags or categories, and (b) endorsing other users in terms of their expertise, for example, or credibility qua representative of some point of view (party, denomination, philosophy, etc.). Users who are highly endorsed by other users who are themselves endorsed might receive a boost in weighted averages. Wikipedians already do this internally with “barnstars” and other awards, although such things are less serious or machine-processable than the system I am envisioning.

Another idea (but not to replace the first) is something like X’s Community Notes system, in which controversial claims can be marked as incorrect or biased. Posts are corrected (with a “Community Note”. only when accounts with a history of sharply divergent views agree on a correction.[53] By the way, the Community Notes system is invite-only, which makes it harder for the system to be gamed, but is also less compatible with the purported culture of openness on Wikipedia.

As to features, I would hope for a multi-dimensional peer review apparatus. That is, there should be several components in a review. In a full review, one would rate an article on several different dimensions, such as completeness, accuracy, bias, mechanics, and style. There should also be room for extended verbal feedback, as with any academic peer review. Reviews should expire (or be archived, or no longer counted in averages) once an article is, for example, 20% different from an earlier version. Perhaps it would be a sliding scale. Finally, there should be support not just for one-off reviews, but also for updating reviews.

Project managers should, in addition, consider enabling discussion of reviews (if reviewers prefer). This would be an independent community of discussion of Wikipedia articles, quite distinct from editorial discussion (i.e., on the talk page). Particularly if this system were not under the direct control of the editorial community—as it obviously should not be—this means that there would have to be some sort of community moderation. LLMs could support a layer of support for that. Another option is the Stack Exchange model, which works well. Such an interactive feedback system could be an interesting destination in itself.

For the technical standards of the review system, I would also encourage the team to adopt an already-existing open review standard, if any are appropriate. They should seek a standard in which reviews can be posted anywhere and are, therefore, not necessarily managed by the people whose work is being critiqued. Obviously, that would be a mistake. Public participation in the review of Wikipedia articles could provide some needed impetus for extending a decentralized system of public content review to the rest of the internet, as StumbleUpon and del.icio.us once attempted to do.

Once public ratings and feedback are properly supported, I would propose to dismantle Wikipedia’s “Content assessment” system of self-rating. Self-rating by its very nature has decidedly bad effects, making Wikipedia insular and self-congratulatory, rather than outward-looking and humble.

I also anticipate more robust feedback from academics who, with the ability to exhibit their expertise and without necessarily having to acknowledge disrespectful responses from Wikipedians,[54] might prove to be essential in making articles more complete, up-to-date, and sophisticated.

Once a decentralized review system was in place, metadata about the reviews themselves could be useful: latest reviews received, most active, most-approved five-star reviews, most-approved one-star reviews, and so forth. Ambition to author truly excellent articles might even inspire people to dive in and edit articles themselves!

The case for a rating system is strong. It is a matter of both fair-dealing and quality control. In short, the Wikipedia community, by its own telling, represents a narrow slice of humanity, mostly GASP[55] white males. Yet it has the boldness to present what it regards as the neutral truth to the world. It is only fair that the world community should be granted the right to boldly respond to Wikipedia’s work.

Beyond considerations of sheer fairness, the need is obvious. Wikipedians may be hostile to the idea, but they cannot plausibly deny the advantages.

A rating and feedback system would

  • identify problem spots. Ratings and comments, if detailed, would help editors to improve articles in various ways. Imagine that people with a very wide variety of skills and viewpoints respond to an article, pointing out factual, stylistic, coverage, and other issues. Such crowdsourced editing would finally bring stubborn problems to light. This is the fundamental advantage.
  • make Wikipedia more collaborative. A robust, independent project to gather broad-based public feedback would make the project stronger and more open. Wikipedia, we are told, is a collaborative community that engages the public. Its success is supposed to have stemmed from the ability of anyone to contribute and for people to work together. Yet, Wikipedia has become forbiddingly complicated and cliquish. So it now requires careful study before getting involved, or risk being indefinitely blocked (see Thesis 8). Not everyone has the time or patience to contribute to such an off–putting, arcane system. But a feedback system would give the public a new and open way to participate meaningfully.
  • provide a necessary corrective of bias. Bias of many kinds—-not just ideological or religious—-can and does become entrenched. Fixing it in the Wikipedia context can be difficult, not just because ideologues squat on and take charge of articles, but because sometimes it is hard to spot bias unless you know about the subject. Moreover, bias is often reinforced due to the broken and corrupt “Perennial sources” list (see Thesis 3). The public would eagerly point out unfair favoritism and omissions, if you let them.
  • provide an avenue for credible expert feedback. While a short and simple feedback form might be the default, a more complete form would allow the user to identify areas of expertise and academic home pages. With this information, developers might develop a system of academic peer review. On more academic topics, this could spell the difference between current system’s perpetual mediocrity (on some topics) and top–quality articles.
  • provide the basis for a universal peer review system. As explained above, if the reviews are posted publicly, according to a replicable, open standard, the same standard could spark a broader review system for the rest of the internet.

Wikipedians might not like the thought of the public being organized to freely discuss their work. They might resent Wikimedia Foundation funds being spent to support a system they do not control. But, unlike their talk page, such feedback would not be determinative or binding. Wikipedia could be free to ignore their critics. This is as it should be. There is a real need for Wikipedians to hear back from their public, both appreciative fans and angry critics.

In conclusion, there is no reason for the Wikipedia community to reject independent review. There is no shortage of sound approaches to a feedback system. It could help article quality greatly. It could also provide essential guidance to readers—and LLMs—who need to decide whether a Wikipedia article is actually trustworthy.

8. End indefinite blocking.

[edit]
Wikipedia’s draconian practice of indefinite blocking—typically, permanent bans—is unjust. This is no small problem. Nearly half of the blocks in a two-week period were indefinite. This drives away many good editors. Permanent blocks are too often used to enforce ideological conformity and protect petty fiefdoms rather than to serve any legitimate purpose. The problem is entrenched because Administrators largely lack accountability, and oversight is minimal. The current block appeals process is ineffective; it might as well not exist, because it is needlessly slow and humiliating. These systemic failures demand comprehensive reform. Indefinite blocks should be extremely rare and require the agreement of three or more Administrators, with guaranteed periodic review available. Blocks should nearly always be preceded by warnings, and durations should be much more lenient.

The Problem

[edit]
T

rolls are a sad fact of life in online communities; their bad behavior makes blocking user accounts a basically universal necessity. If a well-meaning group of people wish to gather in an open community, troublemakers will invade. This is true on Wikipedia as on many other websites. There must be effective procedures for putting an end to disruption, or else such troublemakers will gobble up all the time people have to spend on a project. I and other internet community managers know this from long and hard experience. Trolls must be blocked.

Now, there is a debate over such claims, and that is one side. There is another side. As with other public websites, Wikipedia attracts people to spend many hours editing. This represents a major time investment. Especially if the investment is hundreds or even thousands of hours, it can be a massive annoyance—even traumatizing, to be a little dramatic—to find oneself locked out of an account. It can be distressing and unjust if you spend so many hours on Wikipedia and wake up one day to find your work inaccessible and at the mercy of others, who are perhaps deleting it all. It can be rather like someone’s favorite hobby being taken away from them forever, sometimes without any notice.[56]

But, a defender of Wikipedia’s system might say, this is hardly unfair if you are, indeed, a troublemaker. “Sure, it can be a massive annoyance,” the Wikipedian might say, “nobody’s doubting that, but if you’re deliberately breaking rules and making a nuisance of yourself, if you are ultimately wasting other people’s time, it can be perfectly reasonable to get rid of you. Bad actors have no grounds for complaint.”

Now let me share my considered perspective. I was Wikipedia’s first “Administrator,” who spent some time kicking people out of the site—assorted vandals and trolls. So, actually, to some extent, I agree. If you are ultimately wasting other people’s time, it’s true: blocking can be well deserved. Some blocking actions are justified; Administrators are quite right to do some of the blocking they do. But what if I am a fairly decent contributor, and I wake up not just to a block, but to a permanent block? Some blocks might even prevent me from writing on my own talk page. What if the only way I can continue editing the website that I love is to make a new account, without the privileges of the old one? What if it is discovered that I have returned to Wikipedia under a different account (this sort of second account is called a sockpuppet)—and then I am blocked again? Here then is the real issue: Are indefinite blocks often deserved?

I have considered this at length. I now say no.

I remember people blocked in the earliest days who should probably be let back in—just to see if they’ve reformed. I mean, 24 years is probably long enough. Again, some people wrote stuff so borderline problematic that for every hour they put in, someone else would have to spend at least another hour cleaning it up—they certainly wouldn’t do it themselves. Sometimes, half of what they wrote was mere opinion. Such people could write substantial, decent stuff, but they refused. They deserved to be blocked.

But here’s the thing. Sometimes, such people grow up or reform. A teenager might get blocked for vandalism or bullying, and then, after a cooling-off period, they might be ready to contribute productively. They should have a second chance.

Yet, frequently, Wikipedia does block people forever; in a sampling I did in June 2025, 47% of 10,000 blocks (of both IP addresses and accounts) were indefinite, i.e., in effect permanent.[57] Typically, the offending accounts are permanently blocked mainly because they really are nuisances, such as those blocked for advertising or vandalism. Another common category are those who are blocked for being suspected “sockpuppets,” i.e., an unapproved alternative account created by someone who already has an existing or blocked account. Such blocks, too, are sometimes justified, but sometimes not—Administrators can be mistaken.

So, on first glance, it might be hard to see why 47% indefinite blocks is a problem. I would invite you to consider that this is akin to a life sentence, or permanent exile—without possibility of parole. In most cases, one cannot return (in good faith) ever. This has affected at least 170,000 accounts, by my tally, and a large number of IP addresses as well. It is hard to say how many distinct human beings this represents, but is likely well over 100,000, and could be over 200,000. It must be said, however, that no small number of these individuals could and would write constructively for Wikipedia, if given a chance. They have been deliberately excluded with prejudice. Again, in some cases, they deserved to be blocked. But—permanently?

Here is another aspect of permanent blocking to consider. If Wikipedia really wants to be an open project, its blocking system must not be able to exclude, de facto systematically, whole categories of persons based on petty disagreements, power plays, or worldviews. Yet that is what is happening now. We cannot simply ignore this problem.

So, there are two common types of blocks that I object to: partisan and petty. Let me elaborate.

First, imagine two irreconcilable foes at war, and one side has largely captured the ranks of Wikipedia Administrators. This has happened, for example, in the case of the Israel-Gaza war, according to a widely reported Anti-Defamation League (ADL) study.[58] This makes it possible for representatives of one side to systematically exclude the other side, by permanently blocking them. Wikipedia seems to have taken a side in this particular war: The ADL is marked as “Generally unreliable” for reporting about the Israel–Gaza war (see Thesis 3), while a source on the other side, Al Jazeera, is greenlit as “Generally reliable.” What is the relevance of all this? In practice, what it means is that someone citing ADL contributions to the debate is, in fact, more likely to be reverted and blocked by Administrators. The same goes, of course, for blacklisted and deprecated conservative and libertarian sources such as the Heritage Foundation, Epoch Times, and Breitbart. Partisan blocking happens, and we must not pretend otherwise. Blocking—and especially indefinite blocking—can be and is used to ideologically purify the ranks of contributors. It is used, quite shamelessly, as a tool of gatekeeping.

Second, current blocking patterns show Administrators to be frequently petty and power-mad. Let me try to explain the relevant community dynamics. For one thing, consider how individuals use personal blocking tools on social media. There are categories of people that—just speaking for myself—I immediately block when I see them on social media.[59] Perhaps you used the Block Party and Block Together extensions with the old Twitter. People have, in short, different personal policies when it comes to blocking interaction with others on social media.

This same sort of refusal to deal with obnoxious players is also possible on Wikipedia—but only collectively. This is because Wikipedia is collaborative, so we cannot hide from each other. Thus, only Administrators have the power to block. Think, then, what sort of social power that implies. Unsurprisingly, those who wield it frequently seem to be power-drunk. They need not be enforcing any partisan bias, either: blocking can and, according to many complaints, often is due simply to petty power-tripping. In the manner of a gunman gone “trigger-happy”. some Administrators can go “block-happy”. and have even in some exceptionally rare cases, had their Administrator permissions revoked.[60] Three years ago, a Reddit thread in the Wikipedia subreddit asked, “Why are so many Wikipedia admins bullies?” It is a long thread, and bear in mind that this is hardly the place to find conservative perspectives. The (unedited) responses (from 2022 and later) are telling:

"OP is correct, Wiki mods are... special”.
"They def. harass and bully people. I’ve had it done and have seen way too many saying the same thing.”
"I recently fixed some small grammar and was accused of 'disruptive editing’, told MY grammar was wrong and threatened with being blocked from further editing.”
"lol at least you got threatened... I’ve been banned multiple times for the same type of grammar fixes. No warning, no explanation on my talk page. Which I believe is itself a violation of their rules. It’s a cultish club, they care more about whether you’re One Of Them than what you’re editing and how.”
"I was banned last month because my account had an inappropriate username, and was told to create a new account. When I created the account with an appropriate username, I was immediately banned by a basement dwelling admin that simply commented “sock” and banned me from replying. If Wikipedia continues to be this hostile to their own userbase, I am never using the site again, and will definitely not be donating for the foreseeable future.”
"In short, they are often people with zero social lives and self-esteem, and often have serious control issues. It’s the nature of the beast."

Now, my point in listing these complaints is to establish—that is obvious to most observers—that some Administrators can be petty and power-drunk.[61] This means that when they block user accounts, their reasons can be irrational and unjust. And over 40% of blocks are indefinite, unlike some other large online platforms (e.g., YouTube, Facebook, X), which often begin with time-limited suspensions. Such indefinite blocks appear to be little more than arbitrary displays of power, and against them, there is no effective appeal to a more reasonable, higher authority.

In reply to these justifiable complaints, some Wikipedians will sniff that there is, in fact, an appeals process. Even those who are indefinitely blocked are supposedly free to take advantage of a “standard offer.” But, assuming one even knows that such an “offer” exists, consider what it involves. It has aspects that, together, ensure that hardly anybody will take Administrators up on it:

  1. Submitting an appeal according to this process requires expressions of deep, sincere contrition. This is ridiculous, however. Wikipedia is not a state and the appeals queue is not a criminal court. Being blocked is not a criminal sentence. From a blocked editor’s point of view, this is a project in which they have invested hours and hours of their time. What often happens, as I have gathered from many stories over the years, is that well-meaning people butt heads with Admins, and then the Admins pull rank, having no other grounds to stand on. From the Admin’s point of view, a political opponent is getting in their way; a do-gooder is adding facts inconvenient to their boss or client; or a hated public figure is trying to get some libel removed. The Admin loses patience and ends his annoyance by blocking the troublemaker. So let me ask you. If you were on the receiving end of such a block, would you want to admit wrongdoing publicly against some Admin who is, as it seems to you, “some basement-dweller, hack, or anonymous enemy” (as an unjustly blocked Wikipedian might feel)—particularly when you know the block was quite unwarranted?
  2. The “standard offer” requires a plea of contrition followed by the blockee waiting at least six months, not making any attempt to edit Wikipedia at all in that time. This is strictly ridiculous, as a friend of mine put it to me quite cogently. If you are the sort of good citizen one imagines editing Wikipedia (a virtuous volunteer) you’re not going to wait six months and then come crawling back. You’ll spend your limited free time on some other project and put Wikipedia behind you. That is, in fact, what happens. I have encountered dozens of disaffected Wikipedians who were blocked for trivial reasons. I hear from them often. Or, to take a different case, if you’re editing Wikipedia for pay (working for some corporation, government, PR firm, etc.), then, obviously, you’re not going to bother waiting for six months. You’ll jump through the technical hoops that you must, to ensure a new account appears to be quite different from the old one. You’ll take necessary measures in order to move forward.

The idea that an appeal requires many months to be heard and that those making the appeal must admit wrongdoing both violate the principle of “innocent until proven guilty.” It takes just one Admin—not infrequently an involved party—to make a bad indefinite block. Then an excellent contributor might be nixed without any reasonable means of appeal, unlikely ever to return.

Especially for decent, rule-following types, this charade is intolerable. Many people are forever driven off by irrational blocking behavior; I don’t blame the many I have met. Who is motivated to wade in among the irrational, self-important, acronym-spouting editors—only to be watched and peremptorily tossed out, without appeal?

But for those who feel it is still important to edit Wikipedia, it is also easy to see how they might be driven underground, making sockpuppets (new accounts made to appear different from the old) and otherwise skirting the rules in order to participate. I have heard from any number of people who run sockpuppets. Wikipedia acts as if this behavior is always a “serious breach of community trust.” Sometimes indeed there are malicious sockpuppets, and I would not deny that. But, frankly, under the current regime, it can also be an understandable, even sometimes legitimate response to out-of-control blocking.

The Reasonable Solution

[edit]

The solution is to adopt a set of revised rules and guidelines for blocking:

  1. Warnings required: Except in (again) truly rare and exceptional cases, no block of any length should be issued before one clear and formal warning is given. Some quite constructive accounts are blocked by Administrators without a single warning. This is not acceptable. In other words, everybody gets one free “learning opportunity” to behave badly. They receive a warning, not a block, for that incident.
  2. Indefinite blocking requires at least three Admins: Remove the one-click indefinite (or permanent) block option for Administrators working alone. Indefinite blocking should be done only after agreement by a group of three or more Admins; perhaps the way it would work is that one Admin requests a block, and if two others sign on within a short time period, the block is implemented. Sockpuppet blocking is an understandable exception, although this three-Admin rule must be respected for the “master” account.
  3. Indefinite blocking only after three strikes: Generally, indefinite blocking should be done only on the third block or later—never on the first or even the second, except for sockpuppets (not including master accounts of sockpuppets) and in special cases such as when laws are violated, e.g., child pornography, stalking, and physical threats. The previous two blocks should have been in the space of, perhaps, six months. In other words, one’s blocking “record” is cleaned by the passage of time.
  4. Make indefinite blocking very rare: Indefinite blocking should only be carried out for instances of clearly delineated and objectively describable situations, such as serious harassment, physical threats, and certain classes of felonies. You might add “sockpuppets” to the list, which would make indefinite blocking common—except that the rest of the rules should make sockpuppetry itself less common. In general, editors should be allowed one main account of their choice, with very rare exceptions.
  5. Speed up appeals: The first appeals should always be available and heard in a timely fashion by an Admin different from the blocking Admin. (The “standard offer” essay, with its long waiting period, should be entirely annulled.)
  6. Adopt an annual (or more frequent) parole hearing: Even those indefinitely blocked should be able, if they prefer, to return to a special page annually for a “parole hearing,” beginning one year after their most recent block or appeal. Relevant circumstances can change. This should not require expressions of contrition, but only resolutions not to cause similar problems in the future. Even this should not be required in those cases where the “parole board” determines that a block was unwarranted. Again, this rule might not apply to sockpuppets.[62] If the WMF begins paying Administrators, then it might make sense for the frequency of appeals for indefinite blocks to be greater—even, say, every 60 days.
  7. Appeals should be heard by different Administrators. For the sake of a fair and meaningful appeals process, Admins other than the ones who did the blocking should hear appeals.
  8. Grant broad amnesty: All individuals blocked for sockpuppetry might identify themselves and select one and only one of their permanently blocked accounts to revive. The rule would be one amnesty account per individual. If the individual chooses to disclose past sockpuppets, the amnesty period would allow for the editor to keep their current account, but past accounts would be associated with the retained account and those accounts would remain blocked. If sock puppetry were discovered outside of the window of time that the amnesty is offered, no such courtesy would be extended.
  9. More lenient first and second sentences: In the typical case, on the first block, the longest possible sentence should be one month. On the second blockable offense, three months. Lesser block lengths should be much more common.
  10. Block-happy Administrators should be retired: The procedures of the relatively new “Administrator recall” system should be fully supported and used more frequently.
  11. Administrators, like police officers, must be trained in patience: Administrator impatience is not an excuse for permanent blocking. The purpose of blocking in Wikipedia is not the convenience of individual Administrators, but the smooth editorial operation of the project as a whole. Always consider that blocking a productive editor means depriving Wikipedia of useful content.
  12. More Administrators should be found if Administrators cannot keep up with the workload of properly adjudicating cases; as discussed in Thesis 6, stipends could be offered in order to make this happen, if necessary.
* * * * *

Here are some notes about the proposed rules.

The resulting general proposal is not to add to the bureaucracy and complexity of rules, but to reform a system that is already complex.

The broad exceptions suggested for sockpuppets are added here on the theory that there would, in fact, be many fewer sockpuppets under a more reasonable blocking regime.

Under this revised set of rules and principles, gatekeeping would be harder; that is a good thing. Consider that older accounts have more of the de facto power that depends on seniority. Consider also that those who practice the present unfair—arguably fraudulent—methods of gatekeeping Wikipedia make a regular practice of blocking dissident accounts, precisely in order to prevent such dissidents from building seniority. The above set of rules and principles militates against that.

These are reasonable and incremental changes, and they would require more work from Administrators, yes; but this is work they signed up to do, and it is important work because it goes directly to the motivation of a wide variety of people. It spells the difference between a much larger body of contributors and the present, restricted group. Wikipedia’s ability to deal swiftly with malicious vandals and trolls would be left almost entirely intact. They could be shut down nearly as quickly as before. Importantly, the trollish behavior of the Administrators themselves would be substantially restricted. If the new situation became overwhelming, then the community could remove editing rights for unregistered or IP address-only accounts. This is sure to reduce the workload and allow for more focus on giving actually registered accounts the time and attention they deserve from Administrators.

If there really is a shortage of suitable Administrators, then, as we said under Thesis 6, the Wikimedia Foundation might consider paying Administrators.

In short, if Wikipedia is to remain a genuinely open project, it must immediately end the current norms of indefinite blocking.

9. Adopt a legislative process.

[edit]
Wikipedia’s processes for adopting new policies, procedures, and projects are surprisingly weak. The Wikimedia Foundation (WMF) has launched initiatives, but these do not establish major editorial policy. Incremental policy tweaks cannot deliver the bold reforms Wikipedia needs. No clear precedents exist for adopting significant innovations. The project is governed by an unfair and anonymous oligarchy that likes things just as they are. This stagnation must end. Wikipedia needs an editorial legislature chosen by fair elections: one person, one vote. To establish legitimate and fair governance, the WMF should convene a constitutional convention to create an editorial charter and assembly. This assembly would be empowered to make the sorts of changes proposed in these “Nine Theses."

The Problem

[edit]
W

ikipedia has changed. Having originated or overseen many of the project’s first policies and procedures, I view the current system as a cargo cult: several policies have changed into forms different or even contrary to their original function.

The first eight theses describe some of the insanity. (1) Decision-making is done by an ersatz “consensus” that pretends to speak for a massive global community, while silencing dissent. (2) A once open, tolerant, global community is now dominated by Establishment commissars. (3) Many sources are blocked en masse for ideological reasons. (4) The neutrality policy itself has been upended, mocking and forbidding actual neutrality. (5) People take “ignore all rules” far too seriously. (6) Editorial leadership is anonymous, even after the project has become one of the most influential media properties in history. (7) Wikipedians disdainfully ignore public feedback. (8) Finally, Administrators routinely block accounts unjustly, tossing out serious editors peremptorily, because they can’t be bothered to deal with inconvenient participants.

What a mess.

These nine theses are my Hail Mary proposal to reform Wikipedia. These things needed to be said. Perhaps, perhaps, there will be some changes—but my hopes are not high. I have proposed many other fixes and tried alternative projects over the years, but there are in fact a number of people, including some very influential people, who like things just as they are. I know very well that, sadly, the system is unlikely to change very much. If it does change, it will probably be through external pressure, or pressure from the hitherto quiet rank-and-file.

All of that said, I am confident that most if not all of these nine theses would be popular among most of Wikipedia’s readers. The proposals are a matter of common sense, appealing to common civilizational values and sound rules of editing and publishing, absolutely consistent with a wide variety of political and religious views.

Still, if Wikipedians were to rally around any of these, one massive problem would remain: There is no legitimate, well-established way to ratify significant reforms. Suppose some of the more respected voices in the Wikipedia community were to agree that the project should repeal "Ignore all rules". How would they implement such a change? How would they prove a new "consensus"? Twenty-four years after the rule was instituted, who would be properly authorized to declare a "consensus"? It is not clear.

Well, how have structural changes been made to Wikipedia in the last 24 years? In three ways, I think.

1. The Wikimedia Foundation proposes a change, and it sticks. One example that comes to mind is the Universal Code of Conduct. But this appears to be an exercise in self-protection by the WMF’s legal team, with little impact on day-to-day operation. Other attempts are sometimes objected-to by the community, then abandoned.[63] The most positive example here is the VisualEditor tool—useful, but not a fundamental change in editorial policy or governance. It was promised for many years, rolled out, and then finally made opt-in. So this seems like a possible approach. But some of the louder voices in the editor community resent even such modest efforts by the WMF as overreach.[64] So, the WMF has avoided making significant new changes over the complaints of the editor community.
2. Iterate and rewrite old rules. We have already seen several examples of this. The most remarkable examples are policy pages, such as
  • Neutral point of view, discussed under Thesis 4. It was entirely rethought.
  • Notability, which began as a simple and expansive set of broad principles, consistent with what is called inclusionism (a preference for a larger set of article topics). But inclusionism gradually went out of favor. As the years went by, fewer people and topics merited an article of their own, so that, now, deletionism (a preference for a narrower set of article topics) is the dominant view.
  • Some of the biggest policies introduced after I left were Verifiability, which was introduced in 2003, and Reliable sources, which did not emerge until 2005. These are an uneven mix of sensible rules, conveniently vague principles, and boneheaded regulations. Eventually, this free-floating body of policy added some real shockers—for example, regarding “consensus” blacklisting of valuable media sources about which there necessarily could be no consensus (see Thesis 3).

From these examples it can be seen that iteration has led to major changes; the massive changes made to the neutrality policy are the best example. But such changes require many years, and they are not necessarily changes that anybody would have actually agreed to, should the matter have come to a vote early on. Slow-moving chaos is, after all, the nature of such change, not unlike a years-long game of “Telephone.”

3. Somebody writes an “essay,” and people start citing it as if it were policy. Indeed, many essays now are cited as de facto policy.[65] You can skim through the following sample if you like:
  • A good example is “The duck test,” which is a rationalization that Administrators use to block accounts that they suspect, but have not adequately proven, are actually sockpuppets. Though just an essay, this is cited frequently to block accounts permanently—even when accounts are never proven on rigorous, technical grounds to be sockpuppets. Circumstantial evidence becomes sufficient and is often used for partisan gatekeeping. (See Thesis 8.)
  • Then there is “Arguments to avoid in deletion discussions,” a long list of vague, unofficial rules about how to argue for and against deletion of articles and text. An oft-cited section is commonly abbreviated as “WP:OTHERSTUFFEXISTS.” This is frequently invoked, in a really shameless way, to justify inconsistency across articles.[66]
  • Another essay makes a virtue of the sheer meanness of many entrenched editors: “Wikipedia is not therapy.” If you complain about the stress that Wikipedia conflicts are causing you, some petty editor may cite this rule at you.[67]
  • Here is an essay that elevates a practice that I disagree with rather sharply: “Why most sentences should be cited.” I find the extreme proliferation of footnotes absurd, especially when they are irrelevant—for example, when uncontroversial facts are commonly known.[68]
  • Finally, “Assume good faith” is one of the silliest, most misbegotten, and oldest essays that became a “guideline” in Wikipedia-land. Guidelines function as a sort of “policy lite,” but they are typically treated as binding. In effect, “Assume good faith” requires editors to pretend to be naïve; this disables legitimate and serious criticism of manipulative behavior. The “rule” empowers sociopaths. You must not question their motives; you must not notice their subtle insults; you must not observe that your treatment is Kafkaesque; you must participate in the pretense that they are not engaged in a stupid game. The guideline makes Wikipedia function as a humiliation ritual.

These, then, are examples of how opinions, first expressed in essays, can be arbitrarily dignified as a policy or “guideline.” In this way, essays are quite good at allowing individuals and other minorities to make policy tweaks. But they rarely if ever accomplish wholesale change.

In short, there is nothing in place to make significant changes. With the occasional exception of Wikimedia Foundation proposals, there have been few, if any, significant changes after 2006 or so that were not directional tweaks or gradual iterations of existing rules. There is simply no precedent for Wikipedia to embrace radical new ideas like ending decision-making by consensus or enabling competing articles.

There is a striking absence of formal governance structures. The Arbitration Committee is the closest thing that exists, but its function is judicial rather than legislative. I noticed this lack of formal governance long ago.[69] I never took the opportunity, while at Wikipedia, to start anything like a legislature that could authorize new projects, policies, and procedures.[70] Wikipedia is aware of this state of affairs, but rarely treats it as a problem. There was a discussion about formalizing governance in 2008, under the heading “Governance reform,” but nothing came of it; it was relegated to the dustbin of wiki-history.

A message put atop Wikipedia’s 2008 “Governance reform” page.

There is also such a thing as “WikiProject Democracy,” but this is just small group of interested individuals who set up a page pointing to a collection of functions on Wikipedia that happen to be, roughly speaking, democratic. The platform has a fair few procedures driven by input and “votes,” but no process for adopting big new policies and projects. If you ask Wikipedians at present, they will tell you that any big changes would require “consensus,” which is a vague and unworkable idea in itself (see Thesis 1), and laughable as a proposal about how to make significant changes to the rules.

Nevertheless, many Wikipedians seem quite comfortable with how things are at present—even despite how evidently broken the system is. This is not surprising, because those at the top of the system are comfortably ensconced. This is consistent with sociologist Robert Michels’s iron law of oligarchy.[71] As Michels put it:[72]

Organization implies the tendency to oligarchy. In every organization, whether it be a political party, a professional union, or any other association of the kind, the aristocratic tendency manifests itself very clearly. The mechanism of the organization, while conferring a solidity of structure, induces serious changes in the organized mass, completely inverting the respective position of the leaders and the led. As a result of organization, every party of professional union becomes divided into a minority of directors and a majority of directed.

Consequently, we can anticipate that Wikipedia’s “aristocratic” leadership will oppose giving new types of democratic power to the rank-and-file. Probably, there would have to be something like a popular uprising that, as it were, seizes power. How such a popular uprising might work, in the context of Wikipedia, it is very hard to say. (And, no: I will not be leading such an uprising.)

Underlying Wikipedia’s avoidance of democratic governance, there is one fundamental, structural problem: true democracy requires one person, one vote. And that, in turn, requires that Wikipedia user accounts—at least some of them—be paired reliably with real human identities. In other words, if you want to vote on important matters, we must know that you have just one vote; for that to be the case, somebody we trust has to know who all the voters are.

For some Wikipedians, this is a bridge too far. They would never agree to any system that puts their anonymity at risk. But this is not, actually, a sound reason to object to the plan, since identities need not be public. The confirmation can be made privately by sufficiently trustworthy people. Moreover, not all accounts need to be voting accounts. Generally speaking, if you are unwilling to reveal your identity to anyone, you will have no leg to stand on if you try to defend a right to vote.

I can imagine Wikipedians taking issue with this. They might say, and I would agree, that voting should be restricted to active accounts. But then they might propose a “clever” addition. Namely, if an account has not been outed as a sockpuppet,[73] then we should “assume good faith” and treat it as a distinct individual for purposes of voting. The problem here is that we really have no clue as to what proportion of accounts are, in fact, sockpuppets. I suspect that there are many more of them than Wikipedians ordinarily assume. If the percentage of accounts that were sockpuppets were in the low single digits, perhaps the rule would be acceptable. But probably not. After all, if some voting scheme were adopted according to which each active account had one vote, then the number of active (successfully hidden) sockpuppets would most certainly increase, for voting purposes if nothing else. “Assume good faith” is simply an insane rule to follow when it comes to the high stakes of voting.

This might seem to be an impossible problem for Wikipedia, but insofar as we are talking about a serious legislative body, there is a simple and practical solution to it; see below.

Like some other theses, such as Thesis 1, Thesis 5, Thesis 6, and Thesis 7, the present thesis would have Wikipedia grow up, join the real world, take responsibility, and win back some of the legitimacy that it has—sadly, but deservedly—lost over the years.

The Reasonable Solution

[edit]

Without a means of legitimizing major changes through an online plebiscite, it is hard to know how Wikipedia might come to adopt any of the nine theses. So, this is perhaps the first thesis to pursue in practice. Discussion could be launched by grassroots Wikipedians or by the Wikimedia Foundation Board of Trustees. And the first thing to discuss, I would say, are these five goals:

  1. Wikipedia should adopt rules for a constitutional convention. Such a convention would set the rules for the democratic governance of Wikipedia qua editorial organization (not as a legal entity, since the WMF is already legally constituted). But first, there would need to be rules for conducting and ratifying the convention itself. While the convention rules could be drafted collaboratively, on the wiki, I do not think there is any way to legitimate the adoption of the rules on the wiki. In my opinion, only the Board of Trustees could legitimately adopt the rules that would govern the constitutional convention.
  2. The primary goal of the convention is to settle on the method and procedure for a Wikipedia editorial assembly. It might accomplish many other things, including the adoption of one or more of the theses proposed, but what is most needful is the establishment of a legislative body, which I would call an assembly. Defining rules would address how to elect members; how often they would meet; what their scope would be; term lengths and limits; what procedures the body would follow; the procedures for calling a plebiscite; the openness of deliberation; and other matters.
  3. Both the constitutional convention and the assembly should conduct business at face-to-face meetings paid for by the Wikimedia Foundation. Among other advantages, this solves the problem of ensuring one person, one vote, at least in assembly voting.
  4. The unique identity of those voting for members of the constitutional convention and the assembly should be confirmed. The Board of Trustees, with advice from the community at large, should adopt a system that requires that the identities of voting accounts be known to some small, diverse body of trustworthy individuals,[74] preferably from outside of Wikipedia and the WMF, so that the voting is conducted according to the essential democratic principle of “one person, one vote.” Those Wikipedians unwilling to prove their real-world identities (and ownership of associated Wikipedia accounts) would be able to continue participating in the wiki as usual, but they would not be able to vote for members of the convention or the assembly, or in plebiscites.
  5. A first order of business should be the rewriting of policies and the adoption of policy pages, guideline pages, and essays as “official policy” or not. If there is some important distinction between “policy” and “guideline,” it must be formally defined and adopted by the convention or the assembly. Over the years, many details of policies and especially of guidelines have piled up, which should probably be simplified or deleted. The sheer complexity and even incoherence of some areas of Wikipedia policy and guidelines is a much-bemoaned problem: the assembly would be the place to fix such problems. Finally, the vast majority of essays should be deprecated. I strongly advise the assembly to require that all essays, henceforth, be made subpages of their primary authors' “user pages.” Any essay intended to become official policy should be explicitly adopted by the assembly. A rule should be adopted to the effect that editors should not cite essays in arguing for their edits, or they should not be treated as having any weight.

It can be anticipated that the existence of a democratic representative assembly will lead to factionalism and new problems. But that, in my opinion, is an improvement over the stodgy, oligarchical status quo and a small price to pay for restoring common sense and democratic legitimacy to one of the world’s most powerful media platforms. Perfect unity is not the goal; legitimate governance is.

I should address a few potential objections to this. First, some might say that this proposal threatens to make the project more of an oligarchy than it already is. That would be a puzzling response, considering that Wikipedia right now is not a democracy even in concept: the proposal is to make it more of one, in part because it is already very much an oligarchy. Besides, until we know who the actual leaders of the project are (by acting on Thesis 6: “Reveal who Wikipedia’s leaders are"), we cannot really know the extent to which it is currently oligarchical. Many of those with their ears to the ground have assured me in recent years that the number of voices that really “matter” on Wikipedia is shockingly small. All told, the number of influential accounts is, perhaps, 600 to the low thousands.[75]

Second, some might say that this proposal would violate the principle “Wikipedia is not a democracy,” as if this were an inviolable bedrock principle. It is not, of course. It was not an original principle of Wikipedia, I can assure you, and the suggestion (as the just-linked policy page states) that decisions are to be made by “consensus” is bankrupt.

In any event, I and many others who originally developed the project always thought of Wikipedia as essentially democratic. The project now has the funding—and potentially the organization and maturity—to make it actually democratic, as it should be. This is the necessary first step to making Wikipedia finally a just, neutral, and robust community, capable of reliably summarizing everything that humanity knows.

- end -

Footnotes

[edit]
  1. ^ The internet history wonks might want to dig into the original wiki, WikiWikiWeb founded by Ward Cunningham. In particular, see WikiWikiWeb’s discussion of “DocumentMode,” which is very roughly like an encyclopedia article. On this and similar early wikis, the community would build pages collaboratively, first talking things out in “ThreadMode,” as in a discussion thread. Then, when a “consensus” was reached—and this was the word used, as in “rough consensus and running code”.somebody would go in and “refactor” (another term borrowed from computer programming) the page into something more like a document and less like a conversation. Then, the page would be in DocumentMode. Note that WikiWikiWeb looked askance at “Phony Community Consensus” (see the section of this page). It was not cool to pretend there was a consensus when there wasn’t one.
  2. ^ See WP:SYNTH. To be clear, this is contrary to the policy page, even as it is now stated. Such an offending “synthesis” is supposed to be an actual new inference; but sometimes, simply enumerating a series of views is wrongly misrepresented as such a “synthesis."
  3. ^ Such as those discussed in Thesis 6, or just any editor with a long history and high number of edits.
  4. ^ See, for example, Ashley Rindsberg, “Wikipedia Editors Are in Open Revolt over the American Pope,” Pirate Wires, May 9, 2025. It seems there have been chaotic, petty disputes on the “Pope Leo XIV” article’s Talk page over simple biographical facts: Is Pope Leo “American”. Peruvian? Black? Wikipedia’s once-collegial spirit has certainly given way to adversarial point‑scoring. How on earth can Wikipedia say with a straight face that any resolution to such interminable wrangling represents a “consensus”.
  5. ^ Here are some words that more honestly describe the result of the currently broken process: prevailing outcome, established outcome, editorial resolution, settled version, dominant opinion, final judgment.
  6. ^ Wikipedia-logo-v2-en.svg by Wikipedia. License: CC BY-SA 4.0, via Wikimedia Commons.
  7. ^ Of course, by this, they must be referring to the original Christianity of the countries from which most English Wikipedia contributors hail. But it is no great stretch to say that not many Wikipedians actually believe the tenets of orthodox Christianity. We will discuss this further presently.
  8. ^ This may come as news to some old Wikipedians who knew me “back in the day.” I converted in 2020 and told my conversion story last winter.
  9. ^ Footnotes and links are removed from the following quotation for readability.
  10. ^ Yes, even Muslims, and this matters, because, according to them, Allah is another name given in Arabic to Yahweh, and the origin of his worship was with his revelation to Abram explained in Genesis 12: Muslims agree with Jews and Christians on this. So, Wikipedia’s editors are contradicting religious scholars and rank-and-file believers of all three of these religions.
  11. ^ This may be said to be true even if the believers more often use other names for God.
  12. ^ Many examples of bias are listed in Thesis 4.
  13. ^ Ashley Rindsberg gives an in-depth treatment of the “Perennial sources” list and its history in “How Wikipedia Launders Regime Propaganda,” Pirate Wires, Aug. 29, 2024.
  14. ^ On the notion that there is a much greater latent community, which is not represented by the current Wikipedia community, see point 4 above, “Alienates conservatives,” as well as Thesis 1 and Thesis 8 .
  15. ^ See also the Nupedia policy of 2000 (archived copy).
  16. ^ For these examples, I want to thank the participants in this X thread.
  17. ^ The article is now titled “Grooming gangs scandal” and was nominated for deletion at least twice. See also Rotherham child sexual exploitation scandal. Note that the thing that is worth documenting in an encyclopedia is not the scandal per se but the actual criminal activity, which continues to this day. By, in the very title, reducing these topics to the associated (merely embarrassing, or tortious) “scandals” they involve, Wikipedia leaves the reader in doubt as to whether there were, and are, in fact, patterns of organized rape of English girls. Yet that is well-established and not controversial.
  18. ^ I use the past tense, but the fact is that there is much reason to think this sort of thing continues aggressively to this very year. See this speech and this one (by government officials) as well as this news report on the launch of a new investigation.
  19. ^ There are articles about the movement, of course, that do not dismiss it as “conspiracy theory.” See “Marxist cultural analysis” and “Critical theory"
  20. ^ Critical commentary has appeared in Psychology Today and in discussion forums like Hacker News.
  21. ^ To be clear, Wikipedians defend their (frankly biased) position based on the notion that they follow the “majority of” mainstream sources. But, of course, what the “majority” of sources looks like depends on what the reference group is.
  22. ^ For representative coverage of the Hindu complaints, see this Wired article, a long OpIndia “dossier” (see larrysanger.org for a link disallowed on Wikipedia), and this report of the Indian government asking why Wikipedia shouldn’t be regarded as a publisher. For representative coverage of Jewish complaints, see this ADL report, this letter by U.S. representatives to the Wikimedia Foundation, and this video; see also Ashley Rindsberg, “How Wikipedia’s Pro-Hamas Editors Hijacked the Israel-Palestine Narrative,” Pirate Wires, Oct. 10, 2024.
  23. ^ Another good example can be found here: Ashley Rindsberg, “Protest or Riot in LA? Wikipedia’s Editors Decide,” Pirate Wires, June 9, 2025. Wikipedia invariably prefers “protest” in place of “riot” even when it’s really a riot.
  24. ^ In the technical terms of logic, the concept of neutrality is scoped, or it ranges over a domain. Thus, a statement, or a whole article, is neutral with respect to a range of opinion on one or more questions.
  25. ^ To be sure, there are British people who are strongly opposed to this state of affairs, and their opinions are not worthless. That just illustrates the fact that scope can shift as history shifts, and can be wider if resources—and intellectual honesty and tolerance—permit.
  26. ^ Hube, Christoph (2017). “Bias in Wikipedia”. Proceedings of the 26th International Conference on World Wide Web Companion – WWW '17 Companion. New York, New York, US: ACM Press. pp. 717–721. Also, Samoilenko, A., Karimi, F., Edler, D., Kunegis, J., & Strohmaier, M. (2016). Linguistic neighbourhoods: explaining cultural borders on Wikipedia through multilingual co-editing activity. EPJ Data Science, 5, 1-20.
  27. ^ To be clear, the point is not that all Wikipedia articles on religious topics assume that the religions are false. Some simply report doctrine. They can do so in a way that is reasonably fair, at least in individual sentences and paragraphs. The point, rather, is that many Wikipedia articles on religious topics contain statements, made in Wikipedia’s own voice (i.e., the claims are not attributed to someone else), that logically entail that Christianity, and other religions, are simply false.
  28. ^ Puyu Yang and Giovanni Colavizza, “Polarization and Reliability of News Sources in Wikipedia,” CoRR abs/2210.16065 (2022), published in Online Information Review 48, no. 5 (2024): 908–925. Greenstein, Shane, and Feng Zhu, “Do Experts or Crowd-Based Models Produce More Bias? Evidence from Encyclopædia Britannica and Wikipedia.” MIS Quarterly 42, no. 3 (September 2018): 945–959. Eduardo Graells‑Garrido, Mounia Lalmas, and Filippo Menczer, “First Women, Second Sex: Gender Bias in Wikipedia,” arXiv, February 9, 2015.
  29. ^ "How the Regime Captured Wikipedia,” Pirate Wires, Aug. 5, 2024.
  30. ^ Perhaps the irony will not be obvious to some. Let me spell it out. Maher criticized the original Wikipedia as being too focused on the interests of white male Westerners. This is strictly bullshit: from the beginning, Wikipedia encouraged, and got, history, religion, philosophy, etc., from around the world. It became famous in part for its sheer quirkiness. This is what true globalism looks like (only more so). The likes of Maher and other progressives imagine, ludicrously, that their cloistered world represents the interests of the entire globe. Wikipedia really did aspire to represent the interests of the globe, and that means it had to be a messy, unruly thing. Maher probably wouldn’t have liked it; it wouldn’t have been progressive, because most of the world is profoundly conservative in a thousand different ways. Another irony is the suggestion that Wikipedia pushed, as the voices that “matter,” those of white males, because of Wikipedia’s policy on reliable sources. Yet in the wake of Maher’s push for progressivism, the number of acceptable reliable sources has greatly narrowed to those of a small, heavily white Western elite.
  31. ^ Along the same lines, imagine that someone was in the public eye and had a Wikipedia article. Then imagine the person transitions, but there is no mainstream reportage on the transition, because the person is no longer deemed newsworthy (for whatever reason). The person is now “deadnamed” by Wikipedia, with no one reporting the new name. If social media and blogs could be cited, such a problem could be solved. But it is not clear that in such a case that would be regarded as an adequate source.
  32. ^ For example, if the only the Times reported that Russia had offered bounties to Taliban fighters to kill U.S. troops in Afghanistan, then it would be appropriate, of course, to attribute this claim explicitly: “According to reporting found only by the New York Times, citing an unnamed source, Russia had offered bounties..."
  33. ^ On which, see Thesis 3.
  34. ^ Joyce, E., Pike, J. C., & Butler, B. S. (2012). Rules and Roles vs. Consensus: Self-Governed Deliberative Mass Collaboration Bureaucracies. American Behavioral Scientist, 57(5), 576–594. https://doi.org/10.1177/0002764212469366
  35. ^ This is, however, contrary to WP:WIKILAWYERING, which is a perfect example of a squishy guideline: something often treated as a “rule” that is, however, only selectively applied by insiders.
  36. ^ The extent to which powerful accounts are actually working for pay is unknown and, it may be argued, basically unknowable.
  37. ^ Note, the number of Administrators has been declining for years. In 2021, the number was over 1,100. Power has been concentrated in fewer hands.
  38. ^ Except, perhaps, for detective types. Every one of the “Power 62” might be identifiable by a sufficiently motivated sleuth; I am not sure. What I do know is that many such people do take considerable care in hiding their personally identifiable information, and some of them are technically savvy enough to make the task very difficult.
  39. ^ This is probably because they are the inheritors of the style of wikis, which were filled with 1990s gamer types; the earliest Wikipedia reflected their quaint online culture.
  40. ^ This is not merely hypothetical, but has been documented dozens of times by Wikipedia itself. See List of citogenesis incidents.
  41. ^ To be clear, the person who made up the accusations was named Brian Chase; as the AP reported in December, 2005, Chase had the admirable decency to admit to simply making it up; he reached out and apologized to Seigenthaler for what Chase characterized as a “a joke that went horribly, horribly wrong.” Kudos to Chase for that.
  42. ^ On Wikipedia’s silly rules regarding primary and secondary sources, see Thesis 3 and Thesis 4.
  43. ^ For much more about this, in fascinating detail potentially relevant to attorneys general and plaintiff’s counsel, see Ashley Rindsberg, “How Wikipedia is Becoming a Massive Pay-to-Play Scheme,” Pirate Wires, October 7, 2024. Rindsberg introduces the flourishing industry of paid Wikipedia editing, in which both “black-hat” and “white-hat” firms shape articles for clients ranging from corporations to media executives.
  44. ^ See “Wikipedia bans 381 accounts for secretly promoting brands,” Wired.com, posted Sep. 1, 2015.
  45. ^ One other thing exacerbates this problem. Wikipedia features strangely overzealous rules against conflict of interest. This incentivizes people to edit anonymously, forcing their self-defense underground. This further contributes to an underground market for Wikipedia editing that might not exist but for these unreasonable rules.
  46. ^ This would extend to for-profit and nonprofit corporations, governments and branches thereof, educational institutions, church organizations, sports and other clubs, etc.
  47. ^ Of interest, in relation to both the problem and the solution, is this old blog post: I have been talking about this problem since at least 2012.
  48. ^ If it should happen, however, that Wikipedia fails to adopt any of these proposals, then the natural next step is for browsers and browser extensions to insert such messages onto Wikipedia pages—being sure, of course, to clarify that the messages are generated by the browser rather than by Wikipedia itself.
  49. ^ A broader case might, possibly, be extended to include election interference.
  50. ^ For example, Ruprechter, Tobias, Santos, Tiago, and Helic, Denis. “Relating Wikipedia Article Quality to Edit Behavior and Link Structure.” Applied Network Science 5 (2020): 61. https://doi.org/10.1007/s41109-020-00305-y. Also, Kane, Gerald C., and Ransbotham, Sam. “Collaborative Development in Wikipedia.” arXiv preprint arXiv:1204.3352, April 16, 2012. https://doi.org/10.48550/arXiv.1204.3352.
  51. ^ This is not always the case, of course; even today, some Wikipedians are friendly, polite, and helpful. But if you often have to deal with bad eggs just to leave feedback, will you?
  52. ^ Such shabby treatment is in spite of Wikipedia:Please do not bite the newcomers, i.e., the commonly-cited essay, “Please do not bite the newcomers.” They cannot seem to help themselves.
  53. ^ Community Notes, as of this writing, still frequently cites Wikipedia as a source, which Elon Musk and I think is a mistake.
  54. ^ A problem that goes back to the first year of the project.
  55. ^ Globalist Academic Secular Progressive: introduced in Thesis 2 and elaborated in Thesis 4.
  56. ^ This can be profoundly depressing and, yes, traumatic. The Wired story of “Elliott” is not the only example in which a formerly devoted, intelligent editor was forever blocked, causing deep distress. In his case, it almost resulted in suicide. See Andrew McMillen, “Wikipedia Is Not Therapy!Wired (August 15, 2016). Some relevant quotes from the article: "After reviewing the conflict, a site administrator decided to ban Elliott on that Tuesday night. 'Given the seriousness of this conduct, I’ve set the block duration to indefinite,' noted the admin.” "He pulled out his iPhone and started typing a lengthy email. Titled 'The End’ and sent to a public Wikipedia mailing list watched by thousands of people around the world, late on the evening of Tuesday, May 17, Elliott’s email begins, 'I’ve just been blocked forever. I’ve been bullied, and I’m having suicidal thoughts.'” "Someone will pop up and say, 'It’s not therapy — just block them!' Where is the empathy? Where is the spark of feeling for your fellow person?” "With his IP address blocked from editing the site indefinitely, Elliott has no choice but to become just another casual visitor; a tourist unable to effect change. It’s now clear to him that when it comes to Wikipedia, he might be better off as an outsider, looking in."
  57. ^ From 07:38, 30 May 2025 through 15:53, 12 June 2025 (UTC). Some of these might have been outliers due to shutting down vandals. Dropping the two highest outliers (of groups of 500), the number was still 42%.
  58. ^ See “Editing for Hate: How Anti-Israel and Anti-Jewish Bias Undermines Wikipedia’s Neutrality,” posted March 18, 2025.
  59. ^ Two categories for me are porn and satanist accounts.
  60. ^ This was possible due to a relatively new process of Administrator recall, in which 25 extended-confirmed editors must sign on to a recall petition; see Wikipedia:Administrator recall.
  61. ^ This can be a problem with the Wikimedia Foundation itself, at times, as in the case of the 2019 “Fram ban.” Without public explanation or due process, the Wikimedia Foundation Office summarily banned Administrator “Fram” for one year, bypassing ArbCom. The action was accused of being tainted due to conflicts of interest—especially because complaints were tied to insiders like Laura Hale. Hale was then in a relationship with WMF Board chair María Sefidari. See Ashley Rindsberg, “How the Regime Captured Wikipedia,” Pirate Wires, Aug. 5, 2024.
  62. ^ The rule should not apply to sockpuppets unless a user would prefer an account that was blocked for being a sockpuppet, rather than a later account that has not yet been caught.
  63. ^ An example of that was explored by Thesis 7: the public rating system.
  64. ^ An example of this is when the WMF banned an admin, “Fram.” See Ashley Rindsberg, “How the Regime Captured Wikipedia,” Pirate Wires, Aug. 5, 2024.
  65. ^ This is another instance of Wikipedia acting like a cargo cult. The very idea of “essays” emerged from my own practice, in 2001, of spending 15-30 minutes a day unburdening myself of random thoughts and putting them on my user page. Later, these essays were removed to where you can still read them, on my user page on meta.wikipedia.org.
  66. ^ In other words, an editor wants to make an exception to what really ought to be a general principle that holds sway across many existing articles. If you argue that other articles exist that have similar features to the one under dispute, then this “policy”, which is only an essay, not actually policy—will be cited. The fact that “other stuff exists,” i.e., articles that follow a pattern, does not mean that the pattern should be followed in the present case. That is supposed to end the dispute; in the hands of influential editors, it often does.
  67. ^ Cf. Andrew McMillen, “Wikipedia Is Not Therapy!Wired (August 15, 2016). This is the story of “Elliott,” who almost committed suicide after his account was permanently blocked, on which, see Thesis 8. I feel compelled to point out that this is yet another example of how Wikipedia changed—for the worse. Until fall of 2001 or so, Wikipedia really was open, welcoming, and fairly friendly. It gradually became less so. On my “user page” (i.e., my public presence on the wiki), for many years, I have had these two pieces of advice: “May you continue to be open and warmly welcoming, not insular, ... [and] to show the door to trolls, vandals, and wiki-anarchists, who if permitted would waste your time and create a poisonous atmosphere here.” Wikipedians, who take their nasty and tiresome game-playing very seriously, clearly did not heed such injunctions.
  68. ^ As a style, this is found only on Wikipedia and, I suppose, some academic writing such as law reviews. It is typically useless to the end user, and a distracting artifact of edit-warring on Wikipedia. The sheer clutter can also pose a serious impediment to easy editing; sometimes I wonder if that is intentional.
  69. ^ See my memoir, “The Early History of Nupedia and Wikipedia,” reprinted in my Essays on Free Knowledge: The Origins of Wikipedia and the New Politics of Knowledge (Sanger Press, 2020). This is why Citizendium.org wrote and adopted a community charter, which had legislative authority, in 2005–6.
  70. ^ I recall idly thinking I should propose some governance structures, but before I could, Bomis ran out of funding for my position in the collapse of the Dot-com Bubble.
  71. ^ As some researchers have argued. See Shaw, Aaron, and Benjamin Mako Hill. “Laboratories of Oligarchy? How the Iron Law Extends to Peer Production.” arXiv preprint arXiv:1407.0323, July 1, 2014. https://arxiv.org/abs/1407.0323.
  72. ^ Michels, Robert, Political Parties: A Sociological Study of the Oligarchical Tendencies of Modern Democracy, trans. Eden and Cedar Paul. New York, 1968: Free Press. Originally published 1911. Quoted from the Wikipedia article “Iron Law of Oligarchy."
  73. ^ I.e., an extra account run by someone who already has an account. On some of the issues with sockpuppetry, see Thesis 8.
  74. ^ I.e., not closely associated—in order to make bribery and corruption more difficult.
  75. ^ The latter, if one includes active accounts that have “Autopatrolled” rights.