User:HaeB/Timeline of distributed Wikipedia proposals
Timeline of "distributed Wikipedia" proposals
A timeline of - mostly independent - proposals for a kind of distributed Wikipedia (abolishing the principle that there is only one current article version for each topic), and more specifically, proposals to apply the principles of distributed revision control (as exemplified by Git in software development) to wikis in general and Wikipedia in particular.
Also noting significant related material.
The historical notes in V. Grishchenko's paper Deep Hypertext with Embedded Revision Control Implemented in Regular Expressions (WikiSym 2010) mention several other examples of distributed wiki software and credit Ward Cunningham (1997) as author of the first distributed wiki proposal, it also describes relations to ideas from Project Xanadu.
- 1993: Interpedia: "several independent 'Seal-of-approval' (SOAP) agencies were envisioned which would rate Interpedia articles based on criteria of their own choosing; users could then decide which agencies' recommendations to follow." (from the Wikipedia article, unsourced)
- 1997: Ward Cunningham: Folk memory: A minimalist architecture for adaptive federation of object servers
- 1997: The Distributed Encyclopedia (proposal by Ulrich Fuchs, who would later become one of the first admins on the German Wikipedia and in 2005 founded a fork, "Wikiweise) "Whenever possible, the author stores the essay in html format on his or her own web site. .... all essays will have a uniform layout. ... All essays can be accessed via a centralized index." [1].
In his book "Good Faith Collaboration", Joseph Reagle comments: "The irony here is that while it became clear that the Web would play a fundamental role [for an Internet-based encyclopedia, something that wasn't clear in the earlier Interpedia proposal], and an enormous strength of the Web is its hypertextual and decentralized character, Wikipedia itself is not decentralized in this way. It is not a collection of articles, each written by a single author, strewn across the Web. Instead, many authors can collaborate on a single article, stored in a central database that permits easy versioning, formatting and stylistic presentation. Furthermore, there is a vibrant common culture among Wikipedians that contributes to Wikipedia's coherence." (However, the "Distency" seemed to aim at one article per topic: "... we will accept (mostly) everything on every headword. But, of course, we want to avoid two people writing on the same subject the same time. It's very unpleasant for you to write something we can't accept any more because someone other was faster with an essay about the same subject." [2]) - 2001/2002: According to Andrew Famiglietti, "the history of Wikipedia and Wikipedia like projects shows a long list of failures to implement a 'marketplace of ideas' model. GNUpedia, an attempt by the FSF to build its own encyclopedia in 2001, imploded after selecting a technologically ambitious plan to build a repository of texts users could filter by their own criteria. .. Wikipedia users batted around plans to build similar 'multiple stable versions' in the fall of 2001/spring 2002. None were ever implemented." (see also Wikinfo)
- July 2005: Meta:User:TomLord/distributed wikis ("...a kind of p2p take on wikis. I should be able to selectively import articles from yours to mine, edit them locally and upload. If I have a specialized wiki of my own, you should be able to merge it into your more general wiki..."), see also meta:Versioning and distributed data
- August 2005: In Ward Cunningham's keynote (video and slides) at the first Wikimania, a kind of distributed wiki concept - similar or identical to "Folk memory" - is described (within the last third of this presentation. Watching the video is recommended, the slides alone are not very easy to understand)
- February 2008: Possibility of a git-based fully distributed Wikipedia Thread on Foundation-l (inspired by the release of git-wiki)
- July 2008: "Federating Wikipedia as Open Educational Resource". Presentation at Wikimania 2008 by Murugan Pal from the CK-12 foundation, about extracting content from Wikipedia for use in other formats, also discusses synchronizing (merging content back into Wikipedia), includes some demonstration of their "FlexBooks" software.
- August 2009: Side remark in my Wikimania talk that git-like forking/merging tools might foster cooperation between Wikipedia and Citizendium
- August 2009: strategy:Proposal:Distributed Wikipedia ("Communities can then decide who to view as 'authoritative'. In other words, the entire Wikipedia database could in theory entirely be forked. Democratically. In this way, much of the criticism of Wikipedia's process simply... melts away.")
- October 2009: Wikipedia meets git Thread on Foundation-l. Also mentions git-wiki, gitit, ikiwiki, wigit, DSMW (Distributed Semantic Media Wiki), ...
- November 2009: During a very public controversy about deletions on the German Wikipedia (mainly fueled by members of the Chaos Computer Club), German hacker Scytale announces Levitation, a software project to import XML Wikipedia dumps into Git repositories. It produces a functional version (tested on some large Wikipedias), but peters out before achieving Scytale's vision of an "Omnipedia" ("everyone his own Wikipedia").
- November 2009: mspr0: Die Multipedia: Schafft ein, zwei, viele Wikipedien! (In German - Google translation) Written independently, but inspired by the same controversy
- November 2009: Distripedia (short blog post, apparently without many consequences)
- March 2010: Maja van der Velden: "When Knowledges Meet: Database Design and the Performance of Knowledge", talk at the "Critical Point of View"(CPOV) conference, summaryvideo, suggests "decentering Wikipedia further" to a "distributed database of local ontologies" (ca.14:20- in video), cf. [3]
- May 2010: Gitizendium by Tom Morris ("An attempt to move a little chunk of Citizendium into Git", mainly motivated by a desire to handle Citizendium's "approval" process, which forks articles into a stable and a "Draft" version, in a more natural way, see Citizendium forum posts: [4], [5])
- May 2010: CPOV interview with Florian Cramer (also mentions Levitation)
- July 2010: Federating Wikipedia (presentation at Wikimania 2010, by V. Grishchenko)
- August 2010: Making GitHub More Open: Git-backed Wikis (GitHub announcement) "Each wiki is a Git repository, so you're able to push and pull them like anything else."
- September 2010: Anil Dash: Forking is a Feature (blog post, suggesting among other observations: "... one of the best ways for Wikipedia to reinvigorate itself, and to break away from the stultifying and arcane editing discussions that are its worst feature, could be to embrace the idea that there's not One True Version of every Wikipedia article. A new-generation Wikipedia based on Git-style technologies could allow there to be not just one Ocelot article per language, but an infinite number of them, each of which could be easily mixed and merged into your own preferred version")
- November 2010: Discussion (WebCite) about a possible fork of Citizendium: "I've been experimenting with Gollum, a wiki engine that uses Git. Distributed revision control means we get rid of a huge amount of politics, and people can work on things in a distributed fashion, with branches and offline editing and so on. ... Gollum really rocks and could really be the future of wikis."
- January 2011: Tony Sidaway: [6] "A few years ago I tried to work out how a peering arrangement for parallel Wikipedias could work. Peer sites would in effect have proxy accounts, and edits would appear on selected pages. ... Peering is performed by an exchange of revisions, after which each wiki is returned to its “native” state (ie, the latest revision is a copy of the local state of the article before the peered revisions were introduced)."
- January 2011: "Distributed Wikis" (talk by Pfctdayelise at linux.conf.au, abstract, slides). "... now that distributed version control systems (DVCS) have made forking trivial, are there implications for the political act as well? How does political forking work within collaborative prose text projects (i.e. wikis)? English Wikipedia is so large as to be practically unforkable ... One of the core Wikipedia rules is “one topic, one article”, which would seem to prohibit forking, but could we adhere to this principle and still take advantage of DVCS?"
Additions welcome, but note note that this is not about the related proposals to host/distribute Wikipedia (in its current form) using P2P transfer (such as meta:P2P or this)