User:HaeB/Timeline of distributed Wikipedia proposals
Appearance
Timeline of "distributed Wikipedia" proposals
A timeline of - mostly independent - proposals for a kind of distributed Wikipedia (abolishing the principle that there is only one current article version for each topic), and more specifically, proposals to apply the principles of distributed revision control (as exemplified by Git in software development) to wikis in general and Wikipedia in particular.
Also noting significant related material.
- 1993: Interpedia: "several independent 'Seal-of-approval' (SOAP) agencies were envisioned which would rate Interpedia articles based on criteria of their own choosing; users could then decide which agencies' recommendations to follow." (from the Wikipedia article, unsourced)
- 1997: Ward Cunningham: Folk memory: A minimalist architecture for adaptive federation of object servers
- 1997: The Distributed Encyclopedia (proposal by Ulrich Fuchs, who would later become one of the first admins on the German Wikipedia and in 2005 founded a fork, "Wikiweise) "Whenever possible, the author stores the essay in html format on his or her own web site. .... all essays will have a uniform layout. ... All essays can be accessed via a centralized index." [1].
In his book "Good Faith Collaboration", Joseph Reagle comments: "The irony here is that while it became clear that the Web would play a fundamental role [for an Internet-based encyclopedia, something that wasn't clear in the earlier Interpedia proposal], and an enormous strength of the Web is its hypertextual and decentralized character, Wikipedia itself is not decentralized in this way. It is not a collection of articles, each written by a single author, strewn across the Web. Instead, many authors can collaborate on a single article, stored in a central database that permits easy versioning, formatting and stylistic presentation. Furthermore, there is a vibrant common culture among Wikipedians that contributes to Wikipedia's coherence." (However, the "Distency" seemed to aim at one article per topic: "... we will accept (mostly) everything on every headword. But, of course, we want to avoid two people writing on the same subject the same time. It's very unpleasant for you to write something we can't accept any more because someone other was faster with an essay about the same subject." [2]) - 2001/2002: According to Andrew Famiglietti, "the history of Wikipedia and Wikipedia like projects shows a long list of failures to implement a 'marketplace of ideas' model. GNUpedia, an attempt by the FSF to build its own encyclopedia in 2001, imploded after selecting a technologically ambitious plan to build a repository of texts users could filter by their own criteria. .. Wikipedia users batted around plans to build similar 'multiple stable versions' in the fall of 2001/spring 2002. None were ever implemented." (see also Wikinfo)
- November 2004: Remark in Authority metric proposal by Tim Starling: "... Only articles which are well-written and polished would be protected in this way. New or incomplete articles would be open to editing, just like on Wikipedia. And we could implement simple methods to carry over changes from Wikipedia to the proposed site. ... the whole idea of forking is that you can try things which aren't politically possible on the original site."
- July 2005: Meta:User:TomLord/distributed wikis ("...a kind of p2p take on wikis. I should be able to selectively import articles from yours to mine, edit them locally and upload. If I have a specialized wiki of my own, you should be able to merge it into your more general wiki..."), see also meta:Versioning and distributed data
- August 2005: In Ward Cunningham's keynote (video and slides) at the first Wikimania, a kind of distributed wiki concept - similar or identical to "Folk memory" - is described (within the last third of this presentation. Watching the video is recommended, the slides alone are not very easy to understand)
- December 2006: Wikipedia:Branching support / bug 8265
- October 2007: "Decentralizing Wikipedia and its sister projects" (Wikiversity): "... When someone wants to view an article on cows, they basically ping a tracker server ... The tracker server would say, 'In order the amount of trust you can put into the integrity of the article, you can find the most up to date revision of the article on Cows on the servers x, y, z, and c'. The person who requested the article on cows would then send a message off to machines x, y, z, and c ... They make their changes and send them off to a few of the servers that contain the article on cows. This would then be propagated and perhaps reviewed by editors and then integrated into the best version."
- February 2008: Possibility of a git-based fully distributed Wikipedia Thread on Foundation-l (inspired by the release of git-wiki)
- July 2008: "Federating Wikipedia as Open Educational Resource". Presentation at Wikimania 2008 by Murugan Pal from the CK-12 foundation, about extracting content from Wikipedia for use in other formats, also discusses synchronizing (merging content back into Wikipedia), includes some demonstration of their "FlexBooks" software.
- August 2009: Side remark in my Wikimania talk that git-like forking/merging tools might foster cooperation between Wikipedia and Citizendium
- August 2009: strategy:Proposal:Distributed Wikipedia ("Communities can then decide who to view as 'authoritative'. In other words, the entire Wikipedia database could in theory entirely be forked. Democratically. In this way, much of the criticism of Wikipedia's process simply... melts away.")
- October 2009: Wikipedia meets git Thread on Foundation-l. Also mentions git-wiki, gitit, ikiwiki, wigit, DSMW (Distributed Semantic Media Wiki), ...
- November 2009: During a very public controversy about deletions on the German Wikipedia (mainly fueled by members of the Chaos Computer Club), German hacker Scytale announces Levitation, a software project to import XML Wikipedia dumps into Git repositories. It produces a functional version (tested on some large Wikipedias), but peters out before achieving Scytale's vision of an "Omnipedia" ("everyone his own Wikipedia").
- November 2009: mspr0: Die Multipedia: Schafft ein, zwei, viele Wikipedien! (In German - Google translation) Written independently, but inspired by the same controversy
- November 2009: Distripedia (short blog post, apparently without many consequences)
- March 2010: Maja van der Velden: "When Knowledges Meet: Database Design and the Performance of Knowledge", talk at the "Critical Point of View"(CPOV) conference, summaryvideo, suggests "decentering Wikipedia further" to a "distributed database of local ontologies" (ca.14:20- in video), cf. [3]
- May 2010: Gitizendium by Tom Morris ("An attempt to move a little chunk of Citizendium into Git", mainly motivated by a desire to handle Citizendium's "approval" process, which forks articles into a stable and a "Draft" version, in a more natural way, see Citizendium forum posts: [4], [5])
- May 2010: CPOV interview with Florian Cramer (also mentions Levitation)
- July 2010: Federating Wikipedia (presentation at Wikimania 2010, by V. Grishchenko)
- August 2010: Making GitHub More Open: Git-backed Wikis (GitHub announcement) "Each wiki is a Git repository, so you're able to push and pull them like anything else."
- September 2010: Anil Dash: Forking is a Feature (blog post, suggesting among other observations: "... one of the best ways for Wikipedia to reinvigorate itself, and to break away from the stultifying and arcane editing discussions that are its worst feature, could be to embrace the idea that there's not One True Version of every Wikipedia article. A new-generation Wikipedia based on Git-style technologies could allow there to be not just one Ocelot article per language, but an infinite number of them, each of which could be easily mixed and merged into your own preferred version")
- November 2010: Discussion (WebCite) about a possible fork of Citizendium: "I've been experimenting with Gollum, a wiki engine that uses Git. Distributed revision control means we get rid of a huge amount of politics, and people can work on things in a distributed fashion, with branches and offline editing and so on. ... Gollum really rocks and could really be the future of wikis."
- January 2011: Tony Sidaway: [6] "A few years ago I tried to work out how a peering arrangement for parallel Wikipedias could work. Peer sites would in effect have proxy accounts, and edits would appear on selected pages. ... Peering is performed by an exchange of revisions, after which each wiki is returned to its “native” state (ie, the latest revision is a copy of the local state of the article before the peered revisions were introduced)."
- January 2011: "Distributed Wikis" (talk by Pfctdayelise at linux.conf.au, abstract, slides, video). "... now that distributed version control systems (DVCS) have made forking trivial, are there implications for the political act as well? How does political forking work within collaborative prose text projects (i.e. wikis)? English Wikipedia is so large as to be practically unforkable ... One of the core Wikipedia rules is “one topic, one article”, which would seem to prohibit forking, but could we adhere to this principle and still take advantage of DVCS?"
- June 2011: Foundation-l discussion, Alec Conroy: "In 2002, we sort of 'forked off' from the 'mainstream' Free Software movement, and this 2002ish model of revision control is the model we use in our wikis. ... A users could create a whole new 'project' without using any Wikimedia resources at all ... If a new project was popular, it could be seamlessly and automatically shared with the entire world, again, at no expense to the foundation. 'Bad' projects would get weeded out because no one would share them, while 'good' projects would rise to the top automatically." [7] David Gerard: "Adapting MediaWiki to git has been tried a few times. I suspect the problem is that the software deeply assumes a database behind it, not a version-controlled file tree." [8]
- June 2011: Ward Cunningham: Smallest Federated Wiki ....
- June 2011: Thoughts about using a git-backed wiki for a proposed "Encyclopaedia of original research"
- August 2011: A Wikimania talk about opening up Wikipedia as a data platform cites statistics from open source projects that switched to decentralized revision control, concluding that "decentralizing interaction increases participation".
- January 2012: Project idea at the San Francisco Hackathon: "No more edit conflicts", a proposal to avoid edit conflicts by "Enhanc[ing] revision ids to support version vector, initial goal will be to ease the 3-way merge bottleneck. Tolerate article text in an ambiguous state, meaning that unresolved conflicts will result in a fork which can be cleaned up later by the original poster or an editor. ... A rough analogy is that MediaWiki databases would act like a distributed version control system (git) such that people could clone and fork them."
Additions welcome, but note note that this is not about the related proposals to host/distribute Wikipedia (in its current form) using P2P transfer (such as meta:P2P or "A Decentralized Wiki Engine for Collaborative Wikipedia Hosting")
Other overviews
- The historical notes in V. Grishchenko's paper Deep Hypertext with Embedded Revision Control Implemented in Regular Expressions (WikiSym 2010) mention several other examples of distributed wiki software and credit Ward Cunningham (1997) as author of the first distributed wiki proposal, it also describes relations to ideas from Project Xanadu.
- http://www.delicious.com/pfctdayelise/decentralisedwiki