Publication Cover
Social Epistemology
A Journal of Knowledge, Culture and Policy
Volume 26, 2012 - Issue 1
291
Views
8
CrossRef citations to date
0
Altmetric
Articles

Open Source Production of Encyclopedias: Editorial Policies at the Intersection of Organizational and Epistemological Trust

Pages 71-103 | Published online: 02 Dec 2011
 

Abstract

The ideas behind open source software are currently applied to the production of encyclopedias. A sample of six English text-based, neutral-point-of-view, online encyclopedias of the kind are identified: h2g2, Wikipedia, Scholarpedia, Encyclopedia of Earth, Citizendium and Knol. How do these projects deal with the problem of trusting their participants to behave as competent and loyal encyclopedists? Editorial policies for soliciting and processing content are shown to range from high discretion to low discretion; that is, from granting unlimited trust to limited trust. Their conceptions of the proper role for experts are also explored and it is argued that to a great extent they determine editorial policies. Subsequently, internal discussions about quality guarantee at Wikipedia are rendered. All indications are that review and “super-review” of new edits will become policy, to be performed by Wikipedians with a better reputation. Finally, while for encyclopedias the issue of organizational trust largely coincides with epistemological trust, a link is made with theories about the acceptance of testimony. It is argued that both non-reductionist views (the “acceptance principle” and the “assurance view”) and reductionist ones (an appeal to background conditions, and a—newly defined—“expertise view”) have been implemented in editorial strategies over the past decade.

Acknowledgements

Acknowledgements are due to the participants of the ETHICOMP2010 conference in Tarragona, Spain (April 2010), and to two perceptive reviewers of this journal.

Notes

[1] All websites mentioned in the text and the references were last accessed on 14 May 2011.

[2] Whenever web-based communities solicit input from contributors and just amass it all together (like photographs, videos, music samples or logo designs), dubious input does little harm while it remains isolated. When, however, contributors interact continuously to create content together with the intention of raising quality ever higher, the problem becomes acute (de Laat Citation2010). Any questionable input at any stage in the process potentially propagates to subsequent versions. Apart from collectively composed software, journals and books, such is especially the case for encyclopedias.

[3] Note that we are talking here about the question whether and to what extent communities consider their members trustworthy and build their socio-technical architecture accordingly—not the other way round. The reverse issue—how communities may inspire trust with their members—is the usual focus of discussions on structuring web presences (Shneiderman Citation2000; critical review in Riegelsberger, Sasse, and McCarthy Citation2005). Nevertheless, the issues are closely related because of the close connection between users and producers of knowledge: precisely by granting trust in their design to the audience-as-producers tailored to the extent that this is justified, encyclopedias may inspire confidence with the audience-as-users.

[4] But not completely, while, for example, technical and social aspects of cooperation with other volunteers in collaborative spaces are also important, which fall outside the epistemic purview. Being a trustworthy expert on something, whether an academic subject or more mundane affairs, can best be seen as a necessary condition for becoming a good encyclopedist.

[5] Other authors refer to this as “open content” production.

[6] The list can be obtained from the author on request.

[7] The ratings assigned to an entry are visible to all users, albeit in a barely noticeable spot: at the very end of the corresponding discussion page.

[8] I adopt the convention of exclusively reserving the terms trust and trustworthiness for situations of dependence on people (and systems), while the term credibility applies to the issue of believability of text. I do so since I want to avoid the common practice of using the word trust (worthiness) for just about anything under the sun (see Tseng and Fogg Citation1999).

[9] Kramer, Gregorowicz, and Iyer (Citation2008) propose a simpler approach to quantifying reputation, focusing on the edit survival rate of an editor’s contributions after a constant number of revisions (10, say). In this way they circumvent the measurement problem that occurs when all edits ever performed by an author are taken into consideration at a specific sampling moment: some edits are old and properly tested, while others are still young and relatively untested. In other words: the lifetime of dead edits is known, while this is uncertain for alive edits.

[10] Another type of proposal tries to stay away from computational metrics and continues to rely on human judgment. In order to empower users, a “trust-aware” type of recommendation system is developed that calculates a rating for each entry based on recommendations from one’s “friends.” As in de Alfaro’s model, trust values of one’s friends are variable: they change according to the proximity of their particular verdict to the verdict of others. As a proxy the system can be installed on the user’s own machine (Korsgaard and Jensen Citation2009).

[11] At the same time de Alfaro and coworkers declare themselves to be reluctant to actually display Wikipedian author scores since it would excessively disturb “the spirit of friendliness and cooperation” in Wikipedia. To them, it is mainly to be seen as a mathematical tool for computing textual trust and to take measures accordingly (http://www.wikitrust.net/frequently-asked-questions-faq).

[12] For several language versions a Firefox extension can be downloaded that, for the moment, connects to servers at the University of California at Santa Cruz—not yet directly to the servers of the Wikipedia Foundation (for details see http://www.wikitrust.net).

[13] To mention just some larger ones (in chronological order of actual introduction of the review system): the German, Russian, Polish, Hungarian, Chinese (2008), Arabic (2009), and Indonesian (2010) versions of Wikipedia (data taken from http://meta.wikimedia.org/wiki/FlaggedRevs).

[14] In the next section on testimony I return to the criteria used for appointing reviewers and/or super-reviewers, interpreting them as indicators, respectively, of good intentions towards Wikipedia and “interactional” expertise.In contrast to the English Wikipedia, in the English Wikinews (the journal-that-anyone-may-edit) a similar review system has been functioning for several years now (since mid-2008). After development in the “newsroom,” any new article has to be reviewed before official publication on the “main page”—it should respect copyrights, be newsworthy, fully sourced, written in a neutral manner and conform to a style guide. The task is performed by editors with reviewer status; one has to apply for it and obtain sufficient favorable votes. The status also allows immediate official publication of one’s own new articles (http://en.wikinews.org/wiki/Wikinews:Reviewing_articles). Remarkably enough, the German Wikinews still operates without official reviewers, presumably since “vandalism” does not constitute much of a problem there.

[15] So a cycle can be observed involving both individual testimony and group testimony. Individual producers contribute testimony as input to the encyclopedic production process; as a result, in the long run the encyclopedic institution is able to provide group testimony to its readers. Subsequently those readers, of course, can turn into producers again, completing the cycle. Note that my approach implies separating the two intertwined components in Bruns’ characterization as “produsers” (Bruns Citation2008): users and producers of content are considered separately while involved in different practices.

[16] Note that by developing such procedures Wikipedia may be said to have “encapsulated” the interests of its readers; namely, that entries are (or become) reliable, precise and complete. In accordance with the views of Hardin (Citation2002, ch. 1), readers may develop a long-lasting relationship of epistemic trust towards Wikipedia in this fashion. Remarkably, this relation is not a direct one between readers and contributors, but between readers and Wikipedia as intermediary quasi-institution.

[17] Notice the substantial difference from talk pages: people posting on them are urged to sign with their username or IP-address. A non-anonymous conversation is strongly preferred.

[18] Number 21 of users who oppose the proposal that only registered users may edit the site; post from August 2006 at http://meta.wikimedia.org/wiki/Anonymous_users_should_not_be_allowed_to_edit_articles.

[19] Notice I am not arguing here that qualified experts are the only ones able to perform as experts, or provide expert contributions to an encyclopedia. Obviously, exceptionally talented people can become “true” experts outside official circles of qualification. I am only saying that encyclopedias may take to this criterion as they consider it an easy and efficient procedure.

[20] H2g2 has not been mentioned here since their “weak” implementation of the expertise view (with “editing” experts) is hardly noticeable: the peer review option is voluntary and actually chosen by only 5% of authors.

[21] About 5% of such requests (totaling more than 1,000 as of January 2011) are actually turned down for lack of experience, misbehavior, or both.

[22] On behalf of its readers, Wikipedia creates incentives for contributors to display trustworthy behavior. The quality interest of readers had been encapsulated before in Wikipedian procedures (see note [16] above); as a result, readers could trust Wikipedia (with mature articles to be interpreted as “group testimony”; see Tollefsen Citation2009). This is now taken one step further in another Hardinian maneuver: the quality interest is also encapsulated in the reputational mechanism so that Wikipedia may trust their own contributors (on an individual basis).

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 53.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 384.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.