Control Rates in User Generated Content: PoliticalBase.com, the Moderated Political Wikipedia

Technological developments, resulting in free user-friendly interface applications, led to the second step in the evolution of the World Wide Web, the Web 2.0. The Web 2.0 reflects a paradigm shift, from the “read web”, another platform of mass communication, whose advantages over the traditional media were in terms of functionality: better storage of large amounts of data, better manipulation and selection by means of hypertext and linkage, towards the “read/ write web”. The Web 2.0 is a typical manifestation of a  new paradigm, convergence culture, a space where the concepts of author and audience/user cannot be distinguished,  a space of horizontal co-authoring, augmented by the use of technology as platform of communication exchange.  

Although the main purpose of the Web 2.0 is to empower the users who group themselves in online social networks, the degree of control of the user generated content in the Web 2.0 space depends on the application. PoliticalBase.com is a comprehensive application that defines itself as an “user-powered online community providing bi-partisan commentary, information and conversation about US Politics”. It is thus an online community of interest, gathering collective knowledge by means of user contributions, centered around the topic of political news in the US.

The application offers various tools. The front page of PoliticalBase.com highlights news submitted by Political Base users. The most innovative tool is the Money Track, which allows you to see how much money U.S. political candidates have raised, from what states, counties and even persons they have raised it from, by building diagrams based on data collected from the Federal Election Commission. This is however not a Web 2.0 feature, as only the administrators can introduce data. The other tools offered are topic centered wikis, focusing on several categories: political people, political issues and political groups, that are edited by users and evaluated by moderators. The application also has a forum section. The difference between PoliticalBase.com and Wikipedia as far as control over content is concerned, is that the structure of the wiki created by PoliticalBase.com compels users to enter specific types of content in fixed categories, the content being afterwards review by staff moderators. For example, on the page for Democratic candidate Barack Obama, you can vote on your perception of this politician in different areas, see his political affiliations, and find out where he stands on the issues, as a result of the data introduced in the wiki categories.

In spite of the good ranking as far as data and functionality are concerned, PoliticalBase.com is a quite hierarchical and undemocratical online community, since the moderators have veto power: “For brand new users Political Base moderates all new content submitted into the system. The content is either added or deleted within 24 hours after the submission is made. As users submit quality content, they earn points which rank them in the community and open up more editing options in a less strict environment, often earning live edit access to the majority of the site. Submissions made are moderated by an internal staff and by high scoring users from within our user base. At any time you can view your point score at the top-right of your “my base” section.”

The application distinguishes thus between two categories of contributors: moderators and users, and places the evaluation of the moderators above the evaluation of the readers, which may lead to biased information, especially if we take into account the political nature of the website. From this point of view the application disregards the aspect of socialization of the web 2.0 applications and maintains the traditional hierarchical structure of collaboration, aspect that may push away potential users. On the other hand, the meritory system of gaining editorial liberties by means of votes, can be seen as a method to make users responsible of the content placed online.

Florian Cramer on “Why Semantic Search is Flawed”

Society of the Query Florian Cramer, head of the Networked Media Master at the Piet Zwart Institute in Rotterdam, ended the last session of The Society of the Query conference. The Alternative Search 2 session presented a few of the latest web technologies as potential directions for the web and search engine design in the near future: RFDa, which would make the shift to what Steven Pemberton named the web 3.0, and semantic search, as implemented in the Europeana project.

Florian Cramer concluded this series of presentations with a critical and somewhat pessimistic evaluation of the current state of the web and the idea of a semantic web and semantic search, as one of its potential futures. His three main arguments revolved around: “why search is not just web search (and not just Google),” “why semantic search is flawed,” and “why the world wide web is broken.”

The first point expressed his frustration with the narrow understanding of the notions of query and search engine on which the conference focused. As he explains, wikis and social networking sites also include the search engine functionalities.

Society of the QueryAs far as semantic search is concerned, Cramer usefully pointed out to the difference between folksonomies, the currently used form of semantic tagging, and the universal semantic tagging which a semantic web would require. While folksonomies are “unsystematic, ad-hoc, user-generated and site-specific tagging systems,” (Cramer, 2007), like the tagging systems of Flickr for example, the semantic web would require a structured, universal tagging and classification system which would apply to the entire web. Cramer is skeptical of the possibility to create this unified, ‘objective’, meta-tagging system because classifications, or taxonomies, are not arbitrary but expressions of ideologies, which would call for the discussion of the politics of meta-tagging. While meta-tagging may have its advantages, such as arguably empowering the web users and weakening the position of large web services corporations, although still maintaining the necessity of search engines to aggregate data, it also has several potential weaknesses. The semantic web model must be based on trust in order to prevent some predictable problems, such as massive spamming.

In the concluding section, Cramer expressed his concern that the Internet as a medium for publication and information storage is not sustainable and argued for redundancy in web archiving. However this desire for permanence raises questions about the nature of the medium itself.

Photos by Anne Helmond.