What Data Journalists Can Learn From New Media Research

Earlier this month I wrote an article for the London School of Economics Impact of Social Sciences blog about how journalists can use the web and social media as a source of data about the state of issues, debates and information flows in different societies. 

Screen Shot 2014-05-25 at 10.46.31

You can read the full post here.

List of Academic Papers about Data Journalism and Computational Journalism

In parallel to my work at the European Journalism Centre, for the past couple of years I have been working on and off on a research project that examines sourcing and knowledge production practices in data journalism and how these might be challenging traditional journalism epistemologies. I gave a talk at Stanford University last year about the first part of this study. Thanks to a four-year PhD grant from the University of Groningen and the University of Ghent, I will be able to dedicate more time to this project in the next few years, expand and improve it.

Below is a list of academic papers about data journalism and computational journalism that I collected during my work so far. A few of them, such as Schudson (2010) and Peters (2010) do not directly reference the practice of data journalism but discuss related and relevant developments.

Continue reading

Sourcing Practices in Data Journalism – Slides from My Talk at Stanford

Earlier this year I gave a talk on data journalism at a conference at Stanford University that focused on the right to information and transparency in the digital age. The talk focused on sourcing practices in data journalism and was based on a research project that I am currently working on. The project examines sourcing practices and knowledge production at the Guardian, the New York Times and ProPublica, based on interviews with journalists and analysis of data journalism projects.

Below are the slides from my talk.

Continue reading

Amazon as a Research Engine: Best Selling Issues in the Climate Change Debate

My colleagues at the Digital Methods Initiative (Erik Borra, Natalia Sanchez-Querubin and Sophie Waterloo) and I just submitted an abstract for a social media theory and methods conference featuring great names in this space: Jean Burgess (Queensland University of Technology), Axel Bruns (Queensland University of Technology), Greg Elmer (Ryerson University) and Ganaele Langlois (U. of Ontario Institute of Technology).

The paper is called “Amazon as a Research Engine: Best Selling Issues in the Climate Change Debate” and proposes a protocol for repuposing Amazon.com as tool for debate mapping. Below is our abstract:

Continue reading

Top 10 Most Tweeted Links from NICAR 2013

The annual US National Institute for Computer Assisted Reporting (NICAR) conference brings together hundreds of some of the most experienced data journalists, mainly US-based, and is packed with sessions where you can learn about the latest developments, tools and techniques in the field. Since I didn’t make it to NICAR last week, I followed the most used conference hashtag, #NICAR13, to stay on top of the discussions. The abundance of sessions and presentations at NICAR makes it impossible for anyone to absorb everything that is being discussed, so here is a list of the most tweeted links from the conference, which might be useful to come back to. (Unfortunately I missed capturing tweets from the first day of the conference.)

Continue reading

Seeing the Web through the ages – three stages of Internet research

This morning I attended the first lecture of the Digital Methods for Internet Research course at the University of Amsterdam. Richard Rogers, chair and professor in the university’s New Media & Digital Culture programme, introduced three stages of seeing the Web or of doing Internet research that we have witnessed so far.

Continue reading

Remediation and Premediation as Medium Specificity in Jay David Bolter and Richard Grusin

What is new about new media? As the term “new media” contains powerful ideological connotations, such as “new equals better,” the boundaries between old and new media have been intensely discussed in media studies. Are the old and the new media completely separate entities or are new media old media delivered with new technologies? Bolter and Grusin’s theory of remediation brings yet another way of thinking about new media and answering these questions. For Bolter and Grusin the specificity of new media, their “newness,” lies in the way they remediate older media. Building on McLuhan, they define remediation as “the representation of one medium in another.”[1] Against the technologically progressive view which celebrates new media as an improvement on and a complete break with old media, this notion sets the grounds for conceptualizing the relationship between old and new media not as oppositional but as part of a media genealogy, focusing, in Foucauldian fashion, on their connections and affiliations instead.[2]

Continue reading