Earlier this month I wrote an article for the London School of Economics Impact of Social Sciences blog about how journalists can use the web and social media as a source of data about the state of issues, debates and information flows in different societies.
You can read the full post here.
In parallel to my work at the European Journalism Centre, for the past couple of years I have been working on and off on a research project that examines sourcing and knowledge production practices in data journalism and how these might be challenging traditional journalism epistemologies. I gave a talk at Stanford University last year about the first part of this study. Thanks to a four-year PhD grant from the University of Groningen and the University of Ghent, I will be able to dedicate more time to this project in the next few years, expand and improve it.
Below is a list of academic papers about data journalism and computational journalism that I collected during my work so far. A few of them, such as Schudson (2010) and Peters (2010) do not directly reference the practice of data journalism but discuss related and relevant developments.
Earlier this year I gave a talk on data journalism at a conference at Stanford University that focused on the right to information and transparency in the digital age. The talk focused on sourcing practices in data journalism and was based on a research project that I am currently working on. The project examines sourcing practices and knowledge production at the Guardian, the New York Times and ProPublica, based on interviews with journalists and analysis of data journalism projects.
Below are the slides from my talk.
In January 2012 I was interviewed by Alex Howard, O’Reilly Media’s Government 2.0 correspondent, about the state of data journalism. The interview was published on 14 February on O’Reilly Radar.