Abstract
This article examines the features of day-to-day data journalism produced by The New York Times and The Washington Post in the first half of 2017. The content analysis evaluates story characteristics linked to the concepts of transparency, interactivity, diversity, and information provenance. It finds that the data journalism produced by those outlets comes from small teams, focuses on “hard news,” provides fairly uncomplex data visualizations with low levels of interactivity, relies primarily on institutional sources and offers little original data collection, and incorporates just two data sources on average in a generally opaque manner. This leads to the conclusion that “general data journalism” still has a long way to go before it can live up to the optimism and idealization that characterizes much of the data turn in journalism. Instead, contemporary day-to-day data journalism is perhaps better characterized as evolutionary rather than revolutionary, with its celebrated potential to serve as a leap forward for journalism and engender greater trust in it remaining untapped.
Acknowledgement
The author wishes to thank Sabrina Negrón and Rachel Perry-Gore for their many contributions to this project.
Disclosure Statement
No potential conflict of interest was reported by the author.
Notes
Notes
1 Though this definition was conceived prior to the publication of Lowrey and Hou’s (2018) study, there is considerable convergence between both definitions. This suggests the scholarship is moving toward a more consistent content-oriented conceptualization of data journalism.
2 For example, a simple bar graph would be considered a univariate visualization, with the categories on the X-axis comprising the sole variable and the associated values displayed on the Y-axis not counted.
3 Both organizations instead tended to reference the homepage of the organization responsible for the supporting data, rather than the specific database or dataset they used.