ABSTRACT
This project analyzed the impact of a 2.5-day science training boot camp for political reporters on their use of scientific sources in published reporting. Results showed that immediately following the boot camp, most survey respondents indicated they would try to incorporate more scientific material into future stories. We used both automated text analysis and human-coded analysis to examine if actual changes in reporting behavior occurred. Automated text analysis revealed that while journalists did not use more explanatory language overall in the 6 months following the training, they wrote with greater certainty and less tentativeness in their published articles compared to before the training. The more targeted content analysis of articles revealed that journalists had modest increases for including scientific material overall, and peer-reviewed studies and scientists’ quotes in particular. We discuss the implications of these findings for science training, journalism, and reporting.
Acknowledgements
The authors would like to acknowledge the support of the SciLine team at the American Association for the Advancement of Science (AAAS) for planning the boot camp, with particular thanks to SciLine Associate Director for Science, Dr. Meredith Drosback.
Disclosure Statement
No potential conflict of interest was reported by the author(s).
Notes
1 Two sample chi-square tests were run due to the nominal nature of the data. Results indicated that while the number of articles that included scientific material and sources increased after the training, the only statistically significant increase was the use of peer-reviewed studies, χ2(1, 439) = 7.78, p < .01, V = 0 .13.