References
- Allaire, J. J., Y. Xie, R Foundation, H. Wickham, Journal of Statistical Software, R. Vaidyanathan, Association for Computing Machinery, et al. 2020. rticles: Article formats for R markdown. CRAN. R package version 0.14.1. Accessed August 31, 2020. https://github.com/rstudio/rticles.
- Barba, L. A. 2018. Terminologies for reproducible research. arXiv:1802.03311. http://arxiv.org/abs/1802.03311.
- Barnes, N. 2010. Publish your computer code: It is good enough. Nature News 467 (7317):753. doi: https://doi.org/10.1038/467753a.
- Bhandari Neupane, J., R. P. Neupane, Y. Luo, W. Y. Yoshida, R. Sun, and P. G. Williams. 2019. Characterization of leptazolines A–D, polar oxazolines from the cyanobacterium Leptolyngbya sp., reveals a glitch with the “Willoughby–Hoye” scripts for calculating NMR chemical shifts. Organic Letters 21 (20):8449–53. doi: https://doi.org/10.1021/acs.orglett.9b03216.
- Boettiger, C. 2015. An introduction to Docker for reproducible research. ACM SIGOPS Operating Systems Review 49 (1):71–79. doi: https://doi.org/10.1145/2723872.2723882.
- Brinckman, A., K. Chard, N. Gaffney, M. Hategan, M. B. Jones, K. Kowalik, S. Kulasekaran, B. Ludascher, B. D. Mecum, J. Nabrzyski, et al. 2019. Computing environments for reproducibility: Capturing the “Whole Tale.” Future Generation Computer Systems 94:854–67. doi: https://doi.org/10.1016/j.future.2017.12.029.
- Brunsdon, C. 2016. Quantitative methods I: Reproducible research and quantitative geography. Progress in Human Geography 40 (5):687–96. doi: https://doi.org/10.1177/0309132515599625.
- Buck, S. 2015. Solving reproducibility. Science 348 (6242):1403. doi: https://doi.org/10.1126/science.aac8041.
- Chang, W., J. Cheng, J. J. Allaire, Y. Xie, and J. McPherson. 2020. shiny: Web application framework for R. CRAN. R package version 1.4.0.2. Accessed August 31, 2020. https://CRAN.R-project.org/package=shiny.
- Chen, X., S. Dallmeier-Tiessen, R. Dasler, S. Feger, P. Fokianos, J. B. Gonzalez, H. Hirvonsalo, D. Kousidis, A. Lavasa, S. Mele, et al. 2019. Open is not enough. Nature Physics 15 (2):113–19. doi: https://doi.org/10.1038/s41567-018-0342-2.
- Claerbout, J., and M. Karrenbach. 1992. Electronic documents give reproducible research a new meaning. In SEG Technical Program Expanded Abstracts 1992, 601–4. Tulsa, OK: Society of Exploration Geophysicists. doi: https://doi.org/10.1190/1.1822162.
- Clyburne-Sherin, A., X. Fei, and S. A. Green. 2019. Computational reproducibility via containers in psychology. Meta-Psychology 3. doi: https://doi.org/10.15626/MP.2018.892.
- Code Ocean. 2018. De Gruyter partners with Code Ocean to improve research reproducibility. Accessed April 24, 2020. https://codeocean.com/press-release/de-gruyter-partners-with-code-ocean-to-improve-research-reproducibility.
- Donoho, D. L. 2010. An invitation to reproducible computational research. Biostatistics 11 (3):385–88. doi: https://doi.org/10.1093/biostatistics/kxq028.
- Easing the burden of code review [editorial]. 2018. Nature Methods 15 (9):641. doi: https://doi.org/10.1038/s41592-018-0137-5.
- Eaton, J. W. 2012. GNU Octave and reproducible research. Journal of Process Control 22 (8):1433–38. doi: https://doi.org/10.1016/j.jprocont.2012.04.006.
- Eglen, S. J., B. Marwick, Y. O. Halchenko, M. Hanke, S. Sufi, P. Gleeson, R. A. Silver, A. P. Davison, L. Lanyon, M. Abrams, et al. 2017. Toward standard practices for sharing computer code and programs in neuroscience. Nature Neuroscience 20 (6):770–73. doi: https://doi.org/10.1038/nn.4550.
- Eglen, S. J., R. Mounce, L. Gatto, A. M. Currie, and Y. Nobis. 2018. Recent developments in scholarly publishing to improve research practices in the life sciences. Emerging Topics in Life Sciences 2 (6):775–78. doi: https://doi.org/10.1042/ETLS20180172.
- Eglen, S. J., and D. Nüst. 2019. CODECHECK: An open-science initiative to facilitate sharing of computer programs and results presented in scientific publications. In The 14th Munin Conference on Scholarly Publishing 2019, Septentrio Conference Series. University Library, UiT The Arctic University of Norway. https://doi.org/https://doi.org/10.7557/5.4910.
- Emsley, I., and D. De Roure. 2018. A framework for the preservation of a Docker container. International Journal of Digital Curation 12 (2):125–35. doi: https://doi.org/10.2218/ijdc.v12i2.509.
- Estop, H. 2019. SAGE trials Code Ocean to improve research reproducibility. Accessed April 24, 2020. https://journalsblog.sagepub.com/blog/sage-trials-code-ocean-to-improve-research-reproducibility.
- Fanelli, D. 2018. Opinion: Is science really facing a reproducibility crisis, and do we need it to? Proceedings of the National Academy of Sciences 115 (11):2628–31. doi: https://doi.org/10.1073/pnas.1708272114.
- Foster, I. 2018. Research infrastructure for the safe analysis of sensitive data. The Annals of the American Academy of Political and Social Science 675 (1):102–20. doi: https://doi.org/10.1177/0002716217742610.
- Gentleman, R., and D. Temple Lang. 2007. Statistical analyses and reproducible research. Journal of Computational and Graphical Statistics 16 (1):1–23. doi: https://doi.org/10.1198/106186007X178663.
- Gil, Y., C. H. David, I. Demir, B. T. Essawy, R. W. Fulweiler, J. L. Goodall, L. Karlstrom, H. Lee, H. J. Mills, J.-H. Oh, et al. 2016. Toward the geoscience paper of the future: Best practices for documenting and sharing research from data to software to provenance. Earth and Space Science 3 (10):388–415. doi: https://doi.org/10.1002/2015EA000136.
- Giraud, T., and N. Lambert. 2017. Reproducible cartography. In Advances in cartography and GIScience, ed. M. P. Peterson, 173–83. Cham, Switzerland: Springer. doi:https://doi.org/10.1007/978-3-319-57336-6_13.
- Greenbaum, D., J. Rozowsky, V. Stodden, and M. Gerstein. 2017. Structuring supplemental materials in support of reproducibility. Genome Biology 18 (1):64. doi: https://doi.org/10.1186/s13059-017-1205-3.
- Gronenschild, E. H. B. M., P. Habets, H. I. L. Jacobs, R. Mengelers, N. Rozendaal, J. van Os, and M. Marcelis. 2012. The effects of FreeSurfer version, workstation type, and Macintosh operating system version on anatomical volume and cortical thickness measurements. PLoS ONE 7 (6):e38234. doi: https://doi.org/10.1371/journal.pone.0038234.
- Harris, R., D. O’Sullivan, M. Gahegan, M. Charlton, L. Comber, P. Longley, C. Brunsdon, N. Malleson, A. Heppenstall, A. Singleton, et al. 2017. More bark than bytes? Reflections on 21+ years of geocomputation. Environment and Planning B: Urban Analytics and City Science 44 (4):598–617. doi: https://doi.org/10.1177/2399808317710132.
- Higman, R., D. Bangert, and S. Jones. 2019. Three camps, one destination: The intersections of research data management, FAIR and open. Insights 32 (1):1–9. doi:https://doi.org/10.1629/uksg.468.
- Hinz, M., D. Nüst, B. Proß, and E. Pebesma. 2013. Spatial statistics on the geospatial Web. In The 16th AGILE International Conference on Geographic Information Science, Short papers. AGILE, ed. D. Vandenbroucke, B. Bucher, and J. Crompvoets, 1–7. Leuven, Belgium: AGILE. https://doi.org/https://doi.org/10.31223/osf.io/j8x2e.
- Hirst, T. 2019. “Fragment—Some rambling thoughts on computing environments in education. Accessed April 24, 2020. https://blog.ouseful.info/2019/03/20/fragment-some-rambling-thoughts-on-computing-environments-in-education/.
- Howe, B. 2012. Virtual appliances, cloud computing, and reproducible research. Computing in Science & Engineering 14 (4):36–41. doi: https://doi.org/10.1109/MCSE.2012.62.
- Jupyter Project, M. Bussonnier, J. Forde, J. Freeman, B. Granger, T. Head, C. Holdgraf, K. Kelley, G. Nalvarte, A. Osheroff, et al. 2018. Binder 2.0—Reproducible, interactive, sharable environments for science at scale. Proceedings of the 17th Python in Science Conference, 113–20. https://doi.org/https://doi.org/10.25080/Majora-4af1f417-011.
- Kedron, P., A. E. Frazier, A. B. Trgovac, T. Nelson, and A. S. Fotheringham. 2019. Reproducibility and replicability in geographical analysis. Geographical Analysis. Advance online publication. doi: https://doi.org/10.1111/gean.12221.
- Kluyver, T., B. Ragan-Kelley, F. Pérez, B. Granger, M. Bussonier, J. Frederic, K. Kelley, et al. 2016. Jupyter Notebooks—A publishing format for reproducible computational workflows. In Proceedings of the 20th International Conference on Electronic Publishing, ed. F. Loizides and B. Schmidt, 87–90. Amsterdam, The Netherlands. https://doi.org/https://doi.org/10.3233/978-1-61499-649-1-87.
- Knoth, C., and D. Nüst. 2017. Reproducibility and practical adoption of GEOBIA with open-source software in Docker containers. Remote Sensing 9 (3):290. doi: https://doi.org/10.3390/rs9030290.
- Knuth, D. E. 1984. Literate programming. The Computer Journal 27 (2):97–111. doi: https://doi.org/10.1093/comjnl/27.2.97.
- Konkol, M., and C. Kray. 2019. In-depth examination of spatiotemporal figures in open reproducible research. Cartography and Geographic Information Science 46 (5):412–27. doi:https://doi.org/10.1080/15230406.2018.1512421.
- Konkol, M., C. Kray, and M. Pfeiffer. 2019. Computational reproducibility in geoscientific papers: Insights from a series of studies with geoscientists and a reproduction study. International Journal of Geographical Information Science 33 (2):408–29. doi: https://doi.org/10.1080/13658816.2018.1508687.
- Konkol, M., C. Kray, and J. Suleiman. 2019. Creating interactive scientific publications using bindings. Proceedings of the ACM on Human-Computer Interaction 3:1–16. doi: https://doi.org/10.1145/3331158.
- Kray, C., E. Pebesma, M. Konkol, and D. Nüst. 2019. Reproducible research in geoinformatics: Concepts, challenges and benefits (Vision paper). In 14th International Conference on Spatial Information Theory (COSIT 2019), ed. S. Timpf, C. Schlieder, M. Kattenbeck, B. Ludwig, and K. Stewart, 1–8. Dagstuhl, Germany: Schloss Dagstuhl–Leibniz-Zentrum fuer Informatik. doi:https://doi.org/10.4230/LIPIcs.COSIT.2019.8.
- Kurtzer, G. M., V. Sochat, and M. W. Bauer. 2017. Singularity: Scientific containers for mobility of compute. PLoS ONE 12 (5):e0177459. doi: https://doi.org/10.1371/journal.pone.0177459.
- Lees, J. M. 2012. Open and free: Software and scientific reproducibility. Seismological Research Letters 83 (5):751–52. doi: https://doi.org/10.1785/0220120091.
- Lewis, K. P., E. Vander Wal, and D. A. Fifield. 2018. Wildlife biology, big data, and reproducible research. Wildlife Society Bulletin 42 (1):172–79. doi: https://doi.org/10.1002/wsb.847.
- Markowetz, F. 2015. Five selfish reasons to work reproducibly. Genome Biology 16:274. doi: https://doi.org/10.1186/s13059-015-0850-7.
- Marwick, B. 2015. How computers broke science—and what we can do to fix it. Accessed April 24, 2020. https://theconversation.com/how-computers-broke-science-and-what-we-can-do-to-fix-it-49938.
- Marwick, B. 2017. Computational reproducibility in archaeological research: Basic principles and a case study of their implementation. Journal of Archaeological Method and Theory 24 (2):424–50. doi: https://doi.org/10.1007/s10816-015-9272-9.
- Marwick, B., C. Boettiger, and L. Mullen. 2018. Packaging data analytical work reproducibly using R (and friends). The American Statistician 72 (1):80–88. doi: https://doi.org/10.1080/00031305.2017.1375986.
- Muenchow, J., S. Schäfer, and E. Krüger. 2019. Reviewing qualitative GIS research—Toward a wider usage of open-source GIS and reproducible research practices. Geography Compass 13 (6). doi: https://doi.org/10.1111/gec3.12441.
- Munafò, M. R., B. A. Nosek, D. V. M. Bishop, K. S. Button, C. D. Chambers, N. Percie Du Sert, U. Simonsohn, E-J. Wagenmakers, J. J. Ware, and J. P. A. Ioannidis, 2017. A manifesto for reproducible science. Nature Human Behaviour 1:0021. doi: https://doi.org/10.1038/s41562-016-0021.
- National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and replicability in science. Washington, DC: National Academies Press. doi: https://doi.org/10.17226/25303.
- Nosek, B. A., G. Alter, G. C. Banks, D. Borsboom, S. D. Bowman, S. J. Breckler, S. Buck, C. D. Chambers, G. Chin, G. Christensen, et al. 2015. Scientific standards: Promoting an open research culture. Science 348 (6242):1422–25. doi: https://doi.org/10.1126/science.aab2374.
- Nüst, D., C. Boettiger, and B. Marwick. 2018. How to read a research compendium. arXiv:1806.09525. http://arxiv.org/abs/1806.09525.
- Nüst, D., C. Granell, B. Hofer, M. Konkol, F. O. Ostermann, R. Sileryte, and V. Cerutti. 2018. Reproducible research and GIScience: An evaluation using AGILE conference papers. PeerJ 6:e5072. doi: https://doi.org/10.7717/peerj.5072.
- Nüst, D., and M. Hinz. 2019. containerit: Generating Dockerfiles for reproducible research with R. Journal of Open Source Software 4 (40):1603. doi: https://doi.org/10.21105/joss.01603.
- Nüst, D., M. Konkol, E. Pebesma, C. Kray, M. Schutzeichel, H. Przibytzin, and J. Lorenz. 2017. Opening the publication process with executable research compendia. D-Lib Magazine 23 (1–2). doi: https://doi.org/10.1045/january2017-nuest.
- Nüst, D., F. Ostermann, R. Sileryte, B. Hofer, C. Granell, M. Teperek, A. Graser, K. Broman, and K. Hettne. 2019. AGILE reproducible paper guidelines. OSF. doi: https://doi.org/10.17605/OSF.IO/CB7Z8.
- Nüst, D., and M. Schutzeichel. 2017. An architecture for reproducible computational geosciences. Poster presented at AGILE 2017, Wageningen, The Netherlands, June. doi:https://doi.org/10.5281/zenodo.1478542.
- O’Loughlin, J., P. Raento, J. P. Sharp, J. D. Sidaway, and P. E. Steinberg. 2015. Data ethics: Pluralism, replication, conflicts of interest, and standards in political geography. Political Geography 44:A1–A3. doi:https://doi.org/10.1016/j.polgeo.2014.11.001.
- Pebesma, E., R. Bivand, and P. J. Ribeiro. 2015. Software for spatial statistics. Journal of Statistical Software 63 (1):1–8. doi:https://doi.org/10.18637/jss.v063.i01.
- Pebesma, E., D. Nüst, and R. Bivand. 2012a. R for reproducible geographical research. Paper presented at the AAG Annual Meeting 2012, New York, February 24. Accessed August 31, 2020. http://pebesma.staff.ifgi.de/r_repr.pdf.
- Pebesma, E., D. Nüst, and R. Bivand. 2012b. The R software environment in reproducible geoscientific research. Eos: Transactions of the American Geophysical Union 93 (16):163. doi: https://doi.org/10.1029/2012EO160003.
- Pebesma, E., W. Wagner, M. Schramm, A. V. Beringe, C. Paulik, M. Neteler, and J. Reiche. 2017. OpenEO—A common, open source interface between Earth observation data infrastructures and front-end applications. Zenodo. https://doi.org/https://doi.org/10.5281/zenodo.1065474.
- Peng, R. D. 2011. Reproducible research in computational science. Science 334 (6060):1226–27. doi: https://doi.org/10.1126/science.1213847.
- Pérignon, C., K. Gadouche, C. Hurlin, R. Silberman, and E. Debonnel. 2019. Certify reproducibility with confidential data. Science 365 (6449):127–28. doi: https://doi.org/10.1126/science.aaw2825.
- Perkel, J. M. 2019. Make code accessible with these cloud services. Nature 575 (7781):247–48. doi: https://doi.org/10.1038/d41586-019-03366-x.
- Piwowar, H. 2013. Altmetrics: Value all research products. Nature 493 (7431):159. doi: https://doi.org/10.1038/493159a.
- Preston, B., and M. W. Wilson. 2014. Practicing GIS as mixed method: Affordances and limitations in an urban gardening study. Annals of the Association of American Geographers 104 (3):510–29. doi: https://doi.org/10.1080/00045608.2014.892325.
- R Core Team. 2019. R: A language and environment for statistical computing. Vienna, Austria: R Foundation for Statistical Computing. https://www.R-project.org/.
- Ram, K. 2013. Git can facilitate greater reproducibility and increased transparency in science. Source Code for Biology and Medicine 8 (1):7. doi: https://doi.org/10.1186/1751-0473-8-7.
- Rechert, K., T. Liebetraut, S. Kombrink, D. Wehrle, S. Mocken, and M. Rohland. 2017. Preserving containers. In Forschungsdaten managen, ed. J. Kratzke and V. Heuveline, 143–51. Heidelberg, Germany: heiBOOKS. http://books.ub.uni-heidelberg.de/heibooks/catalog/book/285.
- Rule, A., A. Birmingham, C. Zuniga, I. Altintas, S.-C. Huang, R. Knight, N. Moshiri, M. H. Nguyen, S. B. Rosenthal, F. Perez, et al. 2019. Ten simple rules for writing and sharing computational analyses in Jupyter Notebooks. PLoS Computational Biology 15 (7):e1007007. doi: https://doi.org/10.1371/journal.pcbi.1007007.
- Sandve, G. K., A. Nekrutenko, J. Taylor, and E. Hovig. 2013. Ten simple rules for reproducible computational research. PLoS Computational Biology 9 (10):e1003285. doi: https://doi.org/10.1371/journal.pcbi.1003285.
- Santana-Perez, I., and M. S. Pérez-Hernández. 2015. Towards reproducibility in scientific workflows: An infrastructure-based approach. Scientific Programming 2015:1–11. doi: https://doi.org/10.1155/2015/243180.
- Schönbrodt, F. 2019. Training students for the open science future. Nature Human Behaviour 3 (10):1031. doi: https://doi.org/10.1038/s41562-019-0726-z.
- Shannon, J., and K. Walker. 2018. Opening GIScience: A process-based approach. International Journal of Geographical Information Science 32 (10):1911–26. doi: https://doi.org/10.1080/13658816.2018.1464167.
- Sidhu, N., E. Pebesma, and G. Câmara. 2018. Using Google Earth Engine to detect land cover change: Singapore as a use case. European Journal of Remote Sensing 51 (1):486–500. doi:https://doi.org/10.1080/22797254.2018.1451782.
- Šimko, T., L. Heinrich, H. Hirvonsalo, D. Kousidis, and D. Rodríguez. 2019. REANA: A system for reusable research data analyses. EPJ Web of Conferences 214. doi: https://doi.org/10.1051/epjconf/201921406034.
- Singleton, A. D., S. Spielman, and C. Brunsdon. 2016. Establishing a framework for open geographic information science. International Journal of Geographical Information Science 30 (8):1507–21. doi: https://doi.org/10.1080/13658816.2015.1137579.
- Spielman, S. E., and A. Singleton. 2015. Studying neighborhoods using uncertain data from the American Community Survey: A contextual approach. Annals of the Association of American Geographers 105 (5):1003–25. doi: https://doi.org/10.1080/00045608.2015.1052335.
- Stark, P. B. 2018. Before reproducibility must come preproducibility. Nature 557:613. doi: https://doi.org/10.1038/d41586-018-05256-0.
- Stodden, V. 2009. The legal framework for reproducible scientific research: Licensing and copyright. Computing in Science & Engineering 11 (1):35–40. doi: https://doi.org/10.1109/MCSE.2009.19.
- Stodden, V., and S. Miguez. 2014. Best practices for computational science: Software infrastructure and environments for reproducible and extensible research. Journal of Open Research Software 2 (1):e21. doi: https://doi.org/10.5334/jors.ay.
- Stodden, V., J. Seiler, and Z. Ma. 2018. An empirical analysis of journal policy effectiveness for computational reproducibility. Proceedings of the National Academy of Sciences of the United States of America 115 (11):2584–89. doi: https://doi.org/10.1073/pnas.1708290115.
- Sui, D., and P. Kedron. 2020. Reproducibility and replicability in the context of the contested identities of geography. Annals of the American Association of Geographers. doi: https://doi.org/10.1080/24694452.2020.1806024.
- Sui, D., and S.-L. Shaw. 2018. Outlook and next steps: From human dynamics to smart and connected communities. In Human dynamics research in smart and connected communities, ed. S.-L. Shaw and D. Sui, 235–45. Cham, Switzerland: Springer International. doi: https://doi.org/10.1007/978-3-319-73247-3_13.
- Tennant, J. P., H. Crane, T. Crick, J. Davila, A. Enkhbayar, J. Havemann, B. Kramer, R. Martin, P. Masuzzo, A. Nobes, et al. 2019. Ten hot topics around scholarly publishing. Publications 7 (2):34. doi: https://doi.org/10.3390/publications7020034.
- The Turing Way Community, B. Arnold, L. Bowler, S. Gibson, P. Herterich, R. Higman, A. Krystalli, A. Morley, M. O'Reilly, and K. Whitaker. 2019. The Turing Way: A handbook for reproducible data science. Zenodo. https://doi.org/https://doi.org/10.5281/zenodo.3233986.
- Vandewalle, P., J. Kovacevic, and M. Vetterli. 2009. Reproducible research in signal processing. IEEE Signal Processing Magazine 26 (3):37–47. doi: https://doi.org/10.1109/MSP.2009.932122.
- Verstegen, J. A. 2019. JudithVerstegen/PLUC_Mozambique: First release of PLUC for Mozambique (Version v1.0.0). Zenodo. https://doi.org/https://doi.org/10.5281/zenodo.3519987.
- Verstegen, J. A., D. Karssenberg, F. van der Hilst, and A. Faaij. 2012. Spatio-temporal uncertainty in spatial decision support systems: A case study of changing land availability for bioenergy crops in Mozambique. Computers, Environment and Urban Systems 36 (1):30–42. doi: https://doi.org/10.1016/j.compenvurbsys.2011.08.003.
- Wainwright, J. 2020. Is critical human geography research replicable? Annals of the American Association of Geographers. doi: https://doi.org/10.1080/24694452.2020.1806025.
- Waters, N. 2020. Motivations and methods for replication in geography: Working with data “streams.” Annals of the American Association of Geographers. doi: https://doi.org/10.1080/24694452.2020.1806027.
- Wilkinson, M. D., M. Dumontier, I. J. J. Aalbersberg, G. Appleton, M. Axton, A. Baak, N. Blomberg, J-W. Boiten, L. Bonino da Silva Santos, P. E. Bourne, et al. 2016. The FAIR guiding principles for scientific data management and stewardship. Scientific Data 3:160018. doi: https://doi.org/10.1038/sdata.2016.18.
- Wilson, J. P., and P. A. Burrough. 1999. Dynamic modeling, geostatistics, and fuzzy classification: New sneakers for a new geography? Annals of the Association of American Geographers 89 (4):736–46. doi: https://doi.org/10.1111/0004-5608.00173.
- Wilson, J. P., K. Butler, S. Gao, W. Li, and D. J. Wright. 2020. The replicability and reproducibility of the GIS software and algorithms used in environmental applications. Annals of the American Association of Geographers. doi: https://doi.org/10.1080/24694452.2020.1806026.
- Xie, Y. 2015. Dynamic documents with R and knitr. 2nd ed. Boca Raton, FL: CRC.