7
Views
1
CrossRef citations to date
0
Altmetric
Research and Teaching

Measuring Data Skills in Undergraduate Student Work: Development of a Scoring Rubric

References

  • Allie, S., Buffler, A., Campbell, B., Lubben, F., Evangelinos, D., Psillos, D., & Valassiades, O. (2003). Teaching measurement in the introductory physics laboratory. The Physics Teacher, 41(7), 394–401.
  • American Association for the Advancement of Science (AAAS). (2011). Vision and change in undergraduate biology education: A call to action. AAAS.
  • Association of American Colleges and Universities (AACU). (2011). The LEAP vision for learning: Outcomes, practices, impact, and employers ’ wiews. AACU.
  • Baumer, B. (2015). A data science course for undergraduates: Thinking with data. The American Statistician, 69(4), 334–342.
  • Borne, K. D., Jacoby, S., Carney, K., Connolly, A., Eastman, T., Raddick, M. J., Tyson, J. A., & Wallin, J. (2009). The revolution in astronomy education: Data science for the masses. http://arxiv.org/PS_cache/arxiv/pdf/0909/0909.3895v1.pdf
  • Eagan, M. K., Stolzenberg, E. B., Berdan Lozano, J., Aragon, M. C., Suchard, M. R. & Hurtado, S. (2014). Undergraduate teaching faculty: The 2013-2014 HERI faculty survey. Higher Education Research Institute, UCLA.
  • Education Development Center (EDC). (2016). Building global interest in data literacy: A dialogue: workshop report. EDC.
  • Finzer, W. (2013). The data science education dilemma. Technology Innovations in Statistics Education, 7(2). https://escholarship.org/uc/item/7gv0q9dc
  • Grant, M., & Smith, M. (2018). Quantifying assessment of undergraduate critical thinking. Journal of College Teaching & Learning, 15(1) 27–38.
  • Gould, R., Sunbury, S., & Dussault, M. (2014). In praise of messy data: Lessons from the search for alien worlds. The Science Teacher, 81(8), 31–36.
  • Hart Research Associates. (2016a). Recent trends in general education design, learning outcomes, and teaching approaches: Key findings from a survey among administrators at AAC&Umember institutions. Association of American Colleges and Universities.
  • Hart Research Associates (2016b). Trends in learning outcomes assessment: Key findings from a survey among administrators at AAC&U member institutions. Association of American Colleges and Universities.
  • Hayes, A. F., & Krippendorff, K. (2007). Answering the call for a standard reliability measure for coding data. Communication Methods and Measures, 1(1), 77–89.
  • Holmes, N. G., Wieman, C. E., & Bonn, D. A. (2015). Teaching critical thinking. Proceedings of the National Academy of Sciences, 112(36), 11199–11204.
  • Kastens, K., Krumhansl, R., & Baker, I. (2015). Thinking big. The Science Teacher, 82(5), 25–31.
  • Kastens, K., Krumhansl, R., & Peach, C. (2013, March). EarthCube education end-users workshop report. http://nagt.org/nagt/programs/earth-cube/index.htm
  • Kerlin, S. C., McDonald, S. P., & Kelly, G. J. (2010). Complexity of secondary scientific data sources and students’ argumentative discourse. International Journal of Science Education, 32(9), 1207.
  • Kjelvik, M. K., & Schultheis, E. H. (2019). Getting messy with authentic data: Exploring the potential of using data from scientific research to support student data literacy. CBE—Life Sciences Education, 18(es2), 1–8.
  • Krumhansl, R., Busey, A., Kochevar, R., Mueller-Northcott, J., Krumhansl, K., & Louie, J. (2016). Visualizing Oceans of data: Ocean Tracks—A case study. CreateSpace Independent Publishing Platform. http://www.oceansofdata.org/our-work/visu-alizing-oceans-data-ocean-tracks-%E2%80%93-case-study
  • Krumhansl, R., Peach, C., Foster, J., Busey, A., & Baker, I. (2012). Visualizing oceans of data: Educational interface design. Education Development Center, Inc.
  • Ledley, T. S., Prakash, A., Manduca, C. A., & Fox, S. (2008). Recommendations for making geoscience data accessible and usable in education. Eos, Transactions American Geophysical Union, 89(32), 291.
  • Louie, J. (2016). Ocean Tracks college edition: Year 1 baseline data and needs assessment findings. Education Development Center, Inc.
  • Louie, J. & Hoyle, C. (2017, April). Development of an assessment measuring basic competency in scientific data Interpretation and argumentation [Paper presentation]. Annual meeting of the National Association for Research in Science Teaching, San Antonio, TX.
  • Lubben, F., Allie, S., & Buffler, A. (2010). Experimental work in science. In M. Rollnick (Ed.), Identifying potential for equitable access to tertiary level science (pp. 135–152). Springer, Dordrecht.
  • Madura, J., & Louie, J. (2017, April). Measuring interest in Earth sciences [Paper presentation]. Annual meeting of the National Association for Research in Science Teaching, San Antonio, TX.
  • National Academies of Sciences, Engineering, and Medicine (NASEM). (2018). Data science for undergraduates: Opportunities and options. The National Academies Press. https://doi.org/10.17226/25104
  • NGSS Lead States. (2013). Next generation science standards: For states, by states. The National Academies Press.
  • Ocean Tracks. (2017). www.oceantracks.org.
  • Popham, W. J. (1997). What’s wrong— and what’s right—with rubrics. Educational Leadership, 55(2), 72–75.
  • Reddy, Y. M., & Andrade, H. (2010). A review of rubric use in higher education. Assessment & Evaluation in Higher Education, 55(4), 435–448.
  • Sickler, J., & Hayde, D. (2015). Faculty needs assessment: Findings from front-end interviews and survey. Lifelong Learning Group.
  • Slater, S. J., Slater, T. F., & Olsen, J. K. (2009). Survey of K-12 science teachers’ educational products needs from planetary scientists. Astronomy Education Review, 8(1).
  • Stein, B., Haynes, A., Redding, M., Harris, K., Tylka, M., & Lisic, E. (2010). Faculty driven assessment of critical thinking: National dissemination of the CAT instrument. In Technological Developments in Networking, Education and Automation (pp. 55–58). Springer, Dordrecht.
  • Stevens, D. D., & Levi, A. J. (2013). Introduction to rubrics: An assessment tool to save grading time, convey effective feedback, and promote student learning. Stylus Publishing.
  • Zwickl, B. M., Hu, D., Finkelstein, N., & Lewandowski, H. J. (2015). Model-based reasoning in the physics laboratory: Framework and initial results. Physics Education Research 11 (2), 1–12.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.