1,354
Views
7
CrossRef citations to date
0
Altmetric
Original Articles

Measuring Primary Students’ Graph Interpretation Skills Via a Performance Assessment: A case study in instrument development

, , &

References

  • Baron, J. B. (1991). Strategies for the development of effective performance exercises. Applied Measurement in Education, 4(4), 305–318. doi:10.1207/s15324818ame0404_4
  • Becker-Klein, R., Stylinski, C., Peterman, K., & Phillips, T. (2014). Using embedded assessment to measure science skills within STEM education. Paper presented at the meeting of the American Evaluation Association, Denver, Colorado.
  • Berg, C. A., & Smith, P. (1994). Assessing students’ abilities to construct and interpret line graphs: Disparities between multiple-choice and free-response instruments. Science Education, 78(6), 527–554. issn: 0036-8326
  • Bertin, J. (1983). Semiology of graphics: Diagrams, networks, maps (W. J. Berg, Trans.). Madison, WI: The University of Wisconsin Press.
  • Bertin, J. (2001). Matix theory of graphics. Information Design Journal, 10(1), 5–19. issn: 1876-486X
  • Boote, S. K. (2012). Assessing and understanding line graph interpretations using a scoring rubric of organized cited factors. Journal of Science Teacher Education, 25(3), 333–354. doi:10.1007/s10972-012-9318-8
  • Bybee, R. W. (2011). Scientific and engineering practices in K-12 classrooms. Science Teacher, 78, 34–40. doi:10.1525/abt.2012.74.8.3
  • Bybee, R. W. (2012). The next generation of science standards: Implications for biology education. The American Biology Teacher, 74(8), 542–549. doi:10.1525/abt.2012.74.8.3
  • Curcio, F. (1987). Comprehension of mathematical relationships expressed in graphs. Journal for Research in Mathematics Education, 18(5), 382–393. doi:10.2307/749086
  • Friel, S. N., Curcio, F. R., & Bright, G. W. (2001). Making sense of graphs: Critical factors influencing comprehension and instructional implications. Journal for Research in Mathematics Education, 32(2), 124–158. doi:10.2307/749671
  • Gardner, H. (1992). Assessment in context: The alternative to standardized testing. In B. R. Gifford & M. C. O'Connor (Eds.), Changing assessments (pp. 77–119). Netherlands: Springer. doi:10.1007/978-94-011-2968-8_4
  • Gipps, C. (1995). What do we mean by equity in relation to assessment? Assessment in Education: Principles, Policy & Practice, 2(3), 271–281. doi:10.1080/0969595950020303
  • Glazer, N. (2011). Challenges with graph interpretation: A review of the literature. Studies in Science Education, 47(2), 183–210. doi:10.1080/03057267.2011.605307
  • Johnson, B., Duffin, M., & Murphy, M. (2012). Quantifying a relationship between place-based learning and environmental quality. Environmental Education Research, 18(5), 609–624. doi:10.1080/13504622.2011.640748
  • Johnson, R. L., Penny, J. A., & Gordon, B. (2009). Assessing performance: Designing, scoring, and validating performance tasks. New York, NY: Guilford Press.
  • Konold, C., & Higgins, T. L. (2003). Reasoning about data. In J. Kilpatrick, W. G. Martin, & D. Schifter (Eds.), A research companion to principles and standards for school mathematics (pp. 193–215). Reston, VA: National Council of Teachers of Mathematics.
  • Konold, C., Higgins, T., & Russell, S. J., & Khalil, K. (2014). Data seen through different lenses. Educational Studies in Mathematics, 88(3), 305–325. doi:10.1007/s10649-013-9529-8
  • Malcom, S. M. (1991). Equity and excellence through authentic science assessment. In G. Kulm & S. Malcom (Eds.), Science assessment in the service of reform (pp. 313–330). Washington, DC: American Association for the Advancement of Science.
  • McKenzie, D. L., & Padilla, M. J. (1986). The construction and validation of the test of graphing in science (TOGS). Journal of Research in Science Teaching, 23(7), 571–579. Retrieved from http://onlinelibrary.wiley.com.proxy.antioch.edu/doi/10.1002/tea.3660230702/abstract
  • National Research Council. (2012). A framework for K-12 science education: Practices, crosscutting concepts, and core ideas. Committee on a Conceptual Framework for the New K-12 Science Education Standards. Washington, DC: National Academies Press.
  • National Research Council. (2014). Developing Assessments for the Next Generation Science Standards. Committee on Developing Assessments of Science Proficiency in K-12. Board on Testing and Assessment and Board on Science Education. In J.W. Pellegrino, M.R. Wilson, J.A. Koenig, & A.S. Beatty (Eds.), Division of behavioral and social sciences and education (pp. 1–270). Washington, DC: The National Academies Press.
  • Norman, R. R. (2012). Reading the graphics: What is the relationship between graphical reading processes and student comprehension. Reading and Writing, 25(3), 739–774. doi:10.1007/s11145-011-9298-7
  • Padilla, M. J., McKenzie, D. L., & Shaw, E. L. J. (1986). An examination of the line graphing ability of students in grades seven through twelve. School Science and Mathematics, 86(1), 20–26. issn: 144263551
  • Parmar, R. S., & Signer, B. R. (2005). Sources of error in constructing and interpreting graphs: A study of fourth- and fifth-grade students with LD. Journal of Learning Disabilities, 38(3), 250–261. issn: 00222194
  • Peterman, K. (2013). Show me what you can do: Performance-based assessments as a measure of scientific practice. A panel presented at the meeting of the American Evaluation Association, Washington, DC.
  • Rural School & Community Trust. (2001). Assessing Student Work. Retrieved from http://www.ruraledu.org/user_uploads/file/Assessing_Student_Work.pdf
  • Russell, S.J., Schifter, D., Bastable, V., Konold, C., & Higgins, T. (2002) Working with data, casebook. Parsippany, NJ: Dale Seymour.
  • Savin-Baden, M., & Major, C. H. (2004). Foundations of problem-based learning. Buckingham: SRHE/Open University Press.
  • Steck, T. R., DiBiase, W., Wang, C., & Boukhtiarov, A. (2012). The use of open-ended problem-based learning scenarios in an interdisciplinary biotechnology class: Evaluation of a problem-based learning course across three years. Journal of Microbiology & Biology Education, 13(1), 2–10. doi:http://dx.doi.org/10.1128/jmbe.v13i1.389
  • Stufflebeam, D. (2001). The meta-evaluation imperative. American Journal of Evaluation, 22(2), 183–209. Retrieved from http://www.wmich.edu/evalphd/wp-content/uploads/2011/02/The_Metaevaluation_Imperative.pdf
  • Wiggins, G. (1993). Assessment: Authenticity, context, and validity. Phi Delta Kappa International, 75, 200–214. Retrieved from http://www.jstor.org/stable/20405066
  • Wilson, M., & Sloane, K. (2000). From principles to practice: An embedded assessment system. Applied Measurement in Education, 13(2), 181–208. issn: 0895-7347

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.