1,767
Views
26
CrossRef citations to date
0
Altmetric
Original Articles

Development and Validation of a Multimedia-based Assessment of Scientific Inquiry Abilities

, , &

References

  • Abd-El-Khalick, F., Boujaoude, S., Duschl, R., Lederman, N. G., Mamlok-Naaman, R., Hofstein, A., … Tuan, H. L. (2004). Inquiry in science education: International perspectives. Science Education, 88, 394–419.
  • Adams, R. J., & Khoo, S. T. (1996). Quest: The interactive test analysis system. Camberwell, Australia: Australian Council for Educational Research.
  • Adams, R. J., & Wu, M. L. (2002). PISA 2000 technical report. Paris: OECD.
  • American Educational Research Association, American Psychological Association, & National Council on Measurement in Education. (1999). Standard for educational psychological testing. Washington, DC: American Educational Research Association.
  • de Ayala, R. J. (2008). Methodology in the social sciences: Theory and practice of item response theory. New York, NY: Guilford Press.
  • Bennett, R. E., & Bejar, I. I. (1998). Validity and automad scoring: It's not only the scoring. Educational Measurement: Issues and Practice, 17(4), 9–17. doi:10.1111/j.1745-3992.1998.tb00631.x
  • Bennett, R. E., Persky, H., Weiss, A., & Jenkins, F. (2010). Measuring problem solving with technology: A demonstration study for NAEP. The Journal of Technology, Learning and Assessment, 8(8). Retrieved from http://ejournals.bc.edu/ojs/index.php/jtla/article/view/1627/1471
  • Briggs, D. C., Alonzo, A. C., Schwab, C., & Wilson, M. (2006). Diagnostic assessment with ordered multiple-choice items. Educational Assessment, 11(1), 33–63. doi:10.1207/s15326977ea1101_2
  • Brown, N. J. S., & Wilson, M. (2011). A model of cognition: The missing cornerstone of assessment. Educational Psychology Review, 23(2), 221–234. doi:10.1007/s10648-011-9161-z
  • Buckley, B. C., Gobert, J. D., Horwitz, P., & O'Dwyer, L. M. (2010). Looking inside the black box: assessing model-based learning and inquiry in BioLogica™. International Journal of Learning Technology, 5(2), 166–190. doi: 10.1504/IJLT.2010.034548
  • Chung, G., de Vries, L. F., Cheak, A. M., Stevens, R. H., & Bewley, W. L. (2002). Cognitive process validation of an online problem solving assessment. Computers in Human Behavior, 18(6), 669–684. doi: 10.1016/S0747-5632(02)00023-7
  • Crundwell, R. M. (2005). Alternative strategies for large scale student assessment in Canada: Is value-added assessment one possible answer. Canadian Journal of Educational Adminstration and Policy, 41, 1–21.
  • Duschl, R. A., Schweingruber, H. A., & Shouse, A. W. (Eds.). (2007). Taking science to school: Learning and teaching science in grades K-8. Washington, DC: National Academy Press.
  • Frederiksen, J. R., & White, B. Y. (1998). Teaching and learning generic modeling and reasoning skills. Interactive Learning Environments, 5(1), 33–51. doi: 10.1080/1049482980050103
  • Garden, R. A. (1999). Development of TIMSS performance assessment tasks. Studies in Educational Evaluation, 25(3), 217–241. doi:10.1016/S0191-491X(99)00023-1
  • Gobert, J. D., Sao Pedro, M., Raziuddin, J., & Baker, R. S. (2013). From log files to assessment metrics: Measuring students’ science inquiry skills using educational data mining. Journal of the Learning Sciences, 22(4), 521–563. doi:10.1080/10508406.2013.837391
  • Gonzalez, E., & Rutkowski, L. (2010). Principles of multiple matrix booklet designs and parameter recovery in large-scale assessments. IEA-ETS Research Institute Monograph, 3, 125–156.
  • Kind, P. M. (2013). Establishing assessment scales using a novel disciplinary rationale for scientific reasoning. Journal of Research in Science Teaching, 50(5), 530–560. doi: 10.1002/tea.21086
  • Kind, P. M., Kind, V., Hofstein, A., & Wilson, J. (2011). Peer argumentation in the school science laboratory: Exploring effects of task features. International Journal of Science Education, 33(18), 2527–2558. doi:10.1080/09500693.2010.550952
  • Krajcik, J. S., Blumenfeld, P. C., Marx, R. W., Bass, K. M., Fredricks, J., & Soloway, E. (1998). Inquiry in project-based science classrooms: Initial attempts by middle school students. Journal of the Learning Sciences, 7(3&4), 313–350. doi: 10.1080/10508406.1998.9672057
  • Krajcik, J. S., & Czerniak, C. M. (2007). Teaching children science in elementary and middle school: A project-based approach. New York, NY: Routledge.
  • Kuhn, D. (2007). Reasoning about multiple variables: Control of variables is not the only challenge. Science Education, 91, 710–726. doi: 10.1002/sce.20214
  • Kuo, C.-Y., & Wu, H.-K. (2013). Toward an integrated model for designing assessment systems: An analysis of the current status of computer-based assessments in science. Computers & Education, 68, 388–403. doi:10.1016/j.compedu.2013.06.002
  • Lee, H.-S., & Liu, O. L. (2009). Assessing learning progression of energy concepts across middle school grades: The knowledge integration perspective. Science Education, 94(4), 665–688. doi:10.1002/sce.20382
  • Lee, H.-S., Liu, O. L. & Linn, M. C. (2011). Validating measurement of knowledge integration in science using multiple-choice and explanation items. Applied Measurement in Education, 24(2), 115–136. doi: 10.1080/08957347.2011.554604
  • Liu, O. L., Lee, H., Hofstetter, C. & Linn, M. C. (2008). Assessing knowledge integration in science: Construct, measures, and evidence. Educational Assessment, 13, 33–55. doi: 10.1080/10627190801968224
  • Lorch, R. F., Lorch, E. P., Calderhead, W. J., Dunlap, E. E., Hodell, E. C., & Freer, B. D. (2010). Learning the control of variables strategy in higher and lower achieving classrooms: Contributions of explicit instruction and experimentation. Journal of Educational Psychology, 102(1), 90–101. doi:10.1037/a0017972
  • Martin, M. O., & Mullis, I. V. S. (2012). Methods and procedures in TIMSS and PIRLS 2011. Chestnut Hill, MA: TIMSS & PIRLS International Study Center, Boston College.
  • Ministry of Education. (1999). Curriculum outlines for “nature science and living technology”. Taipei, Taiwan: Ministry of Education.
  • Ministry of Education. (2008). Curriculum outlines for senior high schools. Taipei, Taiwan: Ministry of Education.
  • Messick, S. (1995). Validity of psychological assessment: Validation of inferences from persons’ responses and performances as scientific inquiry into score meaning. American Psychologist, 50(9), 741–749. doi:10.1037/0003-066X.50.9.741
  • National Research Council. (1996). National science education standards. Washington, DC: National Academy Press.
  • National Research Council. (2000). Inquiry and the national science education standards: A guide for teaching and learning. Washington, DC: National Academy Press.
  • National Research Council. (2001). Knowing what students know: The science and design of educational assessment. Committee on the Foundations of Assessment. In J. Pellegrino, N. Chudowsky, & R. Glaser (Eds.), Board on Testing and Assessment, Center for Education. Division of Behavioral and Social Sciences and Education. Washington, DC: National Academies Press.
  • Organization for Economic Co-operation and Development. (2010). PISA computer-based assessment of student skills in science. Paris: Author.
  • Pine, J., Aschbacher, P., Roth, E., Jones, M., McPhee, C., Martin, C., … Foley, B. (2006). Fifth graders’ science inquiry abilities: A comparative study of students in hands-on and textbook curricula. Journal of Research in Science Teaching, 43(5), 467–484. doi:10.1002/tea.20140
  • Quellmalz, E. S., & Pellegrino, J. W. (2009). Technology and testing. Science, 323(5910), 75–79. doi: 10.1126/science.1168046
  • Quellmalz, E. S., Timms, M. J., Silberglitt, M. D., & Buckley, B. C. (2012). Science assessments for all: Integrating science simulations into balanced state science assessment systems. Journal of Research in Science Teaching, 49(3), 363–393. doi:10.1002/tea.21005
  • Ruiz-Primo, M. A., & Shavelson, R. J. (1996). Rhetoric and reality in science performance assessments: An update. Journal of Research in Science Teaching, 33(10), 1045–1063. doi: 10.1002/(SICI)1098-2736(199612)33:10<1045::AID-TEA1>3.0.CO;2-S
  • Shin, N., Stevens, S. Y., & Krajcik, J. (2010). Tracking student learning over time using construct-centered design. In S. Rodrigues (Ed.), Using analytical frameworks for classroom research: Collecting data and analysing narrative (pp. 38–68). London: Taylor & Francis.
  • Songer, N. B., Lee, H. S., & McDonald, S. (2003). Research towards an expanded understanding of inquiry science beyond one idealized standard. Science Education, 87(4), 490–516. doi: 10.1002/sce.10085
  • Wenning, C. J. (2007). Assessing inquiry skills as a component of scientific literacy. Journal of Physics Teacher Education Online, 4(2), 21–24.
  • Wilson, M. (2005). Constructing measures: An item response modeling approach. Mahwah, NJ: Erlbaum.
  • Wright, B. D. & Stone, M. H. (1999). Measurement essentials. Wilmington, DE: Wide Range.
  • Wu, M. L., & Adams, R. J. (2007). Applying the Rasch model to psycho-social measurement: A practical approach. Melbourne: Educational Measurement Solutions.
  • Wu, M. L., Adams, R. J., Wilson, M. R., & Haldane, S. A. (2007). ACER ConQuest version 2.0: Generalised item response modelling software. Melbourne: Australian Council for Educational Research.
  • Wu, H.-K., & Hsieh, C.-E. (2006). Developing sixth graders' inquiry skills to construct scientific explanations in inquiry-based learning environments. International Journal of Science Education, 28(11), 1289–1313. doi:10.1080/09500690600621035
  • Wu, P. H., Wu, H.-K., & Hsu, Y. S. (2014). Establishing the criterion-related, construct, and content validities of a simulation-based assessment of inquiry abilities. International Journal of Science Education, 36(9–10), 1630–1650. doi:10.1080/09500693.2013.871660.
  • Wu, H.-K., Wu, P. H., Zhang, W. X., & Hsu, Y. S. (2013). Investigating college and graduate students' multivariable reasoning in computational modeling. Science Education, 97, 337–366. doi:10.1002/sce.21056.
  • Zachos, P., Hick, T. L., Doane, W. E., & Sargent, C. (2000). Setting theoretical and empirical foundations for assessing scientific inquiry and discovery in educational programs. Journal of Research in Science Teaching, 37(9), 938–962. doi:10.1002/1098-2736(200011)37:9<938::AID-TEA5>3.0.CO;2-S
  • Zoanetti, N. (2010). Interactive computer based assessment tasks: How problem-solving process data can inform instruction. Australasian Journal of Educational Technology, 26(5), 585–606.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.