1,303
Views
7
CrossRef citations to date
0
Altmetric
Articles

Investigating disciplinary context effect on student scientific inquiry competence

ORCID Icon, ORCID Icon & ORCID Icon
Pages 2736-2764 | Received 25 Mar 2019, Accepted 22 Nov 2019, Published online: 06 Dec 2019

References

  • Anwar, Y., & Susanti, R. (2019, February). Analyzing scientific argumentation skills of biology education students in general biology courses. In Journal of Physics: Conference Series (Vol. 1166, No. 1, p. 012001). IOP Publishing.
  • Bao, L., & Redish, E. F. (2004). Educational assessment and underlying models of cognition. The scholarship of teaching and learning in higher education: Contributions of research universities.
  • Baron, J. B. (1991). Strategies for the development of effective performance exercises. Applied Measurement in Education, 4(4), 305–318.
  • Bjork, R. A., & Richardson-Klavehn, A. (1989). On the puzzling relationship between environment context and human memory. In C. Izawa (Ed.), Current issues in cognitive processes: The tulane flowerree symposium on cognition (pp. 313–344). Hillsdale, NJ: Erlbaum.
  • Black, P. (1990). APU science—The past and the future. The School Science Review, 72, 13–29.
  • Bond, T. G., & Fox, C. M. (2007). Applying the Rasch model: Fundamental measurement in the human sciences. Mahwah, NJ: Lawrence Erlbaum Associate.
  • Boone, W., Staver, J., & Yale, M. (2014). Rasch analysis in the human sciences. Berlin, Germany: Springer.
  • Boone, W. J., Townsend, J. S., & Staver, J. R. (2016). Utilizing multifaceted Rasch measurement through FACETS to evaluate science education data sets composed of judges, respondents, and rating scale items: An exemplar utilizing the elementary science teaching analysis matrix instrument. Science Education, 100(2), 221–238.
  • Breslyn, W., & McGinnis, J. R. (2012). A comparison of exemplary biology, chemistry, earth science, and physics teachers’ conceptions and enactment of inquiry. Science Education, 96(1), 48–77.
  • Brown, J. S., Collins, A., & Duguid, P. (1989). Situated cognition and the culture of learning. Educational Researcher, 18(1), 32–42.
  • Brown, N. J., Furtak, E. M., Timms, M., Nagashima, S. O., & Wilson, M. (2010). The evidence-based reasoning framework: Assessing scientific reasoning. Educational Assessment, 15(3-4), 123–141.
  • Buck, L. B., Bretz, S. L., & Towns, M. H. (2008). Characterizing the level of inquiry in the undergraduate laboratory. Journal of College Science Teaching, 38(1), 52–58.
  • Burgh, G., & Nichols, K. (2012). The parallels between philosophical inquiry and scientific inquiry: Implications for science education. Educational Philosophy and Theory, 44(10), 1045–1059.
  • Case, R. (2013). The mind’s staircase: Exploring the conceptual underpinnings of children’s thought and knowledge. Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Clark, D. B. (2006). Longitudinal conceptual change in students’ understanding of thermal equilibrium: An examination of the process of conceptual restructuring. Cognition and Instruction, 24(4), 467–563.
  • de Jong, O. N. N. O., & Taber, K. S. (2014). The many faces of high school chemistry. Handbook of Research on Science Education, 2, 457–480.
  • Docktor, J. L., & Mestre, J. P. (2014). Synthesis of discipline-based education research in physics. Physical Review Special Topics – Physics Education Research, 10(2), 020119.
  • Dori, Y. J. (2003). From nationwide standardized testing to school-based alternative embedded assessment in Israel: Students’ performance in the matriculation 2000 project. Journal of Research in Science Teaching, 40(1), 34–52.
  • Eckes, T. (2011). Introduction to many-facet Rasch measurement. Frankfurt: Peter Lang.
  • Eilam, E. (2015). Measuring the level of complexity of scientific inquiries: The LCSI index. International Journal of Environmental and Science Education, 10(1), 1–2.
  • Giamellaro, M. (2014). Primary contextualization of science learning through immersion in content-rich settings. International Journal of Science Education, 36(17), 2848–2871.
  • Gormally, C., Brickman, P., Lutz, M., & Stone, E. (2012). Developing a test of scientific literacy skills (TOSLS): Measuring undergraduates’ evaluation of scientific information and arguments. CBE—Life Sciences Education, 11(4), 364–377.
  • Grooms, J., Enderle, P., & Sampson, V. (2015). Coordinating scientific argumentation and the next generation science standards through argument driven inquiry. Science Educator, 24(1), 45–50.
  • Grossman, P. L., & Stodolsky, S. S. (1995). Content as context: The role of school subjects in secondary school teaching. Educational Researcher, 24(8), 5–23.
  • Healey, M. (2005). Linking research and teaching exploring disciplinary spaces and the role of inquiry-based learning. In R. Barnett (Ed.), Reshaping the university: New relationships between research, scholarship and teaching (pp. 30–42). Maidenhead: McGraw-Hill/Open University Press.
  • Heckler, A. F., & Bogdan, A. M. (2018). Reasoning with alternative explanations in physics: The cognitive accessibility rule. Physical Review Physics Education Research, 14(1), 010120.
  • Hofstein, A., Navon, O., Kipnis, M., & Mamlok-Naaman, R. (2005). Developing students’ ability to ask more and better questions resulting from inquiry-type chemistry laboratories. Journal of Research in Science Teaching, 42(7), 791–806.
  • Kidman, G., & Casinader, N. (2017). Inquiry-based teaching and learning across disciplines. London: Macmillan Publisher.
  • Kuo, C. Y., Wu, H. K., Jen, T. H., & Hsu, Y. S. (2015). Development and validation of a multimedia-based assessment of scientific inquiry abilities. International Journal of Science Education, 37(14), 2326–2357.
  • Lamb, R. L., Vallett, D., & Annetta, L. (2014). Development of a short-form measure of science and technology self-efficacy using Rasch analysis. Journal of Science Education and Technology, 23(5), 641–657.
  • Landis, J. R., & Koch, G. G. (1977). The measurement of observer agreement for categorical data. Biometrics, 33, 159–174.
  • Lane, S. (2010). Performance assessment: The state of the art (SCOPE Student Performance Assessment Series). Stanford, CA: Stanford University, Stanford Center for Opportunity Policy in Education.
  • Lederman, N. G., & Lederman, J. S. (2012). Nature of scientific knowledge and scientific inquiry: Building instructional capacity through professional development. In B. Fraser, K. G. Tobin, & C. J. McRobbie (Eds.), Second international handbook of science education (pp. 335–359). Dordrecht: Springer.
  • Lederman, J. S., Lederman, N. G., Bartos, S. A., Bartels, S. L., Meyer, A. A., & Schwartz, R. S. (2014). Meaningful assessment of learners’ understandings about scientific inquiry—The views about scientific inquiry (VASI) questionnaire. Journal of Research in Science Teaching, 51(1), 65–83.
  • Linacre, J. M. (1989). Many-faceted Rasch measurement (Doctoral dissertation, Univ. of Chicago, Dept. of Education). Optimizing rating scale category effectiveness. Journal of Applied Measurement, 3(1), 85–106.
  • Linacre, J. M. (2006). FACETS Rasch measurement computer program. Chicago, IL: Winsteps.com.
  • Linacre, J. M. (2010). A user’s guide to FACETS: Rasch-model computer programs [software manual], version 3.67.0. Winsteps.com. Chicago.
  • Linacre, J. M. (2011). A user’s guide to WINSTEPS/MINISTEP: Rasch-model computer programs. Chicago, IL: Winsteps.com.
  • Liu, X. (2010). Using and developing measurement instruments in science education: A Rasch modeling approach. Charlotte, NC: Information Age Publishing.
  • Liu, X., & McKeough, A. (2005). Developmental growth in students' concept of energy: Analysis of selected items from the TIMSS database. Journal of Research in Science Teaching: The Official Journal of the National Association for Research in Science Teaching, 42(5), 493–517.
  • Lupton, M. (2012). Inquiry skills in the Australian curriculum. Access, 26(2), 12–18.
  • Masters, G. (1982). A Rasch model for partial credit scoring. Psychometrika, 47, 149–174.
  • Ministerial Council on Education, Employment, Training and Youth Affairs. (2008). Melbourne declaration on educational goals for young Australians. Carlton South: MCEETYA.
  • Mullis, I. V., & Martin, M. O. (2014). TIMMS advanced 2015 assessment frameworks. International association for the evaluation of educational achievement. Herengracht 487, Amsterdam, 1017 BT, The Netherlands.
  • Nam, S. K., Yang, E., Lee, S. M., Lee, S. H., & Seol, H. (2011). A psychometric evaluation of the career decision self-efficacy scale with Korean students: A Rasch model approach. Journal of Career Development, 38(2), 147–166.
  • National Research Council. (1996). National science education standards. National committee for science education standards and assessment. Washington, DC: National Academy Press.
  • National Research Council. (1999). Grading the nation’s report card: Evaluating NAEP and transforming the assessment of educational progress. Washington, DC: National Academy Press.
  • National Research Council. (2012). A framework for K-12 science education: Practices, crosscutting concepts, and core ideas. Washington, DC: National Academies Press.
  • National Research Council. (2014). Developing assessments for the next generation science standards. Washington, DC: The National Academies Press.
  • Nehm, R. H., & Ha, M. (2011). Item feature effects in evolution assessment. Journal of Research in Science Teaching, 48(3), 237–256.
  • Nehring, A., Nowak, K. H., zu Belzen, A. U., & Tiemann, R. (2015). Predicting students’ skills in the context of scientific inquiry with cognitive, motivational, and sociodemographic variables. International Journal of Science Education, 37(9), 1343–1363.
  • Nowak, K. H., Nehring, A., Tiemann, R., & Upmeier zu Belzen, A. (2013). Assessing students’ abilities in processes of scientific inquiry in biology using a paper-and-pencil test. Journal of Biological Education, 47(3), 182–188.
  • Opitz, S. T., Neumann, K., Bernholt, S., & Harms, U. (2017). How do students understand energy in biology, chemistry, and physics? Development and validation of an assessment instrument. Eurasia Journal of Mathematics Science and Technology Education, 13(7), 3019–3042.
  • Opitz, S. T., Neumann, K., Bernholt, S., & Harms, U. (2019). Students’ energy understanding across biology, chemistry, and physics contexts. Research in Science Education, 49(2), 521–541.
  • Organisation for Economic Co-operation and Development (OECD). (2010). PISA 2009 results: What students know and can do: Student performance in reading, mathematics and science (Volume I). Paris, France: OECD.
  • Organisation for Economic Co-operation and Development (OECD). (2013). PISA 2015 draft science framework. Paris, France: OECD.
  • Organisation for Economic Co-operation and Development (OECD). (2014). PISA 2012 results: What students know and can do with what they know. OECD Publishing. Retrieved from http://www.oecd.org/pisa/keyfindings/pisa-2012-results-overview.pdf
  • Park, M., & Liu, X. (2016). Assessing understanding of the energy concept in different science disciplines. Science Education, 100(3), 483–516.
  • Pine, J., Aschbacher, P., Roth, E., Jones, M., McPhee, C., Martin, C., … Foley, B. (2006). Fifth graders’ science inquiry abilities: A comparative study of students in hands-on and textbook curricula. Journal of Research in Science Teaching, 43(5), 467–484.
  • Potari, D., & Spiliotopoulou, V. (1996). Children’s approaches to the concept of volume. Science Education, 80(3), 341–360.
  • Prieto, G., & Nieto, E. (2014). Analysis of rater severity on written expression exam using many faceted Rasch measurement. Psicológica, 35(2), 385–397.
  • Rose, L. T., & Fischer, K. W. (2009). Dynamic development: A neo-Piagetian approach. In U. Muller, J. Carpendale, & L. Smith (Eds.), The Cambridge companion to Piaget (pp. 400–421). New York, NY: Cambridge University Press.
  • Royal, K. D., Ellis, A., Ensslen, A., & Homan, A. (2010). Rating scale optimization in survey research: An application of the Rasch rating scale model. Journal of Applied Quantitative Methods, 5(4), 607–617.
  • Ruiz-Primo, M. A., & Shavelson, R. J. (1996). Rhetoric and reality in science performance assessments: An update. Journal of Research in Science Teaching, 33(10), 1045–1063.
  • Sahin, M. G., Teker, G. T., & Güler, N. (2016). An analysis of peer assessment through many facet Rasch model. Journal of Education and Practice, 7(32), 172–181.
  • Sandoval, W. A., & Reiser, B. J. (2004). Explanation-driven inquiry: Integrating conceptual and epistemic scaffolds for scientific inquiry. Science Education, 88(3), 345–372.
  • Schecker, H., & Gerdes, J. (1999). Messung von Konzeptualisierungsfähigkeit in der Mechanik. Zur Aussagekraft des Force concept Inventory. Zeitschrift für Didaktik der Naturwissenschaften, 5(1), 75–89.
  • Schiro, M. S. (2008). Curriculum theory: Conflicting visions and enduring concerns. Thousand Oaks, CA: Sage.
  • Sprague, E., Siegert, R. J., Medvedev, O., & Roberts, M. H. (2018). Rasch analysis of the Edmonton symptom assessment system. Journal of Pain and Symptom Management, 55(5), 1356–1363.
  • Vajoczki, S., Watt, S., Vine, M. M., & Liao, R. (2011). Inquiry learning: Level, discipline, class size, what matters? International Journal for the Scholarship of Teaching and Learning, 5(1), 10.
  • Wang, Z., Chi, S., Luo, M., Yang, Y., & Huang, M. (2017). Development of an instrument to evaluate high school students’ chemical symbol representation abilities. Chemistry Education Research and Practice, 18(4), 875–892.
  • Wilson, M. (2005). Constructing measures: An item response modeling approach. Mahwah, NJ: Lawrence Erlbaum Associates.
  • Wu, H. K., Kuo, C. Y., Jen, T. H., & Hsu, Y. S. (2015). What makes an item more difficult? Effects of modality and type of visual information in a computer-based assessment of scientific inquiry abilities. Computers & Education, 85, 35–48.
  • Yang, K. K., Lin, S. F., Hong, Z. R., & Lin, H. S. (2016). Exploring the assessment of and relationship between elementary students’ scientific creativity and science inquiry. Creativity Research Journal, 28(1), 16–23.
  • Zachos, P., Hick, T. L., Doane, W. E., & Sargent, C. (2000). Setting theoretical and empirical foundations for assessing scientific inquiry and discovery in educational programs. Journal of Research in Science Teaching, 37(9), 938–962.
  • Zuzovsky, R., & Tamir, P. (1999). Growth patterns in students’ ability to supply scientific explanations: Findings from the third international mathematics and science study in Israel. International Journal of Science Education, 21(10), 1101–1121.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.