1,531
Views
14
CrossRef citations to date
0
Altmetric
Articles

Sources of difficulty in assessment: example of PISA science items

, , &
Pages 468-487 | Received 15 May 2016, Accepted 09 Feb 2017, Published online: 08 Mar 2017

References

  • Ahmed, A., & Pollitt, A. (1999). Curriculum demands and question difficulty. Paper presented at the Annual Conference of the International Association for Educational Assessment, Slovenia. Retrieved from http://www.iaea.info/documents/paper_1162a1d9f3.pdf.
  • Anderson, L. W., & Krathwohl, D. R. (2001). A taxonomy for learning, teaching, and assessing: A revision of bloom’s taxonomy of educational objectives. New York, NY: Longman.
  • Bain, D. (2003). Pisa et la lecture: un point de vue de didacticien. Revue suisse des sciences de l’éducation, 25, 59–78.
  • Bautier, E., Crinon, J., Rayou, P., & Rochex, J. Y. (2006). Performances en littéracie, mode de faire et univers mobilisés par les élèves: analyses secondaires de l'enquête PISA 2000. Revue Française de Pédagogie, 157, 85–101. doi: 10.4000/rfp.441
  • Bennett, J., & Lubben, F. (2006). Context-based chemistry: The Salters approach. International Journal of Science Education, 28(9), 999–1015. doi: 10.1080/09500690600702496
  • Bodin, A. (2005). What does PISA really assess? What it doesn’t? A French view. Paper presented at Joint Finnish-French conference ‘Teaching Mathematics: Beyond the PISA Survey’. Paris. Retrieved from http://www.apmep.asso.fr
  • Bybee, R., Fensham, P., & Laurie, R. (2009). Scientific literacy and contexts in PISA 2006 science. Journal of Research in Science Teaching, 46, 862–864. doi: 10.1002/tea.20332
  • Drechel, B., Carstensen, C., & Prenzel, M. (2011). The role of content and context in PISA interest scales: A study of the embedded interest items in the PISA 2006 science assessment. International Journal of Science Education, 33, 73–95. doi: 10.1080/09500693.2010.518646
  • Fisher-Hoch, H., & Hughes, S. (1996). What makes mathematics exam questions difficult? Paper presented at the British educational research association annual conference, University of Lancaster. Retrieved from http://www.leeds.ac.uk/educol/documents/000000050.htm
  • Fisher-Hoch, H., Hughes, S., & Bramley, T. (1997). What makes GCSE examination questions difficult? Outcomes of manipulating difficulty of GCSE questions. Paper presented at the British educational research association annual conference, University of York. Retrieved from http://www.leeds.ac.uk/educol/documents/000000338.htm
  • Gilbert, J. K. (2006). On the nature of ‘context’ in chemical education. International Journal of Science Education, 28(9), 957–976. doi: 10.1080/09500690600702470
  • Grisay, A., Gonzalez, E., & Monseur, C. (2009). Equivalence of item difficulties across national versions of the PIRLS and PISA reading assessments. In M. Von Davier & D. Hastedt (Eds.), IERI monograph series: Issues and methodologies in large scale assessments (Vol. 2, pp. 63–83). Hamburg: IERI.
  • Kotovsky, K., Hayes, J. R., & Simon, H. A. (1985). Why are some problems hard? Evidence from tower of Hanoi. Cognitive Psychology, 17, 248–294. doi: 10.1016/0010-0285(85)90009-X
  • Lau, K. C. (2009). A critical examination of PISA’s assessment on scientific literacy. International Journal of Science and Mathematics Education, 7, 1061–1088. doi: 10.1007/s10763-009-9154-2
  • Le Hebel, F., Montpied, P., & Tiberghien, A. (2012). Analyse de réponses d’élèves lors de l’évaluation de la culture scientifique par PISA en France. Recherches en didactique, 14, 65–84.
  • Le Hebel, F., Montpied, P., & Tiberghien, A. (2014). Which effective competencies do students use in PISA assessment of scientific literacy? In C. Bruguière, A. Tiberghien, & P. Clément (Eds.), ESERA 2011 selected contributions. Topics and trends in current science education (pp. 273–289). Dordrecht: Springer.
  • Le Hebel, F., Montpied, P., & Tiberghien, A. (2016). Which answering strategies do low achievers use to solve PISA science items? In N. Papadouris, A. Hadjgeorgiou, & C. P. Constantinou (Eds.), Insights from research in science teaching and learning (pp. 237–252). Dordrecht: Springer.
  • Leong, S. C. (2006). On varying the difficulty of test items. Paper presented at the annual conference of the international association for educational assessment. Singapore. Retrieved from http://www.iaea.info/documents/paper_1162a1d9f3.pdf
  • Le, L. T. (2009). Effects of item positions on their difficulty and discrimination – a study in PISA science data across test language and countries. In K. Shigemasu, A. Okada, T. Imaizumi, & T. Hoshino (Eds.), New trends in psychometrics (pp. 217–226). Tokyo: Uni-versal Academic.
  • Liu, X., & Boone, J. W. (2006). Applications of Rasch measurement in science education. Maple Grove, MN: JAM Press.
  • Liu, O. L., Lee, H.-S., Hofstetter, C., & Marcia, L. (2008). Assessing knowledge integration in science: Construct, measures and evidence. Educational Assessment, 12, 33–55. doi: 10.1080/10627190801968224
  • Marton, F., & Saljo, R. (1976). On qualitative differences in learning: 1 – outcome and process. British Journal of Educational Psychology, 46, 4–11. doi: 10.1111/j.2044-8279.1976.tb02980.x
  • Mevarech, Z. R., & Stern, A. (1997). Interaction between knowledge and contexts on understanding abstract mathematical concepts. Journal of Experimental Child Psychology, 65, 68–95. doi: 10.1006/jecp.1996.2352
  • Nentwig, P., Roennebeck, S., Schoeps, K., Rumann, S., & Carstensen, K. (2009). Performance and levels of contextualization in a selection of OECD countries in PISA 2006. Journal of Research in Science Teaching, 46, 897–908. doi: 10.1002/tea.20338
  • Nentwig, P., & Waddington, D. (Eds.). (2005). Making it relevant. Context based learning of science. Münster: Waxmann.
  • OECD. (2007). PISA 2006: Technical report. Paris: OECD.
  • OECD. (2009). Learning mathematics for life: A perspective from PISA. Paris: OECD.
  • Olsen, R., & Lie, S. (2006). Les Evaluations Internationales et la Recherche En Education: Principaux Objectifs et Perspectives. Revue Française de Pédagogie, 157, 11–26. doi: 10.4000/rfp.393
  • Pollitt, A., & Ahmed, A. (2000). Comprehension failures in educational assessment. Paper presented at the European conference on educational research, Edinburgh. Retrieved from http://www.cambridgeassessment.org.uk/images/109671-comprehension-failures-in-educational-assessment.pdf
  • Pollitt, A., & Ahmed, A. (2001). Science or reading?: How students think when answering TIMSS questions. Paper presented at the annual conference of the international association for educational assessment. Rio de Janeiro. Retrieved from http://www.cambridgeassessment.org.uk/images/109680-science-or-reading-how-students-think-when-answering-timss-questions.pdf
  • Pollitt, A., Entwistle, N., Hutchinson, C., & De Luca, C. (1985). What makes exam questions difficult?Edinburgh: Scottish Academic Press.
  • Porter, A. C., & Smithson, J. L. (2001). Are content standards being implemented in the classroom? A methodology and some tentative answers. In: S. H. Fuhrman (Ed.), From the capitol to the classroom: Standards-based reform in the states ( 100th yearbook of the National Society for the Study of Education, Part II, pp. 60–80). Chicago, IL: University of Chicago Press.
  • Prenzel, M., Kobarg, M., Schöps, K., & Rönnebeck, S. (2013). Research on PISA: Research outcomes of the PISA research conference 2009. Dordrecht: Springer.
  • Rochex, J. Y. (2006). Social, methodological and theoretical issues regarding assessment: Lessons from a secondary analysis of PISA 2000 literacy tests. Review of Research in Education, 30, 163–212. doi: 10.3102/0091732X030001163
  • Rochex, J. Y., & Tiberghien, A. (2007). PISA: Secondary analyses, theoretical and methodological questions and debates. Revue Française de Pédagogie. Lyon: INRP.
  • Salder, T. D., & Zeilder, D. L. (2003). The morality of socioscientific issues: Construal and resolution of genetic engineering dilemmas. Science Education, 88(1), 4–17.
  • Solano-Flores, G., Wang, C., & Shade, C. (2015). International semiotics: Item difficulty and the complexity of science items illustrations in the PISA-2009 international test comparison. International Journal of Testing, 117(1), 1–18.
  • Sun, L., Bradley, K. D., & Akers, K. (2012). A multilevel modelling approach to investigating factors impacting science achievement for secondary school students: PISA Hong Kong sample. International Journal of Science Education, 34(14), 2107–2125. doi: 10.1080/09500693.2012.708063
  • Tekkumru-Kisa, M., Stein, M. K., & Schunn, C. (2015). A framework for analyzing cognitive demand and content-practices intergration: Task analysis guide in science. Journal of Research in Science Teaching, 52(5), 659–685. doi: 10.1002/tea.21208
  • Turner, R., Blum, W., & Niss, M. (2015). Using competencies to explain mathematical item demand: A work in progress. In K. Stacey & R. Turner (Eds.), Assessing mathematical literacy: The PISA experience (pp. 85–115). New York, NY: Springer.
  • Turner, R., Dossey, J., Blum, W., & Niss, M. (2013). Using mathematical competencies to predict item difficulty in PISA: A MEG study. In M. Prenzel, M. Kobarg, K. Schöps, & S. Rönnebeck (Eds.), Research on PISA (pp. 23–37). New York, NY: Springer.
  • Webb, N. L. (2007). Issues related to judging the alignment of curriculum standards and assessments. Applied Measurement in Education, 20(1), 7–25. doi: 10.1080/08957340709336728
  • Werner, M., Schwanewedel, J., & Mayer, J. (2013). Does the context make a difference? Student’s ability in decision-making and the influence of contexts. Paper presented at ESERA 2013 conference, Cyprius.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.