1,979
Views
11
CrossRef citations to date
0
Altmetric
Articles

Analysing formal formative assessment activities in the context of inquiry at primary and upper secondary school in Switzerland

ORCID Icon, ORCID Icon &
Pages 407-427 | Received 26 Oct 2018, Accepted 31 Aug 2019, Published online: 08 Oct 2019

References

  • Abd El Khalick, F., Boujaoude, S., Duschl, R. A., Lederman, N. G., Mamlok-Naaman, R., Hofstein, A., … Tuan, H. (2004). Inquiry in science education: International perspectives. Science Education, 88(3), 397–419. doi: 10.1002/sce.10118
  • Andrade, H. (2010). Students as the definitive source of formative assessment. In H. Andrade & G. J. Cizek (Eds.), Handbook of formative assessment (pp. 90–105). New York: Routhledge.
  • Andrade, H., & Valtcheva, A. (2009). Promoting learning and achievement through self-assessment. Theory Into Practice, 48(1), 12–19. doi: 10.1080/00405840802577544
  • ARG (Assessment Reform Group). (2002). Assessment for learning: 10 Principles. London: ARG. Retrieved from http://www.assessment-reform-group.org
  • Artigue, M., & Baptist, P. (2012). Inquiry in mathematics education. Background resources for implementing inquiry in science and mathematics at school. Paris: Université Paris Diderot.
  • Bell, B., & Cowie, B. (2001). The characteristics of formative assessment in science education. Science Education, 85(5), 536–553. doi: 10.1002/sce.1022
  • Bell, T., Urhahn, D., Schanze, S., & Ploetzner, R. (2010). Collaborative inquiry learning: Models, tools, and challenges. International Journal of Science Education, 32(3), 349–377. doi: 10.1080/09500690802582241
  • Black, P. (1993). Formative and summative assessments by teachers. Studies in Science Education, 21, 49–97. doi: 10.1080/03057269308560014
  • Black, P., & Harrison, C. (2004). Science inside the black box. London: GL Assessment.
  • Black, P., Harrison, C., Lee, C., Marshall, B., & Wiliam, D. (2004). Working inside the black box: Assessment for learning in the classroom. Phi Delta Kappan.
  • Black, P., & Wiliam, D. (1998). Assessment and classroom learning. Assessment in education: Principles. Policy and Practice, 5(1), 7–73.
  • Blanchard, M. R., Osborne, J. W., Wallworks, C., & Harris, E. S. (2013). Progress on implementing inquiry in North Carolina: Nearly 1,000 elementary, middle and high school science teachers Weigh In. Science Educator, 22(1), 37–47.
  • Börlin, J. (2012). Das Experiment als Lerngelegenheit. Vom interkulturellen Vergleich des Physikunterrichts zu Merkmalen seiner Qualität [The experiment as an opportunity to learn. From the intercultural comparison of physics education to characteristics of its quality]. Berlin: Logos Verlag.
  • Börlin, J. & Labudde, P. (2014). Practical work in physics instruction: An opportunity to learn? In H. E. Fischer, P. Labudde, K. Neumann, & J. Viiri (Eds.), Quality of instruction in physics (pp. 111–128). Münster: Waxmann.
  • Burke, K. (2006). From standards to rubrics in 6 steps. Heatherton, Victoria: Hawker Brownlow Education.
  • Bybee, R. (1997). Achieving scientific literacy: From purposes to practices. Portsmouth: Heilmann.
  • Cowie, B., & Bell, B. (1999). A model for formative assessment. Assessment in Education, 6(1), 101–116. doi: 10.1177/107319119900600111
  • D-EDK Deutschschweizer Erziehungsdirektoren-Konferenz. (2014). Lehrplan 21 [Curriculum 21]. Luzern: D-EDK.
  • Dolin, J. (2012). Assess inquiry in science, technology and mathematics education: ASSIST-ME proposal. Copenhagen: University of Copenhagen.
  • EDK Schweizerische Konferenz der kantonalen Erziehungsdirektoren. (1994). Rahmenlehrplan für die Maturitätsschulen [Curricular framework for upper secondary schools]. Bern: Schweizerische Konferenz der kantonalen Erziehungsdirektoren.
  • Euler, M. (2011). PRIMAS survey report on inquiry-based learning and teaching in Europe. Kiel: IPN Kiel.
  • Falchikov, M. (1991). Group process analysis. In S. Brown & P. Dove (Eds.), Self- and peer-assessment (pp. 15–27). Birmingham: Standing conference on educational development, paper 63.
  • Furtak, E. M., & Ruiz-Primo, M. A. (2008). Making students’ thinking explicit in writing and discussion. An analysis of formative assessment prompts. Science Education, 92(5), 799–824. doi: 10.1002/sce.20270
  • Furtak, E. M., Seidel, T., Iverson, H., & Briggs, D. C. (2012). Experimental and quasi-experimental studies of inquiry-based science teaching: A meta-analysis. Review of Educational Research, 82(3), 300–329. doi: 10.3102/0034654312457206
  • Greene, J. C., Caracelli, V. J., & Graham, W. F. (1989). Toward a conceptual framework for mixed-method evaluation designs. Educational Evaluation and Policy Analysis, 11(3), 255–27. doi: 10.3102/01623737011003255
  • Grob, Regula. (2017). Towards the implementation of formal formative assessment in inquiry-based science education in Switzerland. PhD thesis. Basel: Universität Basel.
  • Grob, U., & Maag Merki, K. (2001). Überfachliche Kompetenzen. Theoretische Grundlegung und empirische Erprobung eines Indikatorensystems [Cross-curricular competences. Theoretical background and empirical validation of a system of indicators]. Bern: Peter Lang.
  • HarmoS: Konsortium HarmoS Naturwissenschaften. (2008). HarmoS Naturwissenschaften + . Kompe-tenzmodell und Vorschläge für Bildungsstandards. Wissenschaftlicher Schlussbericht [HarmoS science education + . Model of competences and suggestions for educational standards. Final scientific report]. Bern: Konsortium HarmoS Naturwissenschaften.
  • Hartig, J., Klieme, E., & Leutner, D. (2008). Assessment of competencies in educational contexts. Göttingen: Hogrefe Publishing GmbH.
  • Hattie, J. (2009). Visible learning. A synthesis of over 800 meta-analyses relating to achievement. London & New York: Routledge.
  • Hattie, J., & Timperley, H. (2007). The power of feedback. Review of Educational Research, 77(1), 81–112. doi: 10.3102/003465430298487
  • Heritage, M. (2010). Formative assessment: Making it happen in the classroom. Thousand Oaks, California: Corwin Press.
  • Herman, J. L., Osmundson, E., & Silver, D. (2010). Capturing quality in formative assessment practice: Measurement challenges, CRESST report 770. Los Angeles: National Center for Research on Evaluation, Standards, and Student Testing (CRESST).
  • Husfeld, V. (2009). Aus der Praxis der Leistungsbeurteilung [On the practice of summative assessment]. In D. Fischer, A. Strittmatter, & U. Vögeli-Mantovani (Eds.), Noten, was denn sonst? Leistungsbeurteilung und -bewertung (pp. 33–40). Zürich: Verlag LCH.
  • Jundt, W. (2013). Unpassendes zur Beurteilung [Inappropriate things on assessment]. profiL Magazin für das Lehren und Lernen, 1, 10–11.
  • Kessler, J. H., & Galvan, P. M. (2007). Inquiry in action: Investigating matter through inquiry. Washington, DC: American Chemical Society.
  • Kronig, W. (2009). Schulnoten - Glasperlen des Bildungssystems [Grades – Glass beads of the educational system]. In D. Fischer, A. Strittmatter, & U. Vögeli-Mantovani (Eds.), Noten, was denn sonst? Leistungsbeurteilung und -bewertung (pp. 27–32). Zürich: Verlag LCH.
  • Landis, J. R., & Koch, G. G. (1977). The measurement of observer agreement for categorical data. Biometrics, 33, 159–174. doi: 10.2307/2529310
  • Leahy, S., Lyon, C., Thompson, M., & Wiliam, D. (2005). Classroom assessment: Minute by minute, day by day. Assessment to Promote Learning, 63(3), 19–24.
  • Labudde, P. (2000). Konstruktivismus im Physikunterricht der Sekundarstufe II [Constructivism in upper secondary physics education]. Bern / Stuttgart / Wien: Haupt Verlag.
  • Labudde, P. (2007). How to develop, implement and assess standards in science education? 12 challenges from a Swiss perspective. In D. Washington, P. Nentwig, & S. Schanze (Eds.), Making it comparable: Standards in science education (pp. 277–301). Münster: Waxmann.
  • Labudde, P., Nidegger, Ch., Adamina, M., & Gingins, F. (2007). The development, validation, and implementation of standards in science education: Chances and difficulties in the Swiss project HarmoS. In D. Washington, P. Nentwig, & S. Schanze (Eds.), Making it comparable: Standards in science education (pp. 235–259). Münster: Waxmann.
  • Mayring, P. (2010). Qualitative Inhaltsanalyse. Grundlagen und Techniken. [Qualitative content analysis. Theoretical background and techniques]. Weinheim: Beltz.
  • McLoughlin, E., Finlayson, O., & van Kampen, P. (2012). SAILS – report on mapping the development of key skills and competencies onto skills developed in IBSE: WP 1 – Deliverable 1.1. Dublin: Dublin City University.
  • Moskal, B. M. (2003). Recommendations for developing classroom performance assessments and scoring rubrics. Practical assessment. Research & Evaluation, 8, 14.
  • OECD (Organisation for Economic Co-operation and Development). (2005). The definition and selection of key competences. Executive summary. Paris, France: OECD Publishing.
  • OECD (Organisation for Economic Co-operation and Development). (2013). Synergies for better learning: An international perspective on evaluation and assessment. OECD reviews of evaluation and assessment in education. Paris: OECD Publishing.
  • Paris, S. G., & Paris, A. H. (2001). Classroom applications of research on self-regulated learning. Educational Psychologist, 36, 89–101. doi: 10.1207/S15326985EP3602_4
  • Priemer, B. (2011). Was ist das Offene beim offenen Experimentieren? [What is open in open experimentation?]. Zeitschrift für Didaktik der Naturwissenschaften, 17, 315–337.
  • Rönnebeck, S., Bernholt, S., & Ropohl, M. (2016). Searching for a common ground – A literature review of empirical research on scientific inquiry activities. Studies in Science Education, 52(2), 161–197. doi: 10.1080/03057267.2016.1206351
  • Ruiz-Primo, M. A., Furtak, E. M., Ayala, C., Yin, Y., & Shavelson, R. J. (2010). Formative assessment, motivation, and science learning. In H. Andrade & G. J. Cizek (Eds.), Handbook of formative assessment (pp. 139–158). New York: Routhledge.
  • Schoonenboom, J., & Johnson, R. B. (2017). How to construct a mixed methods research design. KZfSS Kölner Zeitschrift für Soziologie und Sozialpsychologie, 69(Supplement 2), 107–131. doi: 10.1007/s11577-017-0454-1
  • Shavelson, R. J., Young, D. B., Ayala, C. C., Brandon, P. R., Furtak, E. M., Ruiz-Primo, M. A., … Yin, Y. (2008). On the impact of curriculum-embedded formative assessment on learning: A collaboration between curriculum and assessment developers. Applied Measurement in Education, 21(4), 295–314. doi: 10.1080/08957340802347647
  • Smit, R. (2009). Die formative Beurteilung und ihr Nutzen für die Entwicklung von Lernkompetenz. Eine empirische Studie in der Sekundarstufe 1 [Formative assessment and its use for the development of learning competence. An empirical study at lower secondary school level]. Baltmannsweiler: Schneider Verlag Hohengehren GmbH.
  • Smit, R., & Birri, T. (2014). Assuring the quality of standards-oriented classroom assessment with rubrics for complex competencies. Studies in Educational Evaluation, 43, 5–13. doi: 10.1016/j.stueduc.2014.02.002
  • Smit, R., Weitzel , H., Blank, R., Rietz, F., Tardent, J., & Robin, N. (2017). Interplay of secondary pre-service teacher content knowledge (CK), pedagogical content knowledge (PCK) and attitudes regarding scientific inquiry teaching within teacher training. Research in Science & Technological Education, 35(4), 477–499. doi: 10.1080/02635143.2017.1353962
  • Stiggins, R. J., Griswold, M. M., & Wikelund, K. R. (1989). Measuring thinking skills through classroom assessment. Journal of Educational Measurement, 26, 233–246. doi: 10.1111/j.1745-3984.1989.tb00330.x
  • Topping, K. J. (2003). Self and peer assessment in school and university: Reliability, validity and utility. In M. Segers, F. Dochy, & E. Cascaller (Eds.), Optimising new modes of assessment: In search of qualities and standards (pp. 55–87). The Netherlands: Kluwer Academic Publishers.
  • Tsivitanidou, O., Zacharia, Z. C., & Hovardas, A. (2011). High school students’ unmediated potential to assess peers: Unstructured and reciprocal peer assessment of web-portfolios in a science course. Learning and Instruction, 21, 506–519. doi: 10.1016/j.learninstruc.2010.08.002
  • Vögeli - Mantovani, U. (1999). SKBF Trendbericht Nr. 3: Mehr fördern, weniger auslesen. Zur Entwicklung der schulischen Beurteilung in der Schweiz [SKBF trend report no. 3: More support, less selection. On the development of educational assessment in Switzerland]. Aarau: Schweizerische Koordinationsstelle für Bildungsforschung.
  • White, B. Y., & Frederiksen, J. R. (1998). Inquiry, modelling, and metacognition: Making science accessible to all students. Cognition and Instruction, 16(1), 3–118. doi: 10.1207/s1532690xci1601_2
  • Widmer Märki, I. (2011). Fächerübergreifender naturwissenschaftlicher Unterricht: Umsetzung und Beurteilung von Schülerleistungen im Gymnasium [Interdisciplinary science education: Implementation and assessment of student achievement at the gymnasium]. PhD Thesis. Basel: University of Basel.
  • Zachos, P., Hick, T. L., Doane, W. E. J., & Sargent, C. (2000). Setting theoretical and empirical foundations for assessing scientific inquiry and discovery in educational programs. Journal of Research in Science Teaching, 37(9), 938–962. doi: 10.1002/1098-2736(200011)37:9<938::AID-TEA5>3.0.CO;2-S

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.