536
Views
5
CrossRef citations to date
0
Altmetric
Articles

The development and validation of the Instructional Practices Log in Science: a measure of K-5 science instruction

, , , , , & show all
Pages 335-357 | Received 22 Mar 2016, Accepted 10 Jan 2017, Published online: 06 Mar 2017

References

  • Abd-El-Khalick, F., Boujaoude, S., Duschl, R., Lederman, N. G., Mamlock-Naaman, R. M, Hofstein, A., … Tuan, H. (2004). Inquiry in science education: International perspectives. Culture and Comparative Studies, 88(3), 397–419.
  • Achieve. (2010). Taking the lead in science education: Forging next-generation science standards. Washington, DC: Author. Retrieved from http://achieve.org/files/InternationalScienceBenchmarkingReport.pdf
  • AERA/APA/NCME. (2014). Standards for educational and psychological testing. Washington, DC: American Educational Research Association.
  • Ainsworth, S., & Loizu, A. T. (2003). The effects of self-explaining when learning with text or diagrams. Cognitive Science, 27, 669–681. doi: 10.1207/s15516709cog2704_5
  • Ainsworth, S., Prain, V., & Tytler, R. (2011). Drawing to learn in science. Science, 333(26), 1096–1097. doi: 10.1126/science.1204153
  • American Association for the Advancement of Science. (1994). Benchmarks for science literacy. New York, NY: Oxford.
  • Appleton, K. (Ed.). (2013). Elementary science teacher education: International perspectives on contemporary issues and practice. New York, NY: Routledge.
  • Banilower, E. R. (2005). A study of the predictive validity of the LSC classroom observation protocol. Arlington, VA: National Science Foundation.
  • Banilower, E., Cohen, K., Pasley, J., & Weiss, I. (2010). Effective science instruction: What does research tell us? (2nd ed.). Portsmouth, NH: RMC Research Corporation, Center on Instruction.
  • Barton, A. C., Borko, H., Bybee, R., Carslon, J., Minner, D., Minstrell, J., … Zembal-Saul, C. (2012). Guide for video analysis of science teaching: Coherence of the science content storyline.Colorado Springs, CO: BSCS.
  • Bell, C. A., Gitomer, D. H., McCaffrey, D. F., Hamre, B. K., Pianta, R. C., & Qi, Y. (2012). An argument approach to observation protocol validity. Educational Assessment, 17, 62–87. doi: 10.1080/10627197.2012.715014
  • Bowen, N. K., & Guo, S. (2011). Structural equation modeling. New York, NY: Oxford.
  • Burstein, L., McDonnell, L. M., Van Winkle, J., Ormseth, T. H., Mirocha, J., & Guiton, G. (1995). Validating national curriculum indicators. Santa Monica, CA: RAND.
  • Byrne, B. (2012). Structural equation modeling with Mplus: Basic concepts, applications, and programming (multivariate applications series). New York, NY: Routledge.
  • Camburn, E., & Barnes, C. A. (2004). Assessing the validity of a language arts instruction log through triangulation. The Elementary School Journal, 105(1), 49–73. doi: 10.1086/428802
  • Camburn, E. M., Han, S. W., & Sebastian, J. (2017). Assessing the validity of an annual survey for measuring the enacted literacy curriculum. Educational Policy, 31(1), 73–107. doi: 10.1177/0895904815586848
  • Campbell, T., Abd-Hamid, N. H., & Chapman, H. (2010). Development of instruments to assess teacher and student perceptions of inquiry experiences in science classrooms. Journal of Science Teacher Education, 21(1), 13–30. doi: 10.1007/s10972-009-9151-x
  • Danielson, C. (2007). Enhancing professional practice: A framework for teaching (2nd ed.) Alexandria, VA: ASCD.
  • DeBoer, G. E. (2011). The globalization of science education. Journal of Research in Science Teaching, 48(6), 567–591. doi: 10.1002/tea.20421
  • Desimone, L. M. (2006). Consider the source: Response differences among teachers, principals, and districts on survey questions about their education policy environment. Educational Policy, 20, 640–676. doi: 10.1177/0895904805284056
  • Duschl, R., & Osborne, J. (2002). Supporting and promoting argumentation discourse in science education. Studies in Science Education, 38(1), 39–72. doi: 10.1080/03057260208560187
  • Edens, K. M., & Potter, E. (2003). Using descriptive drawings as a conceptual change strategy in elementary science. School Science and Mathematics, 103(3), 135–144. doi: 10.1111/j.1949-8594.2003.tb18230.x
  • Ferguson, R. F. (2012). Can student surveys measure teaching quality?Kappan, 94(3), 24–28. doi: 10.1177/003172171209400306
  • Garry, M., Sharman, S. J., Feldman, J., Marlatt, G. A., & Loftus, E. F. (2002). Examining memory for heterosexual college students’ sexual experiences using an electronic mail daily. Health Psychology, 21(6), 629–634. doi: 10.1037/0278-6133.21.6.629
  • Geldhof, G. J., Preacher, K. J., & Zyphur, M. J. (2014). Reliability estimation in a multilevel confirmatory factor analysis. Psychological Methods, 19(1), 72–91. doi: 10.1037/a0032138
  • Gobert, J., & Clement, J. (1999). Effects of student-generated diagrams versus student-generated summaries on conceptual understanding of causal and dynamic knowledge in plate tectonics. Journal of Research in Science Teaching, 36(1), 39–53. doi: 10.1002/(SICI)1098-2736(199901)36:1<39::AID-TEA4>3.0.CO;2-I
  • Greive, E. L., Carrier, S. J., Minogue, J., Walkowiak, T. A., & Zulli, R. A. (2014). The development of a science instructional log to evaluate science instructional practices in elementary classrooms. Paper presented at the American Education Research Association (AERA) 2014 research conference, Philadelphia, PA.
  • Grissom, J. A., Loeb, S., & Master, B. (2013). Effective instructional time use for school leaders’ longitudinal evidence from observations of principals. Educational Researcher, 42(8), 433–444. doi: 10.3102/0013189X13510020
  • Grossman, P. L., Loeb, S., Cohen, J., Hammerness, K., Wyckoff, J. H., Boyd, D. J., & Lankford, H. (2010). Measure for measure: The relationship between measures of instructional practice in middle school English language arts and teachers’ value-added scores. Washington, DC: National Center for Analysis of Longitudinal Data in Educational Research.
  • Groves, R. M., Fowler, F. J., Couper, M. P., Lepkowski, J. M., Singer, E., & Tourangeau, R. (2009). Survey methodology. Hoboken, NJ: John Wiley.
  • Hayes, K. N., Lee, C. S., DiStefano, R., O’Connor, D., & Seitz, J. (2016). Measuring science instructional practice: A survey tool for the age of NGSS. Journal of Science Teacher Education, 27, 137–164. doi: 10.1007/s10972-016-9448-5
  • Hennessey, M. G. (2003). Probing the dimensions of metacognition: Implications for conceptual change teaching-learning. In G. M. Sinatra &P. R. Pintrich (Eds.), Intentional conceptual change (pp. 103–132). Mahwah, NJ: Lawrence Erlbaum Associates.
  • Hill, H. C. (2005). Content across communities: Validating measures of elementary mathematics instruction. Educational Policy, 19(3), 447–475. doi: 10.1177/0895904805276142
  • Hill, H. C., Blunk, M. L., Charalambous, C. Y., Lewis, J. M., Phelps, G. C., Sleep, L., & Ball, D. L. (2008). Mathematical knowledge for teaching and the mathematical quality of instruction: An exploratory study. Cognition & Instruction, 26, 430–511. doi: 10.1080/07370000802177235
  • Hill, H. C., Charalambous, C. Y., & Kraft, M. A. (2012). When rater reliability is not enough: Teacher observation systems and a case for the G-study. Educational Researcher, 41(2), 56–64. doi: 10.3102/0013189X12437203
  • Jones, L. R., Wheeler, G., & Centurino, V. A. (2015). TIMSS 2015 science framework. In I. V. S. Mullis & M. O. Martin (Eds.), TIMSS 2015 Assessment Frameworks, (pp. 29–59). Chestnut Hill, MA: TIMSS PIRLS International Study Center, Boston College.
  • Kane, M. T. (2013). The argument-based approach to validation. School Psychology Review, 42(4), 448–457.
  • Kane, M. T. (2006). Validation. In R. L. Brennan (Ed.), Educational measurement (pp. 17–64). Westport, CT: American Council on Education and Prager.
  • Kane, T. J., & Staiger, D. O. (2012). Gathering feedback for teaching: Combining high-quality observations with student surveys and achievement gains. Bill and Melinda Gates Foundation.
  • Kuhn, D. (1993). Science as argument: Implications for teaching and learning scientific thinking. Science Education, 77(3), 319–337. doi: 10.1002/sce.3730770306
  • Marshall, J. C., Smart, J., & Horton, R. M. (2010). The design and validation of EQUIP: An instrument to assess inquiry-based instruction. International Journal of Science and Mathematics Education, 8(2), 299–321. doi: 10.1007/s10763-009-9174-y
  • Martone, A., & Sireci, S. G. (2009). Evaluating alignment between curriculum, assessment, and instruction. Review of Educational Research, 79(4), 1332–1359. doi: 10.3102/0034654309341375
  • Mayer, D. P. (1999). Measuring instructional practice: Can policymakers trust survey data?Educational Evaluation and Policy Analysis, 21(1), 29–45. doi: 10.3102/01623737021001029
  • Millar, R., & Osborne, J. (Eds.). (1998). Beyond 2000: Science education for the future. London: King’s College.
  • Muthén, L. K., & Muthén, B. O. (1998–2010). Mplus user’s guide (6th ed.). Los Angeles, CA: Muthén & Muthén.
  • National Center for Educational Research & Development. (1997). Manahej al-ta’alim al-a’am wa ahdafaha [ Public educational curricula and goals]. Beirut: Author.
  • NGSS Lead States. (2013). Next Generation Science Standards: For States, by States. Washington, DC: The National Academies Press.
  • National Research Council. (1996). National science education standards. Washington, DC: The National Academy Press.
  • National Research Council. (2003). How people learn: Brain, mind, experience, and school. J. D. Bransford, A. L. Brown, & R. R. Cocking (Eds.). Washington, DC: National Academy Press.
  • National Research Council. (2005). How students learn: Science in the classroom. M. S. Donovan & J. D. Bransford (Eds.). Washington, DC: National Academy Press.
  • National Research Council. (2007). Taking science to school: Learning and teaching science in grades K-8. Committee on Science Learning, Kindergarten Through Eighth Grade. R. A. Duschl, H. A. Schweingruber, & A. W. Shouse (Eds.). Board on Science Education, Center for Education. Division of Behavioral and Social Sciences and Education. Washington, DC: The National Academies Press.
  • National Research Council. (2012). A framework for K-12 science education: Practices, cross-cutting concepts and core ideas. Washington, DC: National Academies Press.
  • Neumerski, C. M. (2013). Rethinking instructional leadership: A review. What do we know about principal, teacher, and coach instructional leadership, and where should we go from here?Educational Administration Quarterly, 49(2), 310–347. doi: 10.1177/0013161X12456700
  • OECD. (2016). PISA 2015 assessment and analytical framework: Science, reading, mathematic and financial literacy. Paris: Author. doi:10.1787/9789264255425-en
  • Padilla, M. J. (1990). Research matters to the science teacher: The science process skills. National Association for Research in Science Teaching. Retrieved from http://www.educ.sfu.ca/narstsite/publications/research/skill.htm
  • Pianta, R. C., Hamre, B. K., Haynes, N. J., Mintz, S. L., & La Paro, K. M. (2007). Classroom assessment scoring system manual, middle/secondary version. Charlottesville, VA: University of Virginia.
  • Piburn, M., & Sawada, D. (2001). Reformed teaching observation protocol (RTOP) training guide (ACEPT Technical Report Number IN00-3).
  • Raudenbush, S. W., & Bryk, A. S. (2002). Hierarchical linear models: Applications and data analysis methods.Thousand Oaks, CA: Sage.
  • Rowan, B., Camburn, E., & Correnti, R. (2004). Using teacher logs to measure the enacted curriculum: A study of literacy teaching in third-grade classrooms. The Elementary School Journal, 105(1), 75–101. doi: 10.1086/428803
  • Rowan, B., & Correnti, R. (2009). Studying reading instruction with teacher logs: Lessons from the study of instructional improvement. Educational Researcher, 38(2), 120–131. doi: 10.3102/0013189X09332375
  • Rowan, B., Harrison, D., & Hayes, A. (2004). Using instructional logs to study mathematics curriculum and teaching in the early grades. The Elementary School Journal, 105(1), 103–127. doi: 10.1086/428812
  • Rowan, B., Jacob, R., & Correnti, R. (2009). Using instructional logs to identify quality in educational settings. New Directions for Youth Development, 2009(121), 13–31. doi: 10.1002/yd.294
  • Sampson, V., & Blanchard, M. R. (2012). Science teachers and scientific argumentation: Trends in views and practice. Journal of Research in Science Teaching, 49(9), 1122–1148. doi: 10.1002/tea.21037
  • Sampson, V., Enderle, P., Grooms, J., & Witte, S. (2013). Writing to learn by learning to write during the school science laboratory: Helping middle and high school students develop argumentative writing skills as they learn core ideas. Science Education, 97(5), 643–670. doi: 10.1002/sce.21069
  • StataCorp. (2015). Stata statistical software: Release 14. College Station, TX: Author.
  • Tekkumru-Kisa, M., Stein, M. K., & Schunn, C. (2015). A framework for analyzing cognitive demand and content-practices integration: Task analysis guide in science. Journal of Research in Science Teaching, 52(5), 659–685. doi: 10.1002/tea.21208
  • Tourangeau, R., Rips, L. J., & Rasinski, K. (2000). The psychology of survey response.New York, NY: Cambridge University Press.
  • Van Meter, P. (2001). Drawing construction as a strategy for learning from text. Journal of Educational Psychology, 93(1), 129–140. doi: 10.1037/0022-0663.93.1.129
  • Van Meter, P., & Garner, J. (2005). The promise and practice of learner-generated drawing: Literature review and synthesis. Educational Psychology Review, 17(4), 285–325. doi: 10.1007/s10648-005-8136-3
  • Waddington, D., Nentwig, P., & Schanze, S. (Eds.). (2007). Making it comparable: Standards in science education. Mun̈ster: Waxmann.
  • Walkowiak, T. A., Adams, E. L., Porter, S. R., Lee, C. W., & McEachin, A. (in press). The development and validation of the Instructional Practices Log in Mathematics (IPL-M). Learning and Instruction.
  • Weiss, I. R., Pasley, J. D., Smith, P. S., Banilower, E. R., & Heck, D. J. (2003). Looking inside the classroom. Chapel Hill, NC: Horizon Research.
  • Wenning, C. J. (2005). Levels of inquiry: Hierarchies of pedagogical practices and inquiry processes. Journal of Physics Teacher Education, 2(3), 3–11.
  • Willis, G. B. (2005). Cognitive interviewing: A tool for improving questionnaire design. Washington, DC: SAGE.
  • Zeichner, K. M., & Liston, D. P. (2013). Reflective teaching: An introduction. Abingdon, VA: Routledge.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.