1,218
Views
16
CrossRef citations to date
0
Altmetric
Original Articles

Beyond Empirical Adequacy: Learning Progressions as Models and Their Value for Teachers

&
Pages 1-37 | Published online: 24 Jan 2019
 

Abstract

As scientific models of student thinking, learning progressions (LPs) have been evaluated in terms of one important, but limited, criterion: fit to empirical data. We argue that LPs are not empirically adequate, largely because they rely on problematic assumptions of theory-like coherence in students’ thinking. Through an empirical investigation of physics teachers’ interactions with an LP-based score report, we investigate 2 other criteria of good models: utility and generativity. When interacting with LP-based materials, teachers often adopted finer-grained perspectives (in contrast to the levels-based perspective of the LP itself) and used these finer-grained perspectives to formulate more specific, actionable instructional ideas than when they reasoned in terms of LP levels. However, although teachers did not use the LP-based materials in ways envisioned by LP researchers, the teachers’ interactions with the score reports embodied how philosophers envision the fruitful use of good models of dynamic, complex systems. In particular, teachers took a skeptical, inquiring stance toward the LP, using it as an oversimplified starting place for generating and testing hypotheses about student thinking and using concepts from the model in ways that moved beyond the knowledge available in the LP. Thus, despite—and perhaps even because of—their empirical inadequacy, LPs have the potential to serve teachers as productive models in ways not envisioned by LP researchers: as tools for knowledge generation.

Notes

1 The term learning progressions has also been defined in more subject-general literature (e.g., Heritage, Citation2008) to refer to progressions of content, rather than of student ideas. The argument we develop in this article is particular to the way LPs are defined in the science education literature. (For a comparison of differences between the two definitions and implications for teachers’ use of LPs, see Alonzo, Citation2018.)

2 A comparison between mathematics and science education is beyond the scope of this article. Although we acknowledge differences between the two constructs, for the point being made here, learning trajectories in mathematics can be considered similar to learning progressions in science.

3 This point addresses a debate in the LP literature about what constitutes validity arguments for LPs and associated materials. We view studies of teachers’ use of LPs (discussed previously) as reflecting a focus on consequential validity: instead of attending only to empirical adequacy (construct validity), researchers are also attending to how well LPs support their intended uses (consequential validity). For example, Songer, Kelcey, and Gotwals (Citation2009) claimed that it is not possible to directly evaluate LPs; rather, LPs “serve as a resource for the generation of products”—such as curriculum materials—“that are constructed from the LP” (p. 612). However, although aligning ourselves with this move beyond construct validity, we argue that between the construct validity of the LP itself and its efficacy for producing better results (e.g., student learning) lies the black box of how teachers' use of the LP leads to those results. In advocating the need to look inside the black box, we draw on discussions of validity for another use of LPs: informing assessments. Following Kane and Bejar's (Citation2014) discussion of validity claims for LP-based assessment, we argue that an LP validity argument must also include a theory of action, which describes how the LP is to be used, accompanied by evidence that the LP is used in those ways.

4 Pseudonyms are used for all teachers.

5 In the most recent design cycle (2014-2015), some of the concept inventory items on momentum and energy were replaced with OMC items, as LPs for these topics were developed.

6 Standard protocols were used, including ensuring familiarity with think-aloud procedures prior to teachers’ work with the score reports through modeling by the interviewer and teacher practice with feedback from the interviewer.

7 These probabilities were calculated using the attribute hierarchy method (AHM; Gierl, Leighton, & Hunka, Citation2007), as part of an investigation of this approach (Briggs, Circi, & McClarty, Citation2014) for modeling the unique features of the OMC item type (Briggs & Alonzo, Citation2012). The conceptual basis of the AHM was explained in the materials teachers received prior to the second interview.

8 The cognition underlying classroom instruction is known to be tacit, highly situated (e.g., Kagan, Citation1990; Korthagen & Kessels, Citation1999), and thus likely to be only partially represented in a clinical setting (e.g., Loughran, Milroy, Berry, Gunstone, & Mulhall, Citation2001). Although teachers did provide some specific examples of responses to student thinking in their screening interviews, these descriptions also tended to be vague, even though other evidence (i.e., researcher recommendations) suggested that teachers were skilled at responding in their own classrooms. Therefore, our data likely underestimates teachers’ ability to actually respond (as opposed to describe responses) to LP-based information. However, as this underestimation applies to all proposed instructional responses in the interview setting, we are still able to compare responses when teachers use LP-levels-based reasoning to those when other models are used.

Additional information

Funding

This research was supported in part by grants from NCS Pearson, Inc. and the National Science Foundation (Grant No. DRL-1253036). Any opinions, findings, and conclusions or recommendations expressed in this paper are those of the authors and do not necessarily reflect the views of the funding agencies. We acknowledge the assistance of Elizabeth Xeng de los Santos, who conducted the interviews for the empirical study.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 53.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 460.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.