441
Views
1
CrossRef citations to date
0
Altmetric
Special Topic Section on Unlocking the Promise of Multitiered Systems of Support (MTSS) for Linguistically Diverse Students

Predicting Interim Assessment Outcomes Among Elementary-Aged English Learners Using Mathematics Computation, Oral Reading Fluency, and English Proficiency Levels

, , , &
Pages 498-516 | Received 30 Oct 2020, Accepted 03 Feb 2022, Published online: 20 May 2022
 

Abstract

The current study examined the validity of curriculum-based measures (CBM) in mathematics computation (M-COMP) and oral reading fluency (R-CBM) in predicting spring mathematics and reading performance level and performance risk (>1 SD below the national mean) among students classified as English Learners (ELs). Additionally, the current study assessed the incremental predictive value of English language proficiency (ELP) beyond CBM performance. The results indicated that ELP explains a significant portion of variability above M-COMP and R-CBM and increases the accuracy of predicting at-risk performance status on spring measures of mathematics and reading. The findings highlight the challenges of assessing the predictive accuracy of M-COMP and R-CBM among students classified as ELs, as well as the extent to which comprehensive measures of ELP account for variance in both performance level and at-risk status beyond CBMs. The implications for school data-based decision-making for language-minoritized students and directions for future research are discussed.

Impact Statement

Equity in Response-to-Intervention (RTI) is predicated on accurate measurement of skills within universal screening. The current study’s findings suggest that CBMs alone explain less variance and are less predictive of academic performance than when combined with English language proficiency scores. The predictive accuracy of R-CBM and M-COMP varied between students classified as ELs and non-ELs but in only very limited circumstances were these measurable differences. These results indicated that although CBMs are an efficient system of screening among non-ELs, it is also necessary to consider students’ ELP levels when making decisions within RTI models.

Associate Editor:

DISCLOSURE STATEMENT

The authors have no conflicts of interest to report.

Notes

1 The category of EL comprises only a subset of students who could be multilingual or have differing levels of proficiency in multiple languages. When describing studies that focus on linguistically diverse students (not limited to ELs), we use the term “language minoritized”, unless the sample in those studies consists only of students who meet the federal definition of EL (in which case we use the term EL).

2 It is important to note that ACCESS scores are administered in the winter and scores are typically reported back to schools in late spring. As a result, ACCESS scores from the prior year typically inform ELL status in any given current year. However, students were not followed consistently across school years in the current study, so we use ACCESS scores from the student’s current year (i.e., Grade 1 ACCESS scores are the scores students obtained in the winter of Grade 1). As we note later in the limitations, this is an important practical barrier to using ACCESS data, as schools would not know students’ current-year scores until the end of that year. Although all predictive validity studies are necessarily retrospective, the fact that schools would likely not be able to use current-year ACCESS scores for data-based decision-making is a significant practical limitation.

3 The roc function and a simple logistic regression analysis produce identical ROC curves. Moreover, one robust logistic regression did not converge properly, so we used regular (nonrobust) logistic regression in that case. However, robust and nonrobust models resulted in identical AUCs. Consequently, using the roc function produced results identical to all other models.

Additional information

Funding

The research reported here was supported by the Institute of Education Sciences, U.S. Department of Education, through Awards #R305B150003 to the University of Wisconsin-Madison and #R305A100585 to Craig A. Albers. The opinions expressed are those of the authors and do not represent views of the U.S. Department of Education.

Notes on contributors

Garret J. Hall

Garret J. Hall, PhD, is an assistant professor in the Department of Educational Psychology and Learning Systems at Florida State University.

Mitchell A. Markham

Elizabeth C. Moore, PhD, is a psychologist at Nationwide Children’s Hospital in Columbus, OH. She graduated with her PhD in school psychology at the Univeristy of Wisconsin-Madison.

Meghan McMackin

Meghan K. McMackin, PhD, is a licensed clinician at Austin Anxiety and OCD Specialists.

Elizabeth C. Moore

Mitchell A. Markham, PhD, is currently a school psychologist in the Madison Metropolitan School District in Madison, WI.

Craig A. Albers

Craig A. Albers, PhD, is an associate professor of school psychology in the Department of Educational Psychology at the University of Wisconsin-Madison.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 53.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 149.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.