441
Views
1
CrossRef citations to date
0
Altmetric
Special Topic Section on Unlocking the Promise of Multitiered Systems of Support (MTSS) for Linguistically Diverse Students

Predicting Interim Assessment Outcomes Among Elementary-Aged English Learners Using Mathematics Computation, Oral Reading Fluency, and English Proficiency Levels

, , , &
Pages 498-516 | Received 30 Oct 2020, Accepted 03 Feb 2022, Published online: 20 May 2022
 

Abstract

The current study examined the validity of curriculum-based measures (CBM) in mathematics computation (M-COMP) and oral reading fluency (R-CBM) in predicting spring mathematics and reading performance level and performance risk (>1 SD below the national mean) among students classified as English Learners (ELs). Additionally, the current study assessed the incremental predictive value of English language proficiency (ELP) beyond CBM performance. The results indicated that ELP explains a significant portion of variability above M-COMP and R-CBM and increases the accuracy of predicting at-risk performance status on spring measures of mathematics and reading. The findings highlight the challenges of assessing the predictive accuracy of M-COMP and R-CBM among students classified as ELs, as well as the extent to which comprehensive measures of ELP account for variance in both performance level and at-risk status beyond CBMs. The implications for school data-based decision-making for language-minoritized students and directions for future research are discussed.

Impact Statement

Equity in Response-to-Intervention (RTI) is predicated on accurate measurement of skills within universal screening. The current study’s findings suggest that CBMs alone explain less variance and are less predictive of academic performance than when combined with English language proficiency scores. The predictive accuracy of R-CBM and M-COMP varied between students classified as ELs and non-ELs but in only very limited circumstances were these measurable differences. These results indicated that although CBMs are an efficient system of screening among non-ELs, it is also necessary to consider students’ ELP levels when making decisions within RTI models.

Associate Editor:

DISCLOSURE STATEMENT

The authors have no conflicts of interest to report.

Notes

1 The category of EL comprises only a subset of students who could be multilingual or have differing levels of proficiency in multiple languages. When describing studies that focus on linguistically diverse students (not limited to ELs), we use the term “language minoritized”, unless the sample in those studies consists only of students who meet the federal definition of EL (in which case we use the term EL).

2 It is important to note that ACCESS scores are administered in the winter and scores are typically reported back to schools in late spring. As a result, ACCESS scores from the prior year typically inform ELL status in any given current year. However, students were not followed consistently across school years in the current study, so we use ACCESS scores from the student’s current year (i.e., Grade 1 ACCESS scores are the scores students obtained in the winter of Grade 1). As we note later in the limitations, this is an important practical barrier to using ACCESS data, as schools would not know students’ current-year scores until the end of that year. Although all predictive validity studies are necessarily retrospective, the fact that schools would likely not be able to use current-year ACCESS scores for data-based decision-making is a significant practical limitation.

3 The roc function and a simple logistic regression analysis produce identical ROC curves. Moreover, one robust logistic regression did not converge properly, so we used regular (nonrobust) logistic regression in that case. However, robust and nonrobust models resulted in identical AUCs. Consequently, using the roc function produced results identical to all other models.

Additional information

Funding

The research reported here was supported by the Institute of Education Sciences, U.S. Department of Education, through Awards #R305B150003 to the University of Wisconsin-Madison and #R305A100585 to Craig A. Albers. The opinions expressed are those of the authors and do not represent views of the U.S. Department of Education.

Notes on contributors

Garret J. Hall

Garret J. Hall, PhD, is an assistant professor in the Department of Educational Psychology and Learning Systems at Florida State University.

Mitchell A. Markham

Elizabeth C. Moore, PhD, is a psychologist at Nationwide Children’s Hospital in Columbus, OH. She graduated with her PhD in school psychology at the Univeristy of Wisconsin-Madison.

Meghan McMackin

Meghan K. McMackin, PhD, is a licensed clinician at Austin Anxiety and OCD Specialists.

Elizabeth C. Moore

Mitchell A. Markham, PhD, is currently a school psychologist in the Madison Metropolitan School District in Madison, WI.

Craig A. Albers

Craig A. Albers, PhD, is an associate professor of school psychology in the Department of Educational Psychology at the University of Wisconsin-Madison.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.