ABSTRACT
Item stem formats can alter the cognitive complexity as well as the type of abilities required for solving mathematics items. Consequently, it is possible that item stem formats can affect the dimensional structure of mathematics assessments. This empirical study investigated the relationship between item stem format and the dimensionality of mathematics assessments. A sample of 671 sixth-grade students was given two forms of a mathematics assessment in which mathematical expression (ME) items and word problems (WP) were used to measure the same content. The effects of mathematical language and reading abilities in responding to ME and WP items were explored using unidimensional and multidimensional item response theory models. The results showed that WP and ME items appear to differ with regard to the underlying abilities required to answer these items. Hence, the multidimensional model fit the response data better than the unidimensional model. For the accurate assessment of mathematics achievement, students’ reading and mathematical language abilities should also be considered when implementing mathematics assessments with ME and WP items.
Notes
1 The R code for running the unidimensional and multidimensional IRT models with the TAM package can be obtained from the corresponding author of this study.
2 We also applied the principal axis factoring (PAF) as a secondary approach. The results in terms of the number of dimensions and the amount of variance explained were nearly identical because when the number of variables is quite large, PCA and PAF yield very similar results (Thompson & Vidal-Brown, Citation2001).