569
Views
10
CrossRef citations to date
0
Altmetric
Articles

Reading Proficiency and Comparability of Mathematics and Science Scores for Students From English and Non-English Backgrounds: An International Perspective

, , , , , & show all

REFERENCES

  • Abedi, J. (2004). The no child left behind act and English language learners: Assessment and accountability issues. Educational Researcher, 33(1), 4–14.
  • Abedi, J. (2014). The use of computer technology in designing appropriate test accommodations for English language learners. Applied Measurement in Education, 27, 261–272.
  • Abedi, J., & Gándara, P. (2006). Performance of English language learners as a subgroup in large-scale assessment: Interaction of research and policy. Educational Measurement: Issues and Practice, 25(4), 36–46.
  • Abedi, J., Hofstetter, C., & Lord, C. (2004). Assessment accommodations for English language learners: Implications for policy-based empirical research. Review of Educational Research, 74(1), 1–28.
  • Abedi, J., Leon, S., & Mirocha, J. (2003). Impact of student’s language background on content-based assessment: Analyses of extant data (CSE Tech. Rep. No. 603). Los Angeles: University of California, National Center for Research on Evaluation, Standards, and Student Testing.
  • Abedi, J., & Lord, C. (2001). The language factor in mathematics tests. Applied Measurement in Education, 14(3), 219–234.
  • Au, K. (2013). Multicultural issues and literacy achievement. Mahwah, NJ: Lawrence Erlbaum.
  • Butler, F.A., Bailey, A.L., Stevens, R., Huang, B., & Lord, C. (2004). Academic English in fifth-grade mathematics, science, and social studies textbooks (Final deliverable to IES, Contract No. R305B960002; currently available as CSE Report No. 642). Los Angeles: University of California. National Center for Research on Evaluation, Standards, and Student Testing.
  • CTB/McGraw-Hill. (1991). PARDUX [Computer software]. Monterey, CA: CTB/McGraw-Hill.
  • Ercikan, K., & McCreith, T. (2002). Effects of adaptations on comparability of test items and test scores. In D. Robitaille & A. Beaton (Eds.), Secondary analysis of the TIMSS results: A synthesis of current research (pp. 391–407). Dordrecht, The Netherlands: Kluwer Academic.
  • Ercikan, K., Roth, W.-M., & Asil, M. (in press). Cautions about uses of international assessments. Teachers College Record.
  • Ercikan, K., Roth, W.-M., Simon, M., Lyons-Thomas, J., & Sandilands, D. (2014). Assessment of linguistic minority students. Applied Measurement in Education, 27, 273–285.
  • Green, S.B., & Salkind, N.J. (2011). Using SPSS for Windows and Macintosh: Analyzing and understanding data (6th ed.). Upper Saddle River, NJ: Prentice Hall.
  • Henson, R.K. (1998, November). ANCOVA with intact groups: Don't do it! Paper presented at the annual meeting of the Mid-South Educational Research Association, New Orleans, LA.
  • Hudson, R.F., Lane, H.B., & Pullen, P.C. (2005). Reading fluency assessment and instruction: What, why, and how? The Reading Teacher, 58(8), 702–714.
  • Kane, M.T. (2013). Validating the interpretations and uses of test scores. Journal of Educational Measurement, 50(1), 1–73.
  • Kankaraš, M., & Moores, G. (2013). Analysis of cross-cultural comparability of PISA 2009 scores. Journal of Cross-Cultural Psychology, 45(3), 381–399.
  • Kopriva, R. J., Gabel, D., & Cameron, C. (2011, April). Designing dynamic and interactive assessments for English learners which directly measure targeted science constructs. Paper presented at the American Education Research Association Annual Meeting, New Orleans, LA.
  • Lane, S. (in press). Psychometric challenges in assessing English language learners and students with disabilities. Review of Research in Education.
  • Lord, F.M. (1980). Applications of item response theory to practical testing problems. Hillsdale, NJ: Erlbaum.
  • Luykx, A., Lee, O., Mahotiere, M., Lester, B., Hart, J., & Deaktor, R. (2007). Cultural and home language influences on children's responses to science assessments. The Teachers College Record, 109(4), 897–926.
  • Martiniello, M. (2008). Language and the performance of English-language learners in math word problems. Harvard Educational Review, 78(2), 333–368.
  • Maxwell, S.E., O’Callaghan, M.F., & Delaney, H.D. (1993). Analysis of covariance. In L.K. Edwards (Ed.), Applied analysis of variance in behavioral science (pp. 63–104). New York, NY: Marcel Dekker.
  • Mendelovits, J., Ramalingam, D., & Lumley, T. (2012). Print and digital reading in PISA 2009: Comparison and contrast. Retrieved from http://research.acer.edu.au/pisa/6
  • Messick, S. (1989). Meaning and values in test validation: The science and ethics of assessment. Educational Researcher, 18(2), 5–11.
  • Mislevy, R.J. (1991). Randomization-based inference about latent variables from complex samples. Psychometrika, 56(2), 177–196.
  • Monseur, C., & Adams, R. (2009). Plausible values: How to deal with their limitations. Journal of Applied Measurement, 10(3), 320–334.
  • Muraki, E. (1992). A generalized partial credit model: Application of an EM algorithm. Applied Psychological Measurement, 16(2), 159–176.
  • Nguyen, H.T., & Cortes, M. (2013). Focus on middle school: Teaching mathematics to ELLs: Practical research-based methods and strategies. Childhood Education, 89(6), 392–395.
  • Noble, T., Risebery, A., Suarez, C., Warren, B., & O’Connor, C. (2014). Science assessments and English language learners: Validity evidence based on response processes. Applied Measurement in Education, 27, 248–260.
  • Oliveri, M.E., Ercikan, K., & Zumbo, B. (2013). Analysis of sources of latent class DIF in international assessments. International Journal of Testing, 13, 272–293.
  • Oliveri, M.E., Olson, B.F., Ercikan, K., & Zumbo, B.D. (2012). Methodologies for investigating item-and test-level measurement equivalence in international large-scale assessments. International Journal of Testing, 12(3), 203–223.
  • Organisation for Economic Co-Operation and Development. (2010a). PISA 2009 results: What students know and can do—Student performance in reading, mathematics and science (vol. ). Retrieved from http://dx.doi.org/10.1787/9789264091450-en
  • Organisation for Economic Co-Operation and Development. (2010b). PISA 2009 assessment framework: Key competencies in reading, mathematics and science. Retrieved from http://www.oecd.org/pisa/pisaproducts/44455820.pdf
  • Organisation for Economic Co-Operation and Development. (2012). PISA 2009 technical report. Retrieved from http://dx.doi.org/10.1787/9789264167872-en
  • Penfield, R.D., & Lee, O. (2010). Test-based accountability: Potential benefits and pitfalls of science assessment with student diversity. Journal of Research in Science Teaching, 47(1), 6–24.
  • Roth, W.M., Oliveri, M.E., Sandilands, D.D., Lyons-Thomas, J., & Ercikan, K. (2013). Investigating linguistic sources of differential item functioning using expert think-aloud protocols in science achievement tests. International Journal of Science Education, 35(4), 546–576.
  • Schofield, L.S., Junker, B., Taylor, L.J., & Black, D.A. (in press). Predictive inference using latent variables with covariates. Psychometrika.
  • Shadish, W.R., Cook, T.D., & Campbell, D.T. (2002). Experimental and quasi-experimental designs for generalized causal inference. New York, NY: Houghton Mifflin.
  • Solano-Flores, G. (2008). Who is given tests in what language by whom, when, and where? The need for probabilistic views of language in the testing of English language learners. Educational Researcher, 37(4), 189–199.
  • Solano-Flores, G. (2009). The testing of English language learners as a stochastic process: Population misspecification, measurement error, and overgeneralization. In K. Ercikan & W.-M. Roth (Eds.), Generalizing from educational research (pp. 33–48). New York, NY: Routledge Publishing.
  • Solano-Flores, G. (2014). Probabilistic approaches to examining linguistic features of test items and their effect on the performance of English language learners. Applied Measurement in Education, 27, 233–247.
  • Solano-Flores, G., & Trumbull, E. (2003). Examining language in context: The need for new research and practice paradigms in the testing of English-language learners. Educational Researcher, 32(2), 3–13.
  • Vale, C., Weaven, M., Davies, A., Hooley, N., Davidson, K., & Loton, D. (2013). Growth in literacy and numeracy achievement: Evidence and explanations of a summer slowdown in low socio-economic schools. The Australian Educational Researcher, 40(1), 1–25.
  • Yen, W.M. (1993). Scaling performance assessments: Strategies for managing local item dependence. Journal of Educational Measurement, 30, 187–213.
  • Zumbo, B.D. (2003). Does item-level DIF manifest itself in scale-level analyses? Implications for translating language tests. Language Testing, 20(2), 136–147.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.