1,120
Views
18
CrossRef citations to date
0
Altmetric
Articles

Engineering a Twenty-First Century Reading Comprehension Assessment System Utilizing Scenario-Based Assessment Techniques

, , &
Pages 1-23 | Received 15 Jan 2018, Accepted 19 Nov 2018, Published online: 12 Mar 2019

References

  • August, D., Francis, D. J., Hsu, H. Y. A., & Snow, C. E. (2006). Assessing reading comprehension in bilinguals. The Elementary School Journal, 107(2), 221–238. doi:10.1086/510656
  • Bennett, R. E. (2011). CBAL: Results from piloting innovative K–12 assessments (RR-11-23). Princeton, NJ: Educational Testing Service.
  • Bennett, R. E. (2015). The changing nature of educational assessment. Review of Research in Education, 39(1), 370–407. doi:10.3102/0091732X14554179
  • Bock, R. D., & Zimowski, M. F. (1997). Multiple group IRT. In W. J. van der Linden & R. K. Hambleton (Eds.), Handbook of modern item response theory (pp. 433–448). New York, NY: Springer-Verlag.
  • Braasch, J. L. G., Braten, I. & McCrudden, M. T. (eds.). (2018). The handbook of multiple source use. New York, NY: Taylor & Francis/Routledge.
  • Britt, M., & Rouet, J. F. (2012). Learning with multiple documents: Component skills and their acquisition. In J. R. Kirby & M. J. Lawson (Eds.), Enhancing the quality of learning: Dispositions, instruction, and learning processes (pp. 276–314). Cambridge, England: Cambridge University Press.
  • Chudowsky, N., Glaser, R., & Pellegrino, J. W. (2001). Knowing what students know: The science and design of educational assessment. Washington, DC: National Academy Press.
  • Coiro, J. (2009). Rethinking online reading assessment. Educational Leadership, 66, 59–63.
  • Coiro, J. (2012). Understanding dispositions toward reading on the Internet. Journal of Adolescent & Adult Literacy, 55, 645–648. doi:10.1002/JAAL.00077
  • Common Core State Standards Initiative. (2018). English language arts standards. Retrieved from http://www.corestandards.org/ELA-Literacy/introduction/key-design-consideration/
  • Dorans, N. J., & Kulick, E. (2006). Differential item functioning on the Mini-Mental State Examination. An application of the Mantel-Haenszel and standardization procedures. Medical Care, 44(11), 107–114.
  • Ercikan, K., & Pellegrino, J. W. (2017). Validation of score meaning for the next generation of assessments: The use of response processes. New York, NY: Taylor & Francis.
  • Gearhart, M., & Herman, J. L. (1998). Portfolio assessment: Whose work is it? Issues in the use of classroom assignments for accountability. Educational Assessment, 5(1), 41–55. doi:10.1207/s15326977ea0501_2
  • Goldman, S. R., Britt, M. A., Brown, W., Cribb, G., George, M., Greenleaf, C.,…Project READI. (2016). Disciplinary literacies and learning to read for understanding: A conceptual framework for disciplinary literacy. Educational Psychologist, 51, 219–246. doi:10.1080/00461520.2016.1168741
  • Gordon Commission. (2013). To assess, to teach, to learn: A vision for the future of assessment. Retrieved from http://www.gordoncommission.org/rsc/pdfs/gordon_commission_technical_report.pdf
  • International Association for the Evaluation of Educational Achievement. (2013). ePirls online reading 2016. Retrieved from http://www.iea.nl/fileadmin/user_upload/Studies/PIRLS_2016/ePIRLS_2016_Brochure.pdf
  • Keenan, J. M., Betjemann, R. S., & Olson, R. K. (2008). Reading comprehension tests vary in the skills they assess: Differential dependence on decoding and oral comprehension. Scientific Studies of Reading, 12(3), 281–300. doi:10.1080/10888430802132279
  • Kendeou, P., & O’Brien, E. J. (2014). The knowledge revision component (KReC) framework: Processes and mechanisms. In D. Rapp & J. Braasch (Eds.), Processing inaccurate information: Theoretical and applied perspectives from cognitive science and the educational sciences. Cambridge, MA: MIT Press.
  • Kolen, M. J., & Brennan, R. L. (2014). Test equating, scaling, and linking: Methods and practices. New York, NY: Springer-Verlag.
  • Koretz, D., Stecher, B., Klein, S., & McCaffrey, D. (1994). The Vermont portfolio assessment program: Findings and implications. Educational Measurement: Issues and Practice, 13(3), 5–16.
  • LaRusso, M., Kim, H. Y., Selman, R., Uccelli, P., Dawson, T., Jones, S.,…Snow, C. (2016). Contributions of academic language, perspective taking, and complex reasoning to deep reading comprehension. Journal of Research on Educational Effectiveness, 9(2), 201–222. doi:10.1080/19345747.2015.1116035
  • Lee, C. D., & Spratley, A. (2010). Reading in the disciplines: The challenges of adolescent literacy. New York, NY: Carnegie Corporation of New York.
  • Leu, D. J., Kinzer, C. K., Coiro, J., Castek, J., & Henry, L. A. (2017). New literacies: A dual-level theory of the changing nature of literacy, instruction, and assessment. Journal of Education, 197(2), 1–18. doi:10.1177/002205741719700202
  • Lord, F. M., & Novick, M. R. (1968). Statistical theories of mental test scores. Reading, MA: Addison-Welsley Publishing.
  • MacGinitie, W. H., MacGinitie, R. K., Maria, K., Dreyer, L. G., & Hughes, K. E. (2000). Gates-MacGinitie Reading Tests® Fourth Edition. Retrieved from http://www.hmhco.com/hmh-assessments/reading/gmrt
  • Magliano, J. P., McCrudden, M. T., Rouet, J.-F., & Sabatini, J. (2018). The modern reader: Should changes to how we read affect research and theory? In M. F. Schober, M. A. Britt, & D. N. Rapp (Eds.), Handbook of discourse processes (2nd ed., pp. 342–361). New York, NY: Routledge.
  • Magliano, J. P., Millis, K., Ozuru, Y., & McNamara, D. S. (2007). A multidimensional framework to evaluate reading assessment tools. In D. S. McNamara (Ed.), Reading comprehension strategies: Theories, interventions, and technologies (pp. 107–136). New York, NY: Psychology Press.
  • Mantel, N., & Haenszel, W. (1959). Statistical aspects of the analysis of data from retrospective studies of disease. Journal of the National Cancer Institute, 22(4), 719–748.
  • McCrudden, M. T., & Schraw, G. (2007). Relevance and goal-focusing in text processing. Educational Psychology Review, 19(2), 113–139. doi:10.1007/s10648-006-9010-7
  • McNamara, D. S. (2007). Reading comprehension strategies: Theories, interventions, and technologies. Mahwah, NJ: Erlbaum.
  • McNamara, D. S., Graesser, A., & Louwerse, M. (2012). Sources of text difficulty: Across genres and grades. Lanham, MD: R&L Education.
  • Metzger, M. J. (2007). Making sense of credibility on the web: Models for evaluating online information and recommendations for future research. Journal of the American Society for Information Science and Technology, 58, 2078–2091. doi:10.1002/asi.20672
  • Mislevy, R. J. (2008). How cognitive science challenges the educational measurement tradition. Measurement: Interdisciplinary Research and Perspectives, 6, 124. doi:10.1080/15366360802131635
  • Mislevy, R. J., & Haertel, G. D. (2007). Implications of evidence‐centered design for educational testing. Educational Measurement: Issues and Practice, 25(4), 6–20. doi:10.1111/j.1745-3992.2006.00075.x
  • Mullis, I. V., Martin, M. O., Kennedy, A. M., Trong, K. L., & Sainsbury, M. (2009). PIRLS 2011 assessment framework. Chestnut Hill, MA: Boston College.
  • National Governor’s Association Center for Best Practices & Council of Chief State School Officers (NGA & CCSSO). (2010). Common core state standards initiative: About the standards. Retrieved from http://www.corestandards.org/about-the-standards/
  • National Research Council. (2000). How people learn: Brain, mind, experience, and school: Expanded edition. Washington, DC: National Academies Press.
  • O’Reilly, T., & Sabatini, J. (2013). Reading for understanding: How performance moderators and scenarios impact assessment design (RR-13-31). Princeton, NJ: Educational Testing Service.
  • O’Reilly, T., & Sheehan, K. M. (2009). Cognitively based assessment of, for, and as learning: A framework for assessing reading competency (RR-09-26). Princeton, NJ: Educational Testing Service.
  • O’Reilly, T., Deane, P., & Sabatini, J. (2015). Building and sharing knowledge key practice: What do you know, what don't you know, what did you learn? (RR-15-24). Princeton, NJ: Educational Testing Service.
  • O’Reilly, T., Feng, G., Sabatini, J., Wang, Z., & Gorin, J. (2018). How do people read the passages during a reading comprehension test? The effect of reading purpose on text processing behavior. Educational Measurement, 28(4), 277–295.
  • O’Reilly, T., Weeks, J., Sabatini, J., Halderman, L., & Steinberg, J. (2014). Designing reading comprehension assessments for reading interventions: How a theoretically motivated assessment can serve as an outcome measure. Educational Psychology Review, 26, 403–424. doi:10.1007/s10648-014-9269-z
  • Organisation for Economic Co-operation and Development. (2009a). PIAAC literacy: A conceptual framework. Retrieved from http://www.oecd-ilibrary.org/content/workingpaper/220348414075
  • Organisation for Economic Co-operation and Development. (2009b). PISA 2009 assessment framework: Key competencies in reading, mathematics and science. Retrieved from http://www.oecd.org/pisa/pisaproducts/44455820.pdf
  • Organisation for Economic Co-operation and Development. (2017). PISA 2015 technical report. Retrieved from http://www.oecd.org/pisa/data/2015-technical-report/
  • Ozuru, Y., Rowe, M., O’Reilly, T., & McNamara, D. S. (2008). Where's the difficulty in standardized reading tests: The passage or the question? Behavior Research Methods, 40(4), 1001–1015. doi:10.3758/BRM.40.4.1001
  • Partnership for 21st Century Skills. (2008). 21st century skills and English map. Retrieved from http://www.p21.org/storage/documents/21st_century_skills_english_map.pdf
  • Partnership for Assessment of Readiness for College and Careers (PARCC). (2018). Grade 11 English language arts/literacy practice tests. Retrieved from https://parcctrng.testnav.com
  • Perfetti, C. A., & Adlof, S. M. (2012). Reading comprehension: A conceptual framework from word meaning to text meaning. In J. Sabatini, E. Albro, & T. O’Reilly (Eds.), Measuring up: Advances in how to assess reading ability (pp. 3–20). Lanham, MD: Rowman & Littlefield Education.
  • Perfetti, C., & Stafura, J. (2014). Word knowledge in a theory of reading comprehension. Scientific Studies of Reading, 18(1), 22–37. doi:10.1080/10888438.2013.827687
  • Reckase, M. D. (2009). Multidimensional item response theory. New York, NY: Springer.
  • Rouet, J. F. (2006). The skills of document use: From text comprehension to web-based learning. New York, NY: Psychology Press.
  • Rouet, J. F., & Britt, M. A. (2011). Relevance processes in multiple document comprehension. In G. Schraw, M. T. McCrudden, & J. P. Magliano (Eds.), Text relevance and learning from text (pp. 19–52). Charlotte, NC: Information Age Publishing.
  • Rupp, A. A., Ferne, T., & Choi, H. (2006). How assessing reading comprehension with multiple-choice questions shapes the construct: A cognitive processing perspective. Language Testing, 23(4), 441–474. doi:10.1191/0265532206lt337oa
  • Sabatini, J. P., Halderman, L. K., O’Reilly, T., & Weeks, J. P. (2016). Assessing comprehension in kindergarten through third grade. Topics in Language Disorders, 36(4), 334–355. doi:10.1097/TLD.0000000000000104
  • Sabatini, J., & O’Reilly, T. (2013). Rationale for a new generation of reading comprehension assessments. In B. Miller, L. E. Cutting, & P. McCardle (Eds.), Unraveling the behavioral, neurobiological, and genetic components of reading comprehension (pp. 100–111). Baltimore, MD: Paul H. Brookes Publishing Co., Inc.
  • Sabatini, J., Albro, E., & O’Reilly, T. (2012). Measuring up: Advances in how we assess reading ability. Lanham, MD: R&L Education.
  • Sabatini, J., Bruce, K., Steinberg, J., & Weeks, J. (2015). SARA reading components tests, RISE forms: Technical adequacy and test design (RR-15-32). Princeton, NJ: Educational Testing Service.
  • Sabatini, J., O’Reilly, T., & Deane, P. (2013). Preliminary reading literacy assessment framework: Foundation and rationale for assessment and system design (RR-13-30). Princeton, NJ: Educational Testing Service.
  • Sabatini, J., O’Reilly, T., Wang, Z., & Dreier, K. (2018). Scenario-based assessment of multiple source use. In J. L. G. Braasch, I. Braten, & M. T. McCrudden (Eds.), The handbook of multiple source use (pp. 447–465). New York, NY: Taylor & Francis/Routledge.
  • Shanahan, T., & Shanahan, C. (2008). Teaching disciplinary literacy to adolescents: Rethinking content-area literacy. Harvard Educational Review, 78(1), 40–59. doi:10.17763/haer.78.1.v62444321p602101
  • Smarter Balanced Assessment Consortium. (2018). Sample items. Retrieved from: http://sampleitems.smarterbalanced.org/
  • Van den Broek, P., Lorch, R. F., Linderholm, T., & Gustafson, M. (2001). The effects of readers’ goals on inference generation and memory for texts. Memory & Cognition, 29, 1081–1087. doi:10.3758/BF03206376
  • von Davier, M. (2018). Multidimensional latent trait modelling (MDLTM) [Software program]. Princeton, NJ: Educational Testing Service.
  • Wang, Z., Sabatini, J., O’Reilly, T., & Feng, G. (2017). How individual differences interact with task demands in text processing. Scientific Studies of Reading, 21(2), 165–178. doi:10.1080/10888438.2016.1276184

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.