814
Views
1
CrossRef citations to date
0
Altmetric
Articles

Student engagement with evidence-based supports for literacy on a digital platform

, &
Pages 177-187 | Received 09 May 2020, Accepted 06 Sep 2020, Published online: 16 Oct 2020
 

Abstract

There is a persistent gap between education research and practice, including in the design of educational technology. This study bridges this gap and examines our partnership with a literacy platform to implement evidence-based digital features that support different learner needs. We examined students’ (N = 1857; Grades 1-6) use of these new optional features (e.g., audio supports, text magnification, annotation) when completing assignments online and found that 92% of students tried at least one. Importantly, we also found that students showed greater engagement with harder assignments when they used the features. Use of the support features did not differ by student characteristics such as reading proficiency, special education, or socioeconomic status, suggesting that developing research-informed features could potentially benefit all students when they need extra scaffolding.

Acknowledgements

We would like to thank the school district who partnered with us and our collaborators at ReadWorks who supported this research: Kathy Bloomfield, Jeff Fleishhacker, Ruben Kleiman, and Susanne Nobles. We would like to thank our colleagues at Digital Promise who supported this project: Andrew Krumm, Julie Neisler, and Wendy Xiao and the funders of this project: Oak Foundation, Chan-Zuckerberg Initiative, and Overdeck Family Foundation.

Notes

1 ReadWorks gives teachers the option to enable audio support for their students, allowing students to listen to the article read aloud. Teachers can enable audio either for particular students or for their whole class. The audio is either human-read or computerized text to speech of the written passage.

2 One possible objection might be that attempting questions on this platform is simply another type of “feature,” and that these results might simply indicate that students who use some features are more likely to use other features. Two patterns of results contradict this interpretation. First, while article difficulty reduced the likelihood of question attempts as described above, article difficulty increased the likelihood of using supportive features such as the split-screen feature (z = 2.65, p < .01). Second, correlations between different feature use was relatively low, ranging from -0.2 to 0.27. 

Additional information

Notes on contributors

Medha Tare

Medha Tare is the Director of Research for the Learner Variability Project at Digital Promise.

Alison R. Shell

Alison R. Shell is a Research Scientist for the Learner Variability Project at Digital Promise.

Scott R. Jackson

Scott R. Jackson is a Senior Data Scientist at TeamWorx Security.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 53.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 176.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.