814
Views
1
CrossRef citations to date
0
Altmetric
Articles

Student engagement with evidence-based supports for literacy on a digital platform

, &
Pages 177-187 | Received 09 May 2020, Accepted 06 Sep 2020, Published online: 16 Oct 2020
 

Abstract

There is a persistent gap between education research and practice, including in the design of educational technology. This study bridges this gap and examines our partnership with a literacy platform to implement evidence-based digital features that support different learner needs. We examined students’ (N = 1857; Grades 1-6) use of these new optional features (e.g., audio supports, text magnification, annotation) when completing assignments online and found that 92% of students tried at least one. Importantly, we also found that students showed greater engagement with harder assignments when they used the features. Use of the support features did not differ by student characteristics such as reading proficiency, special education, or socioeconomic status, suggesting that developing research-informed features could potentially benefit all students when they need extra scaffolding.

Acknowledgements

We would like to thank the school district who partnered with us and our collaborators at ReadWorks who supported this research: Kathy Bloomfield, Jeff Fleishhacker, Ruben Kleiman, and Susanne Nobles. We would like to thank our colleagues at Digital Promise who supported this project: Andrew Krumm, Julie Neisler, and Wendy Xiao and the funders of this project: Oak Foundation, Chan-Zuckerberg Initiative, and Overdeck Family Foundation.

Notes

1 ReadWorks gives teachers the option to enable audio support for their students, allowing students to listen to the article read aloud. Teachers can enable audio either for particular students or for their whole class. The audio is either human-read or computerized text to speech of the written passage.

2 One possible objection might be that attempting questions on this platform is simply another type of “feature,” and that these results might simply indicate that students who use some features are more likely to use other features. Two patterns of results contradict this interpretation. First, while article difficulty reduced the likelihood of question attempts as described above, article difficulty increased the likelihood of using supportive features such as the split-screen feature (z = 2.65, p < .01). Second, correlations between different feature use was relatively low, ranging from -0.2 to 0.27. 

Additional information

Notes on contributors

Medha Tare

Medha Tare is the Director of Research for the Learner Variability Project at Digital Promise.

Alison R. Shell

Alison R. Shell is a Research Scientist for the Learner Variability Project at Digital Promise.

Scott R. Jackson

Scott R. Jackson is a Senior Data Scientist at TeamWorx Security.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.