ABSTRACT
In this article, we examine the sociopolitical implications of AI technologies as they are integrated into writing instruction and assessment. Drawing from new materialist and Black feminist thought, we consider how learning analytics platforms for writing are animated by and through entanglements of algorithmic reasoning, state standards and assessments, embodied literacy practices, and sociopolitical relations. We do a close reading of research and development documents associated with Essay Helper, a machine learning platform that provides formative feedback on student writing based on standards-aligned rubrics and training data. In particular, we consider the performative acts of the algorithm in the Essay Helper platform – both in the ways that reconstitutes material-discursive relations of difference, and its implications for transactions of teaching and learning. We argue that, through these processes, the algorithms function as racializing assemblages, and conclude by suggesting pathways toward alternative futures that reconfigure the sociopolitical relations the platform inherits.
Disclosure statement
No potential conflict of interest was reported by the authors.
Notes on contributors
Ezekiel Dixon-Román is an associate professor in the School of Social Policy & Practice at the University of Pennsylvania.
T. Philip Nichols is an assistant professor in the department of Curriculum and Instruction at Baylor University.
Ama Nyame-Mensah is a University of Pennsylvania-affiliated researcher.
Notes
1 While AI refers to theories and approaches to produce computer generated tasks of human intelligence, machine learning is a subset of AI that extracts information from data patterns in order to enable an automated process of algorithmic decision making. We use these terms interchangeably throughout the paper.
2 Per agreement, we used pseudonyms (e.g., BeowulfEd, Essay Helper, & WriteGuide) for the name of the educational technology company and its learning analytics platform in order to maintain anonymity.