781
Views
0
CrossRef citations to date
0
Altmetric
Editorial

Challenging the inequitable impacts of edtech

ORCID Icon, ORCID Icon & ORCID Icon

New forms of bias, injustice, discrimination and inequality have become harmful characteristics of contemporary digital landscapes. In 1996, Batya Friedman and Helen Nissenbaum developed a framework for identifying bias in computer systems. From their analysis of computer systems, they proposed three categories of bias: ‘Preexisting bias has its roots in social institutions, practices and attitudes’ (Friedman and Nissenbaum Citation1996, 332). This exists when computer systems embody racist, classist, heteropatriarchal or other injustices that exist independently of – and before – the system was developed. ‘Technical bias arises from technical constraints or considerations’ (ibid.). This form of bias originates from limits in the design of the hardware or software, from decontextualised algorithms, or from flawed quantification of qualitative data. ‘Emergent bias arises in a context of use’ (ibid.). It emerges when the computer systems are used in real contexts with people who may not be the same users as those imagined during the design process. For instance, when they have different values, practices, knowledges, or, to take recent facial recognition systems, when users ‘look different’ to the training data (Buolamwini and Gebru Citation2018).

In the almost thirty years since Friedman and Nissenbaum’s article, a strong body of research has emerged on educational technology and ‘bias’ (or, depending on the authors’ theoretical framework, on edtech and injustice, inequality, inequity, oppression, racism, discrimination, exclusion) (Swist and Gulson Citation2023). Studies of preexisting bias have built on a rich tradition of research on the reproduction of structural inequality in educational institutions. These studies have shown how structural inequalities are now also being reproduced through the hardware, software and infrastructures for learning; through algorithmic injustice, discriminatory forms of datafication and what is now called artificial intelligence (Eynon Citation2024). Investigating the design of edtech, other studies have explored the technical bias encoded into the systems, exploring how the design of, for instance, interfaces, platforms, learning analytics and automated processes create or exacerbate inequalities (Macgilchrist Citation2024).

At Learning, Media and Technology, we receive, however, surprisingly few manuscripts on the ‘emergent bias’ that appears in contexts of use. It is difficult to draw on the kinds of qualitative, in-depth, cultural, sociological, historical and arts-based methods used in the LMT community to ‘capture’ inequalities as they play out in contexts of use. The goal here is not to compare an experimental group with a control group, conduct large-scale assessments of test outcomes, or use measurements of unequal outcomes as a means to develop technical ‘de-biasing’ solutions or ethical standards (Sahlgren Citation2023). Rather, research relevant to the journal should respond to the challenge of examining the ‘effects of take-up, use and meaning-making’ that arise when diverse young people, their families, teachers and schools encounter and experience technologies of datafication, automation and AI in specific contexts (Pangrazio and Sefton-Green Citation2022, 8).

Some recent articles in the journal have developed inventive methods to explore contexts of use. Kiran Vinod Bhatia, Payal Arora and Siddhi Gupta (Citation2024), for instance, conducted family ethnographies to investigate how students in low-income communities in India used edtech platforms such as Vedantu and BYJU’s in their homes. The paper focuses on how students make sense of these systems. It highlights the disconnect between the imagined user and the students in these families. For example, one student interprets BYJU’s as expecting a self-motivated learner, but describes herself as someone who does not enjoy learning. Instead, she is motivated to learn because she wants to make her teachers proud. Now that she realises that no-one at BYJU’s cares about what she learns, she cares less about her own learning. Care is also relevant for students for whom the school provides place-based support to help them deal with neglect or domestic violence at home. This aspect of public education rarely features in edtech visions of learning as a smooth route to academic and career success. More generally, students living in poverty described school as a safe space where they can enact aspirational identities that are supported by their teachers, yet which conflict with the entrenched idea that they have little chance of academic success. These students thrive on the separation of home and school life, and do not, for example, want to turn on their cameras when they study at home. The article also highlights gender issues. Some girls felt unsafe in online learning environments where other students could secretly take photos. Some families aimed to keep them safe by preventing them from attending online classes.

At issue is not only, however, how emergent bias arises in contexts of use, but also how preexisting and technical bias are navigated in contexts of use. Practice may undermine the kinds of bias encoded into edtech, with students deconstructing the norms, values, stereotypes or other aspects of dominant culture that they see reproduced in their educational technologies. An image we saw in a social media post years ago shows a primary school student deconstructing the hegemonic knowledge inscribed in a worksheet, one of the most classic forms of educational technology. The text began: ‘A family consists of a ______, a _______ and one or more ________.’ Words were provided to fill in the gaps: ‘mother’, ‘father’ and ‘children’. This student completed the sentence but also crossed out ‘A family’ and replaced it with ‘Some families’. They then pencilled in a further sentence: ‘Some families consist of two mothers or two fathers’. What would this student have done if they had encountered this text on an app? Attending to subversive practices, Bhatia, Arora, and Gupta (Citation2024) show how girls found strategies to continue learning online, by, for instance, using their brothers’ mobile phones or attending online classes together in groups.

How younger children navigate structural digital divides is explored by Karin Murris et al. (Citation2023) in a recent paper on children’s play with technology in South Africa and the UK. A key challenge for research on inequality in today’s world is that researchers will inevitably identify entrenched structural inequalities. Indeed, this paper found socio-economic, linguistic, religious, gendered, racialised and technical inequalities in the play environments in both South Africa and the UK. Access disparities have not – despite the optimistic research of the 2000s – disappeared. However, a key finding is how children in under-resourced spaces play creatively with technologies. Scenes in the article show children weaving PlayStation games with their play in paper diaries, turning drawings into games. Their use of drawings as inventive technologies defied ‘the neat boundary [between digital and non-digital] constructed by the research design’ (Murris et al. Citation2023, 551).

Murris et al.’s reflexive paper raises a crucial question for researchers of inequality: How do research designs themselves contribute to the reproduction of inequalities through a deficit framing that focuses on disadvantage? In an open letter to educational researchers, Eve Tuck (Citation2009) called for a moratorium on ‘damage-centred’ research, ‘research that intends to document peoples’ pain and brokenness to hold those in power accountable for their oppression’. This kind of research reproduces – perhaps inadvertently – dominant discriminatory understandings of people’s lives. Instead, Tuck calls for scholars to reimagine research so that the findings can be used by, for and with the communities in question. For Murris and colleagues, one approach to rethink research on learning, media, technology and inequality is to ‘begin with what children do have’ (Murris et al. Citation2023, 552).

Beginning with what children have, or with what children bring to a situated encounter with technology, suggests finding ways of creating a research space which is also a ‘third space’, however contested and difficult that term is (Hawley and Potter Citation2022). At its heart is the notion of ‘researcher dwelling’, flattening hierarchies, and accepting forms of ‘porous expertise’ that children and young people from widely different circumstances have about the material conditions of their technological use (Potter and McDougall Citation2017). Participatory research, with all its attendant caveats (Sarria-Sanz, Alencar, and Verhoeven Citation2024), means finding ways of storying the encounter between texts, artefacts and practices that seek to ‘undo the digital’ (Burnett and Merchant Citation2020). The value of such smaller-scale and more in-depth research, which involves making and recording responses to provocations and explorations, is illustrated in the two studies described above (Bhatia, Arora, and Gupta Citation2024; Murris et al. Citation2023). These approaches arguably offer ways of seeing the world which generate alterity and allow us to critically examine the learning processes at work in (post)digital times, with social actors and technology imbricated with one another. This is urgent and important because the all-encompassing worlds of hegemonic intensity baked into the design, sale and use of educational technologies blindside the multiple characteristics and identities which make up actual lived experience. In such circumstances, emergent bias is reinforced when the social actors involved are invited to misrecognise themselves as particular kinds of learners and the system perpetuates itself.

These recent publications in Learning, Media and Technology, alongside related research, foreground the urgency of examining, in up-close empirical detail, the inequitable impacts of educational technologies in diverse settings. This is all the more important as digital platforms continue proliferating into and reconfiguring sites of teaching and learning (Kavanagh, Bernhard, and Gibbons Citation2024), and particularly as AI is promoted uncritically – with little evidence or attention to contexts of use – as a global solution to problems of access, inclusion, and inequality of educational outcomes (Williamson, Molnar, and Boninger Citation2024). We welcome further submissions to the journal that examine (and challenge) the emergent biases and inequalities that are created in situated experiences with educational technologies. As always, Learning, Media and Technology welcomes research on these issues from researchers in diverse contexts using a variety of methods informed by social theory and social science, humanities, media and arts disciplines.

References

  • Bhatia, K. V., P. Arora, and S. Gupta. 2024. “Edtech Platforms from Below: A Family Ethnography of Marginalized Communities and their Digital Learning Post-Pandemic.” Learning, Media and Technology. https://doi.org/10.1080/17439884.2024.2328693.
  • Buolamwini, J., and T. Gebru. 2018. “Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification.” Proceedings of Machine Learning Research 81:1–15.
  • Burnett, C., and G. Merchant. 2020. Undoing the Digital: Sociomaterialism and Literacy Education. London: Routledge.
  • Eynon, R. 2024. “Algorithmic Bias and Discrimination through Digitalisation in Education: A Socio-Technical View.” In World Yearbook of Education 2024: Digitalisation of Education in the Era of Algorithms, Automation and Artificial Intelligence, edited by B. Williamson, J. Komljenovic, and K. N. Gulson, 245–260. London: Routledge.
  • Friedman, B., and H. Nissenbaum. 1996. “Bias in Computer Systems.” ACM Transactions on Information Systems 14 (3): 330–347. https://doi.org/10.1145/230538.230561.
  • Hawley, S., and J. Potter. 2022. “Can a Research Space Be a Third Space? Methodology and Hierarchies in Participatory Literacy Research.” In Unsettling Literacies: Directions for Literacy Research in Precarious Times, edited by C. Lee, C. Bailey, C. Burnett, and J. Rowsell, 19–31. Springer Singapore. https://doi.org/10.1007/978-981-16-6944-6_2.
  • Kavanagh, S. S., T. Bernhard, and L. K. Gibbons. 2024. “‘Someone Else in the Universe is Trying to Teach You’: Teachers’ Experiences with Platformized Instruction.” Learning, Media and Technology. https://doi.org/10.1080/17439884.2024.2337396.
  • Macgilchrist, F. 2024. “Design Justice and Educational Technology: Designing in the Fissures.” In World Yearbook of Education 2024: Digitalisation of Education in the Era of Algorithms, Automation and Artificial Intelligence, edited by B. Williamson, J. Komljenovic, and K. N. Gulson, 294–310. London: Routledge.
  • Murris, K., F. Scott, B. Stjerne Thomsen, K. Dixon, T. Giorza, J. Peers, and C. Lawrence. 2023. “Researching Digital Inequalities in Children’s Play with Technology in South Africa.” Learning, Media and Technology 48 (3): 542–555. https://doi.org/10.1080/17439884.2022.2095570.
  • Pangrazio, L., and J. Sefton-Green. 2022. “Learning to Live Well with Data: Concepts and Challenges.” In Learning to Live with Datafication: Educational Case Studies and Initiatives from Around the World, edited by L. Pangrazio and J. Sefton-Green, 1–16. London: Routledge.
  • Potter, J., and J. McDougall. 2017. Digital Media, Culture and Education: Theorising Third Space Literacies. London: Palgrave Macmillan/Springer.
  • Sahlgren, O. 2023. “The Politics and Reciprocal (re)Configuration of Accountability and Fairness in Data-Driven Education.” Learning, Media and Technology 48 (1): 95–108. https://doi.org/10.1080/17439884.2021.1986065.
  • Sarria-Sanz, C., A. Alencar, and E. Verhoeven. 2024. “Using Participatory Video for co-Production and Collaborative Research with Refugees: Critical Reflections from the Digital Place-MakersProgram.” Learning, Media and Technology 49 (2). https://doi.org/10.1080/17439884.2023.2166528.
  • Swist, T., and K. N. Gulson. 2023. “Instituting Socio-Technical Education Futures: Encounters with/Through Technical Democracy, Data Justice, and Imaginaries.” Learning, Media and Technology 48 (2): 181–186. https://doi.org/10.1080/17439884.2023.2205225.
  • Tuck, E. 2009. “Suspending Damage: A Letter to Communities.” Harvard Educational Review 79 (3): 409–428. https://doi.org/10.17763/haer.79.3.n0016675661t3n15.
  • Williamson, B., A. Molnar, and F. Boninger. 2024. Time for a Pause? Without Effective Public Oversight, AI in Schools Will do More Harm Than Good. Boulder, CO: National Education Policy Center. http://nepc.colorado.edu/publication/ai.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.