Publication Cover
Changing English
Studies in Culture and Education
Volume 31, 2024 - Issue 2
4,224
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Awareness of Artificial Intelligence as an Essential Digital Literacy: ChatGPT and Gen-AI in the Classroom

ORCID Icon

ABSTRACT

This discussion article examines the potential integration of Generative Artificial Intelligence (Gen-AI), including advanced Large-Language Models like the popular platform ChatGPT into subject English education. Following the significant public and academic attention in response to these technologies through 2023, this paper considers the transformative potential and challenges posed by Gen-AI in educational settings. Central to the discussion is the exploration of how English teachers can leverage Gen-AI to enrich student learning beyond the obvious domain of writing skills. Instead, the article foregrounds the necessity for students’ understanding of Gen-AI as an essential component of digital literacy. While acknowledging ethical concerns such as plagiarism, equity, and access, the paper presents an argument for the productive use of Gen-AI in the classroom to augment reading, viewing, and interpretation lessons. Avoiding an evangelical or dystopian view of AI, this discussion piece explores the time-critical and urgent issue of how, when, and why English can engage with the technology.

Introduction

Sophisticated forms of creative content generated by Artificial Intelligence (AI) began to receive mainstream attention during 2022 when simple to operate applications emerged that could produce images that had highly convincing levels of ‘humanness’. Although ChatGPT is not the first text-generating AI, it represented a significant breakthrough in generative AI (Gen-AI) due to its user-friendly chat-like interface. It quickly attracted mainstream attention as a novelty. This advancement led to optimistic forecasts about AI’s application across multiple sectors, including medical, legal, and educational (Ahuja Citation2019; Grassini Citation2023), though it also very quickly provoked complaints that school and university students would be able to plagiarise essays (Crawford, Cowling, and Allen Citation2023; Oravec Citation2023). For example, it was shown to create plausible-sounding text in response to prompts such as: ‘Explain the meaning of Shakespeare’s Macbeth’ (Huff Citation2023). OpenAI, the developers of ChatGPT, responded to these claims by suggesting that they would develop a kind of digital watermark that would enable readers to know if some text had been created with its platform (Hern Citation2022). Other writers were more tepid in their predictions (Grassini Citation2023; Kasneci et al. Citation2023; Metzler Citation2022), indicating that while there are obvious misuses for AI in education, it could be adopted to adjust how students learn and are assessed.

In this article, I will outline why the current – and next – generation of Gen-AI should be of interest to English teachers. I acknowledge there are some concerns with the technology, though I share McKnight’s (Citation2021) optimism that AI can be incorporated into the classroom in productive ways, such as introducing students to the practices of ‘co-composing’ with AI that will inevitably be part of their everyday lives in the workplace after leaving school (Yu Citation2023). My discussion here extends this work by foregrounding that there are opportunities to leverage the technology to improve outcomes in all aspects of subject English for students and teachers beyond writing instruction. As Ciampa, Wolfe and Bronstein (Citation2023) acknowledge, the ‘interactive’ nature of ChatGPT and other types of AI may also have motivational effects on student engagement in a variety of ‘reading, writing, and critical thinking skills’ (Citation2023, 186). However, as yet the potential for Gen-AI to augment reading and viewing lessons, assisting students to comprehend and interpret texts in sophisticated ways, has not been sufficiently explored in the literature.

While much of the public sphere and academic discussion in 2023 has revolved around ChatGPT’s impact on student writing, this article aims to broaden the narrative by exploring Gen-AI’s role in enhancing digital literacy skills in the domains of reading and visual literacy. Drawing on experiences from workshops with English teachers throughout 2023, this article is concerned with how subject English could take advantage of the technology presently available to inspire, motivate and benefit students within the boundaries of the current curriculum and within contemporary pedagogies of digital literacy (Luke et al. Citation2018; O’Mara, Auld, and Wood Citation2021). This is a pragmatic issue. I intend to reframe the discussion to problematise one of the typical manners in which subject English teaches, constructs, and ultimately assesses students’ critical literacy.

Background: an English teacher and the AI

Throughout 2023 it became commonplace for presenters, academics and others delivering public speeches to perform the party-trick of reading (or incorporating) unannounced some material generated by Chat-GPT, and then afterwards revealing ‘I didn’t write that, it was produced by AI!’ Such demonstrations would usually then follow with either a comment on the inaccuracies and problems in the text (Ciampa, Wolfe, and Bronstein Citation2023) or some expression of nervousness about how uncannily the material seemed human-like (Jimenez-Moreno Citation2023). Therefore, before proceeding, I note that this article reproduces extended conversations with ChatGPT including my prompts (or inputs) and much of the output and will indicate wherever these reproductions occur. These reproductions are necessary to unpack the metacognitive processes behind what this may mean for subject English in a contemporary critical and digital literacy. Indeed, it could even be argued that assessing the prompts that students enter into a Gen-AI is one way of diagnosing their awareness, knowledge and understanding of a text they are analysing (Bender Citation2023). This view builds upon Lawrie (Citation2023), who notes that the education response to ChatGPT follows the familiar pattern of existing digital technologies, with educators first opposing its introduction, then pushing for its inclusion before finally finding a new ‘equilibrium’ where ‘many practitioners have begun to develop new approaches to assessment that are based on the inherent differences in thinking observed between a student and an AI response generator’ (393).

As in the party-tricks above, demonstrations of Gen-AI conversations simultaneously showcase its impressive capacities, but also reveal its limitations, and more importantly the human guidance and iterative process required. So as an example, let us pretend a high school student has been set the task of writing a response to Coleridge’s 1798 poem ‘The Rime of the Ancient Mariner’ (Coleridge and Patton Citation1969). They decide to use Gen-AI to do the project, and prompt: ‘Write a two sentence summary the poem “The Rime of the Ancient Mariner” and identify the themes’. The Gen-AI responds:

‘The Rime of the Ancient Mariner’ is a narrative poem by Samuel Taylor Coleridge that tells the story of a sailor who is punished for killing an albatross, a large sea bird, by being cursed to wander the earth telling his tale of guilt and punishment. The poem explores themes of sin, redemption, and the natural world.Footnote1

They may then choose to ‘generate again’ to receive another version or alternatively they could provide some further prompting to continue the conversation. For instance, the additional prompt: ‘Okay, now write one sentence with a quote from the poem as an example of the theme of sin’ yielded the following:

One example of the theme of sin in ‘The Rhyme of the Ancient Mariner’ is when the Mariner shoots the albatross, an act of unnecessary violence that brings upon him and his crew a curse of death and suffering. This act of sin against nature ultimately leads to the Mariner’s personal redemption through his guilt and eventual spiritual growth.

But this may not be quite enough. So the student may prompt: ‘I wanted a quote. Rewrite the analysis including a direct quote from the poem’:

‘He prayeth best, who loveth best/All things both great and small’. This quote illustrates the theme of sin as the Mariner recognizes the value and goodness in all creatures, even the smallest, and shows remorse for his actions in killing the albatross.

Obviously, these results are impressive, particularly as they were created in less than a minute. The AI identifies the poem, its genre, and outlines its themes. Upon request, it also churns out a clear explanation (articulated in a manner that seems thoughtful: killing the albatross is described as ‘an act of unnecessary violence’) and then with some additional prompting identifies a relevant quote, linking it in a fairly superficial way to the thematic analysis. But consider the process of prompting required from this hypothetical student user to get to this point. Prompt 2 is not simply for the AI to analyse the poem, but rather to analyse the theme. Prompt 3 (‘I wanted a quote … ’) achieves a simple analysis, but with more careful and directed input this could no doubt be refined further to unpack the concept of sin and how the albatross becomes symbolic through the poem. This process is known as ‘prompt engineering’ (Greylock Citation2022) or, more playfully, a process of ‘cajoling’ the AI to produce the output you are looking for (Weinstein, interviewed in Rogan Citation2023).

I believe that it is understanding this requirement of cajoling and engineering the output that will be the key to unlocking the potential for Gen-AI in the English classroom. The procedural knowledge this hypothetical student had to possess in order to get to this output is important. How would they get this final result if they did not at the very least know that the genre of poetry analysis a) benefits from quotes, and that b) these quotes must link to the theme they are trying to explicate and that c) a sensible paragraph proceeds by linking ideas coherently together? This is not even to mention that they must know that poems can be read for thematic meaning. Enter, the English teacher.

Concerns and opportunities for English

By now, the general concerns associated with AI (and Gen-AI) are well established (Healey Citation2020). For the purpose of this article, I avoid the technical processes of Gen-AI (Schmelzer Citation2021), although it is useful to be aware of what it is doing to produce this text material. ChatGPT, for example, does not understand what it is writing/producing; it is simply making predictions about what words would be used based upon its internal dataset of existing real-world content (Weinstein, interviewed in Rogan Citation2023). It is also clear that AI will impact students’ future work lives. Predicting the future of employment in the age of AI is difficult. Sam Altman of OpenAI suggests that AI’s impact on the job market is defying traditional predictions to potentially affect creative roles sooner rather than later (Greylock Citation2022). Most relevant to English teachers is that it does indeed seem that AI’s rapid infiltration into education and administrative tasks is altering the way that students after secondary education will either work or study in contexts which require them to use Gen-AI platforms and technologies for many tasks. Therefore, it is immediately an essential component of digital literacy for students to be capable of co-existing and co-working with AI (Baidoo-Anu and Ansah Citation2023; Bender Citation2023).

Gen-AI can exhibit built-in bias due to the data they are trained on being drawn from real-world samples which thus exhibit inherited traits and latent political/ideological positions (Mehrabi et al. Citation2021). English teachers will recognise the importance of examining this as part of a critical literacy due to subject English’s commitment to social justice (Baldick Citation1983). Given many students’ and educators’ interest in social justice and sustainability (Wiltse, Johnston, and Yang Citation2014), these issues are contexts that could motivate participation in learning critical reading and writing skills by positioning them as a proximate form of empowerment.

Hallucinations are instances where the Gen-AI produces content that is not based on the input it was given, manifesting in the output of highly creative, but also unrealistic or even nonsensical content (Dziri et al. Citation2022). These hallucinations can be difficult to detect, a major problem if the user mistakenly believes that Gen-AI is ‘producing’ information instead of simply generating statistically probable text (Ji et al. Citation2022). This is one of the key problems with AI text generators that students who want to use Gen-AI to cheat need to be aware of: oftentimes, the AI just makes up ideas that do not exist, yet such material usually seems plausible (Reid, interviewed in Intercom Citation2022).

Plagiarism is likely the most obvious concern for Gen-AI in education. After all, any student can use Gen-AI to produce an essay or even a piece of creative writing and submit it for assessment, without their teacher easily being able to distinguish it from a legitimate piece of work created by the student (Baidoo-Anu and Ansah Citation2023). There are familiar strategies for reducing plagiarism; for instance, requiring students to produce assignments under supervision, or using some form of similarity-checking software, or setting assignments that demand higher levels of critical thinking (Chia Citation2023; Darr Citation2019). However, as I show throughout the rest of this article, rather than fretting about plagiarism, I believe English teachers can find ways to use Gen-AI to facilitate exciting interpretation and critical reading/viewing activities.

Further demonstration: producing a personal response

To situate the discussion below, first let us explore another extended ‘conversation’ with ChatGPT to produce an interpretation of the film The Shawshank Redemption (Dir., Frank Darabont, Citation1994), a commonly used film text in subject English. This interaction consisted of 6 prompts and totalled 900 words of AI-generated text, with me simulating the role of a student to begin with the prompt:

Write the first paragraph of an essay, with the question: ‘How has your personal context shaped the way you feel about a character in the movie Shawshank Redemption?’

ChatGPT produced the following, which is particularly remarkable for the way it has performed (or simulated) the requirements of what is usually thought of as a personal response:

Growing up in a small, economically disadvantaged town, I have always been able to relate to characters who struggle against seemingly insurmountable odds and long for a better life. This is one of the reasons why the character of Andy Dufresne in the movie Shawshank Redemption resonated with me so deeply. As someone who has also felt trapped and confined by my circumstances, I was able to understand Andy’s desire to break free from the constraints of his prison and society’s expectations. Additionally, my own experiences with injustice and corruption in my community have allowed me to relate to the themes of hope and perseverance that are central to Andy’s story.

Then, in addition to instructing the AI to rewrite this in the written style of a 15 year old, I also asked for an example of a scene from the film that could represent the theme of ‘hope’ and it responded with a paragraph of text outlining Andy playing Mozart over the prison intercom. However, perhaps my teacher knows that I did not grow up in a ‘small, economically disadvantaged town’, so I prompt: ‘Now rewrite the essay introduction as if my personal context included a visit to Fremantle heritage prison on a school excursion:

Having visited Fremantle heritage prison on a school excursion, I want to delve deeper into stories about the criminal justice system and the experiences of prisoners. This is why Andy Dufresne in the movie Shawshank Redemption particularly captured my attention. Visiting Fremantle prison helped me to understand the harsh realities of the prison system and allowed me to relate to Andy’s desire to escape its confines and society’s expectations. It also allowed me to appreciate the themes of hope and perseverance that are central to Andy’s story. Overall, my personal context has shaped the way I feel about Andy as a character, as I am able to see him not just as a fictional character, but as a representation of the real struggles and resilience of prisoners.

There are a number of things to be noted about this exchange. First, the obvious point is how comprehensible-sounding each of the Gen-AI’s responses are. It is tempting to be instantly paranoid about plagiarism (or, simply cheating) in response to this. But that ignores the type of cajoling and guidance I have had to do to get the AI to produce this kind of response. More than simply pushing a magic button, to get something useful my instructions required me to know what personal context could mean as a reading practice in English (Patterson Citation1993) and that the teacher expects me to connect what I view to what I feel in light of what I have experienced (Bender Citation2008). In addition, I needed to be able to ‘evaluate’ (McKnight Citation2021, 449) the usefulness of the theme and accompanying scene chosen. I also needed to understand that rhetorically the very first response from the AI did not convey the sort of written tone the teacher would expect of a 15-year old. Ultimately, I could not have produced the final text without already understanding relevant subject knowledge. In a sense, although the words have come from the Gen-AI, the concepts and understandings demonstrated are from the person engineering the prompts.

Discussion

While Gen-AI is highly disruptive, particularly for education (Grassini Citation2023), it is important to incorporate Gen-AI into subject English without displacing the general aims of the subject (Hunter Citation1991; Wells Citation2017). As demonstrated above, aside from its multiple uses as writing assistant (see McKnight Citation2021), it seems possible for this technology to be used to develop students’ conceptual understandings of texts, interpretation, and criticism.

Implications for practice

There are precedents to take into account when considering this kind of digital technology for students and teachers. Beavis, Muspratt, and Thompson’s (Citation2015) cautions in regard to the enthusiasm of introducing game-based learning into the curriculum bear reiterating here. Consolidating earlier work, the authors argue that research in digital technology should take into account ‘the realities of schools’ (Perrotta et al. Citation2013, iii) and that there can be a ‘mismatch’ between the enthusiasm of research and policy and the ‘conditions under which most teachers and students are operating in schools’ (Erstad and Sefton-Green Citation2013, 94). Teachers should not feel ‘alienated’ by this technology; which can take place whether one feels they do not have the technical capability, or if they are a ‘believer’ committed to using the technology in isolation from colleagues who are ‘unbelievers’ (Stieler-Hunt and Jones Citation2017). It is also crucial to acknowledge and address the equity and access issues that may arise given that some platforms require payment for premium services which limits access for students (and schools) from disadvantaged socioeconomic backgrounds.

Incorporating Gen-AI is not quite the same as having students use a word-processor. The technology does not simply automate or speed up learning processes that can be accomplished in traditional ways. A parallel case study exists in Mathematics’ introduction of graphics calculators in the mid-1990s. Two educators at the time outlined the following ‘balanced approach’ in advocating graphics calculators in schools:

Appropriate use of graphing calculators […] means the student:

  1. Solves analytically using traditional paper and pencil algebraic methods, and then supports the results using a graphic calculator.

  2. Solves using a graphing calculator, and then confirms analytically the result using traditional paper and pencil algebraic methods.

  3. Solves using graphing calculator where appropriate (because traditional analytic paper and pencil methods are too tedious and/or time consuming or there is simply no other way!) (Waits and Demana Citation1998, 3 emphasis in original).

It is easy to unpack a model of pedagogy here where the student essentially supports their learning with the technology rather than replacing maths skills and concepts. In English, the subject imparts fundamental literacy skills as well as sophisticated critical engagement with texts that are important for citizens (Hunter Citation1988). Therefore, while Gen-AI can be exciting to use – particularly with its casino-like ‘push a button then wait to see what it produces’ interaction – English teachers should ensure that students’ use of the technology is learning with (and about) the technology rather than learning the technology.

As demonstrated by the example interactions above, to produce some meaningful text about ‘The Rime of the Ancient Mariner’ or The Shawshank Redemption required the student to have a specific intention for the text output, and to work with the Gen-AI to execute this. These two demonstration encounters with the Gen-AI incorporated all four of Freebody and Luke’s four ‘resources’ (Freebody and Luke Citation2003) – text decoder, text participant, text user and text analyst:

Breaking the code of the text: The hypothetical student here knew that mise-en-scene and cinematography would have particular impact on the construction of a film such as The Shawshank Redemption, and continued prompting to identify a starting point for further analysis.

Participating in the meanings of the text: Prompting to identify themes in ‘The Rime’, and discerning that these themes made sense with their own reading of the poem, as demonstrated by prompting further on the specific theme of ’sin’.

Using the texts functionally: The hypothetical student knew that ‘The Rime’ did not simply tell a narrative, but that there were symbolic and abstract ideas represented by the events depicted. Using this knowledge, the student then looked for a quote that convincingly related to the theme of ‘sin’.

Critically analysing and transforming the texts: The hypothetical student linked something from their personal context (having visited a historic heritage prison) to The Shawshank Redemption, recognising that this personal experience shaped their response to the film. Of course, the potential superficiality of this kind of personal response can be critiqued (Patterson Citation1993), and other types of response could have instead been prompted by the student if they were aware of the practices associated with, for instance a feminist or critical race reading.

In each case the student guides the AI based upon their previous knowledge (of and about texts in general, and this text in particular). The greater the student’s pre-existing critical literacy skills, the more they will be able to interact with the AI to interrogate the text and their response to it. Therefore, none of this replaces the teacher or traditional literacy pedagogy. The teacher will need to set up the conditions for this student-Gen-AI encounter and will also guide the student through the iteration process as they workshop the generated draft further. For example, a teacher might propose to the student: did you experience powerful emotions when you visited Fremantle prison? What were they? How can you link those to the starting points made by the AI? In this example, the Gen-AI can be understood as serving as a form of scaffolding to get the first draft of the student’s intention written out, leaving them to edit and refine later, perhaps with a focus on style and aesthetic impact (Moon Citation2012).

This co-composition process will be an essential form of digital literacy as humans work with Gen-AI in the near-future. Users will need to be articulate, be knowledgeable about texts, art and literary history, as well as be creative thinkers. As Sam Altman, CEO of OpenAI, says directly: ‘What will matter is […] the quality of ideas and the understanding of what you want’ (Greylock Citation2022). Altman further adds that the best results will come about because the person had a clear ‘vision’ of what they wanted the Gen-AI to produce and, most importantly, that they could ‘articulate’ this vision. Therefore, English teachers might be less concerned about whether or not students will want to use Gen-AI to produce text, and more interested in continuing to equip students to be creative thinkers who have excellent articulation skills.

Classroom activities

For the purpose of this article, I bracket the most obvious incorporation of AI in English, which is to engage with it as a theme for a unit of study. For example, many teachers already use texts such as Ridley Scott’s 1982 film Blade Runner to explore the ethics of technology and the posthuman condition (Driver Citation2018). Instead, there are two broad areas I want to focus on in my suggestions of classroom practice: i) using Gen-AI to support students’ development of aesthetic engagement with reading and viewing activities, and ii) using Gen-AI to enhance students’ rhetorical skills both in terms of writing but also critical rhetoric. These are merely starting points that leverage the technology’s current capability and teachers will no doubt develop many more use-cases for Gen-AI.

Supporting students’ aesthetic engagement with reading and viewing

Many typical tasks to engage students with reading and viewing texts can be supported with a Gen-AI, which might also be used effectively in order to achieve differentiation in learning/assessment (Tomlinson and Imbeau Citation2010). Curricula frequently try to disentangle the student’s understanding of a text’s content from their ability to write about it. Oftentimes, English separates the modes of Reading, Viewing, Listening, and Writing and teachers are encouraged to diversify assessment tasks so that the appropriate mode is being assessed and not contaminated by a student’s higher or lower skills in another strand.

Gen-AI can assist with such aims by adhering to classic constructivist ideas of enhancing students’ active effort in their own learning process (Vygotsky Citation1962), particularly via the ‘interactive’ nature of the technology’s chat-like interface. For example, after reading a novel students might prompt the AI for 750 words that explains the main character’s conflict and decisions taken throughout the narrative. They can then work through the produced output checking where the Gen-AI’s version aligns with what is actually (or implicit) in the book, as well as where their own interpretations of the character’s decisions might differ. It is important that students do not see this as a process where the AI’s output is taken to be correct; rather, it is an opportunity to have their interpretation challenged and perhaps for creative cognitive conflict to occur as they justify their interpretations (Limon Citation2001). As students interrogate and ‘cajole’ the response from the Gen-AI to drill deeper, they might be guided with scaffolding questions by the teacher that address either traditional 3-level questions (literal, interpretive, judgemental) or questions derived from Bloom’s taxonomy. Students would then be assessed on the quality and sophistication of the questions they ask the AI – perhaps supported with annotations of why they asked each question – rather than on the quality of the Gen-AI’s output per se.

Supporting students’ rhetorical skills and understandings

As Moon (Citation2012) shows, it is commonplace to hear concerns about the quality of student written expression, with the primary complaint that students’ grammar skills are poor. However, research suggests that the real problem is one of rhetorical skills (38) which has the broader effect of impacting their ability to critically engage with others’ arguments and media messages. Gen-AI could augment existing models of written instruction to both aid student written expression, but more importantly to assist their understanding of how rhetorical techniques can affect their responses to texts written by others. For instance, it is characteristic of Gen-AI that they produce material that has an authoritative and confident voice even if it is hallucinating (Intercom Citation2022). This is obviously something that students should be aware of as they encounter texts produced by Gen-AI in the real-world such as in marketing, news and entertainment. As Ciampa, Wolfe, and Bronstein (Citation2023, 191) suggest: ‘Teaching students about how the technology works and how it was created can also help them understand concerns such as misinformation, biases, and privacy within AI’. Consider the following simple classroom activity; students use Gen-AI to produce a news report about ideas and/or topics that have clear partisan viewpoints. The result is then examined for how it has used active or passive voicing, diction, and other language choices to create its particularly confident style. This approach could be applied as an introduction to the idea of online misinformation, which is often written in highly authoritative and/or otherwise linguistically compelling ways (Acerbi Citation2019).

Conclusion

Developing practical and critical competency with Gen-AI will no doubt become a key component of digital literacy for students; arguably, it already has. It is essential then to consider issues of access, equity in these applications (Kim and Higgs Citation2022). Teachers, and education research, should also tread carefully and avoid racing to adopt any and every application of AI into schools. As with earlier forms of disruptive technology and its application to education, there exists both a future-shock paranoia (Toffler Citation1971) at the same time as an evangelising discourse of the technology’s potential (Bender and Broderick Citation2021), but as yet very scant research on the actual classroom and educational impact. This is important to consider in relation to many of the supposed challenges AI has raised for our understanding of education. For example, one of the often-touted reformations of education offered by AI is the potential for it to personalise learning experiences (Akgun and Greenhow Citation2022). While this sounds exciting and disruptive, arguably such approaches show direct continuation from traditional constructivist approaches to education which rely upon the ‘More Knowledgeable Other’ (MKO) to scaffold the learning experience appropriately (Vygotsky Citation1978). In this case, the MKO happens to be Gen-AI. Although I have focused in this article on the opportunities for AI to contribute to reading, comprehension and interpretation activities (effectively using the AI as an MKO), it is still likely that Gen-AI’s impact will mostly be felt in writing lessons where the student may collaborate with the AI to draft, redraft and manipulate their text. Therefore, it is worth recalling that much of the productive value in collaborative writing activities relies also upon the social interaction of students with each other to discuss, draft, redraft and alter their written work through interactive workshopping (Humphris Citation2010; Thompson Citation2012). Therefore, future research on the incorporation of AI into the subject English context should take into account how the technology can be combined with what already counts as exemplary educational practices and experiences.

In this article, I discussed an optimistic view of how Gen-AI can be incorporated as a ‘digital colleague’ in subject English. While writing skills are the most obvious use-case for Gen-AI (see McKnight Citation2021) – and presumably will be the first way students will expect to use them – this article has indicated some other exciting opportunities for Gen-AI to support student learning. The ability to incorporate Gen-AI into students’ critical encounters with books, poetry, films and other digital media may offer tremendous leverage for contemporary English, in addition to establishing a context in which to engage with broader socio-cultural critical literacies around digital technology (Luke et al. Citation2018).

In the context of evolving digital literacies (Pötzsch Citation2019), it is also imperative for students to be aware of the capabilities of AI in image, video, and audio manipulation. Recent advancements have significantly improved the quality and realism of AI-generated photos, videos, and voice cloning (Göring et al. Citation2023). This progression raises significant concerns for personal digital safety, as even a few seconds of audio can be used to generate an AI clone of a person’s voice (Wells-Edwards Citation2022). Therefore, as a continually evolving digital literacy (Luke Citation2018, 185), critical engagement with AI can equip students with the skills to navigate these challenges. According to recent research (Bewersdorff et al., Citation2023), many people have incorrect or incomplete understandings about what AI is and how it is generating its content, which is especially relevant as many people mistakenly believe that a platform like ChatGPT is generating information rather than statistically probable output. At the same time, teachers and future research must consider matters of equity and access which continue to be an issue in digital technology and learning, exacerbated by the COVID-19 pandemic (Greenhow, Lewin, and Willet Citation2021).

The good news is simple, and this is the primary point of this article: in order to successfully use Gen-AI to produce an analysis of a text, a student first needs to understand the concepts involved in whatever topic they are writing about, and then also have the knowledge and written capabilities to evaluate if the generated output is clear, effective etc. How Gen-AI will play a role in future digital literacy is foreshadowed clearly by one comment from machine-learning specialist Des Traynor: the Gen-AI is ‘giving you back what was probably in your head’ (Intercom Citation2022). Therefore, having ideas, thinking creatively and critically, and being able to articulate them to an AI prompt are likely to become part of what constitutes digital literacy (Bender Citation2023). As such, I remain convinced that Gen-AI will not replace English teachers.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Acknowledgments

This project has benefited from workshopping with practising English teachers in Western Australian secondary schools. I thank these colleagues, and the members of the English Teachers Association of WA for their kind input, discussion and for posing though-provoking questions.

Correction Statement

This article has been corrected with minor changes. These changes do not impact the academic content of the article.

Additional information

Notes on contributors

Stuart Marshall Bender

Stuart Marshall Bender is Associate Professor at Curtin University in the School of Media, Creative Arts and Social Inquiry where he specialises in media production with emerging technologies and researches the effect and affect of high-emotion media. Previously he taught English in secondary schools in Western Australia and continues to research in the area of curriculum and pedagogy.

Notes

1. Throughout this article, all examples of material generated by ChatGPT are real responses from the AI in response to my prompts using the build of the platform with training data up to April 2023. ChatGPT accessed 10 January, 2024 at: https://chat.openai.com/

References

  • Acerbi, A. 2019. “Cognitive Attraction and Online Misinformation.” Palgrave Communications 5 (1): 1–7. https://doi.org/10.1057/s41599-019-0224-y.
  • Ahuja, A. S. 2019. “The Impact of Artificial Intelligence in Medicine on the Future Role of the Physician.” PeerJ 7 (October): e7702. https://doi.org/10.7717/peerj.7702.
  • Akgun, S., and C. Greenhow. 2022. “Artificial Intelligence in Education: Addressing Ethical Challenges in K-12 Settings.” AI and Ethics 2 (3): 431–440. https://doi.org/10.1007/s43681-021-00096-7.
  • Baidoo-Anu, D., and L. O. Ansah. 2023. “Education in the Era of Generative Artificial Intelligence (AI): Understanding the Potential Benefits of ChatGPT in Promoting Teaching and Learning.” Journal of Artificial Intelligence 7 (1): 52–62. https://doi.org/10.61969/jai.1337500.
  • Baldick, C. 1983. The Social Mission of English Criticism 1848-1932. Oxford: Clarendon Press.
  • Beavis, C., S. Muspratt, and R. Thompson. 2015. “Computer Games Can Get Your Brain Working.” Learning, Media and Technology 40 (1): 21–42. https://doi.org/10.1080/17439884.2014.904339.
  • Bender, S. M. 2008. “A Persistent Practice: The Problem of the Documentary Lesson.” English in Australia 43 (2): 27–37.
  • Bender, S. M. 2023. “Coexistence and Creativity: Screen Media Education in the Age of Artificial Intelligence Content Generators.” Media Practice & Education 24 (4): 351–366. https://doi.org/10.1080/25741136.2023.2204203.
  • Bender, S. M., and M. Broderick. 2021. Virtual Realities: Case Studies in Immersion and Phenomenology. Switzerland: Palgrave Macmillan.
  • Bewersdorff, A., X. Zhai, J. Roberts, and C. Nerdel. 2023. “Myths, Mis- and Preconceptions of Artificial Intelligence: A Review of the Literature.” Computers and Education: Artificial Intelligence 4: 100143. https://doi.org/10.1016/j.caeai.2023.100143.
  • Chia, O. 2023. “Teachers v ChatGPT: Schools Face New Challenge in Fight Against Plagiarism.” The Straits Times. Osmond Chia. January 2 https://www.straitstimes.com/tech/teachers-v-chatgpt-schools-face-new-challenge-in-fight-against-plagiarism.
  • Ciampa, K., Z. Wolfe, and B. Bronstein. 2023. “ChatGPT in Education.” Jrnl of Adoles & Adult Literacy 67 (3): 186–195. https://doi.org/10.1002/jaal.1310.
  • Coleridge, S. T., and L. Patton. 1969. The Collected Works of Samuel Taylor Coleridge. London: Routledge and Kegan Paul.
  • Crawford, J., M. Cowling, and K. Allen. 2023. “Leadership is Needed for Ethical ChatGPT.” Journal of University Teaching & Learning Practice 20 (3): 02. https://doi.org/10.53761/1.20.3.02.
  • Darabont, F. dir. 1994. The Shawshank Redemption. Beverly Hills: Castle Rock Entertainment.
  • Darr, T. 2019. Combating Plagiarism: A Hands-On Guide for Librarians, Teachers, and Students. Santa Barbara, California: ABC-CLIO.
  • Driver, D. 2018. “Reflecting Windows: The ‘Blade Runner’ Films in the English Classroom.” Screen Education 91:44–49. https://search.informit.org/doi/10.3316/ielapa.198856774084097.
  • Dziri, N., S. Milton, M. Yu, O. Zaiane, and S. Reddy. 2022. “On the Origin of Hallucinations in Conversational Models.” arXiv [cs.CL] arXiv. http://arxiv.org/abs/2204.07931.
  • Erstad, O., and J. Sefton-Green. 2013. Identity, Community, and Learning Lives in the Digital Age. Cambridge: Cambridge University Press.
  • Freebody, P., and A. Luke. 2003. “Literacy as Engaging with New Forms of Life: The ‘Four Roles’ Model.” In The Literacy Lexicon, edited by G. Bull and M. Anstey, 51–16. Australia: Prentice Hall.
  • Göring, S., R. Rao, R. Merten, and A. Raake. 2023. “Analysis of Appeal for Realistic AI-Generated Photos.” Institute of Electrical and Electronics Engineers Access 11:38999–12. https://doi.org/10.1109/ACCESS.2023.3267968.
  • Grassini, S. 2023. “Shaping the Future of Education: Exploring the Potential and Consequences of AI and ChatGPT in Educational Settings.” Education Sciences 13 (7): 692. https://doi.org/10.3390/educsci13070692.
  • Greenhow, C., C. Lewin, and K. Willet. 2021. “The Educational Response to COVID-19 Across Two Countries: A Critical Examination of Initial Digital Pedagogy Adoption.” Technology, Pedagogy & Education 30 (1): 7–25. https://doi.org/10.1080/1475939X.2020.1866654.
  • Greylock. 2022. “OpenAI CEO Sam Altman | AI for the Next Era,” September. https://www.youtube.com/watch?v=WHoWGNQRXb0.
  • Healey, J. 2020. Artificial Intelligence. NSW, Australia: Spinney Press.
  • Hern, A. 2022. “AI-Assisted Plagiarism? ChatGPT Bot Says It Has an Answer for That.” The Guardian. December 31 https://www.theguardian.com/technology/2022/dec/31/ai-assisted-plagiarism-chatgpt-bot-says-it-has-an-answer-for-that.
  • Huff, S. 2023. “Student Builds ChatGPT Detection App to Fight AI Plagiarism.” January 5. https://www.entrepreneur.com/business-news/student-builds-chatgpt-detection-app-to-fight-ai-plagiarism/442253.
  • Humphris, R. 2010. “Developing Students as Writers Through Collaboration.” Changing English 17 (2): 201–214. https://doi.org/10.1080/13586841003787365.
  • Hunter, I. 1988. Culture and Government: The Emergence of Literary Education. Basingstoke England: Macmillan Press.
  • Hunter, I. 1991. “Learning the Literature Lesson: The Limits of the Aesthetic Personality.” In Towards a Critical Sociology of Reading Pedagogy: Papers of the XII World Congress on Reading, edited by Carolyn D. Baker and A. Luke, 47–82. Amsterdam: John Benjamins Publishing.
  • Intercom. 2022. “How ChatGPT Changed Everything: Thoughts from the Frontline of the AI/ML Revolution,” December. https://www.youtube.com/watch?v=SCsSpqZq_xA&t=2707s.
  • Ji, Z., N. Lee, R. Frieske, T. Yu, D. Su, Y. Xu, E. Ishii, Y. Bang, A. Madotto, and P. Fung. 2022. “Survey of Hallucination in Natural Language Generation.” ACM Computing Surveys 55 (12): 1–38. https://doi.org/10.1145/3571730.
  • Jimenez-Moreno, P. 2023. “This Article Was Written by ChatGPT.” Medium. July 12. https://medium.com/@jimenezmorenopablo/this-article-was-written-by-chatgpt-39e9744a744c.
  • Kasneci, E., K. Sessler, S. Küchemann, M. Bannert, D. Dementieva, F. Fischer, and U. Gasser. 2023. “ChatGpt for Good? On Opportunities and Challenges of Large Language Models for Education.” Learning and Individual Differences 103 (April): 102274. https://doi.org/10.1016/j.lindif.2023.102274.
  • Kim, G., and J. Higgs. 2022. “Exploring Equity Issues with Technology in Secondary Literacy Education.” Technology, Pedagogy & Education 32 (1): 1–16. https://doi.org/10.1080/1475939X.2022.2150288.
  • Lawrie, G. 2023. “Establishing a Delicate Balance in the Relationship Between Artificial Intelligence and Authentic Assessment in Student Learning.” Chemical Education Research and Practice 24 (2): 392–393. https://doi.org/10.1039/D3RP90003G.
  • Limon, M. 2001. “On the Cognitive Conflict as an Instructional Strategy for Conceptual Change: A Critical Appraisal.” Learning and Instruction 11 (11): 357–380. https://doi.org/10.1016/S0959-4752(00)00037-2.
  • Luke, A. 2018. “Digital Ethics Now.” Language and Literacy 20 (3): 185–198. https://doi.org/10.20360/langandlit29416.
  • Luke, A., J. Sefton-Green, P. Graham, D. Kellner, and J. Ladwig. 2018. “Digital Ethics, Political Economy, and the Curriculum: This Changes Everything.” In edited by K. A. Mills, A. Smith, J. Z. Pandya, and A. Stornaiuolo, 251–262. New York, NY: Routledge.
  • McKnight, L. 2021. “Electric Sheep? Humans, Robots, Artificial Intelligence, and the Future of Writing.” Changing English 28 (4): 442–455. https://doi.org/10.1080/1358684X.2021.1941768.
  • Mehrabi, N., F. Morstatter, N. A. Saxena, K. Lerman, and A. Galstyan. 2021. “A Survey on Bias and Fairness in Machine Learning.” ACM Computing Surveys 54 (6): 1–35. https://doi.org/10.1145/3457607.
  • Metzler, K. 2022. “How ChatGPT Could Transform Higher Education.” https://www.socialsciencespace.com/2022/12/how-chatgpt-could-transform-higher-education/.
  • Moon, B. 2012. “Remembering Rhetoric: Recalling a Tradition of Explicit Instruction in Writing.” English in Australia 47 (1): 37–52.
  • O’Mara, J., G. Auld, and N. Wood. 2021. “Necessary but Not Sufficient: Literacy Pedagogies for Changing Times.” Language and Literacy. https://www.erudit.org/en/journals/langlit/1900-v1-n1-langlit06269/1080301ar/abstract/.
  • Oravec, J. 2023. “Artificial Intelligence Implications for Academic Cheating.” Journal of Interactive Learning Research 34 (2): 213–237.
  • Patterson, A. 1993. “‘Personal Response’ and English Teaching.” In Child and Citizen : Genealogies of Schooling and Subjectivity, edited by D. Meredyth and D. Tyler, 61–86. Brisbane, QLD: Institute for Cultural Policy Studies.
  • Perrotta, C., G. Featherstone, H. Aston, and E. Houghton. 2013. “Game-Based Learning: Latest Evidence and Future Directions.” Slough: NFER. https://www.nfer.ac.uk/nfer/publications/GAME01/GAME01.pdf.
  • Pötzsch, H. 2019. “Critical Digital Literacy: Technology in Education Beyond Issues of User Competence and Labour-Market Qualifications.” TripleC: Communication, Capitalism & Critique Open Access Journal for a Global Sustainable Information Society 17 (2): 221–240. https://doi.org/10.31269/triplec.v17i2.1093.
  • Rogan, J. 2023. “What ChatGPT Could Mean for the Future of Artificial Intelligence.” Joe Rogan Podcast. Youtube. January 5. https://www.youtube.com/watch?v=kh5dN72GTQ8.
  • Schmelzer, R. 2021. “GPT-3.” Enterprise AI. TechTarget. June 11. https://www.techtarget.com/searchenterpriseai/definition/GPT-3.
  • Stieler-Hunt, C., and C. Jones. 2017. “Feeling Alienated – Teachers Using Immersive Digital Games in Classrooms.” Technology, Pedagogy & Education 26 (4): 457–470. https://doi.org/10.1080/1475939X.2017.1334227.
  • Thompson, I. 2012. “Planes of Communicative Activity in Collaborative Writing.” Changing English 19 (2): 209–220. https://doi.org/10.1080/1358684X.2012.680766.
  • Toffler, A. 1971. Future Shock. London: Bantam.
  • Tomlinson, C., and M. Imbeau. 2010. Leading and Managing a Differentiated Classroom. Virginia, USA: ASCD.
  • Vygotsky, L. 1962. Thought and Language. Cambridge, MA: MIT Press.
  • Vygotsky, L. S. 1978. Mind in Society. Harvard University Press.
  • Waits, B., and F. Demana. 1998. The Role of Graphing Calculators in Mathematics Reform. Columbus: Ohio State University.
  • Wells, S. 2017. “What is English now? The construction of subject English in contemporary textbooks for Australian secondary schools.” Master’s Thesis, Edith Cowan University. https://ro.ecu.edu.au/theses/1963.
  • Wells-Edwards, B. 2022. “What’s in a Voice? The Legal Implications of Voice Cloning.” Arizona Law Review 64 (4): 1213–1239. https://heinonline.org/HOL/P?h=hein.journals/arz64&i=1265.
  • Wiltse, L., I. Johnston, and K. Yang. 2014. “Pushing Comfort Zones: Promoting Social Justice Through the Teaching of Aboriginal Canadian Literature.” Changing English 21 (3): 264–277. https://doi.org/10.1080/1358684X.2014.929287.
  • Yu, H. 2023. “Reflection on Whether ChatGPT Should be Banned by Academia from the Perspective of Education and Teaching.” Frontiers in Psychology 14 (June): 1181712. https://doi.org/10.3389/fpsyg.2023.1181712.