1,799
Views
1
CrossRef citations to date
0
Altmetric
Articles

Online proctored exams: rhetoric vs reality

ORCID Icon, ORCID Icon & ORCID Icon
Pages 392-405 | Received 24 Nov 2022, Accepted 16 Jun 2023, Published online: 24 Jul 2023

ABSTRACT

Remote proctoring of exams is one of the most divisive issues in higher education. Critiques of remote proctoring abound, and there are a variety of perspectives particularly in relation to the advantages and disadvantages of this type of assessment, and opportunities for cheating. However, these perspectives are largely based on rhetoric with limited empirical data to support or refute the value of remote proctoring. This study used mixed-methods to investigate the experiences, and perceptions of cheating in open-book online proctored exams. An online questionnaire and interviews were conducted with students and academics. Data analysis revealed that the experience of online proctored exams was generally positive although there were mixed preferences for online versus on-campus exams. A variety of advantages and disadvantages of online proctored exams were also identified. Whilst both students and academics reported that they believed students would cheat, actual instances of cheating (as reported by students or academics) were minimal. This may have been because of the use of open-book exams. Based on these findings we comment on the reality versus the rhetoric relating to online proctored exams and suggest a range of ways forward for universities.

Introduction

Remote proctoring of exams is one of the most divisive issues in higher education. In this type of assessment, students undertake an online examination on a computer of their choosing, at a location of their choosing, while they are monitored by a third party (Raman et al., Citation2021). Critiques of remote proctoring abound, from a variety of perspectives. Remote proctoring is said to be an invasion of student privacy (Langenfeld, Citation2020); racist or ableist (Logan, Citation2020); a ‘cop shit’ approach to assessment (Moro, Citation2020); or rooted in a culture of distrust (Logan, Citation2020). The problem is that some critiques are posed by researchers or commentators who have little first-hand experience of remote proctoring, and little-to-no empirical data in support of their arguments. The voices of the students and educators who are experiencing remote proctoring are somewhat absent in the literature. This paper seeks to understand experiences of remote proctoring for educators and students, in order to help disentangle the rhetoric from what actually happens in practice. In doing so, we hope to provide a more nuanced and grounded representation of some of the pros and cons of remote proctored exams.

Whilst universities may have utilised proctored online exams prior to COVID-19, use of this method of assessment became much more common during the pandemic, particularly in contexts where there were widespread, extended lockdowns. Post-COVID-19, it appears that many universities have made the decision to retain remote proctoring (Jha, Citation2022), although debate and discussion regarding the efficacy and ethics of this approach abound. Within the research literature online proctored exams are a relatively new phenomenon emerging in the early 2000s (Kitahara & Westfall, Citation2007). Since then, technology has rapidly expanded (Flaherty, Citation2020). Undertaking exams in an online environment is a very different experience for students when compared with sitting in-person, paper-based exams. Whilst online proctored exams might be considered advantageous by some students, others may experience difficulty finding a safe, quiet space to undertake an exam, or may not have the equipment or internet access that makes sitting the exam easy (Coghlan et al., Citation2021). There may also be concerns regarding privacy (Selwyn et al., Citation2021) and academic integrity (Dawson, Citation2020) in these environments, given that supervision is typically undertaken via webcam using either live proctoring or a video recording of the student (Coghlan et al., Citation2021).

Kharbat and Daabes (Citation2021) have published one of the few post-COVID studies exploring students' attitudes and concerns towards proctoring. Using surveys (n = 106), exam results (n = 106) and focus groups they found that students were typically not in favour of proctoring and had concerns about privacy. Lilley et al. (Citation2016) conducted a pilot study with 17 students enrolled in online courses. They found that participants had concerns about data security, privacy and the intrusiveness of live invigilation prior to undertaking the assessment but that this concern eased having completed the assessment. James (Citation2016) gave students in a first-year psychology unit the choice of sitting an online invigilated exam, with 6.3% (n = 29) of the cohort accepting the opportunity. Participants were challenged by the idea of sitting an online exam, with technical difficulties and timely support identified as two of the key challenges. Karim et al. (Citation2014) randomly assigned 295 participants to either proctored or unproctored online tests and found that proctoring was associated with increased negative reactions. Participants reported concerns about invasion of privacy and of feeling self-conscious about the information collected via proctoring. Bedford et al. (Citation2011) surveyed 31 students and 20 academics and found that overall, they held positive perceptions of online proctoring and that assessment results were not negatively impacted by proctoring. Selwyn et al. (Citation2021) undertook interviews with students who had sat online proctored exams, activists who led campaigns opposing online proctored exams and academic and technical staff. Whilst online proctoring was seen to address a number of risks for universities, concern was raised regarding the surrender of control to commercial proctoring companies, hidden workload for academics and the degree to which students were able to exercise informed consent to online proctoring.

In an ethical analysis of online proctoring, Coghlan et al. (Citation2021) proposed that whilst online proctoring is not a ‘completely evil technology’ (p. 1583) its use should be thoughtfully considered by universities. Their philosophical analysis suggests a number of questions that universities should consider prior to using online proctoring such as: whether there are alternative forms of assessment that are acceptable and whether academic integrity would be impacted to an unacceptable level if proctoring was not in place.

In regard to student performance in online proctored exams, Alessio et al. (Citation2017) compared the academic performance of 147 students enrolled in a medical terminology course. Student results were compared for proctored and unproctored online assessments. They found that students sitting proctored assessments scored an average of 17 points lower on assessments and used significantly less time in completing assessments. However, Hylton et al. (Citation2016) found no statistically significant difference in scores when comparing the performance of 186 students sitting the same exam who were either proctored and unproctored.

The primary claimed affordance provided by remote proctoring – above those provided by online exams in general – is that they can detect or deter cheating (D’Souza & Siegfeldt, Citation2017). However, literature relating to cheating in online proctored exams is sparse. Karim et al. (Citation2014) indirectly determined levels of cheating by comparing student results for online proctored and unproctored tests. They suggested that proctoring did decrease cheating although the effect size was small. Participants undertaking examinations in Bedford et al.'s (Citation2011) study perceived that students would be less likely to cheat when online proctoring was in use. Harmon and Lambrinos (Citation2008) compared proctored and unproctored exams in two economics courses and reported that cheating was taking place in unproctored exams. In Citation2014, Fask and colleagues similarly compared student performance in traditional proctored versus unproctored online exams for 44 students studying statistics. They found that the disadvantages of sitting online exams somewhat offset the opportunities to cheat afforded by an unproctored environment. Janke et al. (Citation2021) investigated cheating in a cohort of more than 1600 German students who sat closed-book online proctored exams. The students reported that they cheated more in these exams than on-campus exams suggesting that academic integrity was reduced. Whilst these studies provide some empirical data, there is need for further research exploring the rhetoric versus reality of online proctored exams.

This paper presents findings from one Australian university’s experience in using online open-book proctored exams. More specifically, the paper explores (i) student and academic preferences for online or located exams; (ii) the perceived advantages and disadvantages and the overall experience of online proctored exams for students and academics; and (iii) student and academic perceptions of cheating and academic integrity in online proctored exams. The paper concludes by discussing the key implications of the study and identifying areas for further consideration. These evidence-based insights may help to unravel the reality versus rhetoric in this space and will have implications for other Australian higher education institutions considering the use of online proctored exams as part of assessment offerings.

Materials and methods

Research approach

This research project adopted a mixed-methods approach using online questionnaires and in-depth interviews. Mixed-methods research involves the collection of both qualitative and quantitative data within a single study. In combining qualitative and quantitative research approaches, a mixed-methods study provides strengths that offset the limitations of each research approach (Johnson & Onwuegbuzie, Citation2004). In this study, collection of quantitative data via questionnaire occurred first. This data reflected the most common responses and experiences of the larger group of participants. Following this, qualitative data via interview were collected allowing for deeper exploration of the perceptions and beliefs of academics and students in relation to experiences and academic integrity in online proctored exams (Guest et al., Citation2012). Ethical approval for the study was granted by the Deakin University Human Ethics committee (HAE-20-143).

Study context

In the second half of 2020, students who were taking part in online open-book proctored exams at an Australian university as part of the response to COVID-19 lockdowns, were invited to participate in the study. The online proctored exams involved students undertaking the examination on a computer of their choosing, at a location of their choosing, while they were monitored/proctored by a third party. The specific type of remote proctoring that was used was provided by an external proctoring company using recorded proctoring via student web-cam. Artificial intelligence was used by the company to detect potential incidences of cheating. At this time the use of online proctored exams was relatively new to the university with only small-scale pilots previously undertaken. Students were undertaking qualifications in the disciplines of business, law, science, engineering, built environment and health. Exams were on various topics relating to their disciplines. To assist with anonymity, in the hope of getting honest answers relating to cheating, students were not asked within the survey or interview to identify their discipline. This limits the conclusions relating to disciplinary differences that can be drawn from this study. Academics who had management and teaching responsibility in units with an online proctored exam in Trimester 2, 2020 were also recruited.

Participants and recruitment

Five hundred and eighty-six students were invited to participate in the study. Four hundred and eighty-one students in this group consented to their answers to the questionnaire being used. Twenty-three students consented to participate in interviews. Twenty-three academics were also invited to participate. Thirteen of these staff consented to their answers to the questionnaire being used in this study; five academics participated in interviews. Demographic data was not collected from participants to encourage participation, given that the questionnaire and interviews included questions about cheating in exams.

Following the delivery of online proctored exams, both participant groups were emailed by an Administrator who was independent of the research project, to request participation in the study. A plain language statement and a link to the questionnaire were included. Consent was indicated by ticking a box within the questionnaire to provide consent for responses to be used for the purposes of research. At the end of the questionnaire participants were asked to indicate whether they were willing to participate in an in-depth interview. Academics managing the units were also asked to post an invitation to participate in the study on their online unit sites.

Data collection

Data were collected using online questionnaires and in-depth interviews; students and academics were invited to participate in both data collections. The online questionnaire was developed by a team of academic and professional staff who were trialling the exams, and had been modified following use in two previous years. The questionnaire was administered via Qualtrics. For students, the questionnaire covered various aspects of taking an online proctored exam such as setting up and using the online exam technology, internet connectivity, online proctoring experience, security and cheating, and preferences for taking exams in the future. For academics, the questionnaire covered perceptions on the appropriateness of online exams as an assessment approach as well as the student experience, and preferences for online or on campus exams in the future. Both questionnaires included Likert scales, percentage scales, yes/no and open response questions. Interviews were conducted by two of the researchers via Zoom and were recorded and transcribed verbatim. The interview protocol was designed to elicit participants’ experiences and perceptions of online proctored exams with particular attention to security and cheating (see Supplementary Information). Students who participated were provided with an incentive in the form of a gift voucher.

Data analysis

To enhance rigour and trustworthiness, multiple data sources (questionnaires and interviews) were used, students and academics participated and all three researchers individually reviewed the data. The quantitative findings and the qualitative data were considered together when establishing study findings (Onwuegbuzie & Leech, Citation2006). The qualitative data were considered the primary data set, the quantitative data were used to provide context, and refinement where appropriate, to the story of the qualitative data. Quantitative data were taken to be ordinal and were analysed using descriptive statistics in Qualtrics calculating percentages, means and standard deviations for responses to questionnaire questions. Qualitative data were analysed using inductive thematic analysis. Each researcher initially familiarised themselves with the data and established themes, generating initial codes (Braun & Clarke, Citation2006). The researchers then came together on three occasions to compare their findings and to name and define themes; differences in opinion were discussed until agreement was reached (Braun & Clarke, Citation2006). When conducting thematic analysis, we sought to establish semantic rather than latent themes, as we were more interested in what our participants explicitly said and wrote rather than in identifying unstated or underlying meanings (Braun & Clarke, Citation2006).

Results

The analysis of the combined qualitative and quantitative data (see for quantitative data summary) led to the articulation of two core inter-connected themes, and a range of sub-themes. Theme one was the online exam experience, with sub-themes of: preference for type of exam, benefits and challenges of online exams and change in perception. Theme two was cheating, with sub-themes of: predictions of cheating, impact of peers cheating, accidental or deliberate cheating and open book exams and cheating. Together the themes and sub-themes reveal the tensions between reality and rhetoric of online proctored exams as experienced by staff and students at one Australian university. A description of each theme and sub-theme follows, together with illustrative quotations that were derived from research interviews. Where quotations are shown, the corresponding participant number is provided at the end of each statement.

Table 1. Quantitative results for relevant questions – students and academics.

Online exam experience

Preference for type of exam

While the academics who participated in the online questionnaire had a strong preference for online proctored exams instead of a paper-based exam (69%), students were more divided on their preferences, with 37% preferring a paper exam, 39% preferring online exam, and 24% neutral on exam format. The following student quotation from the in-depth interviews illustrates why some students preferred the online format: ‘With everything else I have to fit in, I'd prefer to just be home, lock myself in, do the exams and then go right back into it [everyday life]. Logistically it's much, much easier to do’ (Student 16).

Reasons for preferring the paper based or exam hall approach included taking the exam more seriously, and confidence that any cheating issues would be dealt with on the day as revealed in the following student quotation:

I would say I'm old-school so I would prefer a physical exam. Because it's like written down on a paper so you know what you have written, instead of like digitised you're not sure whether you handed it in or like the info that you copied was all that you wanted. (Student 20)

Benefits and challenges of online exams

Regardless of preference for online or paper based, most students and academics interviewed spoke of the benefits of online proctored exams including decreased stress and distraction, not having to travel anywhere, being able to use own computer, having own choice of location and being able to fit exams around other commitments. The following quotation supports this point: ‘You didn't have the stress of, “am I going to be late, am I going to get there on time, am I going to find a car park”, kind of thing. So that was convenient’ (Student 11).

Students also spoke of the challenges faced with online exams, including work and childcare being harder to navigate: ‘Because I could take the supervised exam at my home location, it became a little bit more challenging to try and navigate with things like work or childcare’ (Student 19). Several students also reported challenges in typing rather than handwriting, as illustrated by the following quotation:

Because my assessment involved maths and formulas and so forth which usually you would just sort of write out on paper and then calculate in your own way, but then now you had to find a way to translate that. (Student 17)

Students and academics also spoke of challenges with communication for online exams, reporting that they were unclear on what was and was not allowed during the online proctored exam and requesting improvements in communication: ‘I think it was just there was a lot of miscommunications at the beginning – like, prior to the actual exam starting’ (Student 11). For some students, this lack of clear communication led to pre-exam stress and anxiety caused by being unsure of rules and the potential for technology to fail, as illustrated by the following quotation: ‘Once we got given the rules prior to it, a lot of people were nervous … if it's an assignment, you've got all the time in the world to mess around with things and figure them out’ (Student 16).

Change in perception

Students (55%) and academics (69%) reported that they had a positive experience of the online proctored exam overall, with some students noting their experience had been better than they thought it would be and that they would feel more comfortable doing an online exam in the future:

Yes [my perceptions have changed]. Now I know what to expect or sort of have an idea as to what I need to be doing. How organised I should be while attempting an exam … .I don't mind having another online supervised exam. (Student 14)

A small number of students did not have a positive proctoring experience, noting their perception had not shifted for ethical reasons, or that it was a more invasive experience than expected, or that the exam platform was difficult to use. The following student quotations are evidence of this: ‘I still have this ethical stance [against online proctored exams]’ (Student 13). ‘I just found it a bit invasive. Like with them watching over you in your own home’ (Student 5).

Cheating

Predictions of cheating

The quantitative data gathered suggests that students were more suspicious of and concerned by cheating in online proctored exams than academics. For example, the students who participated in the questionnaire predicted that 24% of students sitting an online proctored exam would attempt to cheat; in contrast the academics predicted only 12% of students would attempt to cheat. This difference in student and academic perception was also revealed when more students than academics were inclined to view cheating as easier in online proctored exams, 46% and 31%, respectively. Concerns about others cheating were also revealed in the in-depth interviews, as illustrated by the following student quotation: ‘I think people would cheat or try to cheat more’ (Student 3). This sentiment was also echoed by some academics: ‘The temptation is there because there's not many eyes in the room and the one webcam that is there, there's blind spots to that. I'm sure if someone's determined enough, they can cheat’ (Academic 4).

Impact of peers cheating

Although the quantitative data showed that students were more concerned about security and cheating (30%) than academics (8%), most students who participated in an in-depth interview were not concerned about their peers cheating, noting it was each student’s ‘own choice’ (Student 21). They were primarily worried about how they performed themselves and expressed the view that they were ‘not competing’ (Student 10) with their peers. Although some students did express concern about their peers cheating if it might impact their own marks or if it effected the standing of the degree as illustrated by the following quotation: ‘If it becomes known that 90 percent of students at [University Name] have cheated, then that certificate becomes less worthy, and everybody will think that I cheated as well, and so in that sense it affects me’ (Student 3).

Students identified the limitations of cheating, observing that the real test was in the outside world where it would soon become apparent who had cheated: ‘I feel like if you cheat your way through a degree, you get found out on the other side eventually’ (Student 2). Many students believed that recorded proctoring in online exams would deter or make it harder to cheat. This view was echoed by academics, with both groups expressing that a focus on preventing cheating was important as illustrated by the following quotation:

I think having your screen recorded as well as you as an individual being recorded, definitely makes it a lot harder. That temptation is taken away, because if you do go onto Google … , it is recorded so they've got evidence pretty much to say, “Yeah, you did cheat”. (Student 11)

‘Without a doubt [recording will reduce cheating]. Full stop, exclamation mark’ (Academic 1).

Most students believed that there will always be students who try to cheat no matter the type of examination, online or on campus, proctored or unproctored: ‘If the student wants to cheat, there are multiple ways they can, with or without supervision [proctoring]. Even if they are sitting in the classroom, they have their own ways to cheat’ (Student 4).

Accidental or deliberate cheating

The chance that students may misunderstand guidelines or forget the rules and accidently cheat was noted by several students as evidenced by the following quotation: ‘I think that there also could be a potential for students to misunderstand the guidelines and then cheat inadvertently’ (Student 13). This caused anxiety for students.

In interviews, two students reported breaking the rules, while 0.42% of the students who participated in the questionnaire were willing to admit attempting to cheat or break exam conditions during the exam. Of the two students who reported breaking the rules in the interview, one used headphones to deal with background noise (whilst not cheating, this was breaking the rules), and another reported cheating by passing notes to a peer sitting the same exam in same room.

Well, I won't lie to you but we were two people sitting in the room and no matter how much we tried, we knew that it was supervised so I couldn’t afford to cheat. Yeah, I did write down a few answers for the other person because she was faltering … and she was confused as to what exactly the answer is, so probably I might have wrote down the answers and after that she felt comfortable. (Student 23)

A third student was ready to cheat if needed with their iPad positioned close by and a detailed plan in place.

Open-book exams and cheating

Many students felt that open book exams made it impossible to cheat or removed the need to cheat.

So, my two exams that I sat were both open-book exams. So, I think it alleviated that pressure if you potentially wanted to cheat. I don't know, if it was say a closed book exam, whether or not someone might try and put a little sticky note somewhere on their computer screen. (Student 11)

This view was echoed by academics with one noting that open book exams ‘level the playing field’ (Academic 2) between students who would have otherwise obeyed a closed-book requirement, and students who would have cheated. Some students also reported liking the open book approach because of its focus on application over memorisation and the challenge it posed to the student, as illustrated by the following student quotation: ‘It [open book] meant that I got to prepare a lot of my digital notes on OneNote and that was good practice because I'm kind of new to this whole platform thing’ (Student 2). Others reported finding open book exams harder, describing them as ‘nerve wracking’ because the exam used ‘more obscure questions, not the bread and butter ones’ (Student 21).

Students noted that open book exams had both benefits and drawbacks, as they make it harder to determine what to have at hand and using notes and resources during an exam takes time. ‘But then also that impedes you because you take longer to answer questions because you're busy looking for the right answers through all the materials. So yeah, I think it's a bit of a double-edged sword’ (Student 9). A small number of students noted that they didn’t feel prepared for the open book format, or that their preparation was impacted because they didn’t understand what open book exams were or how to best prepare:

This exam wasn't supposed to be open-book like normally, so to have it open-book I actually failed it the first-time round because I just didn't prepare for it at all. Because I was like, it's open book, it's all good, it's going to be fine. (Student 1)

Academics also reported that the move to open book opened a debate within the academic team on the most effective ways to assess students.

Discussion

This study sought to unravel the reality versus rhetoric relating to online proctored exams by exploring student and academic perceptions of proctored exams and cheating. The findings of the study have implications for higher education institutions considering the use of online proctored exams as a long-term option and for those involved in the online proctored exam debate. Overall, the study findings demonstrate that our participants held a much more moderate view of online proctored exams than the more extreme perspectives in the literature, especially after participants had experienced one of these exams.

Objective one of this study was to explore student and academic preferences for online or located exams. Students expressed concerns about proctoring prior to exams but were more positive after it. This finding aligns with previous research by Lilley et al. (Citation2016) where participant concerns eased having completed the assessment. Within the data, it appeared that students had a range of pre-conceived assumptions prior to sitting online proctored exams but found the reality less concerning and for many students, after taking a proctored exam, these concerns were less important to them. This negative preconception may be attributable to the aforementioned rhetoric and the contagion of anti-proctoring movements which were prevalent at the time of the study (Selwyn et al., Citation2021). In this study some students reported that they forgot about the proctoring once the exam started and after the exam many identified advantages of sitting an exam at home. The equal preference for online and on-campus exams demonstrated in the quantitative data suggests that these advantages may make online proctored exams more appealing for many students. Online proctored exams offer a range of advantages for students (less travel, use of own device, greater flexibility) and universities (decreased cost, ability to enrol students from a greater range of places); as reflected in both this study and in a recent systematic review (Butler-Henderson & Crawford, Citation2020). Student resistance and persuasion is part of many assessment changes (Bearman et al., Citation2017), and the shift to a more positive perception following sitting an online proctored exam suggests the need to encourage students to give them a try prior to forming an opinion on them.

The second objective of this study was to explore the perceived advantages and disadvantages and the overall experience of online proctored exams for students and academics. Both groups reported that stress and anxiety were prevalent in online proctored exams, which concurs with previous research on test anxiety and remote proctoring (Kharbat & Daabes, Citation2021; Woldeab & Brothen, Citation2019). However, most reported greater confidence following the exam. Some participants reported that sitting an exam in their own space was not possible. This may have been due to not having a private space, poor internet access, poor equipment or for reasons that were not disclosed. Whilst online proctored exams may be considered to offer greater inclusivity for students, consideration of all student groups and offering alternative options for sitting exams (e.g., on-campus) are key to reassuring and supporting students so that all are proctored but individual circumstances do not result in disadvantage. As in assessment in general, inclusion should be built into the design process and not deployed in a reactionary way (Tai et al., Citation2022).

The final aim of the study was to investigate student and academic perceptions of cheating and academic integrity in online proctored exams. Both students and academics appeared to over-estimate the degree to which cheating would occur compared to the observed degree of cheating. Whilst the majority of students thought that it would be easier to cheat in an online proctored exam than in a campus-based exam, the data suggests that few students attempted to cheat. However, this finding must be viewed with caution given that participants may have chosen not to admit to cheating when responding to questionnaire or interview questions. It is also likely that the open book nature of these exams reduced the perceived value of cheating for students as they had access to resources during the exams. This finding differs from that of Janke and colleagues who found that approximately 48% of students accessed resources they were not supposed to, or communicated with people they were not allowed to during online proctored exams (Janke et al., Citation2021). In Janke’s study exams were closed book which may explain the differences in findings. It is also possible that rules relating to reporting academic integrity breaches were relaxed during the early stages of COVID-19 (when the study reported here took place) whilst both students and academics came to terms with the nature of the pandemic and this new world of online assessment.

Students expressed a lack of concern about their peers cheating. They believed that whilst there will always be a proportion of students who cheat, ultimately, they were only cheating themselves. This finding mirrors that of Arnold (Citation2016), who explored cheating in online formative tests of economics students and found that whilst cheating may happen, it does not tend to pay off for students.

However, the fear for students of accidentally cheating was anxiety provoking and suggests that communication prior to exams about the process of assessing cheating is imperative. Many students in this study appeared not to understand how the proctoring was being administered (by a live person or via recording) and what was and wasn’t allowed during the assessment, which contributed to their anxiety that they may inadvertently cheat. Improved communications that help students understand what constitutes cheating and repeated online proctored experiences are likely to alleviate some of this concern.

All of the participants in this study were involved in exams that were open-book, allowing students to access resources that would not typically be available in on-campus invigilated exams. Whilst the literature in the field is currently limited, given the low rate of reported cheating in our study and the current limitations of proctoring (providing limited vision of the students’ work space) in detecting cheating, it is suggested that open-book exams are a better fit for online proctored exams. Administering closed book exams effectively online may require greater use of live supervision (potentially increasing privacy concerns for students), room scans and use of multiple cameras. Open-book exams are also considered more authentic by many (Teodorczuk et al., Citation2018), given that they more accurately mimic the workplace in many disciplines. However, it is important to consider that open book exams may require different types of questions that are more application-based, resulting in more time and skill needed to write them.

Our experience has also provided a range of learnings that may be of value to others who want to create a positive student experience whilst using proctored online exams. Based on our data, we encourage the exploration of ways for students to share their lived experiences of undertaking online proctored exams to help with myth busting and to develop trust and reassurance in the broader student population. Helping students to develop an understanding of the differences between open and closed book exams and how to prepare for them also appears important. Mandating the use of practice exams may assist students to experience the mechanics of the exam prior to sitting the real exam. This, used alongside clear and repeated instruction, is likely to improve the student experience.

Based on our experiences, we encourage educators to embrace open book exams as the preferred assessment design for online proctored exams wherever disciplinary requirements allow. Whilst this may require development of academic awareness and capacity in the writing of suitable open book exam questions, we believe that this likely reduces the incentive for students to cheat and may result in questions more applicable to the real world of work, where exam-like conditions are rare. Our data also suggests that if closed-book proctored online exams are used and more stringent monitoring (for example, live supervision, room scans or multiple cameras) adopted, careful consideration of the potential invasion of student privacy is important. It may be that scheduling closed book exams in on-campus settings is a more viable alternative post COVID-19. This work also leads us to advocate for a continued focus on academic integrity in the context of online proctored exams, including helping students to understand what constitutes cheating in order to ensure that students clearly understand what is and isn’t allowed.

Conclusion

The efficacy and ethics of online proctored exams is currently a hotly debated topic in higher education. However, perspectives are often based on rhetoric rather than empirical data. Based on the findings of this project we seek to question some of the negative rhetoric relating to online proctored exams. This study used interviews and questionnaires to investigate experiences of online open-book proctored exams (including preferences, advantages and disadvantages) and perceptions of cheating. Data analysis revealed that the experience of open-book online proctored exams was generally positive, although preference for online versus on-campus was mixed. Whilst both students and academics reported that they believed students would cheat, there were few instances of cheating reported by either students or academics. Based on these findings we believe that online open-book proctored exams offer a viable form of on-going assessment, and suggest a range of ways forward for universities.

Supplemental material

Supplemental Material

Download MS Word (24.9 KB)

Disclosure statement

No potential conflict of interest was reported by the author(s).

References

  • Alessio, H. M., Malay, N., Maurer, K., Bailer, J., & Rubin, B. (2017). Examining the effect of proctoring on online test scores. Online Learning, 21(1), 146–161. https://doi.org/10.24059/olj.v21i1.885
  • Arnold, I. J. M. (2016). Cheating at online formative tests: Does it pay off? The Internet and Higher Education, 29, 98–106. https://doi.org/10.1016/j.iheduc.2016.02.001
  • Bearman, M., Dawson, P., Bennett, S., Hall, M., Molloy, E., Boud, D., & Joughin, G. (2017). How university teachers design assessments: A cross-disciplinary study. Higher Education, 74(1), 49–64. https://doi.org/10.1007/s10734-016-0027-7
  • Bedford, D. W., Gregg, J. R., & Clinton, M. S. (2011). Preventing online cheating with technology: A pilot study of remote proctor and an update of its use. Journal of Higher Education Theory and Practice, 11, 41–58. https://search.ebscohost.com/login.aspx?direct = true&db = eue&AN = 69631386&authtype = sso&custid = deakin&site = eds-live&scope = site
  • Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101. https://doi.org/10.1191/1478088706qp063oa
  • Butler-Henderson, K., & Crawford, J. (2020). A systematic review of online examinations: A pedagogical innovation for scalable authentication and integrity. Computers and Education, 159, 104024. https://doi.org/10.1016/j.compedu.2020.104024
  • Coghlan, S., Miller, T., & Paterson, J. (2021). Good proctor or” Big Brother"? Ethics and online exam supervision technologies. Philosophy and Technology, 34(4), 1581–1606. https://doi.org/10.1007/s13347-021-00476-1
  • Dawson, P. (2020). Defending assessment security in a digital world. Preventing e-cheating and supporting academic integrity in higher education. Routledge.
  • D’Souza, K. A., & Siegfeldt, D. V. (2017). A conceptual framework for detecting cheating in online and take-home exams. Decision Sciences Journal of Innovative Education, 15(4), 370–391. https://doi.org/10.1111/dsji.12140
  • Fask, A., Englander, F., & Wang, Z. (2014). Do online exams facilitate cheating? An experiment designed to separate possible cheating from the effect of the online test taking environment. Journal of Academic Ethics, 12(2), 101–112. https://doi.org/10.1007/s10805-014-9207-1
  • Flaherty, C. (2020, May 11). Online proctoring is surging during COVID-19. Inside Higher Educations. Retrieved August 31st, 2021 https://www.insidehighered.com/news/2020/05/11/online-proctoring-surging-during-covid-19
  • Guest, G., MacQueen, K. M., & Namey, E. E. (2012). Integrating qualitative and quantitative data. In G. Guest (Ed.), Applied thematic analysis (pp. 187–216). SAGE Publications, Inc.
  • Harmon, O., & Lambrinos, J. (2008). Is the cheating risk always higher in online instruction compared to face-to-face instruction? Research Gate. Retrieved August 31st 2021, https://www.researchgate.net/publication/24141676_Is_the_Cheating_Risk_Always_Higher_in_Online_Instruction_Compared_to_Face-to-Face_Instruction
  • Hylton, K., Levy, Y., & Dringus, L. P. (2016). Utilizing webcam-based proctoring to deter misconduct in online exams. Computers and Education, 92-93, 53–63. https://doi.org/10.1016/j.compedu.2015.10.002
  • James, R. (2016). Tertiary student attitudes to invigilated, online summative examinations. International Journal of Educational Technology in Higher Education, 13(19), 1–13. https://doi.org/10.1186/s41239-016-0015-0
  • Janke, S., Rudert, S. C., Petersen, Ä, Fritz, T. M., & Daumiller, M. (2021). Cheating in the wake of COVID-19: How dangerous is ad-hoc online testing for academic integrity? Computers & Education Open, 2, 1–9. https://doi.org/10.1016/j.caeo.2021.100055
  • Jha, M. (2022, July 4). If Unis stick with online assessment after COVID, they’ll have do more to stop cheating. The Conversation. Retrieved September 6th, 2022 https://theconversation.com/if-unis-stick-with-online-assessment-after-covid-theyll-have-do-more-to-stop-cheating-185762
  • Johnson, R. B., & Onwuegbuzie, A. J. (2004). Mixed methods research: A research paradigm whose time has come. Educational Researcher, 33(7), 14–26. http://www.jstor.org/stable/3700093.
  • Karim, M., Kaminsky, S., & Behrend, T. (2014). Cheating, reactions, and performance in remotely proctored testing: An exploratory experimental study. Journal of Business and Psychology, 29(4), 555–572. https://doi.org/10.1007/s10869-014-9343-z
  • Kharbat, F. F., & Daabes, A. S. A. (2021). E-proctored exams during the COVID-19 pandemic: A close understanding. Education and Information Technologies, 26(6), 6589–6605. https://doi.org/10.1007/s10639-021-10458-7
  • Kitahara, R. T., & Westfall, F. (2007). Promoting academic integrity in online distance learning courses. MERLOT Journal of Online Learning and Teaching, 3(3), 265–276.
  • Langenfeld, T. (2020). Internet-based proctored assessment: Security and fairness issues. Educational Measurement: Issues and Practice, 39(3), 24–27. https://doi.org/10.1111/emip.12359
  • Lilley, M., Barker, T., & Meere, J. (2016). Remote live invigilation: A pilot study. Journal of Interactive Media in Education, 1, 1–5. https://doi.org/10.5334/jime.408
  • Logan, C. (2020). Refusal, partnership, and countering educational technology’s harms. Hybrid Pedagogy, https://hybridpedagogy.org/refusal-partnership-countering-harms/
  • Moro, J. (2020). Against cop shit. Retrieved February 4, 2022, from https://jeffreymoro.com/blog/2020-02-13-against-cop-shit/
  • Onwuegbuzie, A. J., & Leech, N. L. (2006). Linking research questions to mixed methods data analysis procedures. The Qualitative Report, 11, 474–498. http://www.nova.edu/ssss/QR/QR11-3/onwuegbuzie.pdf
  • Raman, R. B., Vachharajani, H., & Nedungadi, P. (2021). Adoption of online proctored examinations by university students during COVID-19: Innovation diffusion study. Education and Information Technologies, 26(6), 7339–7358. https://doi.org/10.1007/s10639-021-10581-5
  • Selwyn, N., O’Neill, C., Smith, G., Andrejevic, M., & Gu, X. (2021). A necessary evil? The rise of online exam proctoring in Australian universities. Media International Australia, 149–164. https://doi.org/10.1177/1329878X211005862
  • Tai, J., Ajjawi, R., Bearman, M., Boud, D., Dawson, P., & Jorre de St Jorre, T. (2022). Assessment for inclusion: Rethinking contemporary strategies in assessment design. Higher Education Research and Development, 42(2), 1–15. https://doi.org/10.1080/07294360.2022.2057451
  • Teodorczuk, A., Fraser, J., & Rogers, G. D. (2018). Open book exams: A potential solution to the “full curriculum”? Medical Teacher, 40(5), 529–530. https://doi.org/10.1080/0142159X.2017.1412412
  • Woldeab, D., & Brothen, T. (2019). 21st century assessment: Online proctoring, test anxiety, and student performance. International Journal of E-Learning and Distance Education, 34. Retrieved September 20th, 2022, from https://files.eric.ed.gov/fulltext/EJ1227595.pdf