5,880
Views
46
CrossRef citations to date
0
Altmetric
Articles

Detecting contract cheating: examining the role of assessment type

ORCID Icon, ORCID Icon & ORCID Icon
Pages 263-278 | Received 04 Jun 2019, Accepted 14 Dec 2019, Published online: 16 Feb 2020
 

ABSTRACT

This article contributes to an emerging body of research on the role of assessment design in the prevention and detection of contract cheating. Drawing on the largest contract cheating dataset gathered to date (see cheatingandassessment.edu.au), this article examines the types of assignments and exams in which students self-reported having engaged in some form of third-party cheating, and compares this with the types of assignments and exams in which staff self-reported detection of third-party cheating. The article outlines three key findings. Firstly, students most commonly reported cheating in the context of exams (particularly multiple-choice exams), yet staff reported the detection of cheating in exams relatively rarely. Secondly, students reported cheating in traditional written assignments, such as reports and essays, at slightly lower rates than exams, however staff detection rates for these assignments were far higher than for exams. So third-party cheating was reported by students as occurring most commonly in exams, yet it was detected most commonly by staff in assignments. And thirdly, staff detection rates relative to student cheating rates were typically highest for text-rich assessments – regardless of whether they were invigilated (e.g., essay under exam conditions) or not (e.g., essay). These findings challenge simplistic advice in the literature and public debate that universities should move away from text-based assignments and towards invigilated exams as a means to prevent contract cheating. While text-rich forms of assessment are not immune to contract cheating, exams are not inherently secure. While staff appear to be practised at detecting cheating in the context of student writing, greater awareness is needed to improve the detection of cheating in exams.

Acknowledgements

This project was funded by the Australian Government Department of Education and Training, Grant SP16-5383. We would also like to acknowledge and thank our project team members – Cath Ellis, Phil Newton, Karen van Haeringen, Pearl Rozenberg and Sonia Saddiqui – for their assistance with the development of the survey instrument used in this research.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Notes

1 The survey instruments are available here: www.cheatingandassessment.edu.au/surveys/. See Bretag et al. (Citation2018) for a full description of the survey development and piloting process.

2 The list of assessment and exam types was derived from a desktop survey of available Australian university task type lists. A long list was compiled, and then similar types collapsed to form a list of 15 assignment type options and 6 exam type options.

3 Data from ‘other’ and ‘I don’t remember’ together comprised no more than 20% of responses on any item. Due to the range of possible interpretations possible for ‘other’, they are not reported in this article.

4 While the data presented here is not continuous data, it has been presented as a line in order to clearly represent staff detection and maintain consistency between each graph in the article.

Additional information

Funding

This project was funded by the Australian Government Department of Education and Training, Grant SP16-5383. Office for Learning and Teaching.
This article is part of the following collections:
Higher Education Research & Development Best Article Award

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 53.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 494.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.