540
Views
10
CrossRef citations to date
0
Altmetric
Articles

Why are some GCSE examination questions harder to mark accurately than others? Using Kelly's Repertory Grid technique to identify relevant question features

&
Pages 335-377 | Published online: 22 Jul 2009
 

Abstract

It has long been established that marking accuracy in public examinations varies considerably among subjects and markers. This is unsurprising, given the diverse cognitive strategies that the marking process can entail, but what makes some questions harder to mark accurately than others? Are there distinct but subtle features of questions and their mark schemes that can affect accuracy? Such features could potentially contribute to a broad rationale for designating questions to markers according to personal expertise. The aim of this study was to identify question features that can distinguish those questions that are marked highly accurately from those that are marked less accurately.

The study comprised an exploration of maths and physics questions from past GCSE examinations, which were marked in an experimental setting by groups of markers and yielded differing marking accuracies. The questions also varied in their difficulty for GCSE candidates, and in the cognitive strategies needed to mark them.

Kelly's Repertory Grid technique and semi‐structured interview schedules were used in meetings with highly experienced principal examiners, who had led the experimental marking of the questions. The data generated comprised ratings for each question on a number of question features (constructs). The ratings were analysed together with the marking accuracy data, enabling an investigation of possible relationships between each question feature and (i) marking accuracy, (ii) question difficulty for the candidate, and (iii) apparent cognitive marking strategy usage.

For both subjects, marking accuracy was found to be related to various subject‐specific question features, some of which were also related to question difficulty (for the candidate) and/or apparent marking strategy complexity. For both maths and physics, several other subject‐specific question features were found to be unrelated to accuracy. Overall, the findings have potential implications for the management of markers and for question design.

Acknowledgements

This research is based on examinations administered by Oxford, Cambridge and RSA Examinations (OCR) and was funded by Cambridge Assessment.

Notes

1. Matching is used when the candidate's response is a visually recognisable item or pattern (e.g. a letter, word or number). The marker looks at a pre‐determined location and compares the response with the correct answer, which is either held in the working memory or recollected using the mark scheme. Scanning occurs when the marker scans the whole answer space, to identify whether a particular detail is present or absent in the candidate's response. The scanned‐for detail may be either simple (e.g. a single number) entailing only pattern recognition, or complex (e.g. a phrase) entailing semantic processing. Evaluating requires careful semantic processing: the marker considers the candidate's response for structure, clarity, factual accuracy, logic, or other characteristics in the mark scheme. Scrutinising follows on from, or is used in combination with, other strategies. When a response is incorrect or unexpected, the marker attempts to identify where the problem lies, and whether the response is a valid alternative to what is in the mark scheme. The overarching aim is to reconstruct the candidate's line of reasoning. No response arises when nothing has been written in the answer space and the marker simply checks the space to confirm this.

2. Part of this research was presented by Alison Wood in two QCA seminars, in March 2006 and May 2006.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 53.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 538.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.