396
Views
1
CrossRef citations to date
0
Altmetric
Research Article

Mark scheme design for school- and college-based assessment in VTQs

ORCID Icon & ORCID Icon
Pages 454-474 | Received 02 Jul 2019, Accepted 17 May 2020, Published online: 20 Jun 2020

References

  • Acquah, D. K., and P. Huddleston. 2014. “Challenges and Opportunities for Vocational Education and Training in the Light of Raising the Participation Age.” Research in Post-Compulsory Education 19 (1): 1–17. doi:10.1080/13596748.2013.872915.
  • Acquah, D. K., and D. Malpass. 2017. “The Technical Baccalaureate: Providing Excellence in Vocational Education?” Assessment in Education: Principles, Policy & Practice 24 (1): 96–117. doi:10.1080/0969594x.2015.1112253.
  • Ahmed, A., and A. Pollitt. 2011. “Improving Marking Quality through a Taxonomy of Mark Schemes.” Assessment in Education: Principles, Policy & Practice 18 (3): 259–278. doi:10.1080/0969594X.2010.546775.
  • Alton, A., K. Brown, and S. Maughan. 2018. Features of Effective Mark Schemes in Knowledge-Based Qualifications: Evidence from a Literature Review and Expert Interviews. Qualifications Wales: AlphaPlus.
  • Baird, J.-A., A. Beguin, P. Black, A. Pollitt, and G. Stanley. 2012. “The Reliability Programme: Final Report of the Technical Advisory Group.” In Reliability Compendium, edited by D. Opposs and Q. He, 771–838. Coventry: Ofqual.
  • Baird, J.-A., J. Greatorex, and J. F. Bell. 2004. “What Makes Marking Reliable? Experiments with UK Examinations.” Assessment in Education: Principles, Policy & Practice 11 (3): 331–348. doi:10.1080/0969594042000304627.
  • Barkaoui, K. 2011. “Effects of Marking Method and Rater Experience on ESL Essay Scores and Rater Performance.” Assessment in Education: Principles, Policy & Practice 18 (3): 279–293. doi:10.1080/0969594x.2010.526585.
  • Beach, P. 2018. “Applied Generals and Tech Level Qualifications: Additional Guidance [Open Letter].” Ofqual, March 15. https://www.gov.uk/government/publications/applied-generals-and-tech-levels-additional-guidance
  • Billington, L., and M. Meadows. 2005. A Review of the Evidence on Marking Reliability. Manchester: AQA Centre for Education Research and Practice.
  • Black, B., I. Suto, and T. Bramley. 2011. “The Interrelations of Features of Questions, Mark Schemes and Examinee Responses and Their Impact upon Marker Agreement.” Assessment in Education: Principles, Policy & Practice 18 (3): 295–318. doi:10.1080/0969594x.2011.555328.
  • Bloom, B. S., ed. 1956. Taxonomy of Educational Objectives: The Classification of Educational Goals. Handbook I: Cognitive Domain. London: Longmans, Green and .
  • Bramley, T. 2009. “Mark Scheme Features Associated with Different Levels of Marker Agreement.” Research Matters: A Cambridge Assessment Publication 8: 16–23.
  • Brockmann, M., L. Clarke, and C. Winch. 2008. “Knowledge, Skills, Competence: European Divergences in Vocational Education and Training (VET) - the English, German and Dutch Cases.” Oxford Review of Education 34 (5): 547–567. doi:10.1080/03054980701782098.
  • Brooks, V. 2012. “Marking as Judgment.” Research Papers in Education 27 (1): 63–80. doi:10.1080/02671520903331008.
  • Cadwallader, S. 2014. Developing Grade Descriptions for the New GCSEs: Considerations and Challenges. Manchester: AQA Centre for Education Research and Practice.
  • Carter, A., and A.-M. Bathmaker. 2017. “Prioritising Progression over Proficiency: Limitations of Teacher-Based Assessment within Technician-Level Vocational Education.” Journal of Further and Higher Education 41 (4): 460–474. doi:10.1080/0309877x.2015.1135881.
  • Child, S., J. Munro, and T. Benton. 2015. An Experimental Investigation of the Effects of Mark Scheme Features on Marking Reliability. Cambridge: Cambridge Assessment. http://www.cambridgeassessment.org.uk/Images/417277-an-experimental-investigation-of-the-effects-of-mark-scheme-features-on-marking-reliability.pdf
  • Cresswell, M. J. 1988. “Combining Grades from Different Assessments: How Reliable Is the Result?” Educational Review 40 (3): 361–382.
  • Cresswell, M. J. 1994. “Aggregation and Awarding Methods for National Curriculum Assessments in England and Wales: A Comparison of Approaches Proposed for Key Stages 3 and 4.” Assessment in Education: Principles, Policy & Practice 1 (1): 45–62. doi:10.1080/0969594940010104.
  • Crisp, V. 2013. “Criteria, Comparison and past Experiences: How Do Teachers Make Judgements When Marking Coursework?” Assessment in Education: Principles, Policy & Practice 20 (1): 127–144. doi:10.1080/0969594x.2012.741059.
  • Dawson, P. 2017. “Assessment Rubrics: Towards Clearer and More Replicable Design, Research and Practice.” Assessment & Evaluation in Higher Education 42 (3): 347–360. doi:10.1080/02602938.2015.1111294.
  • DfE. 2017. Technical and Applied Qualifications for 14 to 19 Year Olds. Key Stage 4 and 16 to 18 Performance Tables from 2020: Technical Guidance for Awarding Organisations. London: Department for Education.
  • Eraut, M., S. Steadman, J. Trill, and J. Parkes. 1996. The Assessment of NVQs. Brighton: University of Sussex.
  • Gillis, S., and P. Griffin. 2004. “Using Rubrics to Recognise Varying Levels of Performance.” Training Agenda: A Journal of Vocational Education and Training 12 (2): 22–24.
  • Gillis, S., and P. Griffin. 2005. “Principles Underpinning Graded Assessment in VET: A Critique of Prevailing Perceptions.” International Journal of Training Research 3 (1): 53–78. doi:10.5172/ijtr.3.1.53.
  • Greatorex, J. 2001. “Can Vocational a Levels Be Meaningfully Compared with Other Qualifications?” Paper presented at the British Educational Research Association Annual Conference, University of Leeds, Leeds, UK, September 13- 15.
  • Griffin, P., and M. Francis. 2017. “Writing Rubrics.” In Assessment for Teaching, edited by P. Griffin, 113–140. Cambridge: Cambridge University Press.
  • Griffin, P., M. Francis, and P. Robertson. 2017. “Judgement-Based Assessment.” In Assessment for Teaching, edited by P. Griffin, 90–112. Cambridge: Cambridge University Press.
  • Gulikers, J. T. M., P. Runhaar, and M. Mulder. 2017. “An Assessment Innovation as Flywheel for Changing Teaching and Learning.” Journal of Vocational Education & Training 70 (2): 212–231. doi:10.1080/13636820.2017.1394353.
  • Harlen, W. 2004. A Systematic Review of the Evidence of Reliability and Validity of Assessment by Teachers Used for Summative Purposes, Research Evidence in Education Library. London: EPPI-Centre, Social Science Research Unit, Institute of Education.
  • Harsch, C., and G. Martin. 2013. “Comparing Holistic and Analytic Scoring Methods: Issues of Validity and Reliability.” Assessment in Education: Principles, Policy & Practice 20 (3): 281–307. doi:10.1080/0969594x.2012.742422.
  • He, Q., and D. Opposs. 2012. “The Reliability of Results from National Tests, Public Examinations, and Vocational Qualifications in England.” Educational Research and Evaluation 18 (8): 779–799. doi:10.1080/13803611.2012.731777.
  • Humphry, S. M., and S. A. Heldsinger. 2014. “Common Structural Design Features of Rubrics May Represent a Threat to Validity.” Educational Researcher 43 (5): 253–263. doi:10.3102/0013189x14542154.
  • Isaacs, T. 2013. “The Diploma Qualification in England: An Avoidable Failure?” Journal of Vocational Education & Training 65 (2): 277–290. doi:10.1080/13636820.2013.783613.
  • Johnson, M. 2008a. “Assessing at the Borderline: Judging a Vocationally Related Portfolio Holistically.” Issues in Educational Research 18 (1): 26–43.
  • Johnson, M. 2008b. “Exploring Assessor Consistency in a Health and Social Care Qualification Using a Sociocultural Perspective.” Journal of Vocational Education & Training 60 (2): 173–187.
  • Johnson, M. 2008c. “Grading in Competence‐Based Qualifications – Is It Desirable and How Might It Affect Validity?” Journal of Further and Higher Education 32 (2): 175–184.
  • Johnson, S. 2011. A Focus on Teacher Assessment Reliability in GCSE and GCE. Coventry: Ofqual.
  • Johnston, B. 2004. “Summative Assessment of Portfolios: An Examination of Different Approaches to Agreement over Outcomes.” Studies in Higher Education 29 (3): 395–412.
  • Jonsson, A., and G. Svingby. 2007. “The Use of Scoring Rubrics: Reliability, Validity and Educational Consequences.” Educational Research Review 2 (2): 130–144. doi:10.1016/j.edurev.2007.05.002.
  • Laming, D. 2004. Human Judgment: The Eye of the Beholder. London: Thomson Learning.
  • Massey, A., and T. Dexter. 2002. “An Evaluation of Spelling, Punctuation and Grammar Assessments in GCSE.” Paper presented at the British Educational Research Association Annual Conference, University of Exeter, Exeter, UK, September 11- 14.
  • Morgan, C. 2010. “The Teacher as Examiner: The Case of Mathematics Coursework.” Assessment in Education: Principles, Policy & Practice 3 (3): 353–375. doi:10.1080/0969594960030305.
  • Nádas, R., I. Suto, and R. Grayson. 2012. “Analyse, Evaluate, Review ….how Do Teachers with Differing Subject Specialisms Interpret Common Assessment Vocabulary? ” Paper presented at the European Conference for Educational Research, University of Cádiz, Spain.
  • Newton, P. 2018. Grading Vocational & Technical Qualifications (Ofqual/18/6441/3). Coventry: Ofqual.
  • Ofqual. 2017. “Guidance: Glossary for Ofqual’s Statistics.” Ofqual. https://www.gov.uk/government/publications/glossary-for-ofquals-statistics/glossary-for-ofquals-statistics
  • Pinot de Moira, A. 2013. Features of a Levels-Based Mark Scheme and Their Effect on Marking Reliability. Manchester: AQA Centre for Education Research and Policy.
  • Pinot de Moira, A. 2014. Levels-Based Mark Schemes and Marking Bias. Manchester: AQA Centre for Education Research and Policy.
  • Pollitt, A., A. Ahmed, J.-A. Baird, J. Tognolini, and M. Davidson. 2008. Improving the Quality of GCSE Assessment. London: Qualifications and Curriculum Authority.
  • Raffe, D. 2015. “First Count to Five: Some Principles for the Reform of Vocational Qualifications in England.” Journal of Education and Work 28 (2): 147–164. doi:10.1080/13639080.2014.1001334.
  • Raikes, N., J. Fidler, and T. Gill. 2010. “Must Examiners Meet in order to Standardise Their Marking? An Experiment with New and Experienced Examiners of GCE AS Psychology.” Research Matters: A Cambridge Assessment Publication 10: 21–27.
  • Suto, I., and J. Greatorex. 2008. “What Goes through an Examiner’s Mind? Using Verbal Protocols to Gain Insights into the GCSE Marking Process.” British Educational Research Journal 34 (2): 213–233. doi:10.1080/01411920701492050.
  • Taylor, M., J. Pritchard, and E. Gray. 2006. Grading the Specialised Diploma. Manchester: AQA Centre for Education Research and Policy.
  • Williamson, J. 2018. “Characteristics, Uses and Rationales of Mark-Based and Grade-Based Assessment.” Research Matters: A Cambridge Assessment Publication 26: 15–21.
  • Wolf, A. 1993. Assessment Issues and Problems in a Criterion-Based System. London: Further Education Unit.
  • Wolf, A. 1995. Competence-Based Assessment. Buckingham: Open University Press.
  • Wyatt-Smith, C., V. Klenowski, and S. Gunn. 2010. “The Centrality of Teachers’ Judgement Practice in Assessment: A Study of Standards in Moderation.” Assessment in Education: Principles, Policy & Practice 17 (1): 59–75. doi:10.1080/09695940903565610.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.