1,953
Views
10
CrossRef citations to date
0
Altmetric
Web Papers

Examiner perceptions of a portfolio assessment process

&
Pages e211-e215 | Received 23 Dec 2008, Accepted 09 Feb 2010, Published online: 27 Apr 2010

Abstract

Background: The portfolio assessment process is important for assessing learner achievement.

Aims: To study examiner perceptions of Dundee Medical School's portfolio assessment process, in years 4 and 5 of the 5-year curriculum, in relation to: outcomes as a framework for the portfolio assessment process; portfolio content; portfolio assessment process; end points of the portfolio assessment process; appropriateness of the two part final exam format and examiner training.

Methods: A questionnaire containing statements and open questions was used to obtain examiner feedback. Responses to each statement were compared over 3 years: 1999, 2000 and 2003.

Results: Response rates were 100%, 88% and 61% in 1999, 2002 and 2003, respectively. Examiners were positive about the ability of institutionally set learning outcomes (Dundee 12 exit learning outcomes) to provide a framework for the portfolio assessment process. They found difficulties, however, with the volume of portfolio content and the time allocated to assess it. Agreeing a grade for each learning outcome for the candidate with their co-examiner did not present difficulties. The comprehensive, holistic picture of the candidate provided by the portfolio assessment process was perceived to be one of its strengths. Examiners were supportive of the final examination format, and were satisfied with their briefing about the process.

Conclusions: The 12 exit learning outcomes of Dundee curriculum provide an appropriate framework for the portfolio assessment process, but the content of the portfolio requires fine-tuning particularly with regard to quantity. Time allocated to examiners for the portfolio assessment process needs to be balanced against practicability. The holistic picture of the candidate provided by the process was one of its strengths.

Introduction

Portfolio assessment for medical students’ final examination was introduced at Dundee Medical School in 1997 to tackle the issues of assessing student attitudes, professionalism and performance in work-place settings. This article describes the examiner perceptions of the portfolio assessment process and will be of interest to health professions’ teachers involved in portfolio assessment and those considering the introduction of a portfolio assessment process. The student perceptions of the portfolio assessment process have been described elsewhere (Davis et al. Citation2009).

Background

The curriculum at Dundee Medical School is a 5-year, integrated, systems-based course (Harden et al. Citation1997). During this study (1999–2003), it had three phases: phase 1 dealing with normal structure, function and behaviour; phase 2 with abnormal structure, function and behaviour; and phase 3 with clinical learning and a task-based approach (Harden et al. Citation2000). The examiner perceptions of the phase 3 portfolio assessment process are the subject of this study.

The portfolio assessment process has five steps: (1) collection of evidence of achievement of learning outcomes by the student; (2) student reflection on the evidence; (3) examiner evaluation of the evidence; (4) defence of the portfolio by the student with two examiners and (5) an assessment decision taken by groups of examiners (Friedman Ben David et al. Citation2001). The 12 exit learning outcomes of the Dundee Medical School (Harden et al. Citation1999) provide the framework for the portfolio in years 4 and 5 (i.e. phase 3). Material is included in the portfolio at regular intervals throughout the 2 years. The material is graded by teaching staff on an A–G scale, required by the University of Dundee, and feedback on the material is returned to the students for formative purposes. The feedback follows each portfolio entry that the student submits; i.e. feedback is provided throughout the 2-year process. Students sit part 1 of their finals at the end of year 4, comprising an objective structured clinical examination (OSCE), extended matching item multiple choice exam, and a constructed response paper. Part 2 of the final exam is the portfolio assessment process, which takes place towards the end of year 5. Once a student submits the completed portfolio, two examiners have approximately 1 week to read it to arrive at an initial judgement of the student's strengths and weaknesses in terms of the 12 learning outcomes, based on the gradings awarded to the student over the 2–year period and their own reading of the portfolio material. The examiners meet to discuss how they will probe the individual student's strengths and weaknesses. This initial judgement is later confirmed or refuted at an interview with the student lasting 40 minutes. A grade is awarded for each outcome and included in the portfolio assessment summary sheet that includes all the grades awarded over the 2-year period. Through a consensus approach, the examiners arrive at the final summative decision as to whether the student has passed, failed or further evidence is required to reach a decision. At the time of this study, students with two or more ‘D’ final outcome grades or any single grade below ‘D’ were either requested to submit further evidence of achievement of the outcomes where they were weak or failed the exam.

The internal examiners were staff who taught at some point in the 5-year curriculum, but not necessarily in years 4 or 5. Basic scientists, laboratory-based staff and clinicians accepted invitations to examine the students’ portfolios. All examiners participated in a 2-hour examiner training programme comprising an initial briefing regarding the rationale of the portfolio assessment process, portfolio content material that provided information about each of the 12 curriculum exit outcomes and instructions about the rating scale used in the portfolio assessment process. Ten external examiners from outwith Dundee Medical School were invited to join the examiner group, in keeping with University of Dundee regulations requiring the contribution of external examiners. The external examiners either attended the internal examiner briefing session or were given an individual briefing. They took an active part as examiners in the portfolio assessment process and also observed internal examiners during discussion of portfolios with individual students.

The present study is based on an examiner evaluation questionnaire, completed by both external and internal examiners, designed to study examiners’ perceptions of the outcome framework of the portfolio process; the portfolio content; the portfolio assessment process; the end points of the portfolio assessment process; the appropriateness of the two part final exam format and the examiner training.

Methods

The examiner questionnaire was developed following discussions with the staff members who designed the portfolio assessment process, i.e. the phase 3 assessment working group, and with the internal and external examiners. The questionnaire contained both items to be rated by the examiners and open questions. A five-point Likert rating scale was used, where strongly agree = 5 and strongly disagree = 1, except for *items in , where a 1–3 scale denoting ‘too little’, ‘about right’, and ‘too much’, respectively, was used. The questionnaire underwent some change each year to reflect the changes made to the portfolio assessment process, based on the student/staff feedback of the previous years (Davis et al. Citation2009). All the items are provided in the tables with the year in which they were used. The questionnaire was distributed immediately after the examiners’ meeting, held at the end of the portfolio assessment process, in years 1999, 2002 and 2003.

The yearly median ratings for each statement were compared using the Kruskal Wallis test, at a statistical significance of p ≤ 0.05. To estimate how large the change was from year to year (i.e. to estimate the contribution by the ‘year’ to the overall variability of average scores for a given questionnaire item) effect sizes were calculated using partial η2. This effect size test was used in the absence of a suitable effect size measure for non-parametric comparisons involving more than two variables. All statistical analyses were carried out using an SPSS version 15.0.

The number of subjects precluded factor analysis of the examiner perceptions. The examiner responses to the open questions of each year were grouped into categories by the first author and then compared between the years.

Results

The questionnaire response rates were 100% (33) in 1999, 88% in 2002 (58 out of 66) and 61% (40 out of 66) in 2003.

All the questionnaire ratings are shown in Tables . shows the ratings for statements about the outcome framework for the portfolio assessment process. shows the ratings for statements about the portfolio content. Though the examiner opinion significantly changed from that of the first year, they were still unsure as to the manageability of the volume of information (with a relatively substantial effect size). shows the ratings for statements about the portfolio assessment process.

Table 1.  The comparison of the median (mean ± SD) ratings for the questionnaire statements on the outcome framework of the portfolio process, over the 3 years

Table 2.  The comparison of the median (mean ± SD) ratings for the questionnaire statements on portfolio content, over the 3 years

Table 3.  The comparison of the median (mean ± SD) ratings for the questionnaire statements on portfolio assessment process, over the 3 years

Table 4.  The comparison of the median (mean ± SD) ratings for the questionnaire statements on outcomes of the portfolio assessment process, over the 3 years

The examiners feel rushed in their contribution to the portfolio assessment process. Although there was a significant change in this opinion with substantial effect sizes in the last 2 years, as opposed to the first year, the examiners were still uncertain as to the adequacy of the time allocated to them for their part of the portfolio assessment process. Similarly, though they were generally satisfied with the training that they received, the examiners were unsure whether it would have been useful if more information about the course was given to them in advance. shows the ratings for statements about the end points of the portfolio assessment process.

The examiners consistently agreed with the statement ‘the combination of an OSCE and written examination in the fourth year and a portfolio exam in the fifth year is an effective approach to the assessment of the competence to graduate’.

There was not much variability in assessors’ perceptions (i.e. median questionnaire item scores) over the 3 years. Overall, the examiners have favourable opinions about portfolio assessment, and that opinion has not changed over the years.

The open-ended response categories that had more than 10 responses in at least 1 year are shown in .

Table 5.  Comparison of open-ended responses

Discussion

The outcome framework

The examiners are confident in the ability of the institutionally set exit learning outcomes to provide a useful framework for the portfolio assessment process. This provides reassurance about the adoption of such an outcome framework for the portfolio assessment process. The students also thought that the outcomes provided a framework for portfolio building and assessment (Davis et al. Citation2009).

Portfolio content

The volume of information in the portfolio was a problem for examiners increasingly over the years. Thus, both the examiners and students (Davis et al. Citation2009) thought that there was too much material in the portfolio, despite a reduction in portfolio content over the years of this study (Davis et al. Citation2009). Too much portfolio material is a well-documented problem that seems hard to overcome (Athenases Citation1997). To address this issue, material that provides no new information about the student and insufficiently reliable material should be omitted from the portfolio.

Portfolio assessment process

The examiners were satisfied with most aspects of the portfolio assessment process, reflecting their confidence in the five-step process (Davis et al. Citation2001; Friedman Ben David et al. Citation2001). There were uncertainties over the amount of time allocated for their part of the portfolio assessment process. Portfolio assessment is well recognised to be a time consuming undertaking for both examiners and students (Race Citation1996). These time demands should be balanced against the benefits of a portfolio assessment process. There was a realisation within the medical school that only the best and worst students needed to discuss their portfolios with the examiners reducing both the amount of staff time expended on the process and student anxiety (Davis et al. Citation2009). Attempts to streamline the portfolio assessment process, however, had to be abandoned in the light of examiner and student opinion. Having expended large amounts of time in compiling their portfolios the students and examiners thought that all the students’ portfolios should be read and discussed.

The majority of the examiners found little or no difficulty in agreeing upon the student grade with the co-examiner (, item 3). This finding contrasts with the evidence on inter-rater reliability by Pitts et al. (Citation1999, Citation2001, Citation2002) and Roberts et al. (Citation2002), but concurs with the more recent inter-rater reliability evidence related to rating scales (Olson et al. Citation2003).

The end points of portfolio assessment process

The strength of the portfolio assessment as identified by the examiners revolves around its ability to assess candidates holistically with comprehensive coverage of performance (, item 4). It is recognised that portfolios facilitate the assessment of integrated and complex abilities, taking into account the level and context of learning (Friedman Ben David et al. Citation2001), combining information from both subjective and objective assessment procedures (Schuwirth et al. Citation2002), ‘to see the whole picture’ (Schuwirth Citation2002).

A few examiners were concerned by the inability of the portfolio to assess clinical skills (, item 5). This is not a deficit of the portfolio assessment process, but of the portfolio content. The inclusion in the portfolio of a workplace-based assessment such as the mini-clinical evaluation exercise (mini-CEX) in appropriate numbers should be able to provide reliable evidence of the student performance of clinical skills in the workplace. Evidence of student competence in clinical skills can be provided by the inclusion of the year 4 OSCE results in the portfolio.

Examination format

The examiner support for the combination of the OSCE and the written examination in the 4th year, followed by the portfolio assessment in the 5th year confirms the appropriateness of the present examination format to assess all four levels of the Miller's pyramid (Miller Citation1990). The written examinations assess at the ‘knows’ and ‘knows how’ levels and the OSCE assesses at the ‘shows how’ level in the 4th year. The portfolio contains evidence that is assessed at ‘does’ level in the 5th year. If the 4th year written exams and OSCE results are included in the portfolio, the portfolio assessment process itself is capable of assessing the students at all four levels of Miller's pyramid.

Examiner training

Though the examiners were satisfied with the briefing that they received, they would have preferred more information about the course in advance. This may further add to the logistics and the cost of the examination. The cost of assessment, in terms of both expenditure and examiner time (Snadden et al. Citation1996; Roberts et al. Citation2002), is an accepted disadvantage of portfolio assessment. This, however, may well be the price that one has to pay for an authentic assessment of performance.

The evaluation process

Examiner participation in the evaluation was high in the first 2 years although in 2003 participation was lower. Examiners made use of the open questions in the evaluation and have freely expressed their opinion, both for and against the portfolio assessment process. The data in this article present a clear picture of examiner opinion. The results of the above study, however, need to be considered with other extant evaluation evidence, such as student perceptions, external examiner reports and student portfolio assessment results to give a full picture of the evaluation of the portfolio assessment process.

Conclusions

Despite the drawbacks of increased examiner and student workload, the examiners acknowledge the suitability of portfolio assessment (together with an OSCE to assess the student competence) for final year medical undergraduate assessment. Future studies need to be directed towards finding the right balance of portfolio content that will provide a manageable workload for the students and examiners, without compromising the comprehensive assessment of student performance that the portfolio assessment process offers.

Declaration of interest: The authors report no conflict of interest. The authors alone are responsible for the content and writing of this article.

Additional information

Notes on contributors

Margery H. Davis

MARGERY DAVIS is a professor in Centre for Medical Education, University of Dundee, UK.

Gominda G. Ponnamperuma

GOMINDA PONNAMPERUMA is in Medical Education Development and Research Centre (MEDARC), Faculty of Medicine, University of Colombo, Sri Lanka.

References

  • Athenases S. The promise and challenge of educational portfolio. Portfolio assessment: A handbook for educators, J Barton, A Collins. Dale Seymour, New York 1997; 99–109
  • Davis MH, Friedman Ben David M, Harden RM, Howie P, Ker J, McGhee C, Pippard MJ, Snadden D. Portfolio assessment in medical students’ final examinations. Med Teach 2001; 23: 357–366
  • Davis MH, Ponnamperuma GG, Ker J. Student perceptions of a portfolio assessment process. Med Educ 2009; 43: 89–98
  • Friedman Ben David M, Davis MH, Harden RM, Howie PW, Ker J, Pippard MJ. AMEE Guide No. 24: Portfolios as a method of student assessment. Med Teach 2001; 23: 535–551
  • Harden RM, Crosby JR, Davis MH. AMEE Medical Education Guide No. 14: An introduction to outcome-based education. Med Teach 1999; 21: 7–14
  • Harden RM, Crosby JR, Davis MH, Howie PW, Struthers AD. Task-based learning: The answer to integration and problem-based learning in the clinical years. Med Educ 2000; 34: 391–397
  • Harden RM, Davis MH, Crosby JR. The new Dundee medical curriculum: A whole that is greater than the sum of the parts. Med Educ 1997; 31: 264–271
  • Miller G. The assessment of clinical skills/competence/performance. Acad Med 1990; 65: S63–S67
  • Olson L, Schieve AD, Ruit KG, Vari RC. Measuring inter-rater reliability of the sequenced performance inventory and reflective assessment of learning (SPIRAL). Acad Med 2003; 78: 844–850
  • Pitts J, Coles C, Thomas P. Enhancing reliability in portfolio assessment: ‘Shaping’ the portfolio. Med Teach 2001; 23: 351–356
  • Pitts J, Colin C, Thomas P. Educational portfolios in the assessment of general practice trainers: Reliability of assessors. Med Educ 1999; 35: 515–520
  • Pitts J, Colin C, Thomas P, Smith F. Enhancing reliability in portfolio assessment: Discussions between assessors. Med Teach 2002; 24: 197–201
  • Race P. The art of assessing: 2. New Academic 1996; 5: 3–6
  • Roberts C, Newble D, O’Rourke AJ. Portfolio-based assessments in medical education: Are they valid and reliable for summative purposes?. Med Educ 2002; 36: 899–900
  • Schuwirth L. Commentaries: Professional development in undergraduate medical curricula from an assessment point of view. Med Educ 2002; 36: 312–313
  • Schuwirth LWT, Southgate L, Page GG, Paget NS, Lescop JMJ, Lew SR, Wade WB, Baron-Maldonado M. When enough is enough: A conceptual basis for fair and defensible practice performance assessment. Med Educ 2002; 36: 925–930
  • Snadden D, Thomas ML, Griffin EM, Hudson H. Portfolio-based learning and general practice vocational training. Med Educ 1996; 30: 148–152

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.