2,649
Views
10
CrossRef citations to date
0
Altmetric
Web paper

Portfolio as a method for continuous assessment in an undergraduate health education programme

, &
Pages e171-e176 | Published online: 03 Jul 2009

Abstract

A portfolio assessment system has been introduced into a biomedical science programme to promote both continuous learning and deep approaches to learning. Attention has been focused on creating harmony between the assessment system and the PBL curriculum of the programme. Biomedicine and laboratory work are central in the curriculum. The portfolio included evidence of laboratory work, personal reflections and certificates from the PBL tutor. The portfolio was assessed on three occasions over 20 weeks. The grades were ‘pass’ or ‘fail’. The tutor certificate appeared to be a crucial part of the portfolio since a ‘fail’ in this part usually led to an overall ‘fail’. Both students and teachers were concerned about ensuring that enough factual knowledge, as measured by a traditional test, had been achieved. The agreement was good enough for the pass or fail level but some expected differences were found at the detailed level. The course, including the portfolio, was evaluated orally during weekly whole-group meetings and using a questionnaire at the end. The students felt comfortable with the portfolio system and preferred it to a traditional test. The teachers felt that they needed to develop their teacher–student discussion skills and to improve their feedback on the reflections. Peer assessment between students is proposed as a line of action to enhance the credibility of the crucial tutor certificate. The portfolio might be an efficient tool for the students to concentrate their efforts on the most central concepts of medical laboratory work. The model will be developed through further discussions and better consensus among faculty.

Introduction

The portfolio has been used for both summative and formative assessment purposes in medical training, with a focus on final examinations in Dundee (Davis et al., Citation2001) and with a focus on introducing reflection on the role of a doctor early in the medical training in Maastricht (Driessen et al., Citation2003). A thorough description and guide to the method has been published (Friedman Ben David et al., Citation2001). Students who come to university directly from secondary school meet a new situation where they are expected to study for a profession. Functional knowledge has to be achieved (Bowden & Marton, Citation1998). Students in a professional programme often feel that there is a remarkable difference between the first and the last part of their studies and that they understand ‘what it is all about’ too late. Traditional knowledge tests that focus on reproduction and might lead students into superficial learning strategies are part of that problem. A recent broad discussion on students’ experiences and approaches can be found in Prosser & Trigwell (Citation1999). Assessment methods that stimulate deep approaches to learning are needed and, in this respect, some kind of portfolio system might be useful. In a problem-based learning (PBL) curriculum there might be tension between the self-directed philosophy of the method and a knowledge-focused assessment system. How can an assessment system that is in harmony with a PBL curriculum and that supports long-term functional knowledge-directed learning be arranged? In this paper we present the results of the students’ performance during their first semester judged by the portfolio method and by a formative end-of-semester written test. We also present the students’ opinions and the teachers’ experiences of the portfolio. We have found that the students are positive and gain considerable formative support from the portfolio, but we will discuss the problems that we have met, potential problems and future development of this application of the method. The study is part of a more extensive study, which aims at exploring and describing the design and development of assessments in five different study programmes at Lund University (Lindberg-Sand, Citation2003).

Method

The biomedical laboratory science programme in Lund

We have used the portfolio in the early part of the training of biomedical scientists. Most of the graduates from this three-year programme will find employment in hospital diagnostic laboratories or university research laboratories. The students in the programme come directly from secondary school, and most of them are 19–22 years old. The first two years encompass studies in biomedicine in a broad sense using PBL and laboratory training. In the third year the students have practical courses in hospital or university laboratories and a small research project. Under the old system, assessment in the first two years comprised mainly short-answer knowledge tests at the end of each semester, but practical tests as well as laboratory reports were also used. This approach meant that students focused on the knowledge tests, and concentrated their work in the final weeks of the semesters.

After having tested a preliminary model with an earlier group of students this portfolio model was introduced in 2002 as a tool to help the students to structure and survey their work during the course and as the main method of collecting data on students’ learning. Assessment was based on discussions with the student on his/her portfolio and was performed on three occasions during the first semester.

At the end of August 2002, 21 students (18 women, 3 men) were admitted to the programme. The portfolio was introduced and used during the first semester (20 weeks), which comprises a course that integrates cell biology, molecular biology and basic laboratory methods. PBL was used to help the students in their studies. The groups, with seven students and one tutor, met once a week for a two-hour session. The problems were collected from recent literature on cell and molecular biology and both the theoretical content and the laboratory methods used were in focus during the discussions. Laboratory work was scheduled for two full days every week and was performed in two groups. To start with, the students had to structure and equip the laboratory by recollecting former knowledge, using study visits to laboratories at the nearby hospital and the research centre of the Faculty of Medicine and through discussions with the teachers. Thereafter they practised the use of basic laboratory equipment such as centrifuges, pH-meters, spectrophotometers, pipettes etc. This training was finished with a ‘driver's licence’ test, which was documented in the form of written certificates and collected in the portfolio (). During the second half of the course the students worked on laboratory projects, which were accounted for in written reports and either posters or seminars. All laboratory work was to be documented in a logbook. What happened in the laboratory and how work is best performed in the laboratory was discussed in weekly whole-group sessions. The students met only three teachers during their first semester, since the same teachers were both tutors and supervisors in the laboratory. All teachers have scientific training and are experienced PBL tutors. These teachers and one additional teacher also carried out the portfolio discussions with the students.

Table 1.  Compulsory content of the portfolio

The portfolio assessment

On three occasions, after 4, 10 and 20 weeks respectively, each student had a half-hour discussion with a teacher about the contents of the personal portfolio. No teacher discussed with students from the PBL group he/she tutored. The students were asked to deliver a written reflection on a specified subject prior to each such discussion. The first reflection (week 4) was on personal strengths and weaknesses in connection with learning at the university level and especially in this study programme, the second (week 10) on personal performance and achievements so far and the third (week 20) on the student's present idea of the professional role of a biomedical scientist. The first discussion was of purely formative nature. The student was supposed to become acquainted with the portfolio and with the one-to-one contact between student and teacher. On the second occasion after 10 weeks, the student could achieve a pass grade for the first half of the course. Those who didn’t pass the first half were advised to remedy the situation by being more active and working harder during the second part of the course. If successful they could obtain a pass grade for the entire course at the end of the semester. At the Faculty of Medicine in Lund the only grades given are pass or fail. Students who fail get further assessment opportunities to be able to pass. In this course the next opportunity was the third discussion in the end of the semester. On the second discussion occasion the portfolio was expected to contain ‘driver's licences’ in seven different basic laboratory areas, results from work in the laboratory, the laboratory logbook, a reflection on personal strengths and weaknesses and a reflection on personal performance and achievements so far. An important part in the portfolio was a certificate from the tutor on constructive engagement, knowledge displayed and presence during the PBL sessions (). On the third discussion occasion, a written report on the laboratory project, a certificate on the student's seminar or poster presentation and a reflection on the student's present idea of the professional role of a biomedical laboratory scientist were added to the portfolio. The seminar/poster certificate was based on the students’ ability to present and discuss within a given subject. The laboratory logbook should also have been expanded (). After all discussions and assessments were finished the teachers agreed on four levels of student performance based on activities in the PBL groups, seminars and in the laboratory as demonstrated in the portfolios, to facilitate comparison with a conventional test. The levels were excellent pass, pass, poor pass and fail.

Short-answer test

When the course was over, the students were given a conventional written test consisting of 52 short-answer questions in the area of basic biomedical laboratory science as a means of looking at their achievements from a different angle. The students knew that the test was to be used only for formative purposes, and they could choose not to participate. The test was broad and covered areas that will be further studied during the rest of the programme, inspired by the progress tests of Maastricht (van der Vleuten et al., Citation1996). Seventeen students took the test. The result was indexed with the best student as 100.

Course evaluation

The students evaluated the course orally during the whole-group meetings throughout the semester and at the end they filled in a questionnaire where they had to agree on a number of statements on a six-point Likert type scale, from ‘do not agree’ (1) to ‘agree fully’ (6). Some of the statements had been judged with three earlier groups of students who had worked with a preliminary form of the portfolio the year before. The educational climate was assessed by a standardized questionnaire (DREEM, Roff et al., Citation1997).

Results

The teachers’ overall impression was that students compiled their portfolios very well and that they reflected in a constructive way on their learning situation and their future role in the laboratory. However, only seven students were given the pass grade after 10 weeks () in connection with the second discussion. It had been agreed among the teachers that the early pass grade was to be limited to those who demonstrated excellent performance. The main obstacle for those who did not pass was the certificate from the PBL tutor, i.e. they did not demonstrate good enough constructive activity or good enough knowledge, or they were absent too often. The requirements in the laboratory part of the course were easier to fulfil. Six students did not reach the pass level even on the third discussion occasion (), again mainly because of poor performance in the PBL group. They were given an oral assessment at a later date, which four students passed. Two students who were given an early pass grade and thus were regarded as excellent at that time did not reach the excellent level at the end of the semester. In fact, one of them did not pass at all. The agreement between the results of the portfolio assessment and the results on the written test was fairly good although there was a considerable overlap between the groups (). The correlation is –0.59 between indexed result and the four levels in . One student performed much worse and another much better in the test. One student who failed during the third discussion and was given an extra oral examination did very well on the test.

Figure 1. Correlation between indexed result on the traditional test with the best student as 100 and the groups of judgement from the portfolios. Excellent pass (1), pass (2), poor pass (3) and fail (4) (R = −0.59).

Figure 1. Correlation between indexed result on the traditional test with the best student as 100 and the groups of judgement from the portfolios. Excellent pass (1), pass (2), poor pass (3) and fail (4) (R = −0.59).

Table 2.  Indexed results from traditional test, judgements after the second and third discussions and end of course judgements

The students felt somewhat insecure about the method as indicated in the whole-group discussions during the course. “Is this enough, do we learn enough if there is not a written examination at the end?” On the other hand, when asked in the written part of the course evaluation if they preferred portfolio to conventional examination, 19 out of 21 students answered ‘yes’. The statements that the students had to agree on in the questionnaire are listed in . Overall, the students were satisfied with the course as indicated by the scores being in the upper half of the six-point scale. “Stimulated thinking about competences you need as a biomedical scientist” (4.1) was among the lowest scores but considerably higher than the opinion of the previous groups (3.2). The students felt informed about the portfolio (4.9) and they saw it as a good indicator of progress (4.8). The progress score was also considerably higher than that of the previous groups (3.8). The impact on learning of the discussion with a teacher on the portfolio also scored relatively low (4.1). Again the students preferred portfolio (4.4) to more traditional tests (3.0). The previous groups also preferred portfolio (4.1). Scores on PBL (4.9), laboratory work (5.1) and laboratory discussions (4.4) are included for comparison. In the free comments on what was especially good during the course, two students mentioned the portfolio and reflection before the portfolio discussion, while nine students mentioned PBL. No student mentioned the portfolio in the free comments on what was especially bad during the course.

Table 3.  Students’ opinions on nine statements about the course

The result of the test of the educational climate showed that the climate was good or satisfactory () (Jiffry et al., Citation2005). The maximum possible result in the five different domains is 4. Results from the local medical programme (unpublished) and from the medical programme in Nepal (Roff et al., Citation2001) are included in the table for comparison.

Table 4.  DREEM domain results for the present group of student (n= 21), a fourth-year medical group from Lund (n = 18; unpublished result) and a first-year group from Nepal (n= 29; Roff et al.) included for comparison

In the teachers’ discussions one key issue was the fairness of the tutor evaluation of the students’ knowledge and activity in the PBL group. Another was how to carry out a good teacher–student discussion and a third was how to give constructive feedback on the students’ written reflections. Furthermore the teachers, as well as the students, discussed whether or not the students learned enough basic facts when this knowledge was not tested.

Discussion

Overall, the students were positive towards the portfolio model, although with a relatively low score on the student–teacher discussions. The students felt that the portfolio was an indicator of their progress during the course. Differences between this group of students and the former group that worked with a preliminary portfolio model (see ) might be due to a more complete system with more emphasis on the reflection on the competences needed by a biomedical scientist. Further experience among the teachers about the portfolio model and thus better integration of the portfolio in the daily work might explain why students felt to a higher degree that the portfolio was an indicator of progress. Fairly good agreement between the portfolio and a traditional test was found although a small number of students were judged differently by the two methods. Facilitators’ assessments have been found by Whitfield & Xie (Citation2002) not to agree fully with results from written exams, especially for students at the bottom end of the scale. Major changes in the curriculum or assessment system often have a negative impact on the educational climate (McAleer et al., Citation1998). In the present case we found a positive climate better than or similar to comparative settings (Roff et al., Citation2001). This indicates that the introduction of the portfolio did not negatively affect the educational climate in the study group.

In the local set of rules and regulations there are no grades, and the result of assessment is only pass or fail. An advantage of the portfolio is that it can be used for continuous assessment, where the student will either get a pass or a not-yet pass, i.e. improvement is needed and can be agreed upon. In principle, every student can thus obtain a pass, although it will take longer for some. Another advantage of the portfolio is that it can be used throughout the whole programme and the contents can change and develop as the student works through the programme.

The portfolio seems to be a good instrument to judge whether students have reached the pass level while it also reduces the assessment stress felt by the students. Assessment stress has been shown to promote superficial approaches to learning (Bowden & Marton, Citation1998). The students prefer portfolio assessment, which indicates that they feel comfortable with the method. Does that in turn indicate deep approaches to learning or is it due to the fact that the students felt that their workload was reduced? This question can only partly be answered today but will be answered more fully as these students enter the job market. Judged by the students’ performance in the laboratory, at seminars and from their written laboratory reports most of them seem to work hard. During the semester both students and teachers expressed worry that not enough factual knowledge was obtained. We therefore decided to use a traditional test for formative reasons. It did not give exactly the same result as the portfolio in detail but at the pass or fail level, which is the most important one in our system, there was sufficient agreement. This lack of detailed agreement is not surprising as factual knowledge is only one of the competences needed to be able to fulfil the demands for the portfolio contents.

One apparent problem in this setup was the judgement of constructive activity and displayed knowledge in the PBL group, which was included in the portfolio in the form of a certificate from the tutor. The tutors might not have used the same levels of judgement or even the same criteria on a detailed level. This difference could be the main reason why some low scorers in the traditional test were given a pass grade early in the course. In the future, we will either reduce the tutors’ judgement to activity in the PBL group or we will have a firmer discussion on levels and criteria among the tutors. The reliability of tutor assessment of students in problem-based learning has been questioned (Cunnington, Citation2002; Whitfield & Xie, Citation2002). If it is to be used, the detailed criteria have to be agreed in advance and these must be made available to the students. Peer assessment among the students in the PBL group could be included as part of the basis for the certificate to add credibility to the process. The strong influence of the tutor certificate is important as it can add significance to and enhance student performance in the PBL process. One of the aims in changing the assessment system was to find a method more in harmony with the PBL curriculum.

The students appreciated the discussions on the portfolio but to a lower degree than, for example, PBL sessions and laboratory work. This is partly due to the limited time, three half-hours, spent on discussions compared with several hours every week spent on PBL sessions and even more time spent in the laboratory. However, the teachers were also concerned about the discussions and regarded them as a new component in their work. They felt a need to develop their ability to conduct one-to-one discussions and to give constructive feedback. The portfolio is good for formative assessment since it gives a focus for the discussion between the student and the mentor. When the portfolio is used for summative assessment the contents must be available to the examiner, and there is a risk that the student may not include material that demonstrates weaknesses. Normally the student should have a mentor to help with formative discussions around the portfolio, but if the mentor is also the examiner this risk becomes obvious. It would probably be an advantage if the student could discuss the collection with a mentor who gives formative advice, and then make a selection that is assessed for summative purposes by someone who is not the mentor. We have realized that staff need extensive help with preparation and support in connection with these kinds of discussions. Studying and discussing the literature is part of that but personal guidance may also be needed.

The validity and reliability of portfolio assessment have been questioned (Roberts et al., Citation2002). However, it is difficult to find methods of assessment that are both valid and reliable and still have a positive effect on the approach to learning. The portfolio is more in line with a qualitative tradition and biomedicine has in general relied heavily on a quantitative tradition. Maybe an assessment based on qualitative methods could be used when the main objectives of the assessment are to judge whether a competence has been reached. In our experience the portfolio is an efficient tool for the students to concentrate their efforts in a formative and reflective way on the most central concepts of medical laboratory work. An important factor is that the portfolio system can promote deep approaches to learning through reduced assessment stress and by spreading the workload over the whole of the course. We will continue to develop our local model and use it throughout all three years of the study programme.

Additional information

Notes on contributors

Göran Thomé

GORAN THOMé, PhD, is a senior lecturer at the Office of Medical Education, Medical Faculty, Lund University and former head of the Biomedical Laboratory Science programme within the Faculty of Medicine.

Hans Hovenberg

HANS HOVENBERG, PhD, is a senior lecturer at the Department of Cell and Molecular Biology, Medical Faculty, Lund University and responsible for the first semester of the Biomedical Laboratory Science programme within the Faculty of Medicine.

Gudrun Edgren

GUDRUN EGREN, PhD MMEd, is a senior lecturer at the Office of Medical Education, Medical Faculty, Lund University, and has previously been senior lecturer in the Biomedical Laboratory Science programme within the Faculty of Medicine.

References

  • Bowden J, Marton F. The University of Learning: Beyond Quality and Competence in Higher Education. Kogan Page Limited, London 1998
  • Cunnington J. Evolution of student assessment in McMaster University's MD Programme. Medical Teacher 2002; 24: 254–260
  • Davis MH, Friedman Ben-David M, Harden RM, Howie P, Ker J, McGhee C, Pippard MJ, Snadden D. Portfolio assessment in medical students’ final examinations. Medical Teacher 2001; 23: 357–366
  • Driessen EW, Van Tartwijk J, Vermunt JD, Van der Vleuten CPM. Use of portfolios in early undergraduate medical training. Medical Teacher 2003; 25: 18–23
  • Friedman Ben-David M, Davis MH, Harden RM, Howie PW, Ker J, Pippard MJ. AMEE Medical Education Guide No. 24: Portfolios as a method of student assessment. Medical Teacher 2001; 24: 535–551
  • Jiffry MTM, McAlleer S, Fernando S, Marasinghe RB. Using the DREEM questionnaire to gather baseline information on an evolving medical school in Sri Lanka. Medical Teacher 2005; 27: 348–352
  • Lindberg-Sand Å. 2003, Examinations as Practiced Views of Knowledge: From Testing Explicit Knowledge to Assessing Implicit Competence in Swedish Higher Education (Lund University, Sweden, Report No 2003:222, Office of Evaluation [in Swedish]. An English summary is available at: http://www.evaluat.lu.se/publ/SummaryExam.pdf (accessed 22 September 2005)
  • McAleer S, Roff S, Harden R, Al Qathani MU, Ahmed A, Deza H, Groenen G. The Medical Education Environment Measure: a diagnostic tool. Medical Education 1998; 32: 217
  • Prosser M, Trigwell K. Understanding Learning and Teaching: The Experience in Higher Education. Society for Research into Higher Education & Open University Press, Buckingham 1999
  • Roberts C, Newble D, O’Rourke A. Portfolio-based assessments in medical education: are they valid and reliable for summative purposes?. Medical Education 2002; 36: 899–900
  • Roff S, McAleer S, Harden R, Al-Qahtani M, Ahmed AU, Deza H, Groenen G, Primparyon P. Development and validation of the Dundee Ready Education Environment Measure (DREEM). Medical Teacher 1997; 19: 295–299
  • Roff S, McAleer S, Ifere OS, Bhattacharya S. A global diagnostic tool for measuring educational environment: comparing Nigeria and Nepal. Medical Teacher 2001; 23: 378–382
  • Van der Vleuten CPM, Verwijnen GM, Wijnen WHFW. Fifteen years of experience with progress testing in a problem based learning curriculum. Medical Teacher 1996; 18: 103–109
  • Whitfield CF, Xie SX. Correlation of problem-based learning facilitators’ scores with student performance on written exams. Advances in Health Science Education 2002; 7: 41–51

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.