353
Views
1
CrossRef citations to date
0
Altmetric
Descriptive Account

The Introduction of Assessed Group Presentations as a Novel Form of In Course Assessment in Neuroscience

, &
Pages 1-16 | Received 11 Apr 2003, Accepted 09 May 2003, Published online: 14 Dec 2015

Abstract

Increasing student numbers has led to enormous pressure on assessment. It is a challenge to develop methods of assessment that are not too time consuming for staff, but still foster integration of knowledge and the development of independent learning skills and a deep approach to learning by the students. This article describes how we have introduced a system of assessed group seminar presentations as the in course assessment component of the first and second year neuroscience modules for medical students at the University of Birmingham. These assessed presentations are generally well liked by both students and staff. They have achieved a much more even distribution of the marking load amongst staff and a (modest) saving in overall staff time. Furthermore they require the students to use oral presentation skills, foster group working and independent learning, and allow us to perform some assessment of these activities. The success of this assessment format has led to its adoption on various other courses.

Introduction

Widening participation in higher education is leading to large increases in student numbers. This puts increasing pressure on assessment. One solution is clearly to opt for computer marked, multiple-choice assessments. However, while these are an excellent method of assessing factual knowledge, they can be very hard to write well, and may tend to foster a surface, information “cramming”, approach to learning. The ability to solve problems, integrate information from a variety of sources, use various resources for research, evaluate information, assemble a logical argument and present information clearly is key to all the biological sciences. We wanted to develop a method of assessment that would foster, and allow some assessment of, these attributes.

The largest course in the University of Birmingham Medical School is the MBChB course. This course underwent a marked expansion in 2000/1, when the intake went up from 250 to 320 students, and we currently have 380 students in the first year. All first and second year students follow an extensive biological sciences curriculum, with systems-based modules in, for example, cardiovascular science, respiratory science, the immune and haematological system. We are responsible for a 20 credit “double” module, The Musculoskeletal System and Neuroscience 1, in the first year of the course, and the 10 credit module Neuroscience 2, in the second year. Assessment for all the first and second year biological science modules follows a similar pattern, which is determined by the Curriculum Development and Implementation Group (CDIG); 90% of the mark is obtained from unseen examination papers, which consist of an Multiple Choice Question (MCQ) paper, a very short answer “spotter” paper, and a structured short answer paper, in which questions are usually based on a clinical scenario. The remaining 10% of the mark is available for in course assessment, and the format of this can be determined by individual module co-ordinators with the approval of CDIG. The traditional form of in course assessment had been an essay, but increasing student numbers have rendered these impractical. Many module co-ordinators have opted for MCQ based in course assessments, but we wished to avoid this, for the reason’s outlined above. We decided to try out a system of assessed group presentations, based around clinical cases. We hoped that these would have the following advantages:

  1. be quicker and easier to mark than essays;

  2. allow us to assess (for the first time) some of the year and module learning outcomes, specifically those related to group work, independent learning and presentation skills;

  3. require an in depth understanding on the part of the students; we felt that it would be harder to get away with simply regurgitating material that the students had read but not understood, than in an essay;

  4. the research required for the presentation would be extensive, but the presentation would not take up as much time to prepare as writing an essay;

  5. the use of group work should facilitate real understanding, rather than rote learning of facts (CitationBrown and Atkins, 1991; Gibbs, 1992);

  6. the problem based learning approach should promote understanding and integration of knowledge, rather than simply relying on memorising unconnected facts (CitationGibbs, 1992; Engel, 1997);

  7. provide a different assessment method which may advantage/disadvantage different students from the standard exam format;

  8. provide practise in oral presentations, which is a key skill for all students and something that these particular students will require for numerous case presentations in their clinical years.;

  9. engage students with material not necessarily covered in lectures.

Methods

Design of the problems

For each cohort of students, we devise 4 problems to form the basis of the in course assessment. These problems require the students to engage in a true problem-based learning scenario (as defined by CitationSavin-Baden, 2000), where they have to identify for themselves the knowledge required to tackle the problems. Each problem is delivered as a case history, accompanied by 3–4 very general statements, which served to guide (but not instruct) the students. These general statements also break the problem down into different sections, each of which will finally be presented by a different student, randomly allocated by the assessors on the day of the presentation (see below). Most problems are divided into 4 sections, for groups of 4 students, as a group of 4 is sufficiently large to reduce the risk of one student dominating the others, but small enough to make it difficult for “passengers” to hide (CitationRace and Brown, 1998).

The format of case presentation and the degree of guidance appears to be critical in determining how well the students engage with the problems and so we followed the advice of CitationGlick and Armstrong (1996); the introduction of the problems is in the form of a very personal patient history and the guidance points are very general and require the students to make decisions about what aspects of the case they feel are the most important to study. These questions cover all aspects of neuroscience and musculoskeletal biology from basic ion channel physiology and 3-D anatomy in year 1, to the effect and management of specific brain lesions in year 2. In addition, the topics chosen had not been explicitly covered in lectures although some aspects may be mentioned, albeit in a different way. This exercise also forces the students to start exploring and evaluating information resources for themselves, rather than relying on recommendations from teachers. The format of the problems in the first and second year is the same. However, we give rather straightforward problems to the first year students with explicit guidance as to what to cover. The problems for the second years are rather more challenging and open ended, in an attempt to develop their research and analytical skills. Examples of the problems that we used this year are shown in .

The students in a group may choose to break up the task of researching the problem. However, all students are required to have a good understanding of the whole of their problem, as they are not told until just before the presentation which part they are to present.

Designing sufficient, suitable problems can be a problem, especially as new ones are required every year. However, this year, the 3rd year in which we have used this assessment method for the 1st year students, we tried “recycling” the problems used two years previously. The problems were modified cosmetically, so that they did not appear exactly the same. Also, the medical school-bound 1st year, has rather little contact with 3rd year students, who spend most of their time out in various hospitals. This appears to have been successful, in that there was no evidence of copying previous presentations or handouts, and no dramatic improvement in marks obtained compared with previous years (rather the reverse, if anything).

Timetabling and administrative arrangements

The biological science component of the first two years of the MBChB course consists of a mix of plenary lectures and small group teaching sessions (anatomy classes, tutorials, CAL and problem based learning). For these small group sessions, the year is divided into 22 “M groups” and each student remains in the same group for all the small group teaching sessions in a year.

Figure 1 Instructions to students and example problems used for this year’s assessed presentations. a, shows the instructions issued to the 1st year students; these are very similar to those issued to the 2nd year students.

These M groups are used for the assessed presentations. This allows us to run 11 presentations simultaneously, as this is the standard timetabling arrangement for small group teaching sessions, and also means that the students work with, and give their presentations to, a group with which they are very familiar.

An introductory session is timetabled for each M group early in each module. At this session, the students are issued with instructions for the exercise (see ) and the list of problems, and one of their usual tutors explains the task to them. The students then decide how they will split themselves into 4 subgroups of 3–5 students, and come to an agreement as to which group will address which problem. A 2nd session is timetabled for 1 week later to give students time to think about their approach to the task and ask any questions at that stage. The students then have several weeks for preparation and are encouraged to refer to their tutors if further guidance is required. No further formal sessions are timetabled, but the students are strongly encouraged to get together to discuss their findings and plan/rehearse the presentation.

Figure 1b shows an example of a problem given to the 1st year students

The actual assessment takes place during a 2h session, timetabled towards the end of the semester. The presentations take place in the students usual anatomy small group teaching room; thus they are familiar with the room and the resources available, and are encouraged to make use of the anatomical models and other resources to supplement their presentations (see ). The students are randomly assigned to a particular section just before the presentations begin; by this mechanism we aim to encourage the students to work together to produce a coherent presentation, and understand the whole thing, rather than just the particular section that they may have prepared. Students are also required to produce a single page summary handout, which further encourages group-work and serves the additional purpose of providing a check that students from different M groups, working on the same problem, have not simply copied from one another.

Occasionally the number of students in the group does not match the number of sections in the problem. This is unfortunate, but inevitable, as different M groups may contain different numbers of students. The students will have been counselled on how they might deal with this when the problems are initially allocated, and on the day of the presentation they will be asked how they want to tackle the presentation. For example, they may decide to split an extra section between them, or for one person to elect to present that section. Alternatively, it may have been necessary for a group to “invent” an extra section (usually a summary) to accommodate an extra group member. Nevertheless, the students must still be prepared for the random allocation of sections.

Figure 1c shows an example of a problem given to the 2nd year students

Conduct of the assessment

During the 2h assessment session, each subgroup gives a 20-minute presentation, in which each student typically spends about 5 minutes presenting the section which has been allocated to them by the assessors at the start of the session. The presentation is followed by questions both from the “audience” (the rest of the M group) and the assessors. For each presentation, the students circulate copies of the handout that they have prepared; one for the assessors, which is marked and returned with the completed mark sheets, and one for each sub-group of students.

Two assessors mark each presentation - one each from an anatomy or neuroscience background. This helps to ensure that the assessors have the relevant expertise to cover the integrated problems, and also helps to stop the students seeing these as exclusively “anatomy” or “pharmacology” presentations. Assessors are allocated to M groups who they do not normally see in small group sessions, in order to eliminate any personal bias.

Assessors are provided with detailed mark sheets (see ). These serve as a guide for marking, provide a standard format for the return of marks for processing and review, and hopefully improve consistency between groups marked by different assessors. Students are marked according to their individual performance, both in the presentation and in questions, and each group is given a mark reflecting the presentation as a whole (see ). Each student’s mark comprises the average of the individual and the group components; the assessment criteria are explained to the students in the preliminary information that they receive (see CitationFig.1). At the end of the session, the two assessors are encouraged to compare notes, and discuss any major discrepancies in their marks. All the marks for the year are then scrutinised to check for any unexpected inconsistencies between groups, or between written comments and the marks. The handouts are also examined to check for any evidence of collusion.

Monitoring of the process

This process was introduced, to first year students only, on 2000/1. In that year the process was monitored by conducting two meetings to assess the views of contributing staff; one before and one after the assessment session, and by informal interviews with students. The marks achieved were also scrutinised, although these have not been formally analysed, as we felt that it was not appropriate to compare the marks with those obtained using the traditional form of in course assessment — standard, factual essays written by individual students.

We felt that the outcome of the initial exercise was sufficiently promising to continue with this method of assessment, and we introduced it for 2nd year students in 2001/2. We have continued to elicit informal views from students and staff, and this year we circulated a feedback questionnaire to 2nd year students immediately after their presentations, asking for their views on this method of assessment.

Results

The views of the staff

A major problem that we faced initially was in persuading sufficient of our colleagues to agree to be assessors, and it seemed that many staff felt that their knowledge would be inadequate to properly assess the problems set. We addressed this by providing “crib sheets” to the assessors in advance of the presentations; these contain brief “answers” to the main points and links to other resources, so that the assessors will be aware of the broad issues that the students are likely to cover. We also reassure assessors that it is the process they are marking and not necessarily the accuracy of every statement. We chose not to issue staff with standard questions to ask the students. It is very hard to predict which areas will be covered well, and which poorly, for any topic, and it is therefore extremely useful to be able to ask questions to clarify any misunderstandings or misconceptions. Furthermore, the presentations are done in two sets; it is highly likely that the second half of the year would be forewarned of any “standard” questions.

The first time we used the assessed presentations, all the staff involved were invited to a meeting after the presentations, where we had a frank and open discussion of their feelings regarding the exercise. We asked them to consider whether the exercise was deemed sufficiently successful to be repeated and whether any improvements for the future were required. Most members of staff were surprised and impressed by the effort that the students had put into their presentations. Also, despite their initial misgivings about the level of expertise required for marking, most reported being able to ask appropriate and searching questions of the students during the presentations. The assessors also felt that the students’ responses to questions illustrated that they had acquired a breadth of knowledge and understanding from their research beyond that required to simply address the problem. Many staff reported that, despite their initial pessimism, the presentations had proved enjoyable, stimulating and a positive experience, because it was obvious how well the students had engaged with the task as a result of a high degree of motivation. Several staff also remarked that students had used personal experiences to illustrate their presentations. This integration of real-life experiences with academic work suggests that the first year students, even at such an early stage in their training, were engaging in experiential learning, using information derived from a variety of sources to aid their problem-solving.

Student performance

Another potential problem initially raised by staff was that they felt it was unlikely that students’ performance would vary very much, as it would be easy to perform well whilst knowing nothing! However, it is clear when watching the presentations that this is not the case. There is a world of difference between students who clearly understand the material and have given thought to how best to present it, and those who have simply “lifted” the relevant section of a textbook and read it out. This difference is reflected in the grades. The full range of grades is awarded across the whole year for this exercise. Most students achieve B’s and C’s, but a number of students obtain A’s for well structured and imaginatively presented work and a few, who have clearly put in no effort whatsoever, fail. It seems that this form of assessment is able to differentiate student performance.

The views of the students

This year, the 2nd year students were issued with a questionnaire to assess their thoughts on the assessed presentations, now that they had been through the process twice… once in the 1st year and once in the 2nd. The questionnaire is shown in . We achieved a response rate of 57%, obtaining completed questionnaires from 172 students.

Figure 3 The student feedback questionnaire issued to second year students after this year’s presentation.

Completed questionnaires were returned by 172 of the 300 students. The columns of figures indicate the number of students who reported “strongly agreeing” etc to each statement.

The first part of the questionnaire consisted of a series of statements; the students were asked to indicate the degree to which they agreed or disagreed with the statements by circling the appropriate symbol (see ). The responses to this section of the questionnaire are shown in . Overall, these responses are very pleasing, with the majority of students responding to the statements as we would have hoped. Most of the students enjoyed the seminars, with 57% agreeing or strongly agreeing to the statement, and 76% of respondents agreed or strongly agreed that the seminars were stimulating and challenging. A smaller majority (54%) agreed or strongly agreed that they learned a lot from listening to other groups’ seminars and 60% agreed or strongly agreed that this was a fair and rigorous method of assessment. A very large majority (87%) agreed or strongly agreed that preparing the seminar reinforced and extended their knowledge.

Table 1 Student responses to the question on the student feedback questionnaire: What were the three best points about the assessed seminars?

Questionnaires were returned by 172 of the 300 students in the cohort. All comments made by more than two students are included and the number of students making each response is indicated. A full list of comments is available from the authors, if required.

In contrast, only 23% of respondents strongly agreed, agreed or felt neutral about the statement that they would have preferred to write an essay and we were pleased to see that only 16% of students strongly agreed, agreed or weren’t sure that they could get all the information that they needed from lectures. Most students (74%) agreed or strongly agreed that the topics covered were of obvious relevance to the module and 72% agreed or strongly agreed that access to learning resources was adequate for their needs. The overwhelming majority (84%) agreed or strongly agreed that they had a clear understanding of what they were meant to do but only 51% agreed or strongly agreed that the staff were responsive to student needs.

The second part of the questionnaires asked for student comments in response to the three best (and worst) aspects of the assessed presentations, how the exercise could be improved, and “any other comments”. Responses made by more than 2 students are summarised in -.

Table 2 Student responses to the question on the student feedback questionnaire: What were the three worst points about the assessed seminars?

Students cited a variety of factors as being among the “best points” of the assessed presentations (). The aspect cited most frequently was that the presentations provided an opportunity to practice and/or improve presentations skills (46 students); 11 students stated that the session improved their confidence. Quite a lot of students (22) stated that the exercise was interesting and/or informative; 5 students even found it “fun”! Many students seem to have appreciated the different style of learning involved in this exercise: 30 cited the group work, or the fact that their group worked well, as one of the best points and 5 stated that the exercise had improved their group work; 17 enjoyed learning about something in depth, 16 liked the fact that the exercise had enabled them to learn their own topic extremely well and 3 stated the fact that they had had to integrate information as an advantage. Four students even cited the random allocation of sections to present as an advantage, recognising that it required them to learn the whole presentation. Ten students stated that they liked the fact that the exercise forced/encouraged them to read and think around the subject, 9 students stated that this was a useful method of learning and understanding material, 9 students stated that they liked learning from a variety of resources, 5 enjoyed the fact that it was a different style of assessment and 4 stated that it had improved their self-directed learning skills. Some students also seem to have appreciated the way the presentations were organised: 15 liked learning from their colleagues’ presentations, 10 enjoyed the act of presenting, 10 felt that the atmosphere was relaxed and supportive, 7 felt that they had had ample time to prepare, 3 felt that the tutors were interested and supportive, 3 found the topics interesting, 3 felt that the presentations were a good length and 3 enjoyed the opportunity to prepare overheads and handouts. Predictably, however, 6 students reported that the best thing about the exercise was getting it finished or “going to the pub afterwards”, and 3 felt that the best point was that they were only required to do the one presentation.

The most unpopular aspect of the assessed presentations was the “random” allocation of sections to present; 30 students cited this as one of the worst points (see ); clearly we are not going to change this, as we feel that this random element is essential to encourage the students to engage with the whole problem. Many (26 students) also stated that it took too long to prepare, or involved too much work for only 5% of the module mark. Another 23 comments also pointed out that the session coincided with other course work deadlines, or was too close to the end of term. This is hard to address; it is true that many assessment deadlines are clustered at the end of the semester, but that is because this is the most appropriate time for them. Many students also expressed reservations about the group work: 12 felt that it was hard to split the work equally within the group, 6 felt that some members of the group had been poorly prepared, or made unequal contributions, 3 felt that they had prepared “their part” really well, but that it was spoiled by poor presentation by another member of the group and 1 even reported that a bad point was having to “rely on someone else writing a good presentation that you have to present”. Some were dissatisfied with the organisation of the exercise: 14 felt that some presentations were more difficult that others, 7 were disappointed that they did not receive more immediate feedback, 6 felt that the 2h session was too long, 5 stated that it was difficult to know what was expected, 4 stated that they felt the questions were unfair or inappropriate and 3 more simply didn’t like the questions, 4 believed that some parts were not relevant to the course or to the exam, 3 felt that there had been insufficient time to prepare and 3 felt that it was hard to include the use of models or diagrams in their particular presentation. Sadly, quite a lot of students (20) stated that they didn’t learn much (or anything) from the other groups’ presentations. Furthermore, 5 said that they were too nervous to listen to the other presentations and 4 thought that listening to the other groups was boring. A single student stated that this was an unfair method of assessment. One (very honest) student felt that a disadvantage was that “you are assessed on exactly what you have done” and one did not like the fact that there were two assessors present.

Table 3 Student responses to the question on the student feedback questionnaire: Do you have any suggestions as to how this exercise could be improved?

Comments from students as to how the exercise could be improved are shown in . The most common suggestion (9 students) was to have PowerPoint facilities available for the presentations; while we are in favour of this in principle, it may prove logistically difficult to have 11 set ups available simultaneously. Even fewer students made “any other comments” and only one comment was made by more than one student; three students make the point that they had found access to the computer cluster a problem during the preparation period.

Discussion

After 3 years, the assessed presentations are now an established part of out neuroscience modules. To what extent do we feel that we have achieved our initial aims?

The presentations generally seem to be an exacting and rigorous method of assessment, which most students and staff feel to be fair. We assess presentations by the entire year in 22 x 2h sessions, each of which is attended by 2 members of staff. Thus assessment requires 88 staff hours for 300–380 students. As a rough estimate, it would take about 10–15 mins to read, mark and comment on a student essay, thus it would take 50–95h to mark essays from all the students. According to this analysis then, we have achieved at worst no saving in staff time, and at best a very modest one. However, all the presentations are double marked; if this is taken into account, the time saved is substantial. Furthermore, the way that the sessions are timetabled means that no member of staff assesses for more the 4h for any group of students, and we achieve a much more equitable distribution of the marking load. It is also much less arduous, and certainly more enjoyable, to sit through a couple of hours of presentation than to face a pile of essays! An unexpected bonus of this method of assessment is that, in our School at least, the timetabling of the assessment sessions attracts finance from the School, in a way that marking essays does not.

The initial reticence of staff to engage in the process appears to be a common problem with the introduction of problem-based approaches (CitationAbrahamson, 1997; Savin-Baden, 2000). The role of staff involved in problem based learning changes from instructor to facilitator and staff may perceive this as somehow undermining their expertise (CitationLittle, 1997). However, we have been delighted by the response of members of staff who have actually experienced the presentations and rarely do we now encounter difficulties in persuading members of staff to act as assessors.

These presentations do not only assess factual knowledge, but also the ability of the students to work in groups, learn-independently and present information effectively. This is well recognised by both staff and students, although the students do not always feel that this is a good thing (see ).

We have been pleased by the in depth understanding that many of the students seem to achieve by this exercise, and many of the students seem to appreciate this too. The majority of students would emphatically NOT have preferred to write an essay, so we suspect that our aim of reducing student workload, while retaining the need for extensive research has been achieved.

It is hard to assess whether it is the use of group work, problem based learning or both, which encourages the students to take a deeper approach to their learning. However, we are content that most (although clearly not all) are doing so.

Several students cited the fact that this was a different method of assessment as one of the best things about the exercise. Some also stated that it was easier than an MCQ, because they didn’t have to revise the whole module (see ). It seems that the students do appreciate the provision of this different form of assessment, albeit for a variety of reasons.

The presentations clearly provide practice in oral presentation skills, and this is something that is particularly appreciated by many students.

The responses from the students show that most cannot gain all the information that they require from their lecture notes, and thus are required to engage with material not presented in the standard lecture format. We were initially concerned that some students might have trouble finding sufficient resources from which to obtain the information, but this does not appear to be a major problem, with most students reporting that access to resources was adequate. However, some students did comment that it seemed easier to find information for some topics than for others.

We were particularly gratified that many students feel that they can, and do, learn from their own studies and each other’s presentations. Early in the course we tend to find that many students are insecure about information that they have found for themselves, or that they obtain from their colleagues, rather than a “teacher”. They want to know if it is the “right answer” or if there is any information that they have missed. The experience of doing the problem-based presentations seems to be going some way to address this problem, probably by making the students more self-reliant and fostering a deeper approach to learning. Some students however, would still like to be tested by an MCQ, to make sure that they have got all the “relevant” information, or have the presentations as a “standard anatomy teaching session”. We hope that the integration of more student centred- and problem based-learning into the course will gradually wean these students off their dependence on the “teacher”.

While most students seem to have enjoyed the group work, some clearly feel disadvantaged by this, and some groups are obviously dysfunctional, with one or members contributing very little. This, naturally, resulted in conflict and a breakdown in the group dynamic. We have mixed feelings about this situation; it is unfair for students to have to do extra work on behalf of other group members, who may then benefit from a good group mark. However we do not wish to make the task more prescriptive by timetabling compulsory slots for groups to meet, as some of the element of responsibility for self-directed learning would then be lost. One possible solution would be to introduce a component of peer assessment to the group mark, as suggested by two students (see ) — after all it is the members of the group who know best how much was contributed by each person.

The success of these problem-based, assessed presentations has led us to introduce them into other courses that we run. They are currently being used to good effect in a module on craniofacial biology for first year dentists and in the first year pharmacology module for science students.

References

  • AbrahamsonS. (1997) Good planning is not enough. In The challenge of problem-based learning, ed BoudD. and FelettiG.. London: Kogan Page.
  • BrownG. and AtkinsM. (1991) Effective teaching in higher education. London: Routledge.
  • EngelC.E. (1997) Not just a method but a way of learning. In The challenge of problem-based learning, ed BoudD. and FelettiG.. London: Kogan Page.
  • GibbsG. (1992) Improving the quality of student learning. Bristol: Technical and Educational Services.
  • GlickT.H. and ArmstrongE. G. (1996) Crafting cases for problem-based learning: experience in a neuroscience course. Medical Education 30: 24-30.
  • LittleS. (1997). Preparing tertiary teachers for problem-based learning. In The challenge of problem-based learning, ed BoudD. and FelettiG.. London: Kogan Page.
  • RaceP. and BrownS. (1998) Making small group teaching work. In The lecturer’s toolkit. London: Kogan Page.
  • Savin-BadenM. (2000) Problem-based learning in higher education: untold stories. Buckingham, UK: SRHE and Open University Press,.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.