1,447
Views
4
CrossRef citations to date
0
Altmetric
Research Article

Competency-based integrated practical examinations: Bringing relevance to basic science laboratory examinations

, &
Pages e443-e447 | Published online: 20 Sep 2010

Abstract

Background: The practical examinations in subject-based curriculum have been criticized for lack of relevance and clinical application. We developed competency-based integrated practical examinations (IPEs) for first two years incorporating basic science principles with clinical relevance in our integrated curriculum.

Aim: To bring relevance to basic science laboratory practical examinations by conducting competency-based IPEs.

Methods: IPEs were developed according to competency-based blueprinting for each integrated module. Clinical scenarios were used as triggers followed by tasks pertaining to laboratory tests, relevant physical diagnosis and ethics/professional aspects utilizing standardized patients. Checklists were developed for standardized marking. A feedback questionnaire and two focus group discussions were administered to a random group of students from both first and second year students. Faculty members’ feedback was also recorded on a questionnaire.

Results: Almost all the students agreed that IPE was a useful experience. Eighty-nine percent agreed that it was a fair examination and elicited a lesser degree of psychological stress. Eighty-two percent agreed that IPE encouraged critical thinking and application of knowledge. However, students suggested better organization and longer duration of stations. Faculty members also liked the experience.

Conclusion: In conclusion, IPEs were well-received and valued both by students and faculty members.

Introduction

Assessment is an essential part of the teaching and learning process (Harden Citation2001). Any teaching method needs to be matched by an appropriate assessment that relates to the objectives of the teaching. Students may or may not learn what is in the curriculum or what we teach, but they will learn what we assess them on, as “assessment drives learning” (Shimura et al. Citation2004). As medical curricula around the world move from discipline-based to integrated teaching and learning, the integration of assessment must necessarily follow (Hudson & Tonkin Citation2004). A change in instructional methods without changing assessment will not achieve desired outcomes (Ahmed et al. Citation2007). Studies have shown that such a mismatch can eventually lead to failure of any adopted curriculum (Ghosh & Pandya Citation2008).

The limitations of clinical and practical examinations have been realized for a long time and have given rise to attempts at improving the current scenario (Stiliman et al. Citation1977; Edelstein & Ruder Citation1990; Newble Citation1991). An earlier innovation in this regard is the objective structured clinical examination (OSCE) later extended to the objective structured practical examination (OSPE) described in 1975 and in greater detail in 1979 by Harden and his group from Dundee (Harden et al. Citation1975; Harden & Gleeson Citation1979). This method with some modifications has stood the test of time and has largely overcome the problems of the conventional clinical examinations mentioned earlier (Ananthakrishnan Citation1993). It appears that the assessment of problem solving skills must occur in the context of a clinical problem, yet no unanimity exists on the tools to be used (Nendaz & Tekian Citation1990).

Skills assessment does not only measure the performance but also provides an indication of the effectiveness of learning strategies and appropriateness of the content input. A good assessment should be valid, reliable, practicable, and should have educational value (Ahmed et al. Citation2007).

Miller acknowledged that no single method of assessment could provide all the data required for judgment of the delivery of professional services by a physician (Miller Citation1990).

The move from discipline-based to system-based integrated curriculum at Shifa College of Medicine, Islamabad, Pakistan in 2008, provided exciting opportunities to integrate the learning of basic and clinical sciences for medical undergraduates. After introducing integrated written assessments in the form of integrated multiple choice questions (MCQs) and short answer questions (SAQs), we aimed to develop a competency based integrated practical examination (IPE), for the 1st and 2nd year of our 5-year undergraduate curriculum. This type of assessment serves as a tool for testing multiple competencies related to skills (as in performance exercises), grounded knowledge (as in OSPE; Howley Citation2004; Hudson & Tonkin Citation2004) and attitudes of the students.

The objective of this study is to share our experience of bringing relevance to basic science laboratory practical examinations by conducting competency-based IPEs, and to analyze its efficacy for the students.

Methods

CanMeds framework for competencies was utilized for developing the curriculum (Mickelson & MacNeily Citation2008). The level of achievement for these competencies was defined according to the years of undergraduate education. In our 5-year undergraduate curriculum, spirally integrated modules were developed for two spirals delivered over a period of first three years. First spiral aimed at system-based modules comprising anatomy, physiology, biochemistry, ethics, professionalism, clinical relevance, and evidence-based medicine. The modules in the second spiral incorporated revisit of the systems with a stress on pharmacology, community health sciences, pathology, relevant clinical disciplines, ethics, professionalism, and evidence-based medicine. Modules revolved around longitudinal themes which were revisited with varying objectives throughout the curriculum. The last 2 years of curriculum revolved around clinical clerkships in the disciplines of General internal medicine, Family medicine, Surgery, Ophthalmology, Otolaryngology, and Obstetrics and Gynecology.

Multidisciplinary modular teams developed objectives using SMART (specific, measurable, attainable, relevant, and targeted) technique linking objectives to CanMeds competency framework. Longitudinal clinical themes covering the important clinical concepts were developed for all the modules. Appropriate learning strategies were employed, which included small group discussions, followed mostly by a large group wrap-up session, problem-based learning and self-directed learning. Modular delivery involved a great deal of stress on learning clinical skills about data gathering and physician–patient interaction. Students were provided with opportunity to learn skills in skills laboratory sessions. Faculty members from both basic as well as clinical sciences facilitated skill-related sessions utilizing models, simulations, and standardized and real patients.

In our traditional curriculum, practical examinations were discipline based. The practical assessments revolved around laboratory techniques for biochemical and physiological testing with very little relevance to real life practice of a physician.

IPEs were constructed for each module by the team members from various disciplines including Anatomy, Physiology, Biochemistry, General medicine, and Surgery. Content validity was ensured by developing a blue print and repeated discussions during planning meetings. Competencies related to Performance Skills, Communication Skills, Reasoning Skills, and Humanistic Qualities/Professionalism were incorporated by developing an IPE construction template. A rating instrument was also developed which had these competencies listed on it. The raters were given workshops on the rating instrument. Students were rated on all the competencies incorporated in an IPE on a three-category scale of unsatisfactory, satisfactory, and superior. Practical performance and clinical skills like history taking, physical examination, and counseling skills were assessed in a clinical context. Relevant ethical and professional aspects were also addressed. Basic science knowledge was linked to clinical practice and high construct validity was achieved by subjecting the draft of each station to extensive review. Faculty members involved in the process of assessment were briefed about the process of assessment and rating of the students.

It was a multistation, objective, structured examination. Case scenario, video, Images/Photograph/Model/Specimen, or Standardized Patient were used as triggers at each station. Three or four tasks relevant to the trigger were given. Each module exam had 8–12 stations. Each station was given equal time, after which the candidate was required to move on to the next station. These stations included tasks which were both interactive (for example, taking history or performance on patient) and static (for example, identification of slide or data interpretation).

A faculty member was present as an observer at each station where performance was required. Global rating scale was used to rate the students, using checklist as a reminder.

Feedback

A feedback questionnaire was administered to a random group of 44 students from first and second year classes, and they were asked to give suggestions and to comment on the IPE. Their comments were recorded. Feedback pro forma was also administered to the faculty members.

Sample stations of IPE

Results

Likert scale was used; strongly agree and agree were merged for the purpose of analysis and strongly disagree and agree were also merged for analysis.

Of the students, 80–90% agreed with organization and content of the tasks for various stations. Structure and sequence of the IPE and the time given for each task were the areas where less than 50% of the students agreed ().

Table 1.  Student's feedback on module organization and structure

Of the students, 70–80% agreed with the content of the tasks in each IPE station ().

Table 2.  Student's feedback on the content of the tasks

Comments of the students

Most of the students liked change from traditional to the IPE, but thought that it needs to be organized further; they thought that the time given for each station was not enough for the performance of the tasks. Some of them also suggested that the oral viva should be more extensive. They thought it was an unbiased way of examination.

Some representative comments of the students regarding gains and concerns are shown in .

Table 3.  Student's comments on IPEs

Comments of the faculty

Faculty found IPE a good way of assessing application of knowledge and skills of the students ().

Table 4.  Faculty's comments on IPEs

Discussion

In our study, the feedback from students was strongly positive. This is consistent with another study in which the multistation IPE was ranked high by the majority of students (Abraham et al. Citation2005).

The organization as well as clinical relevance in the practical exam was highly appreciated by the students. Eighty-two percent of the students thought that it reflected relevance and helped create connections across various disciplines. This is consistent with another study in Katmandu, although this study included examination in the discipline of pharmacology as compared to our study where the examination was integrated. In the curriculum of Katmandu university stations, testing skills in pharmacology are proposed to be introduced and the attitude of students toward this development was generally positive (Shankar & Mishra Citation2002). Another study by Boon et al. (Citation2001) conducted at the University of Pretoria in South Africa shows that the majority of students considered that the clinical case studies gave them a better understanding of the relevant basic sciences.

The majority of the students agreed that the IPE was a better way of assessing practical skills when compared with the traditional method of practical exam. Eighty-nine percent thought it was a fair exam with lesser stress for the candidate. This is consistent with the results of another study by Hudson and Tonkin (Citation2004), where students admitted that the previous assessment method encouraged test-directed studying. It was pleasing that a significant number of students thought that the examination was clinically relevant and it was a good preparation for later assessments and clinical practice. Students acknowledged that the integrated practical items had high clinical relevance.

The feedback on knowledge area covered in the IPE was strongly positive. Eighty-two percent of the students agreed that it covered a wide knowledge area. Seventy-nine percent of the students appreciated that the clinical relevance helps to clarify concepts in basic sciences. This is consistent with another study where students found that the clinical cases and interactions with actual patients helped them synthesize and retain basic science knowledge, although we used simulated patients in the IPEs (Muller et al. Citation2008).

In our modules, formal teaching of important competencies related to medical ethics, professionalism, and communication skills which are generally ignored in the traditional curriculum (Verma et al. Citation1991; Nayar et al. Citation1995) were also introduced, using CanMeds framework of competencies (Mickelson & MacNeily Citation2008). History taking, physical examination, professionalism, and communication skills were assessed during IPEs. Ethics-related issues were also incorporated in written assessment. Assessment of attitudes in IPE where it was integrated with basic sciences knowledge was appreciated by the faculty in this study. Students and the faculty members both appreciated the competency-based integrated examinations; however they felt that there was still room for improvement in the structure and organization of IPEs, and that the time allocated for each station was not appropriate.

Running IPE simultaneously for all the students would certainly be the best option to ensure a fair exam, but lack of enough space was a limitation which was overcome by utilizing the same space for all the groups. However, the stations and the tasks were changed for each group to ensure privacy, although it took longer and was hectic for the faculty and SPs. It was made sure that the students’ feedback on organization of the module and inappropriateness of time at each station was rectified in the forthcoming examinations.

Students study more thoughtfully when they anticipate certain examination formats, (Hakstian Citation1971) and changes in the format can shift their focus to clinical rather than theoretical issues (Newble & Jaeger Citation1983). The introduction of competency-based IPE is unique in assessing the basic science practicals in undergraduate medical curriculum which addresses multiple competencies that are required by a good practicing physician.

In conclusion, competency-based IPEs were well received and valued both by students and faculty members.

Declaration of interest: The authors report no conflicts of interest. The authors alone are responsible for the content and writing of the article.

References

  • Abraham RR, Upadhya S, Torke S, Ramnarayan K. Student perspectives of assessment by TEMM model in physiology. Adv Physiol Educ 2005; 29: 94–97
  • Ahmed A, Begum M, Begum S, Akhter R, Rahman N, Khatun F. Views of the students and teachers about the new curriculum (curriculum 2002) and their opinion on in-course assessment system. J Med 2007; 8: 39–43
  • Ananthakrishnan N. Objective structured clinical/practical examination (OSCE/OSPE). J Postgrad Med 1993; 39: 82–84
  • Boon JM, Meiring JH, Richards PA, Jacobs CJ. Evaluation of clinical relevance of problem oriented teaching in undergraduate anatomy at the University of Pretoria. Surg Radiol Anat 2001; 23: 57–60
  • Edelstein DR, Ruder HJ. Assessment of clinical skills using video tapes of the complete medical interview and physical examination. Med Teach 1990; 12: 155–162
  • Ghosh S, Pandya HV. Implementation of integrated learning program in neurosciences during first year of traditional medical course: Perception of students and faculty. BMC Med Educ 2008; 8: 44
  • Hakstian RA. The effects of type of examination anticipated on test preparation and performance. J Educ Res 1971; 64: 319–324
  • Harden RM. The learning environment and the curriculum. Med Teach 2001; 23: 335–336
  • Harden RM, Gleeson FA. Assessment of clinical competencies using an objective structured clinical examination (OSCE). ASME Medical Education Booklet no. 8, ASME, Dundee 1979
  • Harden RM, Stevenson M, Wilson Downie W, Wilson GM. Assessment of clinical competence using objective structured examination. BMJ 1975; 1: 447–451
  • Howley LD. Performance assessment in medical education: Where we’ve been, and where we’re going. Eval Health Prof 2004; 27: 285–303
  • Hudson JN, Tonkin AL. Evaluating the impact of moving from discipline based to integrated assessment. Med Educ 2004; 38: 832–843
  • Mickelson JJ, MacNeily AE. Translational education: Tools for implementing the CanMeds competencies in Canadian urology residency training. Can Urol Assoc J 2008; 2(4)395–404
  • Miller GE. The assessment of clinical skills/competence/performance. Acad Med 1990; 65(9 Suppl.)S63–S67
  • Muller JH, Jain S, Loeser H, Irby DM. Lessons learned about integrating a medical school curriculum: Perceptions of students, faculty and curriculum leaders. Med Educ 2008; 42: 778–785
  • Nayar U, Verma K, Adkoli BV, editors, 1995. Inquiry- driven strategies for innovation in medical education in India. New Delhi: AIIMS
  • Nendaz MR, Tekian A. Assessment in problem-based learning medical schools: A literature review. Teach Learn Med 1990; 11(4)232–243
  • Newble DI. The observed long case in clinical assessment. Med Educ 1991; 25: 369–373
  • Newble DI, Jaeger K. The effect of assessments and examinations on the learning of medical students. Med Educ 1983; 17: 165–171
  • Shankar PR, Mishra P. Student feedback on the objective structured component of the practical examination in pharmacology. J Nepal Med Assoc 2002; 41: 368–374
  • Shimura T, Aramaki T, Shimizu K, Miyashita T, Adachi K, Teramoto A. Implementation of integrated medical curriculum in Japenese medical schools. J Nippon Med Sch 2004; 71(1)11–16
  • Stiliman PL, Brown DR, Redfield DL, Sabors DL. Construct validation of the Arizona clinical interview rating scale. Educ Psychol Meas 1977; 37: 1031–1038
  • Verma K, D’Monte B, Adkoli BV, editors. Inquiry – driven strategies for innovation in medical education in India. AIIMS, New Delhi 1991

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.