762
Views
5
CrossRef citations to date
0
Altmetric
Trend Articles

Progress testing 2.0: clinical skills meets necessary science

, , &
Article: 27769 | Received 02 Mar 2015, Accepted 10 Apr 2015, Published online: 05 May 2015

Abstract

Introduction

Progress testing has been widely used in medical schools to test scientific knowledge but has not been reported for assessing clinical skills.

Development

We designed a novel progress examination that included assessments of both clinical performance and underlying basic and social science knowledge. This Progress Clinical Skills Examination (PCSE) was given to 21 early medical students at the beginning and end of a 6-week pilot test of a new medical school curriculum.

Implementation

This examination was feasible for early students, easy to map to curricular objectives, and easy to grade using a combination of assessment strategies.

Future directions

Use of a PCSE is feasible for early medical students. As medical schools integrate clinical experience with underlying knowledge, this type of examination holds promise. Further data are needed to validate this examination as an accurate measure of clinical performance and knowledge.

Progress testing, with origins at the University of Maastricht, the Netherlands (Citation1), and the University of Missouri Kansas City, is defined as ‘a longitudinal, comprehensive examination of knowledge acquisition and retention’ (Citation2) and a ‘quality-controlled assessment tool for improving learning and teaching and the demonstration of educational standards’ (Citation3). It has also been described as an assay of ‘functional knowledge’ (Citation4). Progress tests sample the ‘complete knowledge domain’ (Citation3) expected of graduating medical students and are delivered multiple times over the course of the educational program.

Because progress testing assesses the whole knowledge domain, it can be used for unique curricula with an unusual learning trajectory. The University of Maastricht needed a rational assessment strategy for its problem-based learning curriculum, one that would drive continuous, deep, learner-driven knowledge acquisition rather than test- or course-driven ‘binge learning’. The University of Missouri program, with entry into medical school directly from high school, required its assessment of students to take that uniqueness into account, and sought to demonstrate the rigor of their educational strategy. At this time, progress testing is taking place all over the world, and is described by a rich literature.

Advantages of progress testing described by that literature include the provision of integrated data for learners, faculty, and the overall curriculum over time (Citation5); early prediction of students requiring remediation (Citation2); and stability of the assessment process, as the same progress test can be administered as long as the same knowledge domain is desired, irrespective of the curricular strategy. Progress tests provide a ‘growth curve’ of knowledge that can illuminate curricular outcomes and support both learner and curriculum improvement. Albano et al. (Citation6) demonstrated the differing ‘kinetics’ of six different curricula and also demonstrated little difference in final knowledge acquisition through the use of a common progress test sequence. Importantly, Schuwirth et al. (Citation4) demonstrated that students experience less overall stress and anxiety when progress testing is employed as the major assessment strategy.

Progress testing has been applied to medical knowledge assessment but has not been described for clinical skills assessment. Nationally, medical educators are working to better integrate learning with actual performance through simulation or through work-based assessment. The movement toward entrustable professional activities (EPAs) (Citation7) requires the assessment of integrated performance and drives the need for integrated assessment strategies.

We have embarked on a curriculum renewal process characterized by the integration of ‘necessary science’ (defined as the foundational biological, psychological, and social sciences needed to function as a new resident) and clinical skills throughout the entire medical education program. For many of the same reasons that medical knowledge progress testing was developed in the United States and the Netherlands, we needed to envision an assessment system that would support integration as the highest value for both our faculty and our students.

Toward this end, we created a ‘Progress Clinical Skills Examination (PCSE)’ requiring demonstration of integrated clinical skills and the necessary science knowledge underpinning those skills. We administered our PCSE twice as a pilot with early learners. Little has been published about the use of a clinical skills exam with early medical students, and the feasibility in this population was not clear. We describe the development, implementation, lessons learned, and the future directions of that PCSE.

Development

Setting

The student group for the PCSE was drawn from another pilot study, which included a 6-week trial of an early clinical experience (ECE). Our student sample was determined by the needs of this pilot. Of 21 students (9 male and 12 female), 7 had completed the first year of medical school, 9 had not yet matriculated, and 5 had completed a post-baccalaureate program. This pilot group was selected to represent a broad range of academic backgrounds and a range of previous clinical experience. All 21 students participated in the PCSE at the beginning and at the end of the ECE pilot test.

Our planned new curriculum is built on a framework of approximately 120 chief complaints and concerns (C3) topics, which define the competencies expected of our graduates. The end-competency template for each C3 is divided into three sections. The first section includes data gathering, problem identification and synthesis, and management. The second section includes the necessary science underpinning each clinical skill. The last section details the complexities and challenges we expect graduates to be able to identify and explore.

Design of PCSE

The PSCE was designed as a multistation objective structured clinical examination (OSCE). In each station, students had 20 min with a standardized patient, family member, and/or healthcare team member, followed by 10 min to answer associated essay questions. The PCSE was blueprinted so that each station was linked to a specific C3 and each included assessments linked to all three parts of the end-competency template (see ).

Table 1 Blueprint for PCSE

Assessments for the PCSE included checklists completed by standardized patients and essay questions graded by faculty. We developed the essay questions to assess necessary science knowledge based on the relevant C3 end-competency template. The questions sampled a broad range of content, including physiology, anatomy, biochemistry, public health, safety science, ethics, pharmacology, and epidemiology. Grading rubrics for each essay question were developed by faculty using key concepts and/or key words.

We also received feedback from students about the PCSE as part of the pilot test evaluation.

The data from this project were reviewed by the Michigan State University Institutional Review Board and were determined to be exempt.

Implementation

A four-station clinical skills exam was given to all students at the beginning of the 6-week pilot. Students interviewed and examined standardized patients and then answered three to six questions at a computer kiosk. The first exam occurred in a single morning. At the end of the pilot, the same four cases were used, along with four new cases. The number of necessary science questions answered was limited to three. The second exam occurred in a long afternoon.

The students were able to participate in the exam without difficulty, even though several had not started medical school and few had any clinical experience. All students completed the essay questions in the time allotted.

Three faculty members graded the essay portion of the PCSE. Using the rubrics, a single faculty member could grade all 21 students’ essays for a single station in less than an afternoon.

Students valued the experience; one student commented ‘The PCSE helped me get ready for the clinic. I got less nervous. It was like practice. The SP's were really good’. A faculty member commented ‘The surprising thing was that many students felt that the PCSE prepared them for clinic – an unintended consequence’. Although the PCSE was designed as an assessment of knowledge and skills, the exam was seen by students as another opportunity to practice for clinical care.

The delivery of the exam was different between the two points. Not only did the second iteration have more stations, it was also delivered after a multiple choice examination in the morning, and after a busy 6-week pilot. Given these limitations, comparison of student performance data has limited utility. However, students had no problems completing the longer PCSE in the time allotted.

Future directions

As medical education moves toward a competency-based approach in which integrated clinical performance is the goal, the inclusion of a clinical skills component to progress examinations will be necessary to match assessments to curricular objectives. Our experience demonstrated that administration of the PCSE to early medical students is feasible. An important feature of this examination was the integration of the clinical encounter with essay questions to elaborate underlying necessary science, reflecting the integrated nature of the curriculum itself. Since this initial PCSE pilot test, we have successfully administered this examination to a sample of first through fourth year students at our institution to demonstrate its validity and its response to curricular effort. These administrations are providing baseline data as we transition to our new curriculum. As medical education moves to a more integrated approach to curriculum and assessment, this type of examination holds promise.

Conflict of interest and funding

The authors have not received any funding or benefits from industry or elsewhere to conduct this study.

Ethical approval

This study was determined exempt by the Michigan State University Institutional Review Board.

Acknowledgements

Much thanks to the 21 students in the MSU ECE Pilot for their forbearance; Henry Barry, Francesca Dwamena, Kerry Polizzi, and James Harkema for their review of case content; Kim Lyth and Ann Moeller in the CSTA; Mary Kay Smith, Kimberly Patterson, Matt Koss, and Peter Redovich for their help with implementation; Julie Phillips for her review of the manuscript; and Aron Sousa for his leadership and support.

References

  • Van der Vleuten CP , Verwijnen G , Wijnen W . Fifteen years of experience with progress testing in a problem-based learning curriculum. Med Teach. 1996; 18: 103–9.
  • Arnold L, Willoughby TL. The quarterly profile examination. Acad Med. 1990; 65: 515–16.
  • Wrigley W, Van der Vleuten CP, Freeman A, Muijtjens A. A systemic framework for the progress test: strengths, constraints and issues: AMEE Guide No. 71. Med Teach. 2012; 34: 683–97.
  • Schuwirth LW, Van der Vleuten CP. The use of progress testing. Perspect Med Educ. 2012; 1: 24–30.
  • Coombes L, Ricketts C, Freeman A, Stratford J. Beyond assessment: feedback for individuals and institutions based on the progress test. Med Teach. 2010; 32: 486–90.
  • Albano MG, Cavallo F, Hoogenboom R, Magni F, Majoor G, Manenti F, etal. An international comparison of knowledge levels of medical students: the Maastricht Progress Test. Med Educ. 1996; 30: 239–45.
  • Association of American Medical Colleges. Core entrustable professional activities for entering residency. Curriculum developer's guide. 2014; AAMC. 2014. Available from: https://members.aamc.org/eweb/upload/Core EPA Curriculum Dev Guide.pdf [cited 1 December 2014].