4,980
Views
46
CrossRef citations to date
0
Altmetric
Research Article

Using a Delphi process to establish consensus on emergency medicine clerkship competencies

, , , , &
Pages e333-e339 | Published online: 24 May 2011

Abstract

Background: Currently, there is no consensus on the core competencies required for emergency medicine (EM) clerkships in Canada. Existing EM curricula have been developed through informal consensus or local efforts. The Delphi process has been used extensively as a means for establishing consensus.

Aim: The purpose of this project was to define core competencies for EM clerkships in Canada, to validate a Delphi process in the context of national curriculum development, and to demonstrate the adoption of the CanMEDS physician competency paradigm in the undergraduate medical education realm.

Methods: Using a modified Delphi process, we developed a consensus amongst a panel of expert emergency physicians from across Canada utilizing the CanMEDS 2005 Physician Competency Framework.

Results: Thirty experts from nine different medical schools across Canada participated on the panel. The initial list consisted of 152 competencies organized in the seven domains of the CanMEDS 2005 Physician Competency Framework. After the second round of the Delphi process, the list of competencies was reduced to 62 (59% reduction).

Conclusion: This study demonstrated that a modified Delphi process can result in a strong consensus around a realistic number of core competencies for EM clerkships. We propose that such a method could be used by other medical specialties and health professions to develop rotation-specific core competencies.

Introduction

Undergraduate emergency medicine (EM) teaching in Canada is heterogonous. There have been recommendations for national standards in this area (Frank et al. Citation2008). Recommendations for a fourth-year medical student EM curriculum have been published in the United States (Manthey et al. Citation2006). The International Federation for Emergency Medicine recently described an international generic model curriculum for medical student education in EM (Hobgood et al. Citation2009). However, neither of these sets of recommendations were developed using validated methods defined a priori. Furthermore, both prescribe EM curricula for the overall training program rather than a specific clinical experience. Canadian programs employ a clinical clerkship model wherein students rotate through various clinical placements in their senior year(s) and there is an imperative to select relevant learning objectives from the overall program curriculum for application in each specific, time-limited clinical rotation. We are unaware of systematically derived curricula for such a rotation in EM.

A core competency is defined as the essential knowledge, skill, or attitude needed to succeed in a given field. Defining learning objectives and core competencies for clinical clerkship has become a priority in medical education (Burke & Brodkey Citation2006). The Liaison Committee on Medical Education (LCME) is the accrediting authority for programs leading to the MD degree in the United States and Canada. The LCME requires medical schools to specify objectives of their educational program and ensure that all those responsible for teaching medical students be aware of these objectives. These objectives are to be stated in terms that allow assessment of student progress in developing the competencies that the profession and the public expect of a physician (Liaison Committee on Medical Education Citation2010). There are already widely recognized definitions of the knowledge, skills, and attitudinal attributes appropriate for a physician, including those described in the ACGME Outcome Project (Accreditation Council for Graduate Medical Education Citation2010) and the CanMEDS 2005 Physician Competency Framework (Frank Citation2005). The Association of American Medical Colleges has defined the clinical skills curricula for undergraduate medical education (Association of American Colleges Citation2005). Implementing these recommendations requires educators to define from these general lists what are appropriate for every given educational unit (such as a course or clinical rotation). Most undergraduate and postgraduate medical education programs have developed national core competencies for their respective overall programs, if not for specific experiences within it. Although all programs are encouraged to design curricula to maximize local resources and constraints, a national reference core competency list for specific rotations can help to ensure nothing is missed, realistic priorities are set, and political discussions are well-informed.

The CanMEDS 2005 Physician Competency Framework was developed by the Royal College of Physicians and Surgeons of Canada as a guide to the essential abilities physicians need for optimal patient outcomes. It consists of seven roles or thematic groups of competencies: medical expert, communicator, collaborator, manager, health advocate, scholar, and professional (Frank Citation2005). Each role can be further broken down into smaller components for teaching, learning, and assessment.

The Delphi process has been used extensively in social sciences and health-related research. The Delphi process is a group facilitation technique, that seeks to obtain consensus on the opinions of experts through a series of structured questionnaires (rounds) in an iterative multistage process, designed to transform opinion into group consensus (Hasson et al. Citation2000).

The purpose of this project was to define the core competencies for EM clerkships in Canada, to validate a Delphi process in the context of national curriculum development, and to demonstrate the adoption of the CanMEDS physician competency paradigm in the undergraduate medical education realm.

Methods

Using a modified Delphi process, we developed a consensus amongst a panel of expert EM undergraduate educators from across Canada utilizing the CanMEDS 2005 Physician Competency Framework (Frank Citation2005).

Questionnaire development

A convenience sample of experts in EM undergraduate education (the project investigators n = 6) developed a comprehensive list of competencies for the EM clerkship. This list was developed by consulting multiple sources including a review of the published literature and the gray literature (internet) for existing EM curriculum and course objectives; and reviewing the course curricula of nine Canadian medical school EM clerkships until saturation was reached (Appendix 1). The list of competencies was organized into the domains described in the CanMEDS 2005 Physician Competency Framework (Frank Citation2005) by consensus of two study investigators (Rick Penciner and Glen Bandiera). This comprehensive list underwent a series of edits and additions with a view to inclusivity until there was consensus amongst the expert panel.

A questionnaire was developed utilizing a web-based survey tool (SurveyMonkey® available at www.surveymonkey.com). Each item consisted of a description of the competency followed by a 7-point Likert scale indicating the strength of agreement whether the competency should be included in a 4-week EM clerkship rotation (as distinctly opposed to what students should know about EM upon graduation). Detailed instructions for panel members on what to consider when rating each competency were developed (Appendix 2). In addition, panel members were given the opportunity to add any additional competencies at their discretion. The questionnaire was piloted with nine EM educators. Final edits were made to the questionnaire based on feedback received.

Round 1

A larger representative panel of expert EM undergraduate educators in Canada was identified. The EM undergraduate coordinator/director at each English speaking medical school in Canada (n = 13) was invited by email to participate. Using a “snowball” technique (Valente & Pumpuang Citation2007), we asked each EM undergraduate coordinator/director to identify four additional “experts” at their University and to provide the principal investigator (PI) their names and email addresses. An expert was defined as an “emergency medicine physician with expertise and/or interest in emergency medicine undergraduate education”. The PI contacted potential participants by email to ensure interest and agreement to participate in the project. Reminder invitations were sent at weekly intervals twice. Participation in the panel was voluntary. The process was conducted in a “quasi” anonymous manner. Respondents’ identity was known to the PI only to allow for reminders and provision of feedback in subsequent rounds. The participants’ judgments and opinions remained strictly anonymous to members of the expert panel. On March 4, 2010, an email was sent to each member of the panel with a link to an online questionnaire. Reminder emails were sent at weekly intervals twice.

Round 2

Competencies rated as 6 or 7 were categorized as “must include”, 4 or 5 as “for consideration” and 1, 2, or 3 as “not include”. The PI ranked the competencies from highest percent to lowest percent of “must include” responses within each CanMEDS competency domain. Competencies rated “not include” by 75% or more of the respondents were eliminated from the list. Panel members were subsequently invited by email 22 days after the first questionnaire to complete a second online questionnaire. Each panel member was asked to rate each competency using a scale of only “must include” or “not include”. The panel's aggregate responses for each competency from round 1 were noted beside each competency as a percent of “not include”, “consideration”, and “must include”. Reminder emails were sent at weekly intervals twice.

Evaluation method

Using descriptive statistics, we rank-ordered the competencies based on percent response of “must include”. Inclusion criteria for the final list were those competencies for which 75% or more respondents provided a response of “must include”.

External review

Upon completion of data collection, the project was reviewed by four external reviewers. The reviewers (none of whom participated in the study and who were selected by the investigators based on their national reputation as experts in curriculum development) included three EM educators/researchers and one family medicine undergraduate educator/researcher. Reviewers were asked to provide a brief narrative commenting whether the project results were valid, useful, applicable to EM clerkship, and whether the methodology was appropriate to inform competency and curriculum development.

Results

Delphi process

Nine clerkship directors responded to the initial invitation providing names of 33 potential participants. Of these potential participants, 30 consented to participate on the expert panel. One panel member did not respond to the second questionnaire. There was representation from nine different medical schools from across Canada (range = 2–5 participants per school) with six provinces represented.

The expert panel (n = 30) was diverse in terms of education background and expertise (). The initial list of competencies consisted of 152 competencies organized in the seven domains of the CanMEDs framework (). After round 2 of the Delphi process, the list of competencies was reduced from 152 to 62 competencies (59% reduction).

Table 1.  Profile of expert panel (n = 30)

Table 2.  Number of competencies by Delphi round

The seven domains of the CanMEDS 2005 framework are; medical expert, communicator, collaborator, manager, health advocate, scholar, and professional. A sample of the “Medical Expert” competencies can be found in and .

Table 3.  Example of “Medical Expert” (General) competencies

Table 4.  Example of “Medical Expert” (presenting problem) competencies. This domain demonstrates an approach to patients presenting to the ED with the following problems (including basic differential diagnosis, initial investigations, and initial treatments)

External review

The external reviewers commented on four areas of the project; face validity of results, applicability of results, usefulness of results, and whether the methodology was appropriate to inform competency and curriculum development.

The reviewers believed the project results had face validity by ensuring a comprehensive literature search and identifying the right experts for the panel. One reviewer commented that “the unique educational considerations of the emergency department learning environment have been considered”. Two reviewers commented that there was no representation on the panel outside of EM educators (such as students and other medical educators).

The results were considered applicable to EM clerkships across Canada. The inclusion of educators from across Canada and of educators from a variety of educational environments and backgrounds ensured the findings were generalizable. There was concern regarding the absence of panel members from the province of Quebec given its unique language and culture within Canada and the limited representation on the panel from community Emergency Departments.

The reviewers commented that the results were useful since they were mapped onto the CanMEDS 2005 Framework which is increasingly informing undergraduate medical education. The face validity of the construct of adapting consultant level competency models to design undergraduate curricula specific to given rotations thus seems robust. The reviewers believed that the methodology employed was appropriate to inform competency and curriculum development. Using a Delphi process was ideal in developing consensus amongst experts. One reviewer commented that “there was no representation from learners or recent graduates from medical school.”

Discussion

This study demonstrated that a modified Delphi process is an effective method to establish a national consensus in development of core competencies for EM clerkships. Consensus methods in medical and health services research provide a means of harnessing the insights of appropriate experts to enable decisions to be made. Commonly used consensus methods include, brainstorming, nominal group technique, consensus development conference and Delphi technique (Jones & Hunter Citation1995). We employed the Delphi process as it was considered the most rigorous and practical means to achieve consensus amongst a group of experts geographically dispersed. The Delphi process has been used extensively in social science research and health-related research. The process allows researchers to seek out information which may generate a consensus on the part of the respondent group. It is increasingly being used in health professional education research including curriculum and competency development. This includes nursing (Barton et al. Citation2009), dentistry (Fried & Leao Citation2007) and pharmacology (Walley & Webb Citation1997). The Delphi process has been used in curriculum development in medical education, including continuing education (Esmaily et al. Citation2008) and postgraduate education (Flynn & Verma Citation2008). It has also been used in undergraduate medical education including palliative medicine (Paes & Wee Citation2008), dermatology (Clayton et al. Citation2006), critical care (Perkins et al. Citation2005) and family medicine (Hueston et al. Citation2004). We found no evidence in the English language literature of the Delphi process being used in curriculum and competency development in undergraduate EM.

There is no universally accepted uniform process for the use of a Delphi technique (Hasson et al. Citation2000). We chose to develop a questionnaire with a finite list of options. This list of options was based on the literature and existing curricula in Canada and was developed by consensus of a smaller group of experts. The initial list of competencies was overly inclusive by design. The first questionnaire employed a 7-point Likert scale which forced participants to rate each competency indicating the strength of agreement whether the competency should be included. This allowed for a response by each expert without having to make a final commitment on inclusion or exclusion of a given competency. We decided a priori to eliminate from the second questionnaire any competency that was rated 1, 2, or 3 by greater than 75% of the respondents. No competencies met this criterion and all were included in the final questionnaire. However, the 7-point Likert scale ratings resulted in a range of responses, allowing for initial rankings of the competencies. These aggregate rankings were fed back to the panel on the second questionnaire. One of the key elements of the Delphi process is to provide the current status of the groups’ collective opinion after each round. The purpose is to allow each panel member to consider revising their opinion based on the forming group opinion. On the second questionnaire the rating scale was changed to “not include” or “must include”, ultimately we wanted a commitment from each expert on this simple binary question. We believe that changing the rating scale allowed for a significant reduction of the list of competencies from 152 to 62 (59% reduction). The classic Delphi technique had four rounds however some evidence appears to show that either two or three rounds are preferred (Hasson et al. Citation2000). Due to limitations of time and resources, we limited the Delphi process to two rounds.

There is no clear definition in the literature of “expert” for the purposes of a Delphi process. We defined our experts as EM physicians with interest and/or expertise in EM undergraduate education. We employed purposive sampling to ensure we had a panel of experts from all regions of Canada with medical schools. By utilizing a “snowball” technique for recruitment of the expert panel, we ensured a representative panel was formed. Each participant was identified as an expert either by means of their formal position (EM undergraduate education director at their respective medical school) or as being identified as an expert by a colleague. Furthermore, by agreeing to participate in the panel, participants demonstrated a level of interest in the topic. An expert panel usually consists of 15–30 participants (Linstone & Turoff Citation1975). Increasing the group size beyond 30 has seldom been found to improve the results (Fink et al. Citation1984). Our study employed 30 experts. By pre-recruiting interested individuals prior to administration of the questionnaires we had 100% response rate to the first questionnaire and only one panel member not responding to the second questionnaire.

Delphi process is an effective method of collecting opinion and determining consensus. However, a universally agreed proportion does not exist, as the level used depends upon the sample numbers, aim of the research, and resources. The reported level of consensus in the literature ranges from 51% to 80% (Hasson et al. Citation2000). The level of consensus that we sought was 75% of respondents supporting that the competency “must be included”. We chose a high level of consensus to ensure that a manageable, core list of competencies was developed that a medical student could encounter during a time-limited EM clerkship rotation.

There has been an active debate in the literature on the validity of the Delphi process (Jones & Hunter Citation1995). The validity of our study was enhanced by ensuring a representative panel of experts from across Canada was recruited. Our experts represented every province with a medical school with the exception of two provinces. The experts’ profile was diverse in terms of educational background. Our high response rates and use of successive rounds also increased the validity of the results.

There are several widely recognized definitions of the knowledge, skills, and attitudinal attributes appropriate for a physician, including those described in the AAMC's Medical School Objectives Project, the general competencies of physicians resulting from the collaborative efforts of ACGME and the physician roles summarized in the CanMEDS 2005 Physician Competency Framework (LCME). In Canada, the CanMEDS 2005 Physician Competency Framework describes the generic abilities required for a practicing physician. This framework of competencies was originally adapted for use at the postgraduate medical specialist level. We chose to utilize the CanMEDS Framework because it is increasingly being used to inform undergraduate medical education and is familiar to most educators in Canada. We demonstrated that the CanMEDS Framework is a valid and practical framework to structure competencies and curriculum at the undergraduate level.

It is no surprise that the number of competencies for the “Medical Expert” role (43) far exceeded those for another role, and in fact significantly outnumbered all the other roles combined (19). This reflects the traditional teaching paradigm of all medical schools and the fact that medical knowledge and skills are still seen by teachers to be at the core of the curriculum for medical students. More programs are being developed for medical schools that reflect how to incorporate the non- “Medical Expert” roles into the curriculum. Of note, the expert panel did not include one competency in the final list in the “Manager” role. This may reflect the fact that they believe the role is not appropriate for students in undergraduate education. It may also reflect an exercise of priorities in what should be included in a time-limited EM Clerkship.

There are a number of limitations of this study. The Delphi process itself has limitations including concerns of the lack of reliability (Hasson et al. Citation2000). Inherent in the anonymity of the process is a danger of the lack of accountability of the opinions expressed. Our expert panel had no representation from two provinces that have medical schools and did not include representation from French language medical schools in Canada. The panel included only four physicians from community hospitals (teaching and non-teaching). This panel format was designed to include community representation yet recognize that most EM Clerkships in Canada (and expertise in the practicalities of curriculum delivery) are concentrated in Academic Health Science Centres. Furthermore, since undergraduate training in EM is typically oriented toward acquiring general fundamental competencies, our panel included physicians who practiced in a variety of adult and adult/pediatric settings. Since emergency physicians practicing exclusively pediatric EM are almost always specialty or sub-specialty certified in this area and see a unique practice population, we chose to exclude them from this undergraduate study. Our study utilized the CanMEDS 2005 Competency Framework (Frank Citation2005) to organize the competencies. Although this framework is increasingly informing undergraduate education, it has not been adopted by all medical schools in Canada. Our study provides evidence of the face validity of such an approach.

Conclusions

This study demonstrated that a modified Delphi process is an effective method to establish a national consensus in development of core competencies for EM clerkship. These results are applicable and useful to EM clerkship programs in Canada. We propose that such a method could be used by other medical specialties and health professions to develop rotation-specific core competencies.

Acknowledgments

The authors are grateful to the external reviewers: Jonathan Sherbino, McMaster University; Farhan Banji, McGill University; Eddy Lang, University of Calgary and Risa Freeman, University of Toronto. They also gratefully acknowledge the members of the expert panel including: Simon Field, Connie LeBlanc, John Ross, Dalhousie University; Danielle Blouin, Jaelyn Caudle, Jim Landine, Queen's University; Randy Cunningham, Ryan Oland, Curtis Rabuka, Andrew Stagg, University of Alberta; Nancy Austin, Mike Mostrenko, Patrick Rowe, University of British Columbia; Laurie-Ann Baker, David Lendrum, University of Calgary; Albert Buchel, Zoe Oliver, University of Manitoba; Jason Frank, Brian Weitzman, University of Ottawa; Nadim Lalani, Patrick Ling, University of Saskatchewan, Shirley Lee, Rahim Valani, Stella Yiu, University of Toronto.

Declaration of interest: The authors report no conflicts of interest.

References

  • Accreditation Council of Graduate Medical Education 2010. ACGME Outcome Project. [Retrieved 2010 August 16]. Available from: http://www.acgme.org/outcome
  • Association of American Colleges 2005. Recommendations for Clinical Skills Curricula for Undergraduate Medical Education 2005. [Retrieved 2010 August 9]. Available from: https://services.aamc.org/Publications/showFile.cfm?file=version56.pdf&prd_id=141&prv_id=165&pdf_id=56
  • Barton AJ, Armstrong G, Preheim G, Gelmon SB, Andrus LC. A national Delphi to determine developmental progression of quality and safety competencies in nursing education. Nurs Outlook 2009; 57(6)313–322
  • Burke MJ, Brodkey AC. Trends in undergraduate medical education clinical clerkship learning objectives. Acad Psychiatry 2006; 30(2)158–165
  • Clayton R, Perera R, Burge S. Defining the dermatological content of the undergraduate medical curriculum: A modified Delphi study. Br J Dermatol 2006; 155(1)137–144
  • Esmaily HM, Savage C, Vahidi R, Amini A, Zarrintan MH, Wahlstrom R. Identifying outcome-based indicators and developing a curriculum for a continuing medical education programme on rational prescribing using a modified Delphi process. BMC Med Educ 2008; 8: 33
  • Fink A, Kosecoff J, Chassin M, Brook RH. Consensus methods: Characteristics and guidelines for use. Am J Public Health 1984; 74: 979–983
  • Flynn L, Verma S. Fundamental components of a curriculum for residents in health advocacy. Med Teach 2008; 30(7)e178–e183
  • Frank JR, (Ed) 2005. The CanMEDS 2005 physician competency framework: Better physicians, better care. Ottawa: The Royal College of Physicians and Surgeons of Canada. [Retrieved 2010 August 9]. Available from: http://rcpsc.medical.org/canmeds/CanMEDS2005/CanMEDS2005_e.pdf
  • Frank JR, Penciner R, Upadhye S, Nuth J, Lee C. State of the nation: A profile of Canadian emergency medicine clerkships 2007. Can J Emerg Med 2008; 10(3)266
  • Fried H, Leao AT. Using Delphi technique in a consensual curriculum for peridontics. J Dent Educ 2007; 71(11)1441–1446
  • Hasson F, Keeney S, McKenna H. Research guidelines for the Delphi survey technique. J Adv Nurs 2000; 32(4)1008–1015
  • Hobgood C, Anantharaman V, Bandiera G, Cameron P, Halperin P, Holliman J, Jouriles N, Kilroy D, Mulligan T, Singer A. International federation for emergency medicine model curriculum for medical student education in emergency medicine. Can J Emerg Med 2009; 11(4)349–354
  • Hueston WJ, Koopman RJ, Chessman AW. A suggested fourth-year curriculum for medical students planning on entering family medicine. Fam Med 2004; 36(2)118–122
  • Jones J, Hunter D. Consensus methods for medical and health services research. BMJ 1995; 311: 76–380
  • Liaison Committee on Medical Education. 2010LCME Accreditation Standards: Educational Program for the MD degree: Educational Objectives. [Retrieved 2010 August 9]. Available from: http://www.lcme.org/functionslist.htm#educational program
  • Linstone HA, Turoff M. The Delphi method techniques and application. Addison-Wesley, Boston, MA 1975
  • Manthey DE, Coates WC, Ander DS, Ankel FK, Blumstein H, Christopher TA, Courtney JM, Hamilton GC, Kaiyala EK, Rodger K, et al. Report of the task force on national fourth year medical student emergency medicine curriculum guide. Ann Emergency Med 2006; 47: E1–E7
  • Paes P, Wee B. A Delphi study to develop the association for palliative medicine consensus syllabus for undergraduate palliative medicine in Great Britain and Ireland. Palliat Med 2008; 22(4)360–364
  • Perkins GD, Barrett H, Bullock I, Gabbott DA, Nolan JP, Mitchell S, Short A, Smith CM, Smith GB, Todd S, et al. The acute care undergraduate teaching (ACUTE) initiative: Consensus development of core competencies in acute care for undergraduates in the United Kingdom. Intensive Care Med 2005; 31(12)1627–1633
  • Valente TW, Pumpuang P. Identifying opinion leaders to promote behaviour change. Health Educ Behav 2007; 34(6)881–896
  • Walley T, Webb DJ. Developing a core curriculum in clinical pharmacology and therapeutics: A Delphi study. Br J Clin Pharmacol 1997; 44(2)167–170

Appendix 1

Sources for first questionnaire

  1. Dalhousie University, Department of Emergency Medicine, Clinical Competencies/Encounters Documentation Form.

  2. Dalhousie University, Emergency Medicine Clerkship Objectives.

  3. Queen's University, Emergency Medicine Clerkship Objectives.

  4. University of Alberta, Medical Student Emergency Medicine Rotation Course Goals.

  5. University of Alberta, Medical Student Emergency Medicine Rotation Course Goals.

  6. University of British Columbia Emergency Medicine Core Year 3 Clerkship Objectives.

  7. University of Calgary, Emergency Medicine Clerkship, Rotation Objectives.

  8. University of Manitoba, Emergency Medicine Clerkship Academic Objectives.

  9. University of Ottawa, Emergency Medicine Clerkship Goals and Objectives.

  10. University of Saskatchewan Emergency Medicine Clerkship Core Content.

  11. University of Western Ontario Emergency Medicine Clerkship Objectives.

  12. Ref. Manthey et al. (Citation2006)

  13. Ref. Hobgood et al. (Citation2009).

Appendix 2

Instructions to expert panel

WHAT TO CONSIDER WHILE COMPLETING THE QUESTIONNAIRE:

As you rate each competency, consider the following:

  1. That the level of training of the learner is a senior medical student NOT a resident.

  2. Which competencies should be taught DURING a time –limited 4 week Emergency Medicine clerkship rotation (as opposed to other rotations).

  3. Which competencies should be acquired during an Emergency Medicine clerkship rotation that ALL medical students should have upon graduation regardless of chosen specialty or career path.

  4. This rating is not what is currently done at your site, but rather what you think should be included in a national curriculum.

  5. Competencies may be acquired through exposure to real or simulated patients (e.g. clinical activities, workshops, simulation, online learning).

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.