5,538
Views
17
CrossRef citations to date
0
Altmetric
Web Papers

Which pedagogical principles should clinical teachers know? Teachers and education experts disagree Disagreement on important pedagogical principles

, MD, , PhD, , MD, , MD, , MD, , MD, , MD & , MD show all
Pages e117-e124 | Published online: 03 Jul 2009

Abstract

Background: In a previous study, a group of non-clinician medical education experts identified 30 pedagogical principles, knowledge of which might enhance clinical teaching effectiveness.

Aims: To assess expert teachers? perceptions of which basic pedagogical principles, if known and understood, would enhance their teaching effectiveness.

Method: We conducted an on-line Delphi consensus-building study with 25 expert clinical teachers who rated the importance to teaching effectiveness of each of the 30 principles.

Results: There was agreement between clinicians and PhD education experts on the importance of several of the principles but there was major disagreement between the 2 groups for many principles, including those related to assessment and those relevant to clinical teachers? day to day teaching activities.

Conclusions: The lack of concordance between clinical teachers and education experts with respect to how the 30 principles rank in importance may have serious implications for faculty development and for the design, development, and assessment of educational programs. Program directors and curriculum designers should exploit the strengths of both clinician and non-clinician educators to assure the success of educational programs.

Introduction

Clinical teachers are usually regarded as competent practitioners of their craft. Many develop teaching behaviors based on a combination of their own experiences as learners as well as general conceptions of teaching derived from personal experience and observation (Irby Citation1994) but few have studied the art and science of medical education (Misch Citation2002). For many years there has been a tacit assumption that expertise in clinical practice will translate into competence in clinical teaching (Hesketh et al. Citation2001; McLean Citation2001; Seabrook Citation2001) but studies suggest that mere content expertise is insufficient to guarantee teaching excellence (Chen & Ennis Citation1995; Darling-Hammond &Youngs Citation2002).

Successful clinical teachers are said to possess a special form of content-specific pedagogical knowledge which develops through the apprenticeship model of observation and experience and which facilitates the translation of content expertise into a form that can be readily understood and learned by students (Reynolds Citation1992). Their expertise relies on effective behaviours and strategies, that is, they have practical, ‘how to do it’ knowledge of teaching but few understand the basic principles, theories and concepts of the teaching and learning process, or the ‘why’ of pedagogic behaviours (McLeod et al. Citation2003).

Several studies have addressed the peculiar irony that most clinical educators have never studied the art and science of pedagogic principles and the learning process (Calderhead Citation1996; Bligh Citation2001; McLeod et al. Citation2003). Others indicate there is reason to believe that good knowledge and understanding of the basics of pedagogy can sensitize teachers to the process of learning, provide logic for understanding repeated successes and failures, and serve a critical function in informing teaching practice (Calderhead Citation1996; Fenstermacher Citation1978; Bronsford et al. Citation2000; Parsell & Bligh Citation2001).

In a recent study, a group of medical education research experts, all of whom have advanced education degrees but none of whom is involved in clinical teaching with patients, identified a number of core pedagogical concepts, knowledge of which might enhance the success of clinical teachers who have not systematically studied the teaching and learning process (Mcleod et al. Citation2003). As a follow-up to that study with non-clinician educators, we decided to explore specialist clinicians’ perceptions of which basic principles and concepts might have particular importance to their instructional endeavors, and to compare their perceptions to those of the education experts involved in the earlier study. We set out with the assumption that any mismatch in perceptions between the education experts and the clinical teachers who carry the bulk of clinical teaching is likely to have significant implications for both faculty development programs and the application of pedagogical principles in the clinical milieu.

Method

Recruitment of experienced clinical teachers

To recruit clinical teachers for the study, we solicited the aid of the residency program directors of the 5 largest postgraduate specialties, in terms of numbers of teachers and trainees, at our university. The specialties included: family medicine, internal medicine, pediatrics, psychiatry and surgery. We asked each director to send us the names of the 10 most highly rated clinical teachers in their departments. We asked that all had consistently received excellent teaching evaluations and that all had at least 5 years teaching experience. We felt that this process would provide a good, non-random sample of ‘expert’ clinical teachers.

When all lists were returned we selected the top 5 names from each list and sent letters to all 25 teachers inviting them to participate in an on-line Delphi consensus building exercise designed to solicit their perceptions of which pedagogic principles are most important for clinicians to know. Twenty-one agreed to participate and 4 declined. We replaced these 4 from the list of 10 teachers submitted by their respective program directors.

Glossary development:

In the current study, the lead author and 6 of the co-authors undertook the task of developing a simple glossary which included the meaning or explanation of the 30 education principles identified as important for successful clinical teaching by the education experts in the previously mentioned study (Mcleod et al. Citation2003). Our hope was to render the principles readily comprehensible to the clinician teachers by including examples of the applicability of each to clinical teaching. Initially the lead author developed a rough draft of the glossary entry for each principle. Then 3 pairs of co authors discussed, revised and clarified the concepts or principles, with each pair responsible for ten. Following this, all 30 glossary items underwent a final revision with input from all co- authors.

(Appendix 1).

The Delphi consensus building exercise

The Delphi technique is a versatile iterative method for systematic collection of judgments and consensus on a particular topic using sequential questionnaires interspersed with summarized information and feedback derived from earlier responses (Murray & Hammons Citation1995). We conducted the Delphi with an easy-to-use, online instrument called Survey Monkey (www.surveymonkey.com). This is an on-line survey creation, distribution, collection and analysis tool which allows for the tracking and collecting of responses which can be downloaded to a spread sheet for statistical analysis. This process had the advantage of ease of use for the busy clinical teachers. The instructions to the surveyed clinical teachers recommended that they read the glossary item accompanying each pedagogical item before rating its importance in clinical teaching. Then, using their own broad clinical teaching experience, they were to rate each on a 4 point scale, based on their perception of the likely benefit to clinical teachers of knowing and understanding the basic concept or principle underlying the item. We asked them to rate each as:

  1. Clinical teachers must know this principle

  2. Clinical teachers should know this principle

  3. For clinical teachers it would be nice to know this principle

  4. For clinical teachers it is not important to know this principle

This process and the rating scale used were identical to those used in the previous pedagogical principles study with non-clinician medical education experts.

Following round one of the Delphi rating process, we calculated arithmetic means and standard deviations for the clinicians’ ratings of each of the 30 principles. We then sent these results to all of them for round 2 and asked them to again rate all 30 items taking into account their perceptions of the importance of each, as well as the mean ratings reflecting the opinions of the other teachers involved in the exercise. When round two ratings had been returned by the participants, we performed an analysis identical to that following the first round and again sent the results to the raters for the third round. We used the standard deviations of the items as a measure of convergence towards consensus (Smith & Simpson Citation1995). Thus the entire analysis process duplicated that used in the study which involved non-clinician education experts (Mcleod et al. Citation2003).

Results

Twenty-one of the 25 expert clinical teachers completed all 3 rounds of the Delphi consensus–gathering exercise. For 27 of the 30 pedagogical items, the standard deviations decreased progressively over the 3 rounds. For the remaining 3 items they remained virtually unchanged through 3 rounds. The mean standard deviations for all 30 item ratings were: Round 1 = 0.74; Round 2 = 0.60; Round 3 = 0.54. The progressive decline in SD over three rounds was interpreted as good evidence of rating consensus.

As shown in , we used the clinical teachers’ final ratings of the 30 principles to create a rank order of their perceptions of the importance of each. For comparison, the second column of shows the ratings accorded the same principles by the 13 non-clinician education experts, as described in the previous publication (Mcleod et al. Citation2003).

Table 1.  Rank order of the importance to clinical teaching of 30 basic pedagogical principles, according to experienced clinical teachers (n = 21) and medical education experts (n = 13). The mean ratings of each item are shown in parentheses (1 = must know, 2 = should know, 3 = nice to know, 4 = not important to know)

It can be seen that the clinicians’ mean ratings of the importance of the 30-items were high, ranging from 1.15 to 2.72 on the 4-point importance scale. Overall the ratings by education experts in the previous study were even higher (range 1.0 to1.62). Comparison of clinical teachers’ ratings to education experts’ ratings of the same principles demonstrates strikingly different rank ordering of the importance of the 30 principles, save for ‘goals and objectives’ which was ranked first by both groups. Thereafter there are major differences in rankings. It is particularly noteworthy that the pedagogical principles related to assessment are ranked very highly by the education experts but are very low on the teachers’ rank order list. Conversely, many of the items which might have relevance to teachers’ day-to-day interactions with learners in the clinical setting rank higher on the clinical teachers’ list. Examples of these include: role modeling; communication skills and concepts; and supervision of learners. Another noteworthy observation is that the clinical teachers rated fully one third of the principles as less than 2 on the 4 point scale, that is, they were rated below the ‘must know’ and ‘should know’ levels.

Discussion

The clinical teachers in this study rated several of the 30 pedagogical principles highly with respect to their importance for success in clinical teaching. This indicates that the clinical teachers believe that an understanding of the principles would likely enhance their teaching effectiveness. The same applied for the earlier study when education experts rated the same principles. Among the principles rated and ranked similarly by both the clinical teachers and the education experts were: the need for goals and objectives; the emphasis on self-directed versus teacher-directed learning; the importance of coaching; the utility of understanding adult learning theory; and idiosyncratic problem solving (a process by which individuals make clinical decisions).

Nevertheless there are striking mismatches of opinions of the two groups of raters on many items. Of particular note are the education experts’ high ratings and rankings of the several items related to assessment. Conversely, those items were ranked very low in the clinical teacher ratings list. Although we did not explore the reasons for those differences, it is possible that education experts were expressing their recognition of the critical role that assessment plays in the learning process (Schuwirth & van de Vleuten Citation2004). The clinical teachers, who may not be directly involved in designing and implementing learner assessment, may have been reflecting their experience as ‘informal’ or ‘workplace’ assessors. Furthermore, clinical teachers may have little awareness of how well their students are performing overall because they are not involved in collecting data, and ordinarily do not have access to all of the other assessment data on their students. Hence they may not perceive the importance and impact of their individual assessments.

The pattern of the clinical teachers’ pedagogical principles rankings suggests that their ratings decisions were significantly influenced by their day to day teaching experiences in the clinical environment (e.g. communicating, role modeling, supervising, problem solving, mentoring). On the other hand, education experts who are not regularly involved in clinical teaching may not appreciate the practicality of applying these principles in the context of clinical teaching.

Our study has a number of limitations. First is the possibility that the relatively small sample size of 21 clinicians could produce skewed results as a consequence of selection bias. We attempted to minimize this bias by selecting competent teachers from 5 large specialties and including only those with significant experience. A second possible confounder is the clinicians’ access to the glossary. It may have helped many in coming to a decision about the importance of some principles but may have impaired their comprehension of others. The fact that the education experts did not have a glossary available when they rated the principles may also contribute to the lack of concordance between ratings by the 2 groups. All of the medical education experts are university–based researchers with advanced degrees and we assumed they would have a broad knowledge of pedagogical principles.

Our findings have potentially important implications for a wide range of activities in the Medical school. Firstly it is worrisome that the teachers in our study ranked many of the assessment items very low. This suggests that they may not recognize that assessment is more than just the completion a form. Assessment should play a critical role in the learning enterprise and faculty development is badly needed to convince teachers of that fact. Secondly, the major disconnect between clinician teachers and educators may have implications for the PhD Medical educators, many of whom are intimately involved in the design, implementation and assessment of courses for medical students and post-graduate trainees. These educators must recognize which important basic pedagogical concepts are applied daily in the clinical setting and act accordingly when dealing with those critical components of the curriculum. Finally, medical school administrators should be made aware of the perception differences between the 2 groups. They would be wise to take advantage of the ‘strengths and weaknesses’ of members of both groups when planning, designing and assessing curricula, courses and faculty development activities.

We recognize that neither of the rank order lists should be construed as the ‘correct’ order of importance of the pedagogical principles. Still, the non-agreement begs for significant discourse between members of the 2 groups, both of which influence curriculum and course design, program implementation and assessment, and faculty development.

A useful starting point for a meeting of the minds in the discourse derives from the Miller pyramid of assessment (Miller Citation1990). Miller's model outlines the relationships between knowledge, skills, competence and performance in the context of assessment. Expert clinical teachers such as those in our study sample can be regarded as performing at the top tiers of the clinical teaching pyramid since they have developed into competent educators who are performing at a high level while the education experts possess the critically important pedagogical knowledge base supporting the pyramid. Undoubtedly the non-clinician educators, lacking medical training, are capable of performance at the top tiers of other ‘competence–performance pyramids’ in the broad domain of education. Both groups are fundamental to the structural integrity of the ‘clinical teacher competence pyramid’ and the education enterprise and each can benefit from a dialogue designed to exploit the strengths of the other.

Conclusion

Experienced, highly rated clinical teachers and experienced non-clinician medical education experts differ substantially when asked to rate the importance of pedagogical principles, knowledge of which might enhance teaching effectiveness. This lack of concordance is likely to have significant implications for both faculty development programs and the application of pedagogical principles in the clinical milieu.

Declaration of interest:The authors report no conflicts of interest. The authors alone are responsible for the content and writing of the paper.

Acknowledgements

We sincerely thank the Royal College of Physicians and Surgeons of Canada for their generous financial support. We are also indebted to the specialist clinicians at the McGill University teaching hospitals who unselfishly gave of their time to participate in the Delphi consensus building process. We thank Dr Peter Cantillon for his helpful suggestions related to the manuscript.

The protocol for this study was reviewed by the McGill University IRB which granted ethics approval. No potential conflicts of interest are declared.

Additional information

Notes on contributors

Peter McLeod

PETER MCLEOD, MD is Professor of Medicine and Pharmacology and a CORE member of the Centre for Medical Education at McGill University

Yvonne Steinert

YVONNE STEINERT, PhD is Professor of Family Medicine Associate Dean for Faculty Development and Director of the Centre for Medical Education at McGill University.

Colin Chalk

COLIN CHALK, MD is Associate Professor of Neurological Sciences and a CORE member of the Centre for Medical Education at McGill University.

Richard Cruess

RICHARD CRUESS, MD is Professor of Surgery and CORE member of the Centre for Medical Education at McGill University.

Sylvia Cruess

SYLVIA CRUESS, MD is Professor of Medicine and CORC member of the Centre for Medical Education at McGill University.

Sarkis Meterissian

SARKIS METERISSIAN, MD is Associate Professor of Surgery, Associate Dean for Postgraduate Education and a CORE member of the Centre for Medical Education at McGill University

Saleem Razack

SALEEM RAZACK, MD is Associate Professor of Pediatrics and a CORE member of the Centre for Medical Education at McGill University.

Linda Snell

LINDA SNELL, MD is Professor of Medicine and a CORE member of the Centre for Medical Education at McGill university.

References

  • Bligh J. Learning from uncertainty: A change of culture. Med Educ 2001; 35: 2–3
  • How People Learn; Brain, Mind, Experience and School, J Bronsford, A Brow, R Cocking. The Committee on Development in the Success of Learning National Academy Press, Washington, DC 2000
  • Calderhead J. Teachers: Beliefs and knowledge. Handbook of Educational Psychology, D Berliner, R Calfie. Simon and Shuster McMillan), (New York 1996
  • Chen A, Ennis C. Content knowledge transformation: An examination of the relationship between content knowledge and curricula. Teach Teacher Educ 1995; 11: 38–401
  • Darling-Hammond L, Youngs P. Defining highly qualified teachers: What does scientifically–based research actually tell us?. Educ Res 2002; 31: 13–25
  • Fenstermacher G. A Philosophical consideration of research and teacher effectiveness. Preview of Research in Education, Shulman S. Illinois. Peacock), Itasca 1978; 157–185
  • Hesketh EA, Ragnall G, Buckley EG, et al. A framework for developing excellence as a clinical educator. Med Educ 2001; 35: 555–564
  • Irby D. What clinical teachers need to know. Acad Med 1994; 69: 333–342
  • McLean M. Rewarding teaching excellence. Can we measure teaching excellence? Who should be the judge?. Med Teac 2001; 23: 6–11
  • McLeod P, Steinert Y, Meagher T, McLeod A. The ABC's of pedagogy for clinical teachers. Med Educ 2003; 37: 638–644
  • Miller GE. The assessment of Clinical Skills/Competence/Performance. Acad Med 65 (Suppl). 1990; S63–S67
  • Misch D. Androgogy and Medical education: Are medical students motivated to learn?. Adv Health Sci Educ 2002; 7: 153–160
  • Murray J, Hammons J. Delphi: A versatile methodology for conducting qualitative research. Rev Higher Educ 1995; 18: 423–436
  • Parsell G, Bligh J. Recent perspectives on clinical teaching. Med Educ 2001; 35: 409–414
  • Reynolds A. What is competent beginning teaching? A review of the literature. Rev Educ Res 1992; 62: 1–35
  • Schuwirth L, Van de Vleuten C. Merging views of assessment (editorial). Med Educ 2004; 38: 1208–1211
  • Seabrook M. Learning to teach (editorial). Postgrad Med J 2001; 77: 361–362
  • Smith K, Simpson R. Validating teachers’ competencies for faculty members in higher education: A national study using the Delphi method. Innov High Educ 1995; 19: 223–234

Appendix 1. Glossary of pedagogical principles

A. Pedagogical principles related to curriculum

1. Curriculum structure and design

Description

The formal curriculum is a written educational plan outlining the goals and objectives to be pursued, which topics, of knowledge, skills and attitudes should be covered and which methods are to be used for learning, teaching and evaluation. In other words, it includes all the planned learning experiences of an educational institution, school or department.

The hidden curriculum refers to a critical set of influences that function at the level of organizational structure and culture. It recognizes the culture and moral communities which are usually not mentioned in the written, formal curriculum document but which are intimately involved in defining what is ‘good’ and ‘bad’ medicine.

Example

Patterns of collegial interactions based upon local factors (e.g., presence of many subspecialists in an institution leading to earlier decisions to obtain subspecialty consultations for patients than would happen in another practice environment).The informal curriculum is an unscripted, highly interpersonal form of teaching and learning that takes place among and between faculty and students.

2. Goals and objectives

Description

A Goal is a general aim, object or end-effect that one strives to achieve. An example of a goal would be to become a good doctor. In medical education, an objective is what the learner will be able to know or do after taking part in educational activities. It includes the specific knowledge, skill, or attitude a learner should master and may include the standard or level to which it is mastered. For example, a final year student should be capable of inserting, without difficulty, an intravenous line in the forearm of a healthy young person.

3. Learning environment

Description

The learning environment includes the social, emotional and physical environment in which the learning occurs. Many factors influence the environment. Among the most important are student interaction, emotional climate, meaningful learning, breadth of interest and organization. Other dimensions identified are academic enthusiasm, authoritarianism and intellectual maturity. There is undeniable evidence relating the quality of learning and the learning environment.

4. Lesson structure, planning

Description

The structure of a lesson should be determined by the objectives, the students’ previous experiences, the size of the class and the comfort of the teacher. The teacher or group leader must plan in advance and develop the necessary learning resources to assure that: the learning is perceived as relevant; it is participatory and actively involves the students; it is focused on problems; it is designed so that learners can take responsibility for their own learning; and it is based on mutual trust and respect. The learning should also involve cycles of action and reflection rather than a didactic lecture.

Example

Choosing a case-based discussion to illustrate respiratory physiology because the teacher is working with students in small groups of 15.

B. Pedagogical principles related to how adults learn

5. Active vs. passive learning

Description

Active learning implies that the student is intellectually and emotionally engaged in most or all aspects of the learning process. Passive learning occurs when the learning activity is teacher-centered with the student acting solely as a passive recipient of information. An example of active learning is a student researching information related to solving a problem posed by a tutor. The best example of passive learning is the classic didactic lecture accompanied by non-thinking students furiously writing notes.

6. Adult learning theory

Description

The critical elements of adult learning theory include the following:

  1. Learning is favored if the learning environment is safe and allows learners to express themselves as they are involved in mutual planning of relevant methods and curriculum content. The learning environment should also support them in linking new knowledge to their existing knowledge.

  2. Learners diagnose their own needs, and this is motivating to help them to formulate their own objectives, to identify learning resources, and to devise strategies to achieve their objectives. Learners are encouraged to participate in evaluating their own learning. This can help in the development of self-reflection.

7. Problem solving for learning

Description

It is believed that students learn best by working through a clinical problem. In problem-based medical schools students identify issues raised by specific problems to help develop understanding about underlying concepts and principles. In all problem-solving activities new knowledge and understanding arise through working on a problem rather than in the traditional approaches in which the new knowledge is prerequisite for working on the problem.

8. Idiosyncratic problem solving

Description

Idiosyncratic problem solving implies that different experts often disagree about the way a specific problem case should be solved. There is often broad agreement about the solution, but disagreement on the way to proceed through the problem-solving process. Idiosyncratic problem solving may have implications for compromising the validity of educating and testing students.

9. Relevance for Learning

Description

Relevance implies practicality and applicability. Research has indicated that a student's approach to learning–surface or deep–is a crucial determinant of the quality of learning outcomes. Courses or learning activities that foster deeper learning commonly provide a rich context which motivates learning. Advocates of teaching with patients and those in PBL schools who use paper cases as stimuli for learning recognize the motivating value of real or ‘near real’ (relevant) problems for learning in health care specialties.

10. Role modeling

Description

Role modeling entails demonstration of behaviors, attitudes and values that may be modeled or copied by learners. Role modeling may be conscious or unconscious and involve demonstration of desirable or undesirable characteristics. Educators view role modeling as a powerful method of teaching and learning.

11. Self-regulation of learning

Description

Self-regulation of learning relates to a form of education that involves the individual learner's initiative to identify and act on his or her learning needs (with or without assistance), taking increased responsibility for his or her own learning. For example, a student who recognizes gaps in his/her understanding of an acid-base disturbance in a seriously ill patient might take the initiative to review the topic using an on-line data base. Self-regulation also refers to the monitoring of one's own thinking processes during learning activities.

12. Transfer of learning

Description

Transfer of learning is defined as the effective application by trainees, to their jobs, of the knowledge, skills and attitudes gained as a result of attending an educational program. The likelihood of transfer is enhanced when learning activities are authentic or similar to the activities required in the practice setting. An example of transfer of learning in surgery would be transfer to the operating room of the skills acquired in a simulation center.

C. Pedagogical principles related to helping adults learn

13. Case-based Learning

Description

Case-based teaching and learning activities are based on real, simulated or ‘paper’ cases. Using real patients allows for learning through formal teaching, observation and role modeling. It incorporates adult learning principles as it is meaningful and relevant and allows active involvement. In the absence of patients, healthy people simulating physical signs or other elements of a clinical case are powerful for learning. ‘Paper cases’ also allow for incorporating components of relevance, reality and enjoyment.

14. Coaching

Description

Coaching is the act of facilitating the performance, learning and development of another individual, usually via person-to-person interaction in a work environment. In a learning context, an instructor might coach a student by providing help as needed for the student to successfully diagnose a patient's problem.

15. Communication skills and concepts

Description

Communication skills between physicians and patients and between teachers and students require elements of active listening, responding to listeners’ feelings, communicating concern, understanding and respect. Successful negotiation, particularly in situations of disagreement is also key. Courtesy, respect, explanation and confidence are other anchors of effective communication.

Example

Gauging trainee comfort with a particular procedure by planning with the trainee in order to provide adequate graded supervision during the procedure.

16. Mentoring

Description

Mentors provide learners, either those younger or of a similar age, with career-enhancing functions such as sponsorship, advice, and facilitation of exposure to learning while offering challenging work or protection, all of which help the learner to develop a role and acquire confidence. In learning situations mentors may be assigned by course directors or managers, or mentorship may develop serendipitously because of individual initiatives by the mentor or the mentee. Mentoring may include role modeling and coaching.

17. Motivation for learning

Description

Motivation refers to factors or circumstances that induce or impel a person to act in a particular way. Motivation for learning can be influenced by many features including incentives (e.g., pleasure of achieving a good mark on a test), the learning environment, and the organization of the instructional material. Many behaviors result from a combination of motives.

18. Pedagogical implications of learner differences

Description

Learners demonstrate differences in needs, learning styles, maturation, dedication to learning, test-taking ability, prior knowledge, experiences, independence in learning etc. These differences are relevant to the pedagogical choices made by instructors.

Example

When teaching or assessing different levels of trainees in a clinical teaching unit using the same clinical material, you will focus on different aspects according to the learner's needs. (e.g., a senior resident might need to develop higher level clinical reasoning around the case while a junior may need help with physical exam).

19. Peer and near-peer tutoring

Description

Peer and near-peer tutoring involve interactions between people from similar social groupings who are not professional teachers, helping each other to learn and learning themselves by teaching (e.g., students tutor fellow students). It encompasses a wide range of activities and strategies such as group projects and learning activities, mentoring and support, and tutoring. There is a large body of empirical evidence supporting the efficacy of peer and near-peer teaching and learning in medical settings.

Example

Resident study groups for credentialing exam preparation.

20. Self-directed, teacher-directed instruction

Description

Self-directed learning can be viewed as a method of organizing teaching and learning in which tasks are largely within the learner's control. The quality of learning is best when learners feel safe and comfortable expressing themselves and are involved in the planning of relevant teaching methods and curriculum content. Self-directed learning is also viewed as a goal towards which learners strive so they become empowered to accept personal responsibility for their own learning. In contrast, teacher-directed instruction implies that control of all elements of the educational process rests with the teacher.

21. Supervision of learners

Description

Clinical and educational supervision are essential at all levels of medical education. This supervision has been defined as ‘the provision of guidance and feedback on matters of personal, professional and educational development in the context of the learner's experience of providing safe and appropriate care.’ Direct supervision (e.g., observing the learner) and the quality of the supervisory relationship are key to effective supervision.

Example

Guiding a trainee through the insertion of a central line.

D. Pedagogical principles related to assessment

22. Assessment to drive learning

Description

All tests and assessments have an educational impact and can influence learning. For example, students’ study strategies and focus are often influenced by examinations; as a result, educators should capitalize on the phenomenon. When the examinations match the curriculum goals and objectives, preparing for the exam means preparing to become a doctor.

Example

National Credentialing exams can provide a way for residents to round out their learning prior to finishing training.

23. Characteristics of assessment instruments

Description

Five criteria are commonly used to characterize assessment instruments:

  1. Reliability refers to the accuracy with which a test score is determined. For example, a long case oral exam could be called unreliable if a student got widely different marks from different examiners observing the same performance.

  2. The validity of a test is the extent to which it measures what it purports to measure. A valid test of appendectomy skill of a surgeon would be observation of the surgeon doing the procedure. An invalid test of appendectomy skill would entail a multiple-choice question about the procedure.

  3. Educational impact of an assessment is a criterion because students tend to focus on learning what they believe will be on the examination.

  4. Cost effectiveness is critical. An expensive resource and time intensive test will not be eagerly accepted by educators and administrators.

  5. Acceptability of the test vehicle by learners and examiners is critical. If either or both groups are not accepting of a test, it will not survive.

24. Criterion vs. Norm-referenced Assessment

Description

Criterion-referenced assessment refers to testing against an absolute standard such as rating an individual's performance against a benchmark. In norm-referenced assessment, performance of a learner is compared to that of other learners. In health care, assessors often compare one student's performance to the performance of classmates or fellow group members. This is norm-referenced assessment.

Examples

Criterion-referenced assessment: Evaluating a learner's performance based upon predetermined educational objectives.

Norm-referenced assessment

Writing on an evaluation form that the learner ‘performed below average when compared to peers …’.

25. Key features for assessment

Description

Key features assessment entails using a realistic clinical case followed by multiple choice or open-ended questions related to the case. Successful completion of the question requires that the student outline essential decisions in investigation and/or management of the problem. Key feature questions seem to measure problem solving ability without losing validity.

26. Knowledge, skills and attitudes

Description

Knowledge is the relevant content of the subject being studied (e.g., the cardiac cycle). Skills are the acts or actions that a trainee should master. Examples of skills are: ability to interact empathetically with a patient (interpersonal skills); ability to interpret an X-ray (analytical skills); ability to perform a lumbar puncture (procedural skills). Attitudes are emotional and intellectual predispositions resulting from efforts to make sense of, organize, predict and shape our reactions to the world. Attitudes and values influence behaviors which are overt, modifiable and measurable.

27. Performance-based assessment

Description

Assessment of learners can occur at various levels, ranging from lower levels (‘knows’, ‘knows how’) to higher levels of competence (‘shows how’, ‘does’). Performance-based assessment (also called work-based assessment) aims to evaluate higher levels of competence. Multiple choice questions, simulations and objective structured clinical examinations target low levels of competence. There is an assumption that performance-based assessments are the best reflection of actual practice. In clinical medicine, the standard in-training evaluation is a performance-based assessment.

28. Reasons for assessing learners

Description

Assessment or evaluation is essential as a measure of the quality of an educational program. Of equal or greater importance is that assessment can provide evidence of how well students’ learning objectives are being achieved and whether teaching standards are being maintained. Evaluation or assessment can also serve as a check on whether the school curriculum is evolving in the desired way. It then contributes significantly to the academic development of an institution and its members. Assessment processes also convey to the learners what the teachers regard as important to learn.

29. Summative vs. formative assessment

Description

Assessment is a system of evaluation of professional accomplishments using defined criteria. It usually includes an attempt at measurement, either by grading on a rough scale or by assigning a numerical value. Summative assessment is testing which occurs at the end of a term or course, used primarily to provide information about how much the student has learned and how well the course was taught. Formative assessment is testing that is part of the ongoing teaching/learning process. It should include delivery of feedback to the student.

Examples

Formative Assessment: mid-rotation feedback provided to trainees on a clinical rotation.

Summative Assessment

End of rotation in training evaluation report (ITER).

30. Unintended consequences of assessment

Description

Assessment strategies are designed to determine learner competence and as a quality assurance procedure to gauge the effectiveness of the curriculum. Other consequences of assessment include: (1) Assessment can influence student learning. (2) Undue emphasis on unimportant content can have a negative influence on learning. (3) Alienating learners may result if the assessment content is different from what is expected. (4) Learning facts rather than integration can favor surface and not deep learning.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.