2,540
Views
24
CrossRef citations to date
0
Altmetric
Research Article

Attributes of residents as teachers and role models – A mixed methods study of stakeholders

, , &
Pages e1052-e1059 | Published online: 08 Nov 2012

Abstract

Background: Residents are at the forefront of student education in the hospital, yet valid tools to assess their performance as teachers are lacking.

Aims: To develop a valid evaluation tool for assessing resident performance as educators for clerkship students.

Method: A mixed-methods design was used. Focus groups of residents and medical students explored desired behaviors in resident educators. Using grounded theory, a list of behaviors was generated inductively through iterative review and categorized into themes. After thematic saturation, behaviors were rated on a Likert scale by stakeholders based on “importance” and “accuracy of measurement.” Items which were both important and accurate were used in the final tool.

Results: Eighty-five desirable behaviors for resident educators were identified and consolidated into a 14-item tool. Twenty met both “importance” and “accuracy” criteria and fell under themes of respect, safe environment, balancing supervision with autonomy, relevant teaching and feedback. Nineteen “important” behaviors deemed not accurately measurable fell under themes of professionalism, communication, management skills and leadership.

Conclusions: Evaluation of residents as teachers and development of resident-as-teacher curricula should emphasize aforementioned areas. Professionalism and organizational skills may not be measurable reliably by learners. Complementary tools to assess these aspects of resident performance are necessary.

Introduction

Residents are often at the forefront as teachers for clerkship medical students on inpatient services. The quality of residents as teachers is associated with medical student outcomes. Better resident teachers have been linked to improved medical student academic performance during the clerkship (independent of the influence of the attending physician) (Griffith et al. Citation1998) and also influence medical student career choices (Wright et al. Citation1997). Not only is the quality of residents as teachers important from the perspective of student learning, but teaching is also a required skill (included within practice-based learning and improvement) in which competence is expected of all graduating residents by the Accreditation Council for Graduate Medical Education (ACGME).

However, medical school and residency training often do not prepare residents to become effective teachers. Residents struggle to balance their own educational needs with the needs of their learners, while simultaneously attempting to provide competent patient care. Clear and effective resident feedback on their teaching has been hampered by the lack of a valid and reliable evaluation tool to assess educator quality.

Previous measurement tools developed using theoretical constructs (Stalmeijer et al. Citation2008) alone fail to adequately incorporate the perspectives of various stakeholders (faculty, residents, and medical students). Tools developed using principally faculty rather than residents (Litzelman et al. Citation1998; Copeland and Hewson Citation2000) are inadequate as residents are different from faculty in what they teach and how they do so (Tremonti and Biddle Citation1982; Wilkerson et al. Citation1986). Tools developed in the context of teaching occurring in a mix of different clinical and non-clinical milieus (Litzelman et al. Citation1998; Copeland and Hewson Citation2000) may be less reliable (Beckman et al. Citation2004) than a tool developed for the inpatient setting.

The aim of our study was to develop a valid and reliable tool utilizing perspectives of stakeholders who are involved with and assess resident teaching, to measure pediatric resident performance in an in-patient setting. Accurate assessment of resident performance is the cornerstone of success for any outcomes based paradigm of residency education, including the ACGME outcome project and the CanMEDS initiative (Swing Citation2002; Frank and Danoff Citation2007). This accuracy in assessment would enhance resident learning based on evaluations as well as assist residents in incorporating teaching goals into individualized learning plans. We hypothesized that having a valid tool would also facilitate more accurate monitoring of and provision of feedback on resident-as-educator performance in the future. On a broader scale, such assessment for residents could allow curricula to be designed to impart more effective teaching strategies and ultimately improve learning for students.

Methods

We used a mixed-methods design for our study. All parts of this study were reviewed and the protocol approved by the University of California, Davis Institutional Review Board. Participants provided written informed consent prior to participation or consented through return of completed study survey materials.

Focus group interviews

In the qualitative phase of the study, focus group interviews with groups of medical students and residents explored qualities of residents as good clinician role models and educators. Semi-structured interviews were guided by a single investigator (DAP) experienced in qualitative research methodology. The focus of the interview questions varied based on the educational composition of each of the four groups, which consisted of a homogenous cohort of third year medical students, fourth year medical students, or pediatric residents. For example, pediatric clerkship students on the first day of their rotation (in their second clerkship) were asked to reflect on their previous interactions with residents and their general beliefs about resident role models (focus group #1). The same group was asked, at the end of their pediatric clerkship, to discuss positive and negative interactions and experiences with residents during the pediatrics clerkship (focus group #2). The focus group with fourth year medical students was asked about their experiences with residents based on their entire third year clerkship experiences (focus group #3). The resident focus group was asked to reflect on their experiences interacting with medical students during the in-patient pediatric clerkship or their experiences with residents when they themselves were medical students (focus group #4). provides an outline of the questions that guided the focus group interviews.

Table 1.  Sample of questions used to guide the focus group interviews

We purposefully sampled group participants from the three strata of medical education of interest: third and fourth year medical students, and residents to maximize diversity within each group. Participants were targeted for recruitment to achieve diversity with respect to gender (for all focus groups), expressed interest in choice of residency training (for fourth year focus group), and year of training (for the resident focus group).

Third year medical students were contacted by e-mail prior to the start of their pediatrics clerkship for recruitment into one of two focus group discussions. Each of the 12 students approached consented to participate in the first of the two discussions; nine of the 12 students also participated in the second focus group session. Residents and fourth year medical students were also invited by e-mail to participate in a focus group discussion. Of the 20 residents contacted, nine participated in the group discussion, and 10 of the 27 fourth year students contacted participated in the focus group. Interviews were digitally recorded and transcribed verbatim. Three investigators concurrently inductively analyzed the transcripts using iterative review and grounded theory (Strauss and Corbin Citation1998) to abstract lists of behaviors, including contexts and relevant positive and negative examples, for a good resident educator. Thematic saturation was noted when investigators independently noted the reoccurrence of behaviors discussed and no new behaviors were mentioned during group discussion.

Focus group characteristics

Each focus group lasted 60–90 minutes in length and included nine to 12 participants. Forty students and residents participated in one of the four focus groups; 67% of the participants were women. Verbatim transcription yielded 276 pages of double-spaced text. A list of 63 behaviors was extracted from the combined transcripts. Through an additional process of review, which included detailed contexts as well as positive and, where available, negative, examples of the characteristics, the investigators classified the 63 behaviors into 13 thematic categories.

Comprehensiveness of items

We sought respondent validation for the behaviors and themes identified from the focus groups by sending out an anonymous survey to a mixed population of 11 learners (third and fourth year students and residents), five of whom had participated in one of the focus groups. Eleven subjects were asked to complete the structured survey and all 11 completed the survey. Five subjects, a mixed cohort of third and fourth year students and residents who had participated previously in a focus group, were selected at random to complete the survey.

We asked the learners to comment on inclusiveness of all behaviors they felt were important in residents as clinical role models and teachers. None of the respondents suggested additional themes, but three respondents made recommendations that resulted in adding two behaviors and altering the labels given to two others, resulting in a final list of 65 behaviors. Since the behaviors could be incorporated into the existing thematic framework, we did not attempt to further validate our findings with the literature.

National educator panel review of behaviors

We then emailed the list of 65 behaviors to a convenience sample of a national panel of educators who had been previously recruited by us based on their expertise in student and resident education, and asked them to assess the list of 65 behaviors for comprehensiveness and to review each item for clarity in meaning. The educators were recruited during the joint annual conference of pediatric residency and clerkship directors, and were approached based on their local, regional or national level leadership in the field of medical education.

Twelve of the 17 educators returned the survey; seven were female, six were involved in clerkship education, four were residency program directors (or associate directors), and three held positions in Deans’ offices. Respondents added another 20 behaviors to the existing list, generating a final list of 85 desirable behaviors for residents in their role as teachers and role models for students.

National educator panel rating of items

We sent this comprehensive 85 item list of behaviors back to the educators (14 of 17 responded) and to another mixed group of 13 learners (all responded) at different levels (third and fourth year students and residents) to rate each item. Five of the learners were previous focus group participants. We asked respondents to rate each item on a five-point Likert scale, based on two separate criteria: “importance of the item” (5 = extremely important, 4 = important, 3 = somewhat important, 2 = of little importance, 1 = not at all important, unable to assess) and perceived “accuracy of measurement” of the item by a third year clerkship student (5 = extremely accurate, 4 = accurate, 3 = somewhat accurate, 2 = minimally accurate, 1 = not at all accurate, not sure). Respondents were informed that the clerkship students would be using a five-point scale (unable to assess, strongly agree, agree, disagree, and strongly disagree) to evaluate resident performance for the selected behaviors.

Evaluation tool development

Behaviors were selected if the distribution of sample scores provided strong evidence that ratings were consistently high, using a preset threshold for both importance and accuracy of measurement. The threshold chosen for “importance” was ≥ 4 and for “accuracy of measurement” was ≥ 3.5. We computed 95% asymptotic confidence intervals for item mean ratings (assuming a Normal distribution for Likert scale ratings) (Nunnally and Bernstein Citation1994). We then selected only those items whose confidence intervals for the sample mean were entirely at or above the corresponding criteria thresholds for both importance and accuracy.

For each of the 85 items, we also compared mean measurability ratings between educators and learners, using two-sample t-tests. To account for the multiple comparisons and reduce the risk of making false discoveries in the absence of real differences between the two populations, we used the Benjamini-Hochberg method (Benjamini and Hochberg Citation1995) to adjust the raw p-values.

Data analyses were performed using the SAS 9.3 software. Statistical significance was attained when the two-sided (adjusted) p value was < 0.05.

Results

Focus group data

We categorized the 65 behaviors identified by the participants into 13 themes pertaining to residents’ interpersonal attributes, their teaching strategies, their patient care skills and a large overlap group that transcended categories. displays the categories, themes, behaviors, and representative quotes derived from the focus group interviews. The 5 themes that pertained to residents’ interpersonal attributes were sensitivity, team dynamics, communication skills, professionalism and passion. The two themes under teaching strategies were feedback and programmatic awareness. Clinical competence was the only theme pertaining to patient care skills. Five themes fell into an overlap category: accountability, assesses and addresses needs, critical thinking skills, teaching style, and organizational skills.

Table 2.  Behaviors associated with good resident teachers and role models derived from focus group interviews with medical students and residents

Table 3.  Fourteen item instrument for use by clerkship students to assess resident performance

Results from national educator panel review of behaviors

The national panel of educators modified and added other behaviors to the list of resident-teaching behaviors generated by the focus groups to create a final list of 85 resident behaviors. Input from national educators included changing “values the student as a colleague” to the more objective “solicits opinions and input from students and other members of the team” (theme: team dynamics) and changing “makes teaching relevant to patient care” to “makes teaching relevant to patient care and to student's educational goals” (theme: assesses and addresses needs). Behaviors added included “asks for feedback and ways to improve self as a teacher” (theme: team dynamics), “uses vocabulary understandable to patients/families” (theme: communication skills), “is honest with students and patients” and “treats students and patients in a kind manner” (theme: professionalism) and “identifies feedback as such when offering it” (theme: feedback).

National educator panel rating of behaviors

The lower bounds of the 95% percent confidence interval for the mean “importance” rating of behaviors ranged from 3.11–4.81 while for “accuracy of measurement” the range was 2.81–4.32. Thirty nine (46%) of the behaviors met the “importance” cut-off, while 42 (49%) met the accuracy cut-off. As can be seen in , behaviors that were ranked higher on “importance” also tended to get ranked higher on “accuracy of measurement.”

Figure 1. Accuracy vs. Importance ratings (lower bounds, 95% CI for mean).

Figure 1. Accuracy vs. Importance ratings (lower bounds, 95% CI for mean).

The behavior that was ranked highest in “importance” was “is honest with students and patients” (score of 4.7), the lowest scoring being “facilitates student learning in small groups” (2.7). For “accurately measurable”, the highest scoring item was “directly observes student performance” (4.01), with the lowest being “situates problems in larger context” (2.5).

Nineteen behaviors were rated as being important for residents to demonstrate, but were perceived to be poorly measurable from the perspective of clerkship students. These behaviors fall predominantly under the ACGME competencies of professionalism (“takes steps to get to know patient as an individual,” “engages patient in shared decision making,” “demonstrates empathy,” “is honest with students and patients,” “addresses patient concerns and needs,” “demonstrates patient ownership,” “is non-judgmental towards students and families, “uses active listening skills,” and “demonstrates intellectual curiosity”). Other behaviors that were rated as difficult to accurately measure fell under the competency of interpersonal and communication skills (“uses vocabulary understandable to patients/students” and “models effective communication style”), and a miscellaneous group of behaviors pertaining to residents’ management skills (“effectively balances teaching and patient care responsibilities” and “manages time effectively”) and their teaching/leadership style (“encourages application of knowledge,” “makes attempts to understand student's knowledge base and build on it,” “empowers student to take responsibility for own education,” “assigns students tasks with educational value,” “demonstrates willingness to incorporate others contributions into management plans” and “provides a balance of supervision and autonomy”).

Twenty behaviors were rated as both important and able to be accurately measured and were incorporated into the final evaluation tool. For the sake of brevity and ease of practical application, the items were consolidated into a final list of 14 behaviors which are outlined in . Items in the final evaluation form included (1) respect towards patients, families, (2) health care team, (3) maintaining patient confidentiality, (4) creating a safe learning environment, (5) showing enthusiasm about teaching, (6) soliciting input from team members as appropriate, (7) soliciting feedback on the residents’ own performance, balancing supervision with autonomy by (8) providing clear expectations to the student, (9) directly observing their performance, and (10) promoting student ownership of patients/families, (11) making teaching relevant to patient care and the students’ educational goals, (12) demonstrating how to solve clinical problems, and providing feedback that was (13) timely, and (14) constructive.

Comparison of educator versus learner ratings

We observed no differences in mean scores for “accuracy of measurement” for any of the 85 behaviors when learner ratings were compared to educator ratings. For “importance,” educators rated 2 of the 85 items higher on average than the learners (p < 0.05): “provides clear expectations to students” (all educators rated it 5, while 31% of learners rated it at 5 and the rest at 4) and “gives feedback” (93% of educators rated it 5 and 7% at 4 while only 31% of learners rated it 5 and the rest at 4).

Discussion

Using stakeholder input and grounded theory via focus groups, we were able to successfully develop a measurement tool to evaluate resident skills in their role as clinical role models and educators for clerkship medical students in an in-patient setting. Stakeholders in our study agreed on the qualities that are important in residents in order to be good educators. These qualities overwhelmingly related to the display of professional skills (accountability, respect and humanism), fostering a safe learning environment, an empowering teaching style and technique (showing enthusiasm, providing clear expectations, direct observation of learners, demonstrating problem solving and both providing and asking for feedback) and leadership skills (soliciting input from team and balancing supervision with autonomy).

The preponderance of professional attributes is not surprising to us and reinforces the powerful role of the hidden curriculum in influencing learner behaviors and attitudes (Hafler et al. Citation2011). The importance given to the learning environment fits into the theoretical construct of the situated learning theory which states that learning is contextually “situated” in an authentic environment and therefore hugely influenced not only by the activity, but also the context and the climate in which the activity is performed (Brown et al. Citation1989). This is reflected in the behaviors pertaining to clarifying expectations, giving effective feedback, articulating problem solving techniques, progressively providing developmentally appropriate autonomy to learners, building on their previous knowledge (scaffolding) and making teaching and learning relevant. Some of the other important behaviors included leadership skills including effective communication and organization (time management). The themes identified as being important by our stakeholders matched the items developed using grounded theory and those included in the Postgraduate Hospital Educational Environment Measure (PHEEM) (Boor et al. Citation2007). The PHEEM is a questionnaire that was validated for use by medical students and residents to accurately assess the learning climate within a department or hospital and includes items pertaining to the quality and content of teaching (such as item 12: I am able to participate actively in educational events), role of autonomy and supervision (such as item 5: I have the appropriate level of responsibility in this post), and the support system that the learners are working in (such as item 25: There is a no blame culture in this post).

However, in the quantitative phase of our study, we found that stakeholders felt that clerkship students were limited in their ability to accurately assess some of these valuable and desired resident attributes. This is likely from a variety of factors including limited contact of learners with residents and their inability to directly observe or know about the occurrence of some of these behaviors (such as use of shared decision making and honesty). Many of the important but poorly measurable behaviors also fell into the domain of professional attributes and a large number in the category of organizational skills. It is widely accepted that medical “professionalism” is a systems issue that is affected profoundly by situational and contextual factors and, therefore, understandable that students who have limited contact with residents may not be in a position to accurately assess this competency in a robust manner (Arnold Citation2002).

From the standpoint of measuring and assuring resident competency in these areas, some of these limitations can be overcome by using multi source feedback such that other more “appropriate” evaluators (such as health care professionals, patients and peers) who are more likely to observe residents engage (or not) in these behaviors can weigh in (Lynch et al. Citation2004).

Compared to previously published tools used to evaluate faculty teaching, we observed differences in the behaviors that stakeholders desired in residents. Some of the expectations that learners had for faculty members in other studies that did not seem important to be present in residents, in our study, included teaching diagnostic skills, cost-effective care, effective incorporation of research and practice guidelines in patient care (Copeland and Hewson Citation2000) and “control of teaching sessions” (Litzelman et al. Citation1998). Behaviors that we found in our study to be uniquely important for residents to demonstrate and that were not emphasized in the aforementioned faculty evaluation tools included the majority of humanistic attitudes and behaviors that learners desired to see in residents. This discrepancy likely is a product of the study methodologies and the theoretical constructs used in designing the evaluation forms.

Overlapping themes and expectations that learners have for both faculty and resident educators (and which therefore could be addressed in combined resident and faculty development activities) were the emphasis on learning climate and teaching style (including giving feedback and teaching clinical reasoning skills).

Limitations of our study include the possibility that our tool may not have good transferability to other centers or even other learning environments due to the unique cultural differences among institutions and teaching environments with varying amount of contact time between residents and students. In addition, since this tool was developed for use in the pediatric in-patient setting, it is unknown how valid it would be if used in other specialties (e.g., surgery). However, a strength of our study is that our tool was developed with the help of a national group of educators including pediatric clerkship directors, residency program directors, and faculty in the Dean's office, and hence included the perspectives of all the different stakeholders involved in clerkship education in any setting and branch of medicine. Moreover, the mixed-methods approach of our study design (using focus groups, educator input, anonymous surveys and rating of items by importance and accuracy of measurement) adds to the face validity of our tool. Our next steps are to establish the validity and reliability of this evaluation form in a clinical context at our institution and at other centers.

In summary, based on stakeholder input, we developed an evaluation tool consisting of 14 behaviors which clustered around themes of professionalism and the learning environment. Comparing our study with the published literature, we see that there appear to be differences in learner expectations of teachers, based on the level of the teacher, probably due to a combination of logistic and developmental reasons. Once we have established the validity and reliability of our tool, our future plans include investigating how useful residents perceive the tool to be for self-improvement (Practice Based Learning and Improvement), and designing and studying the effect of “resident as teacher” curricula on resident performance, using this tool. While data support that resident attitudes towards teaching improve after resident as teacher curricula, the studies are conflicting as to whether resident teaching skills and student learning improve in a consistent manner (Morrison and Hafler Citation2000; Wamsley et al. Citation2004) and how sustained these skills might be (Edwards et al. Citation1988). Part of this may stem from the lack of a valid evaluation tool to assess resident performance as teachers. Our evaluation tool is the first step towards answering these questions.

Acknowledgements

The authors wish to thank our national educator panel (Drs Miriam Bar-On, Julie Byerley, Robert Drucker, Leslie Fall, Larrie Greenberg, Christine Johnson, Lindsey Lane, Christopher Maloney, Heather McPhillips, Bruce Morgenstern, Mary Ottolini, Judith Rowen, Richard Shugerman, Stephanie Starr, Daniel West and Jerold Woodhead), and the pediatric residents and medical students who participated in this study.

Declaration of interest: The authors report no conflicts of interest. The authors alone are responsible for the content and writing of the article.

References

  • Arnold L. Assessing professional behavior: Yesterday, today, and tomorrow. Acad Med 2002; 77: 502–515
  • Beckman TJ, Ghosh AK, Cook DA, Erwin PJ, Mandrekar JN. How reliable are assessments of clinical teaching? A review of the published instruments. J Gen Intern Med 2004; 19: 971–977
  • Benjamini Y, Hochberg Y. Controlling the false discovery rate – A practical and powerful approach to multiple testing. J Roy Stat Soc B Met 1995; 57: 289–300
  • Boor K, Scheele F, van der Vleuten CP, Scherpbier AJ, Teunissen PW, Sijtsma K. Psychometric properties of an instrument to measure the clinical learning environment. Med Educ 2007; 41: 92–99
  • Brown JSB, Collins A, Duguid P. Situated cognition and the culture of learning. Educ Res 1989; 18: 32–42
  • Copeland HL, Hewson MG. Developing and testing an instrument to measure the effectiveness of clinical teaching in an academic medical center. Acad Med 2000; 75: 161–166
  • Edwards JC, Kissling GE, Brannan JR, Plauche WC, Marier RL. Study of teaching residents how to teach. J Med Educ 1988; 63: 603–610
  • Frank JR, Danoff D. The CanMEDS initiative: Implementing an outcomes-based framework of physician competencies. Med Teach 2007; 29: 642–647
  • Griffith CH, 3rd, Wilson JF, Haist SA, Ramsbottom-Lucier M. Do students who work with better housestaff in their medicine clerkships learn more?. Acad Med 1998; 73: S57–S59
  • Hafler JP, Ownby AR, Thompson BM, Fasser CE, Grigsby K, Haidet P, Kahn MJ, Hafferty FW. Decoding the learning environment of medical education: A hidden curriculum perspective for faculty development. Acad Med 2011; 86: 440–444
  • Litzelman DK, Stratos GA, Marriott DJ, Skeff KM. Factorial validation of a widely disseminated educational framework for evaluating clinical teachers. Acad Med 1998; 73: 688–695
  • Lynch DC, Surdyk PM, Eiser AR. Assessing professionalism: A review of the literature. Med Teach 2004; 26: 366–373
  • Morrison EH, Hafler JP. Yesterday a learner, today a teacher too: Residents as teachers in 2000. Pediatrics 2000; 105: 238–241
  • Nunnally JG, Bernstein IH. Psychometric testing. McGraw Hill, New York, NY 1994
  • Stalmeijer RE, Dolmans DH, Wolfhagen IH, Muijtjens AM, Scherpbier AJ. The development of an instrument for evaluating clinical teachers: Involving stakeholders to determine content validity. Med Teach 2008; 30: e272–e277
  • Strauss A, Corbin J. Basics of qualitative research: Techniques and procedures for developing grounded theory. Sage, Thousand Oaks, CA 1998
  • Swing SR. Assessing the ACGME general competencies: General considerations and assessment methods. Acad Emerg Med 2002; 9: 1278–1288
  • Tremonti LP, Biddle WB. Teaching behaviors of residents and faculty members. J Med Educ 1982; 57: 854–859
  • Wamsley MA, Julian KA, Wipf JE. A literature review of “resident-as-teacher” curricula: Do teaching courses make a difference?. J Gen Intern Med 2004; 19: 574–581
  • Wilkerson L, Lesky L, Medio FJ. The resident as teacher during work rounds. J Med Educ 1986; 61: 823–829
  • Wright S, Wong A, Newill C. The impact of role models on medical students. J Gen Intern Med 1997; 12: 53–56

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.