865
Views
5
CrossRef citations to date
0
Altmetric
Conversation Starters

Developing Professionalism via Multisource Feedbackin Team-Based Learning

, , , , , & show all
Pages 362-365 | Published online: 27 Oct 2015
 

Abstract

CGEA 2015 CONFERENCE ABSTRACT (EDITED) A Novel Approach to Assessing Professionalism in Preclinical Medical Students Using Paired Self- and Peer Evaluations. Amanda R. Emke, Steven Cheng, and Carolyn Dufault Construct: This study sought to assess the professionalism of 2nd-year medical students in the context of team-based learning. Background: Professionalism is an important attribute for physicians and a core competency throughout medical education. Preclinical training often focuses on individual knowledge acquisition with students working only indirectly with faculty assessors. As such, the assessment of professionalism in preclinical training continues to present challenges. We propose a novel approach to preclinical assessment of medical student professionalism to address these challenges. Approach: Second-year medical students completed self- and peer assessments of professionalism in two courses (Pediatrics and Renal/Genitourinary Diseases) following a series of team-based learning exercises. Assessments were composed of nearly identical 9-point rating scales. Correlational analysis and linear regression were used to examine the associations between self- and peer assessments and the effects of predictor variables. Four subgroups were formed based on deviation from the median ratings, and logistic regression was used to assess stability of subgroup membership over time. A missing data analysis was conducted to examine differences between average peer-assessment scores as a function of selective nonparticipation. Results: There was a significant positive correlation (r = .62, p < .0001) between self-assessments completed alone and those completed at the time of peer assessment. There was also a significant positive correlation between average peer-assessment and self-assessment alone (r = .19, p < .0002) and self-assessment at the time of peer assessment (r = .27, p < .0001). Logistic regression revealed that subgroup membership was stable across measurement at two time points (T1 and T2) for all groups, except for members of the high self-assessment/low peer assessment at T1, who were significantly more likely to move to a new group at T2, χ2(3, N = 129) = 7.80, p < .05. Linear regression revealed that self-assessment alone and course were significant predictors of self-assessment at the time of peer assessment (Fself_alone = 144.74, p < .01 and Fcourse = 4.70, p < .05), whereas average peer rating, stage (T1, T2) and academic year (13–14, 14–15) were not. Linear regression also revealed that students who completed both self-assessments had significantly higher average peer assessment ratings (average peer rating in students with both self-assessments = 8.42, no self-assessments = 8.10, self_at_peer = 8.37, self_alone = 8.28) compared to students who completed one or no self-assessments (F = 5.34, p < .01). Conclusions: When used as a professionalism assessment within team-based learning, stand-alone and simultaneous peer and self-assessments are highly correlated within individuals across different courses. However, although self-assessment alone is a significant predictor of self-assessment made at the time of assessing one's peers, average peer assessment does not predict self-assessment. To explore this lack of predictive power, we classified students into four subgroups based on relative deviation from median peer and self-assessment scores. Group membership was found to be stable for all groups except for those initially sorted into the high self-assessment/low peer assessment subgroup. Members of this subgroup tended to move into the low self-assessment/low peer assessment group at T2, suggesting they became more accurate at self-assessing over time. A small group of individuals remained in the group that consistently rated themselves highly while their peers rated them poorly. Future studies will track these students to see if similar deviations from accurate professional self-assessment persist into the clinical years. In addition, given that students who fail to perform self-assessments had significantly lower peer assessment scores than their counterparts who completed self-assessments in this study, these students may also be at risk for similar professionalism concerns in the clinical years; follow-up studies will examine this possibility.

Additional information

Notes on contributors

Anna T. Cianciolo

EXPERT COMMENTATOR BIOGRAPHIES

Anna T. Cianciolo, Ph.D., is Assistant Professor, Department of Medical Education, at Southern Illinois University School of Medicine and Editor of Teaching and Learning in Medicine. Her interest is focused on exploring the influence of context (interpersonal and environmental) on the nature of learning and instruction. Areas of emphasis include small-group collaborative instruction, diagnostic strategy, and professional self-development.

David Musick

David Musick, Ph.D., is Professor of Medicine and Assistant Dean for Faculty Development at Virginia Tech Carilion School of Medicine and Research Institute. He also is Director of the Department of Continuing Professional Development at Carilion Clinic. His research interests include medical education, social and cultural aspects of medicine, medical professionalism and the learning environment, biomedical ethics, and clinical education in rehabilitation.

Boyd Richards

Boyd Richards, Ph.D., is Assistant Vice President of the Center for Education Research and Evaluation and Professor of Medical Education (in Pediatrics) at the Columbia University Medical Center. His research interests include medical education, team-based learning, and pediatrics.

Claudio Violato

Claudio Violato, Ph.D., is Professor of Medicine and Psychology, Professor of Medical Education, and Director of Assessment and Evaluation at Wake Forest School of Medicine. His research interests include the topics of medicine, psychology, education, medical education, psychometrics, and structural equation modeling.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 65.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 464.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.