1,294
Views
12
CrossRef citations to date
0
Altmetric
Research Article

Involving users in the refinement of the competency-based achievement system: An innovative approach to competency-based assessment

, , , , , & show all
Pages e143-e147 | Published online: 30 Jan 2012

Abstract

Background: Competency-based assessment innovations are being implemented to address concerns about the effectiveness of traditional approaches to medical training and the assessment of competence.

Aim: Integrating intended users’ perspectives during the piloting and refinement process of an innovation is necessary to ensure the innovation meets users’ needs. Failure to do so results in no opportunity for users to influence the innovation, nor for developers to assess why an innovation works or does not work in different contexts.

Methods: A qualitative participatory action research approach was used. Sixteen first-year residents participated in three focus groups and two interviews during piloting. Verbatim transcripts were analyzed individually and then across all transcripts using a constant comparison approach.

Results: The analysis revealed three key characteristics related to the impact on the residents’ acceptance of the innovation as being a worthwhile investment of time and effort: access to frequent, timely, and specific feedback from preceptors. Findings were used to refine the innovation further.

Conclusion: This study highlights the necessary conditions for assessing the success of implementation of educational innovations. Reciprocal communication between users and developers is vital. This reflects the approaches recommended in the Ottawa Consensus Statement on research in assessment published in Medical Teacher in March 2011.

Introduction

In this article, we report findings from the development and pilot implementation of the competency-based achievement system (CBAS), an innovative framework that emphasizes assessment for learningFootnote1 in the development of competences for Family Medicine residents. The primary tool for learning and assessment in CBAS is formative feedback. Given that this approach is a fundamental conceptual shift in thinking in medical education, we used participatory action research (detailed in Ross et al., 2011) to seek user feedback during development and initial implementation of CBAS. This allowed us to ensure that the system we developed would meet the learning needs of our residents. Our findings allowed us to identify the key characteristics that contribute to acceptance or non-acceptance of CBAS. We were thus able to modify our innovation accordingly to ensure that perceived weaknesses of CBAS were ameliorated.

Background

Medical training has undergone a shift toward competency-based models of education due to calls for greater accountability in all aspects of the profession (Frank & Danoff Citation2007; Frank et al. 2010). Competency-based medical education (CBME) approaches physician training by focusing on outcomes, emphasizing abilities, de-emphasizing time-based training, and promoting greater learner-centeredness (Frank et al. 2010). Competencies are defined as “the knowledge, skills, attitudes, and personal qualities essential to the practice of medicine” (Albanese et al. Citation2008, p. 250). Competencies are dynamic, developing or receding over time, and grounded in the learning environment.

In response to the need for a valid and reliable means of measuring, tracking, and documenting competency, we developed the CBAS, which supports the learning process of residents and advisors in addition to documenting achievement of required competencies (for further program description, see Ross et al. (2011)). Specifically, CBAS focuses on three objectives: to generate competent physicians by capturing the habitual demonstration of competencies, to fulfill a need to identify and support “trouble” cases early in the process, and to obtain scholarly defence for the utility of a competency-based program. Various components of CBAS were piloted during prior years, including the use of field notes, paper portfolios, and regular progress reviews with advisors. This research report focuses on the 2009–2010 year when the electronic CBAS tools were piloted at three sites representing both rural and urban clinical settings.

The theoretical foundation for CBAS is assessment for learning (Black & Wiliam, 1998; 2009). Within this framework, assessment becomes a part of learning, acting as periodic checks on learning. Assessment is formative, and feedback given to the student clearly tells the student where strengths and weaknesses are occurring. Cumulative assessments for learning will show progress – or lack thereof – over time. Using this theoretical framework as the foundation of our innovation, we aim to make assessment an integral part of learning. Yet, the assessments also act as a way of tracking progress. Assessment is done with the learner, rather than to the learner, always with the intent of providing as much information as possible for the learner to be able to incorporate the formative assessment into their next stage of learning.

This approach to competency-based assessment in medical education is a marked conceptual shift. With this in mind, we wanted to ensure that we were mindful in our approaches to evaluate the impact of CBAS on resident learning. What is often missing from studies reporting the impact of new educational innovation is a discussion about programmatic decisions and consequences during initial implementation. These details are important for a few reasons: further development/refinement of the program and transferability of the program to new contexts.

The purpose of this study was to examine the value in documenting the development process, including identifying the core principles of the program and assessing the fidelity of the program to those principles following pilot implementation through qualitative feedback about the innovation. We explored the impact of CBAS on residents’ learning experiences and outcomes beyond that of surveying for satisfaction ratings. By doing so, this article moves beyond student satisfaction when we look at the impact of CBAS in our program, and generates information that is important for decision-making.

Our work has important implications to inform the training and support for implementation of educational innovations across residency programs. Despite the best intentions of innovation developers, there will be variety in how an innovation is used once in moves into the reality of different training environments. To better understand the realities of implementing an innovation in various contexts (and the differences in how users used the tools of the innovation), we sought to understand how the residents’ experiences led to either an acceptance or non-acceptance of CBAS as being a worthwhile investment of time and effort. To that end, our intention was to identify conditions that contribute to effective implementation. We hypothesized that increased communication between program developers and end-users provides important information for the successful implementation of medical education innovations.

Methods

A grounded theory approach with a participatory action framework guided the data collection and analysis procedures. First-year residents at a large North American Family Medicine Residency Program were invited to share their experiences from the development and pilot implementation of CBAS during the first year of their two-year residency program. Interviews and focus groups were used to collect data because of their usefulness for eliciting rich and descriptive data and as a well-documented means for obtaining access to individuals’ worldviews, past experiences, group perspectives, and information that is not directly observable (Merriam Citation1998; Mertens Citation2005). The focus groups and interviews were semi-structured and guided by a protocol focused on four areas as well as unstructured time for comments (). This approach was used to ensure that focus groups were of an exploratory nature; the intention was that the researchers did not want to lead the participants. Interviews were used to accommodate the residents who could not, because of scheduling conflicts, attend the focus groups.

Table 1.  Areas of focus group and interview focus

Each interview and focus group was between 45 and 60 min long, took place in a location where the conversation could occur in private, was audiotaped, and transcribed verbatim. Inductive thematic analysis was used in which each transcript was first read and initial codes and definitions were developed using a constant comparison method (Strauss & Corbin Citation1990). Codebooks were compared across two independent raters; similarities and differences in coding were then reviewed until consensus was reached. The codes were then refined and applied across all other transcripts and recurring themes across transcripts were developed. To enhance credibility in findings through the use of member-checking, a summary of themes for each individual transcript was created and distributed to the participants (Cote & Turgeon 2005).

In the final stage of the study, the final results were examined by the CBAS development team. Changes were made to the CBAS tools and procedures to incorporate residents’ needs into the further development of CBAS.

Results

The analysis revealed three key characteristics which impacted on the residents’ perception of CBAS as a worthwhile investment of time and effort: access to specific, timely, and frequent feedback from preceptors. Moreover, when those characteristics were reported to occur together, residents were more likely to report motivation to engage in self-assessment. In comparison, residents were less likely to describe the CBAS program as supportive of their learning when access to feedback was limited and less specific. In these cases, residents described CBAS as having increased their workload and infringed on clinical learning opportunities.

Specificity of feedback was extremely important to residents. Residents were more likely to report a positive perception of the utility of CBAS (and a desire to continue to use the tools) when residents received feedback from preceptors that was specific to the observed clinical event. This category of feedback was characterized by residents as being “detailed” with respect to strengths and weaknesses, “constructive” in terms of pointing to the next steps, and “legitimate” as an accurate reflection of the observation. The specificity of feedback enabled residents to “see (their) learning experiences in a more structured way” and “think more about the feedback that (they) get”. As a result, residents identified specificity as a key characteristic in the individualized feedback they received. Residents attributed the lack of change of preceptor behavior in giving specific feedback to differing teaching styles among preceptors: “Each physician or doctor has their own way of giving feedback and they don’t change their style because CBAS is there.”

Timeliness of feedback was also seen as crucial to the value of CBAS as a learning tool. It was important that feedback be received as soon as possible after the observation to ensure that the feedback was meaningful. Residents were more likely to report greater understanding of their strengths and weaknesses when preceptors used CBAS to provide timely feedback. As one participant commented, “I think maybe for me it's just helped me think about getting feedback, or think about the feedback that I do get, and maybe recognize it a little bit more.” The result was a level of communication between preceptors and residents that was described as an “open discussion of cases within the workplace.” Increased access to immediate, timely feedback contributed to increased resident motivation to engage in self-assessment of learning objectives, educational goals, and areas of concern. An area of concern for the CBAS development team was the observation that forced discussion of feedback at clinical sites was perceived as placing residents in the uncomfortable situation of having to specifically request feedback from preceptors who did not provide it of their own accord.

Finally, frequency of feedback was essential to residents’ perception of CBAS as a worthwhile tool for their own learning. Residents identified increased engagement in their own learning process as the result of frequent ongoing feedback that was specifically informative about their progression to competency. However, frequency of feedback needed to occur in conjunction with the other two key characteristics (specificity and timeliness). When residents became too concerned with getting as much feedback as possible, this often resulted in high frequency but low quality of feedback. Some residents focused on the number of field notes they felt they needed to obtain, regardless of the quality of the feedback contained in those notes. When this occurred, residents viewed the CBAS program as a burden that increased resident workload. These residents described the program as an “infringe (ment) on the already-limited time available to medical residents,” with few added benefits. These residents commented that the feedback requirement was excessive and should be tailored to more realistically suit the needs of both residents and preceptors. The CBAS system was observed as more of a “log” than as a mechanism aimed at assisting progress to competency.

In terms of CBAS as an overall tool to assist and improve learning, residents reported being more motivated and more likely to remediate their areas of concern when they had a record of feedback about gaps in knowledge or skills. This was particularly true when residents recognized the organization scheme of the CBAS tools and the alignment of that scheme to program objectives. Thus, a greater commitment to self-learning and self-evaluation was fostered. As one resident reported: “A really positive thing about CBAS is that it helped me to … see my learning experience in a more structured way and … to be a little bit more proactive in trying to achieve them (objectives).”

One of the most important findings from this study was the need to explain to residents how and why an innovation was developed to help their learning. Some residents claimed that the CBAS tools had little educational relevance and that they did not experience CBAS as benefiting their learning. These residents did not connect the tools of CBAS to the two benefits they gained from those tools: an increase in documented feedback, and a way to categorize that feedback. Some residents reported that the system lacked simplicity and was not “user-friendly,” and that a general lack of direction and guidance was provided with the implementation of CBAS. As one participant commented, “The ambiguity associated with (CBAS) makes it difficult to meet the mandatory requirements. So if those are more clearly spelled out, at the very least those objectives can be met.” Residents who expressed this view also experienced a lack of preceptor interest in the program, which reinforced their own perception of CBAS as a “burden.” The high degree of perceived effort needed from these residents was not reciprocated by an equal perception of program value, and instead was seen as infringing on residency training.

Resident perceptions of the value of CBAS as a learning tool varied between sites. It was noted that clinical site differences can pose problems; for example, rural sites have unique clinical set-ups and preceptors are more removed from the ‘ivory tower.’ One participant commented: “the priority of the (rural) clinic is not to be a huge clinical teaching unit.” As a result, residents experienced a lack of specific feedback. Observations from these residents included a need for greater preceptor training on how to deliver specific feedback focused on resident strengths and weaknesses.

Changes to CBAS as a result of user feedback

It is not surprising that residents reported varied experiences during the CBAS pilot implementation given the diversity of the ways in which the CBAS tools were used. This was the first target for change as a result of this study ().

Table 2.  Changes to the CBAS program in response to user feedback

Discussion

The analysis revealed the key characteristics that contributed to CBAS being a worthwhile investment of residents’ time and effort: specific, timely, and frequent feedback. CBAS was credited with providing structure for the ongoing identification of strengths and weaknesses, clear direction for progress, and facilitated self-reflection on achievement of educational goals and objectives. CBAS was thus regarded as a program whose tools allowed for greater awareness of progression to competency through continuous dialog between teachers and students. To that end, interactions between preceptor and resident that were timely to the clinical context and provided ongoing feedback that was specific to the resident were essential for program acceptance by residents. This finding may not be surprising given the research reported by Tetzlaff et al. (2008) who highlighted the importance of real-time feedback focused on observed and ongoing trainee performance in a realistic clinical setting. Furthermore, training environments that promote frequent and timely feedback encourage residents to engage in a continuous approach to learning that evolves as progression toward competency occurs (Bierer et al. 2008). Our findings suggest that the CBAS tools structured the interactions between resident and preceptor. This provided the resident with access to information about their level of clinical skills. In doing so, residents were more likely to engage in self-assessment: they reflected upon their progress toward achievement of competencies and identified the next steps.

Environments characterized by frequent and close contact with supervisors within the clinical setting optimize learning opportunities (Tetzlaff et al. 2008). The benefits of such an environment include reinforcing learning objectives, fostering retention of the learning experience, and equipping residents to recognize and address gaps in performance. The latter represents a core component of reflective practice. Individualized feedback that reinforces strengths also encourages the remediation of problems early in the training process and enables residents to work at their own pace to achieve educational objectives (Marple Citation2007; Albanese et al. Citation2008). The goal of increasing timely and ongoing feedback of progress to competency is evidenced by our study and supports medical education's shift to increase transparency and accountability during residency training (Frank & Danoff Citation2007).

The changes made to CBAS as a result of the pilot residents’ feedback allowed the development team to build upon the perceived strengths of the tools and address the perceived weaknesses that led to user frustration and disinclination to use the tools. Preliminary data from the subsequent cycle of implementation indicates that there is greater acceptance of CBAS and an increase in positive perceptions of its value as a learning tool to inform progress toward competency.

Conclusion

The implications of our study have the potential to be far reaching and suggest that increased communication between program developers and end-users provides important information for the successful implementation of medical education innovations. In particular, accessing and incorporating end-users’ suggestions during the piloting and refinement process provides information related to users’ extent of program acceptance. By moving beyond sole reliance on users’ satisfaction ratings, program developers can identify the program features during a pilot implementation that users consider worthy of time and effort investment to inform program refinement while maintaining alignment with program goals.

This study provides critical insights regarding the necessary conditions for successful implementation of educational innovations. Innovations require a change in thinking and an openness to new experiences, and necessitate time to become accepted and integrated (Rogers Citation2003). End-users are likely to hold hesitant views, particularly if the innovation presents a significant shift from current practice. As Rogers (Citation2003) found, it is critical for program developers to invest time and attention to intended users’ experiences and encourage frequent communication channels. Program benefits should be experienced in the short term and end-users should not experience a significant increase in their workload. Opportunities for training are required in order to establish knowledge and understanding of the innovative program, especially the key program characteristics. This study illuminates end users’ experiences following the pilot implementation of a CBME innovation, and thus is limited by context and sample size. Further research is necessary in order to explore the relationship between end-users’ program acceptance and likelihood of sustained change in practice. To do this requires an iterative implementation process and the use of qualitative methods to capture experiences and perspectives of those involved.

Acknowledgments

The authors thank the residents who took part in this study and the administrative staff who helped with scheduling the data collection. This study was funded by the Teaching and Learning Enhancement Fund at the University of Alberta. S. Ross, C.-A. Poth, and M.G. Donoff had the original concept. C. Papile wrote the first draft of the article, which was then revised by C.-A. Poth. S. Ross rewrote and did all final revisions on the manuscript, with conceptual assistance from M.G. Donoff. C.-A. Poth led the data collection and qualitative analysis. All authors contributed to the data collection and the final manuscript draft.

Declaration of interest: The authors report no conflicts of interest. The authors alone are responsible for the content and writing of this article.

Notes

1. Assessment that contributes to and is a part of learning, rather than simply an evaluation of progress at one point in time.

References

  • Albanese MA, Mejucano G, Mullan P, Kokotailo P, Gruppen L. Defining characteristics of educational competencies. Med Educ 2008; 42: 248–255
  • Beirer SB, Dannefer EF, Taylor C, Hall P, Hull AL. 2008. Methods to assess students’ acquisition, application and integration of basic science knowledge in an innovative competency-based curriculum. Med Teach 30:e171–e177.
  • Black P, Wiliam D. 1998. Inside the black box: raising standards through classroom assessment. Phi Delta Kappan 80:139–144.
  • Black P, Wiliam D. 2009. Developing the theory of formative assessment. Educational Assessment, Evaluation and Accountability 21:5–31.
  • Côté, L, Turgeon, J. 2005. Appraising qualitative research articles in medicine and medical education. Med Teach. 27: 71–75.
  • Frank JR, Snell LS, Cate OT, Holmboe ES, Carraccio C, Swing SR, Harris P, Glasgow NJ, Campbell C, Dath D, Harden RM, Iobst W, Long DM, Mungroo R, Richardson DL, Sherbino J, Silver I, Taber S, Talbot M, Harris KA. 2010. Competency-based medical education: theory to practice. Med Teach 32:638–645.
  • Frank JR, Danoff D. The CanMEDS initiative: Implementing an outcomes-based framework of physician competencies. Med Teach 2007; 29: 642–647
  • Hilliard RI, Tallett SE. The use of an objective structured clinical examination with postgraduate residents in pediatrics. Arch Pediatr Adolesc Med 1998; 152: 74–78
  • Kim S, Kogan JR, Bellini LM, Shea JA. A randomized-controlled study of encounter cards to improve oral case presentation skills of medical students. J Gen Intern Med 2005; 20: 743–747
  • Leach DV. Competencies: From deconstruction to reconstruction and back again, lessons learned. Am J Public Health 2008; 98: 1562–1564
  • Marple BF. Competency-based resident education. Otolaryngol Clin North Am 2007; 40: 1215–1225
  • Merriam SB. The qualitative research and case study application in education. Jossey Bass, San Francisco, CA 1998
  • Mertens DM. Research and evaluation in education and psychology2nd. Sage, Thousand Oaks, CA 2005
  • Pearson DJ, Heywood P. Portfolio use in general practice vocational training: A survey of GP registrars. Med Educ 2004; 38: 87–95
  • Rogers EM. Diffusion of innovations5th. Free Press, New York 2003
  • Ross S, Poth C, Donoff MG, Humphries P, Steiner I, Schipper S, Janke F, Nichols D, 2011. The competency-based achievement system (CBAS): Using formative feedback to teach and assess competencies with family medicine residents. Can Fam Physician 57:e323-e330
  • Schuwirth L, Colliver J, Gruppen L, Kreiter C, Mennin S, Onishi H, Pangaro L, Ringsted C, Swanson D, van der Vleuten C, et al. Research in assessment: Consensus statement and recommendations from the Ottawa 2010 Conference. Med Teach 2011; 33: 224–233
  • Strauss A, Corbin JM. Basics of qualitative research: Grounded theory procedures and techniques. Sage Publications, Thousand Oaks, CA 1990
  • Tetzlaff JE, Dannefer EF, Fishleder AJ. 2008. Competency-based assessment in a medical school: A natural transition to graduate medical education. In GF Olligton (Ed) Teachers and teaching strategies: Innovations and problem-solving. Nova: New York, NY. 229–243.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.