332
Views
15
CrossRef citations to date
0
Altmetric
Original Research

Are they ready? Organizational readiness for change among clinical teaching teams

, , , , , & show all
Pages 807-815 | Published online: 14 Dec 2017

Abstract

Introduction

Curriculum change and innovation are inevitable parts of progress in postgraduate medical education (PGME). Although implementing change is known to be challenging, change management principles are rarely looked at for support. Change experts contend that organizational readiness for change (ORC) is a critical precursor for the successful implementation of change initiatives. Therefore, this study explores whether assessing ORC in clinical teaching teams could help to understand how curriculum change takes place in PGME.

Methods

Clinical teaching teams in hospitals in the Netherlands were requested to complete the Specialty Training’s Organizational Readiness for curriculum Change, a questionnaire to measure ORC in clinical teaching teams. In addition, change-related behavior was measured by using the “behavioral support-for-change” measure. A two-way analysis of variance was performed for all response variables of interest.

Results

In total, 836 clinical teaching team members were included in this study: 288 (34.4%) trainees, 307 (36.7%) clinical staff members, and 241 (28.8%) program directors. Overall, items regarding whether the program director has the authority to lead scored higher compared with the other items. At the other end, the subscales “management support and leadership,” “project resources,” and “implementation plan” had the lowest scores in all groups.

Discussion

The study brought to light that program directors are clearly in the lead when it comes to the implementation of educational innovation. Clinical teaching teams tend to work together as a team, sharing responsibilities in the implementation process. However, the results also reinforce the need for change management support in change processes in PGME.

This article is referred to by:
What models of change can be used to implement change in postgraduate medical education?

Introduction

Curriculum change and innovation are inevitable parts of progress in postgraduate medical education (PGME). Even though implementing change is known to be challenging,Citation1 little support has been sought from change management principles.Citation2 In view of the resources invested and the increasing regulatory and social demands, it is necessary to acquire knowledge on how curriculum change takes place in PGME and which factors either support or impair these implementation processes.

In general, innovation involves the introduction of new ideas into a product or service to create value and, by doing so, satisfies a specific need.Citation3,Citation4 Innovation may be driven by visionary ideas, quality requirements, or the need for higher efficiency. For an innovation to add value to its context, it needs to be adopted and routinized into standard practice;Citation5 especially the latter proves to be challenging.Citation5

In the last decade, the medical profession has faced a major innovation of PGME with the introduction of competency-based medical education (CBME). This innovation is driven by, among others, changes in healthcare needs and in expectations of the public.Citation6,Citation7 PGME needs to show a greater accountability to society by being more transparent about the content and quality of medical education.Citation8 CBME added value by introducing a broader definition of competencies needed by future medical specialists to meet the needs of their patientsCitation7 as well as requirements for teaching and assessment strategies.Citation6 In essence, the introduction of CBME requires a paradigm shift from a focus solely on reaching medical expertise to a focus on becoming a medical expert as well as acquiring other competencies for trainees to successfully address the roles they have in meeting societal needs. For instance, in the case of CanMEDS, this means trainees also need to become, among others, a competent “manager,” “collaborator,” and “scholar.” In addition, faculty development is essential to ensure an adequate uptake of CBME in daily practice such as proper use of feedback and reflection on learning.Citation7 However, as mentioned earlier, adopting and routinizing an innovation can be difficult and the transition from theory to practice does not necessarily lead to the intended changes.Citation6,Citation9

In practice, CBME has indeed led to more conscious attention to other competencies besides the role of the medical expertCitation10,Citation11 as well as to more frequent direct observation and increased documented feedback about a trainee’s performance.Citation12 However, details of generic models for CBME are not always explicitly outlined, which leads to a lack of clarity about its content, meaning, and relevance.Citation13Citation17 In addition, the implementation of CBME frameworks is further complicated by a lack of support from expert facilitators such as educationalists who can help with understanding the educational concepts and relating them to the clinical work environment.Citation6,Citation18

Despite the challenges that PGME is facing with the implementation of CBME, attention to a change management perspective on supporting these processes is still rather limited.Citation2 One of the potentially beneficial change management strategies for PGME could be the assessment of organizational readiness for change (ORC).Citation2 ORC is a comprehensive construct that reflects the degree to which members of an organization are collectively primed, motivated, and capable of adopting and executing a particular change initiative to purposefully alter the status quo.Citation19 Change experts contend that ORC is a critical precursor for the successful implementation of change initiatives.Citation20,Citation21 It is believed that when change leaders establish insufficient readiness, a range of predictable and undesirable outcomes would occur: change efforts make a false start from which it might or might not recover, change efforts stall as resistance grows, or the change fails altogether.Citation21 Actions to create readiness include, among others, establishing a sense of urgency, empowering your team members, and creating an appealing vision for the future as well as fostering a sense of confidence that this can be realized.Citation20,Citation21

Change readiness can be assessed at several stages of the change process, ie, before or during the change, as a way to diagnose any possible or current hurdles in the implementation process in order to facilitate any corrective interventions. In addition, readiness can be assessed repeatedly to explore the effects of the interventions.Citation2 Team members will commit to a change because they want to, have to, or ought to. Regardless of this reason, this form of commitment will lead to behavioral compliance with the requirements for change. Some, however, might show resistance, either active or passive, and fail to comply. On the other end of the spectrum are those who show behaviors that go beyond what is formally required to ensure success and enthusiastically promote change to others.Citation22

When looking at the current transition in PGME, many of the components relevant for the establishment of ORC can also be recognized in the implementation processes of CBME, such as proactive knowledge management to increase clarity about the content and meaning of CBME,Citation5 establishment of a need for change toward CBME,Citation1,Citation20 training of staff,Citation12 availability of resources,Citation5 access to expert facilitators such as educationalistsCitation5 to support relating educational concepts to the clinical work environment, and so on. This suggests that ORC could potentially play a vital role in the implementation processes of PGME. By understanding change management principles, educational leaders may improve the clinical teaching team’s ability to implement the planned change.Citation19 Therefore, this study explores whether assessing ORC in clinical teaching teams could help to understand how curriculum change takes place in PGME and, as a result, could provide support in overcoming the challenges in the implementation processes.

Methods

Setting and selection of participants

In the Netherlands, legislation has formalized the requirement that all postgraduate medical training programs must be reformed according to the competency-based framework of CanMEDS.Citation8 As a result, clinical teaching teams of all medical specialties registered at the Dutch Federation of Medical Specialists were eligible for participation as they had to implement this new curriculum in their local settings. In general, educational science was implemented relatively rapidly since an assessment method like simulation or multisource feedback has such obvious advantages that it was easily accepted into practice. Furthermore, attention to the generic skills is growing. However, reflective practices have not been institutionalized everywhere, and patient feedback has not been used sufficiently.Citation14

In the Netherlands, there are 8 academic medical centers, each of which coordinates PGME within a geographical region. Each geographical region contains multiple affiliated nonacademic teaching hospitals. Within each teaching hospital, at least one clinical teaching team offers residency training. Both academic medical centers and nonacademic teaching hospitals provide PGME and patient care and participate in research projects. However, academic medical centers are bigger, provide more specialized patient care, and both develop and participate in research projects at a much larger scale. Besides, they are responsible for providing undergraduate medical education as well.

In general, training programs in PGME are 4–6 years in duration depending on the subject. Trainees will complete several years of their training in an academic medical center and several years in one of the affiliated nonacademic teaching hospitals. In daily practice, trainees are supervised and trained by a clinical teaching team (ie, clinical staff members), which is led by a program director appointed by the Dutch Federation of Medical Specialists.

All teaching hospitals have a separate educational department that supports and assists the clinical teaching teams with their educational tasks. Between February and November 2015, we asked these educational departments to contact the individual clinical teaching team in their teaching hospital and discuss our study with the program directors. If a program director agreed to participate, an official invitation was sent including an information letter and a link to the web-based questionnaire. In addition, we sent a direct invitation to the program directors within our own network.

Subsequently, the program directors were responsible for inviting the other members of their clinical teaching team (ie, trainees and clinical staff members) to participate. During the study period from February till November 2015, several reminders to complete the questionnaire were sent to the program directors of the participating clinical teaching teams.

Ethical approval

This study was approved by the Ethical Review Board of the Dutch Association for Medical Education. All the participants received an information letter explaining study purpose, confidentiality, and voluntary participation. Written informed consent was obtained from all the participants for this study.

Materials

Specialty Training’s Organizational Readiness for curriculum Change (STORC)

ORC was measured using STORC ().Citation2,Citation34 This questionnaire was designed to measure readiness for change in clinical teaching teams at a team level, rather than at an individual level. This questionnaire was developed in an international Delphi study,Citation2 followed by a confirmatory factor analysis validating the clustering of items within the 10 subscales.Citation34 Generalizability study showed 5–8 responses are needed for a reliable outcome.Citation34 Participants were asked to rate their level of agreement with the 43 items of STORC on a 5-point Likert scale (1= strongly disagree and 5= strongly agree). Alternatively, they had the option to choose “not applicable.”

Table 1 Subscales and topics covered by the STORC questionnaire

Behavioral support-for-change

In addition, change-related behavior was measured using the “behavioral support-for-change” measure reflecting the 5 types of resistance and support behavior described by Herscovitch and Meyer:Citation22 active resistance (score =0–20), passive resistance (score =21–40), compliance (score =41–60), cooperation (score =61–80), and championing (score =81–100). These 5 types of behavior were made visible along a behavioral continuum of 101 points (ie, from 0 to 100). Participants were provided with a written description of each of the behaviors and were asked to indicate the score that best represented their own reaction as well as their clinical teaching team’s reaction to the introduction of competency-based medical education.

Statistical analysis

Statistical analyses were conducted using IBM SPSS Version 24.0 (IBM Corp., Armonk, NY, USA). Intraclass correlation due to respondents being nested within hospitals frequently requires multilevel analysis, in which hospital (upper level) and respondent (lower level) are treated as hierarchical levels.Citation23 However, in the current study, as the intraclass or intrahospital correlation for all response variables of interest was very small (ie, ranged from 0 to about 0.065), two-way analysis of variance (ANOVA) was performed for all response variables of interest: the individual score on change-related behavior (0–100ww), team’s score on change-related behavior (0–100), and the 10 separate subscales of STORC ().

Results

In total, 836 clinical teaching team members were included in this study: 288 (34.4%) trainees, 307 (36.7%) clinical staff members, and 241 (28.8%) program directors (). Respondents were working either at an academic medical center (49%) or at a nonacademic teaching hospital (51%), and about one third of the respondents were working in a surgical specialty. In total, the respondents represent 221 clinical teaching teams in 23 teaching hospitals, thereby representing 30.0% of clinical teaching teams (n=736) and 37.1% of teaching hospitals (n=62) in the Netherlands, respectively. About half of the respondents were female.

Table 2 Descriptive characteristics of the respondents

Statistical analysis

A two-way ANOVA was performed for the individual score on change-related behavior (0–100), the team’s score on change-related behavior (0–100), and the 10 separate sub-scales of STORC. The two factors in ANOVA were group of respondents and the type of hospital. The group-by-type interaction was very small for all response variables (partial η2 values <0.01) and, after correction for multiple testing, not statistically significant at the conventional α=0.05 significance level. presents the main effects of group and type.

Table 3 Main effects of group and type per response variable

Group of respondent

In general, looking at the three groups of respondents in our study, program directors gave higher scores on almost all of the subscales of STORC (). Their scores on 7 subscales differ significantly from the scores of clinical staff members and trainees (). Studying the scores on the different subscales of STORC in more depth revealed a similar pattern for all groups of respondents (). The subscale “formal leader,” consisting of items regarding whether the program director has the authority to lead and accept responsibility for the success of the change process, scored higher than the other scales. High scores were also given on the subscale “staff culture,” which includes items about teamwork and clinical staff’s receptiveness to changes, as well as on the subscale “appropriateness.” At the other end, the subscales “management support and leadership,” “project recourses,” and “implementation plan” had the lowest scores in all respondent groups.

Figure 1 Subscale scores on the STORC questionnaire per respondent group (A) and type of hospital (B).

Notes: This figure shows the average subscale scores and standard deviation on the STORC questionnaire divided by groups of respondents (A) and type of hospital (B). A–J subscales of the STORC questionnaire. A = formal leader; B = appropriateness; C = staff culture; D = involvement; E = clarity of mission and goals; F = necessity to change; G = pressure to change; H = management support and leadership; I = implementation plan; J = project resources.
Abbreviation: STORC, Specialty Training’s Organizational Readiness for curriculum Change.
Figure 1 Subscale scores on the STORC questionnaire per respondent group (A) and type of hospital (B).

Type of hospital

When comparing responses from nonacademic teaching hospitals and academic medical centers, respondents in nonacademic teaching hospitals showed higher scores on almost all of the subscales (). For 7 subscales, their scores were significantly higher than the scores of respondents in academic medical centers (). Further analysis of the different subscales of STORC showed that a similar pattern can be recognized when comparing respondents based on hospital type and on group of respondent: again high scores were given on the subscales “formal leader” and “staff culture” and low scores on “implementation plan” and “project recourses” ().

Change-related behavior

When comparing change-related behavior between the groups, program directors judged their own reaction to change more positively than trainees and clinical staff members. In addition, when asked to judge their team’s change-related behavior, program directors were significantly more pessimistic than their colleagues (; ). Looking at the different types of hospitals, respondents in nonacademic teaching hospitals judged their own reaction to change as well as their clinical teaching team’s reaction to change significantly more positively than their academic counterparts (; ).

Figure 2 Scores on change-related behavior per respondent group (A) and type of hospital (B).

Notes: This figure shows the average scores and standard deviation for change-related behavior divided by groups of respondents (A) and type of hospital (B). The individual score shows how respondents judged their own reaction to change whereas the groups score shows the score they gave to their clinical teaching teams overall. The scores represent the 5 types of change-related behavior: 0–20 active resistance, 21–40 passive resistance, 41–60 compliance, 61–80 cooperation, 81–100 championing.
Figure 2 Scores on change-related behavior per respondent group (A) and type of hospital (B).

Discussion

In this study, we used a change management perspective to understand how clinical teaching teams deal with a curriculum change such as the introduction of CBME. By looking at the team’s “state” of readiness for change as well as their change-related behavior, insights into leadership roles, teamwork, shared commitment, perceived support, and behavioral reactions to change were gathered.

Results show that the program directors are clearly seen as the “doctor in the lead” of the educational change, both by their own judgment and by that of their colleagues. First, this is supported by high scores on the subscale “formal leader” throughout the entire sample. One of the core components of ORC is the belief that formal leaders are committed to the success of the change and take responsibility for it.Citation19 Previous research in PGME had shown that the implementation process is accelerated in the presence of good leaders who are seen as role models and as entrepreneurs and who are able to inspire their team.Citation18 Previous research on general innovations in healthcare service and organization has also shown that strong leadership may be especially helpful in encouraging clinical team members to break through convergent thinking and routines.Citation24 Second, program directors judge their own behavior to change as significantly more supportive of the change as that of their other team members. However, according to the program directors, team members do tend to comply and therefore show commitment to change.Citation22 At the least, this gives the impression that program directors feel that they invest more effort than their colleagues. Whether they actually think this is appropriate or not cannot be determined by the present study.

Besides the role of the program director, the subscale “staff culture” was highly rated as well, which is reassuring for two reasons. First, team dynamics such as motivation, teamwork, and visionary staff are factors that contribute to successful change, as was previously shown in healthcare innovations.Citation24 More in particular, these factors combined with strong leadership increase the capacity to absorb knowledge, ie, to identify, interpret, and share new knowledge and subsequently link it to the team’s existing knowledge base in order to put it to appropriate use.Citation5,Citation24 Second, in the philosophy of CBME, teaching is not the responsibility of the program director alone, but rather of the entire clinical staff.Citation25 The results show that clinical staff members do feel and share a sense of responsibility for the improvement of training and that they work together as a team.

Based on the scores on the subscale appropriateness, which reflects the belief that a specific change is correct for the situation being addressed,Citation19 CBME indeed seems to meet the needs in PGME and therefore is accepted as a necessary and correct innovation. This is also supported by the fact that most team members showed behavioral compliance at the least or, in other words, were committed to this change. It is unclear whether this commitment is merely based on a desire to provide support or on a sense of obligation. However, scores representing actual resistant behavior, either passive or active, were rarely seen. In other words, respondents did comply and were supportive of the current curriculum change.

However, the lowest scores were found on the subscales “management support and leadership,” “project recourses,” and “implementation plan.” These subscales represent components that are clearly recognizable as being related to change management. As was stated earlier, the knowledge and use of change management strategies are lacking in change processes in PGME.Citation2 Not surprisingly, these subscales affirm this shortcoming, which becomes evident in, for instance, the absence of descriptions of tasks and timelines, and the shortage of evaluation cycles, training facilities, and financial resources.

When looking at the differences between responses from academic hospitals and nonacademic teaching hospitals, the latter seem to have an advantage. Possible reasons could be differences in department size, in culture, and in the balance between education, patient care, and research. Firstly, in nonacademic teaching hospitals, departments are usually smaller, which might lead to more efficient communication and decision-making processes.Citation18 Possibly, it might also cause team members to feel a stronger sense of a shared responsibility for implementing the proposed change,Citation18 thus promoting teamwork when implementing change. Earlier research looking at readiness for organizational change in healthcare has also shown that a socially supportive workplace may play an important role in the team members’ ability to cope with stress resulting from change.Citation26 This underscores the importance of a shared responsibility and teamwork.

Second, the academic cultural environment rewards individual accomplishments due to a stronger individualistic and competitive environment.Citation27Citation29 In addition, the primary focus tends to be more on pursuing an active career in research rather than an active career in medical education.Citation29Citation32 Potentially, this could impede gaining sufficient support and shared efforts to implement educational change.

In sum, clinical teaching teams appear to comply with the implementation of curriculum change if the proposed change is seen as a correct innovation. In that case, program directors receive and take the responsibility for the job that needs to be done, but they lack a fully equipped toolbox of change management principles to actually get that job done as efficiently as possible. Too little guidance from appropriate change models and implementation strategies slows down the implementation process, mainly because opportunities for advanced assessment and planning are missed.Citation18,Citation24

Strengths, limitations, and future research

Our findings extend the existing literature about implementation processes in medical education,Citation6,Citation18,Citation33 since this study was the first to particularly explore implementation processes in PGME from a change management perspective. The inclusion of 836 medical doctors from 39 hospitals allowed for a thorough assessment of change readiness and change-related behavior in this field. This sample size allowed us to assess differences between program directors, clinical staff members, and trainees. However, due to our method of recruitment, we were not informed how many colleagues each program director had invited to participate. Another limitation is that both questionnaires used in this study were distributed in English, as we assumed that the English language comprehension of all participants would be sufficient to participate. We cannot rule out the possibility that some participants might have misunderstood the items due to a language barrier. Nevertheless, these effects can be expected to be minimal since our findings show clear trends and are in accordance with change management principles deduced from other fieldsCitation19 and healthcare settings.Citation24,Citation26

A more in-depth analysis of the implementation of curriculum change in PGME is justified to further explore the way these changes occur in clinical teaching teams, which will strengthen our understanding of these processes and improve implementation efforts in this field.

Conclusion

The present analysis of readiness for change in clinical teaching teams brought to light that program directors are clearly in the lead when it comes to the implementation of educational innovation. Clinical teaching team members tend to work together as a team, sharing responsibility in the implementation process. However, results also reinforce the need for change management support in change processes in PGME in order to enhance the efficiency of the process itself as well as to improve the chances for success.

Author contributions

LB and MJ participated in data collection, after which LB and JL performed the data analysis. LB drafted the manuscript. All authors contributed toward data analysis, drafting and revising the paper and agree to be accountable for all aspects of the work.

Acknowledgments

The authors wish to thank all the clinical teaching teams participated for their input and the time they invested and Lisette van Hulst for her editing assistance.

Disclosure

The authors report no conflicts of interest in this work.

References

  • WeinerBJAmickHLeeSYConceptualization and measurement of organizational readiness for change: a review of the literature in health services research and other fieldsMed Care Res Rev20086537943618511812
  • BankLJippesMvan LuijkSden RooyenCScherpbierAScheeleFSpecialty Training’s Organizational Readiness for curriculum Change (STORC): development of a questionnaire in a Delphi studyBMC Med Educ20151512726242219
  • dictionary.cambridge.org [homepage on the Internet]CambridgeCambridge University Press2017 Available from: https://dictionary.cambridge.org/dictionary/english/innovationAccessed July 03, 2017
  • businessdictionary.com [homepage on the Internet]Fairfax: 2017WebFinance Inc Available from: http://www.businessdictionary.com/definition/innovation.htmlAccessed July 03, 2017
  • WilliamsIOrganizational readiness for innovation in health care: some lessons from the recent literatureHealth Serv Manage Res20112421321822040949
  • LillevangGBuggeLBeckHJoost-RethansJRingstedCEvaluation of a national process of reforming curricula in postgraduate medical educationMed Teach200931e260e26619811158
  • FrankJRDanoffDThe CanMEDS initiative: implementing an outcomes-based framework of physician competenciesMed Teach20072964264718236250
  • ScheeleFTeunissenPVan LuijkSIntroducing competency-based postgraduate medical education in the NetherlandsMed Teach20083024825318484450
  • RentingNDornanTGansROBorleffsJCCohen-SchotanusJJaarsmaADWhat supervisors say in their feedback: construction of CanMEDS roles in workplace settingsAdv Health Sci Educ Theory Pract20162137538726342599
  • CarraccioCEnglanderRVan MelleEAdvancing competency-based medical education: a charter for clinician-educatorsAcad Med20169164564926675189
  • BorleffsJCMouritsMJScheeleFCanMEDS 2015: nog betere dokters? [CanMEDS 2015: better doctors?]Ned Tijdschr Geneeskd2016160D406 Dutch27438391
  • SchultzKGriffithsJImplementing competency-based medical education in a postgraduate family medicine residency training program: a stepwise approach, facilitating factors, and processes or steps that would have been helpfulAcad Med20169168568926717504
  • ChouSColeGMcLaughlinKLockyerJCanMEDS evaluation in Canadian postgraduate training programmes: tools used and programme director satisfactionMed Educ20084287988618715485
  • ScheeleFVan LuijkSMulderHIs the modernisation of postgraduate medical training in the Netherlands successful? Views of the NVMO Special Interest Group on Postgraduate Medical EducationMed Teach20143611612024256088
  • HassanISKuriryHAnsariLACompetency-structured case discussion in the morning meeting: enhancing CanMEDS integration in daily practiceAdv Med Educ Pract2015635335826056510
  • RingstedCHansenTLDavisDScherpbierAAre some of the challenging aspects of the CanMEDS roles valid outside Canada?Med Educ20064080781516869928
  • ZibrowskiEMSinghSIGoldszmidtMAThe sum of the parts detracts from the intended whole: competencies and in-training assessmentsMed Educ20094374174819659487
  • JippesEVan LuijkSJPolsJAchterkampMCBrandPLVan EngelenJMFacilitators and barriers to a nationwide implementation of competency-based postgraduate medical curricula: a qualitative studyMed Teach201234e589e60222489978
  • HoltDTHelfrichCDHallCGWeinerBJAre you ready? How health professionals can comprehensively conceptualize readiness for changeJ Gen Intern Med201025505520077152
  • KotterJPLeading ChangeBoston, MAHarvard Business School Press1996
  • WeinerBJA theory of organizational readiness for changeImplement Sci200946719840381
  • HerscovitchLMeyerJPCommitment to organizational change: extension of a three-component modelJ Appl Psychol20028747448712090605
  • LeppinkJData analysis in medical education research: a multilevel perspectivePerspect Med Educ20154142425609172
  • GreenhalghTRobertGMacfarlaneFBatePKyriakidouODiffusion of innovations in service organizations: systematic review and recommendationsMilbank Q20048258162915595944
  • FrankJRRoyal College of Physicians and Surgeons of CanadaThe CanMEDS 2005 Physician Competency Framework: Better Standards, Better Physicians, Better CareOttawa, ONRoyal College of Physicians and Surgeons of Canada2005
  • CunninghamCEWoodwardCAShannonHSReadiness for organizational change: A longitudinal study of workplace, psychological and behavioural correlatesJ Occup Organ Psychol200275377392
  • PololiLConradPKnightSCarrPA study of the relational aspects of the culture of academic medicineAcad Med20098410611419116486
  • KrupatEPololiLSchnellERKernDEChanging the culture of academic medicine: the C-Change learning action network and its impact at participating medical schoolsAcad Med2013881252125823887002
  • LowensteinSRFernandezGCraneLAMedical school faculty discontent: prevalence and predictors of intent to leave academic careersBMC Med Educ200773717935631
  • PololiLKernDECarrPConradPKnightSThe culture of academic medicine: faculty perceptions of the lack of alignment between individual and institutional valuesJ Gen Intern Med2009241289129519834773
  • FairchildDGBenjaminEMGiffordDRHuotSJPhysician leadership: enhancing the career development of academic physician administrators and leadersAcad Med20047921421814985193
  • BeasleyBWSimonSDWrightSMA time to be promoted. The Prospective Study of Promotion in Academia (Prospective Study of Promotion in Academia)J Gen Intern Med20062112312916336619
  • JippesMDriessenEWBroersNJMajoorGDGijselaersWHvan der VleutenCPA medical school’s organizational readiness for curriculum change (MORC): development and validation of a questionnaireAcad Med2013881346135623887017
  • BankLJippesMLeppinkJSpecialty training’s organizational readiness for curriculum change (STORC): validation of a questionnaireAdv Med Educ Pract2017