827
Views
1
CrossRef citations to date
0
Altmetric
Article

Providing quality improvement workplace-based professional development to Australian general practice clinical educators: findings from a feasibility study

ORCID Icon, &
Pages 242-262 | Received 11 Apr 2022, Accepted 16 Jan 2023, Published online: 02 Feb 2023

ABSTRACT

In Australia, doctors undertaking advanced training to become general practitioners work under the supervision of clinical educators. Primarily clinicians, these clinical educators participate in one day of teaching-related professional development annually, generally a workshop. Shortcomings with this form of professional development led to the development of a quality improvement focused intervention that is facilitated in the workplace and based on the clinical educator’s self-identified concerns. Drawing on teacher action research and feasibility study methodologies we explored this novel form of professional development by asking ‘Can this intervention work?’ We employed an embedded multiple case study design, trialling the intervention in four settings. The intervention’s constituent activities were acceptable to participants; including a preparedness to experience discomfort for the purpose of improving their teaching practices. They perceived the intervention as convenient and the workload manageable. Participants were motivated to experiment with new teaching behaviours, but measuring outcomes was challenging. Medical educators facilitating the intervention perceived their tasks as cognitively ‘hard work’. The findings provide support for this workplace-based intervention as a feasible form of professional development, which should be implemented on a larger scale and evaluated for impact.

Introduction

As long-standing facilitators, recipients, and evaluators of professional development for general practice clinical educators in Australia, we had accumulated a list of ‘problems’ with how their professional development was being provided, the content, and its effect on participants’ teaching practice. Such ‘problems’, reflecting our interests and our concerns, are grist for action research (Mills Citation2018).

‘We’ are multifariously general practitioners (GPs), general practice clinical educators (GP-supervisors), medical educators (educators who teach clinical educators how to teach), and medical education researchers. GI and KA are experienced GPs, GP-supervisors, medical educators, and medical education researchers. TC is a social scientist who has worked in the general practice training environment for ten years in quality assurance, education, and research roles. We are all invested in the GP-supervisor role, described as the backbone of general practice training by the Royal Australian College of General Practitioners (Citation2021), and how best the role holders can be supported.

Our specific working context is the Australian General Practice Training (AGPT) programme, where registrars – qualified doctors – undertake advanced training in accredited training posts in order to become GPs. These trainees work under the supervision of clinical educators, the aforementioned GP-supervisors (hereafter supervisor(s)), who are experienced GPs, ‘on hand’ to observe, support, and teach. GPs gain accreditation as supervisors on the basis of their clinical expertise. Following an orientation programme to the supervisory role, they must participate in a minimal amount of teaching-related professional development each year, organised by the nine training organisations that are contracted by the Australian Department of Health to deliver the AGPT programme (Australian Government Department of Health Citation2022).

Supervisors typically have to attend one day of professional development each year, which is usually organised as a face-to-face workshop away from their workplaces. For small training posts, releasing a doctor to attend a one-day workshop may be problematic, depleting the clinical workforce, and for those in rural and remote settings, travel time may be impractical and unsafe. Where there is more than one supervisor in a training post, only one supervisor is required to attend the workshop, and the content is expected to ‘trickle down’ to colleagues. The ‘transfer of training’ problem from workshops to workplaces has long been recognised (Ford and Weissbein Citation1997) and the hoped-for flow of information to stretched colleagues is likely to be more of a drip than a trickle. The idealised self-directed learner that underpins this mode of professional development also tends to overlook that registrars are supported by a supervisory team. In failing to recognise that ‘it takes a village’ to educate a registrar, professional development workshops for supervisors marginalise important others, such as the allied health professionals who work in the clinic and are involved in registrars’ training.

A workshop’s content typically includes significant operational matters, rather than an exclusive focus on teaching practice. In the last decade, the focus on operational matters (e.g. changes in training policy) has been exacerbated by the constant reorganisation of the training programme, watering down the time that can be allocated to improving as an educator. As there is no national curriculum for supervisors that the training organisations can reference, the content is ‘ad hoc’ (Morgan et al. Citation2015), directed by well-intentioned medical educators who have limited information about supervisors’ diverse training needs. For long-standing supervisors, the reported repetitive nature of the educational content, means that the principal value of attending workshops comes from interacting with peers.

Supervisors comment that they do not know how they are performing as educators. Feedback relating to what they are actually doing in the workplace is unlikely to come from a workshop. As for most teachers (Hattie Citation2009), clinical educators’ teaching practice is a relatively private undertaking. Although supervisors are reaccredited every three years, the process does not use observational data, meaning that what supervisors actually ‘do’ is seldom questioned. Although a small number of recent research studies have promoted the value of real-time data over self-reports (for example, Brown et al. Citation2018), this practice has not been picked-up by the training organisations.

Registrars provide feedback about their supervisors to the training organisations, but as this tends to be given ‘in confidence’ it is of little utility to the supervisor. Supervisors can and do ask registrars for direct feedback, but this is typically viewed as having questionable validity due to the registrar’s employment relationship with the training post. Both the professional development and the accreditation processes are failing to give supervisors dependable performance information that they can use to evaluate their practice.

All these ‘problems’ are likely contributors to the reported variability in supervisors’ teaching practices (Farrell et al. Citation2013, Ingham et al. Citation2019). As medical educators we were motivated to address these problems, which we thought would require new forms of professional development. To this end we designed an intervention for high impact that would release external attendance pressures, negate the transfer of training problem, focus exclusively on educational content that is targeted at supervisors’ needs, provide information about actual performance, and embrace the possibility of including members of the supervisory team.

In brief, we designed an intervention based on a generic quality improvement (QI) cycle to be facilitated in a training post by a medical educator, which gave scope for the principal supervisor to include other members of the supervisory team. The intervention would take no more than seven hours (i.e. one day), but play out longitudinally (six- to eight-weeks). The QI focus – educational – would be identified by the supervisor and members of the supervisory team when involved. As data collection and outcome measures are integral to a QI cycle, participants were ensured of receiving performance information that they could use to evaluate their practice.

Designing such an intervention brought our identities as medical educators to the fore, suggesting teacher action research as the means of studying our practice whilst we trialled the intervention (Mills Citation2018). We also thought that conceptualising the research as a feasibility study was pertinent, which is undertaken when one wants to examine whether a novel intervention can be accomplished (Tickle-Degnen Citation2013). Although we thought that our initial educational design had a sound rationale and drew on best-practice evidence from research on professional development, there was the possibility that supervisors might not find it acceptable. For example, given our earlier comment about the private nature of supervisors’ teaching practice, we were not certain that we would find participants willing to expose their practice to us or their colleagues.

A feasibility study has congruence with teacher action research, being small scale, and also endorsing a developmental learning process whereby the intervention can be adapted during the study. Drawing on both teacher action research and feasibility study methodology, we set out to explore this novel form of professional development for Australian supervisors. We used Orsmond and Cohn’s (Citation2015) guidance for conducting a feasibility study to shape the research questions, using a broad overarching question: ‘Can this intervention work?’ and five subsidiary questions:

  1. Can we recruit supervisors to participate in this form of professional development?

  2. Is the QI intervention and the embedded activities suitable and acceptable to participants?

  3. How appropriate are the data collection procedures and outcome measures to the medical educator and supervisory team participants and the purpose of the professional development intervention?

  4. Do the medical educators have the resources and ability to manage the QI intervention?

  5. Does the QI intervention show promise of being successful with the supervisors and medical educators?

Materials and methods

Context

The intervention was trialled in 2021 in the training footprint of Murray City Country Coast (MCCC) General Practice Training, which covers four-fifths of Victoria, the second smallest state in Australia (227,444 km2), which is similar in size to the United Kingdom. MCCC, which is divided into four sub-regions, has 242 accredited training posts; 612 supervisors, and 568 registrars enrolled in the AGPT training programme (MCCC GP Training Citation2017). GI and TC were MCCC employees.

Approach, design, and philosophical assumptions

As we have signposted, we adopted a teacher action research approach in order to critically look at how we were implementing the intervention, understand the effects it was having on the participants, and develop our professional disposition as medical educators (Mills Citation2018). We framed the implementation of the intervention in a training post as a case, drawing on case study design (Yin Citation2014), and elected to implement the intervention in four training posts. This number of cases was influenced by the funding body’s 12-month timeline and our desire to recruit a training post in each of MCCC’s four sub-regions. We thought that four action research cycles would be sufficient to answer our overarching question, ‘Can this intervention work?’ In essence, the research is an embedded multiple case study design, with each intervention being the ‘unit of analysis’. The interventions were implemented sequentially, consistent with action research methodology, so that the learnings from one intervention could inform the next ().

Figure 1. Research design – embedded multiple case study.

Figure 1. Research design – embedded multiple case study.

As educators and teacher researchers we hold on to an ontological realism and an epistemological constructivism (Maxwell Citation2012). Our intervention was designed to engender particular outcomes, and we acted in ways to produce particular effects. For example, we negotiated ground rules with participants in order create a safe learning climate, which we believed would allow participants to interact with us in an open and authentic manner. At the same time people actively construct knowledge and we cannot predict what they will learn from their experiences. It must also follow that our findings and related understandings are similarly our constructions.

Recruitment

An email describing the intervention was sent by MCCC’s Communications Manager to MCCC’s training posts and supervisors. Interested supervisors could contact either GI or TC to discuss opting-in or to have their queries answered. We hoped to employ a limited form of maximum variation sampling (e.g. urban and rural training posts; newer and long-standing supervisors) but were prepared to fall back on convenience sampling (i.e. anyone who volunteered) (Miles and Huberman Citation1994), as we had no idea whether we would receive any interest. The invitation was sent in March 2021, when Victoria had had a long run of days with no locally acquired cases of COVID-19, a situation that would not last.

We invited four of MCCC’s medical educators to participate in the project after we had been awarded the research grant. As well as being a response to a funding goal – that of developing MCCC’s research capacity – it was also an opportunity to incorporate additional medical education expertise and perspectives into the project. Medical educators received an email from MCCC’s Director of Education and Training about the project and were invited to reply with an ‘expression of interest’. These responses were forwarded to GI and TC. Other than to have a medical educator from each sub-region, the recruitment criteria were an interest in the intervention and a willingness to embrace the aforementioned action research goals. Thus, each case was facilitated by a different teacher research team (GI, TC and another medical educator), with GI and TC participating in all four cases, and KA, GI and TC meeting as a separate, longitudinal research team, able to dig down into the details of a particular case and take a ‘helicopter view’. The four medical educators met together twice, at an orientation meeting and a final case analysis synthesis meeting.

Incentives

Supervisors were paid for participating in the intervention at the same rate that they would have received for attending a workshop. Other training post participants attended in their contracted hours. We negotiated with MCCC that participating in the intervention would count as supervisors mandated professional development for the calendar year. Supervisors could choose to use the intervention to apply for Continuing Professional Development points from either of the Australian general practice Colleges. The medical educators were paid at their salaried rate.

The intervention

expands the outline of the intervention given in the Introduction. A comprehensive description of the intervention, including details of our pedagogical approach, is provided in an open access guidebook that we have written for medical educators who wish to support supervisors through the process (Clement et al. Citation2022). Iterations of the guidebook were produced during the project’s action research cycles. Together with this paper, these two documents cover the requisite information for reporting an intervention outlined by Phillips et al. (Citation2016) and Hoffman et al. (Citation2014).

Table 1. Intervention stages.

The medical educator’s role essentially supports the participants through the phases of a QI cycle, helping them to identify an educational problem that they want to explore, plan actions in response to the problem, conceive of measurements of the actions, and reflect on the outcomes of those actions. The scoping, planning, and review meetings are the major ‘events’ in the process. To kickstart the process, we developed a preliminary activity for the principal supervisor: a priming task to encourage the supervisor to reflect on their practice, the practice of the supervisory team, and the practice learning environment. The primer, available in the guidebook, and consisting of 46 open-ended questions organised in three domains (Support, Education, and Team), was developed from tools used to measure clinical learning environments (for example, Shochet et al. Citation2015).

It should be noted that training posts are remunerated for providing a set number of teaching hours each week, which includes one hour of protected teaching and additional time to answer ad hoc calls for support. These hours are not part of the formal professional development time, but are integral to the educational design, for it is these teaching hours that allow supervisor participants to experiment with their practice and seek feedback from registrars.

Modes of delivery

Each case had a unique teacher research team (GI, TC, and a medical educator), with the medical educator assigned the lead facilitator role, who not only led the major events but also undertook any pre- and post-meeting liaison. With minor variations, TC and the medical educator attended all the major events, whilst GI attended all the scoping meetings, then taking an ‘arm’s length’ position that involved reviewing the recordings of subsequent major events.

Our original intention was for some stages of the intervention to be facilitated in-person by the medical educator at the training post. Using a medical educator from the sub-region was intended to reduce the impost of travel time. The need for in-person visits would be determined on a case-by-case basis, influenced by the needs of the particular QI project and the quality of the relationships between the parties. In one instance, the medical educator had a pre-existing relationship with the principal supervisor. The trajectory of the pandemic in Victoria and the restrictions imposed by the State government required a pivot to online facilitation in the later cases.

Data collection and analysis

We collected data from multiple sources, which is consistent with both action research and case study methodology (Pine Citation2009, Yin Citation2014). Our data comprised of audio- or video-recordings of research team meetings; reflective research journal entries; audio- or video-recordings of meetings with training post participants; fieldnotes arising from other interactions with the training post participants (e.g. telephone calls); artefacts created for and used in each intervention (e.g. written responses to the priming activity, surveys, draft versions of the guidebook); documents, and post-intervention interviews with participants.

We employed a variety of data analytic techniques, described by Miles and Huberman (Citation1994), Saldaña (Citation2013), and Yin (Citation2014), which were related to distinct phases of the project. We used event listing to record all the events related to a particular case in chronological order, which could be further demarcated into time periods (e.g. recruitment). This included both minor and major events (e.g. a telephone call and the scoping meeting), and events that only included the teacher research team (e.g. a case analysis meeting). After each major event with the training post we reviewed the audio- or video-recording and wrote individual reflective notes that summarised our impressions, issues, questions, and so on, about that meeting, which was closely followed by a case analysis meeting where we collectively discussed what we thought was happening in the case at that point in time. After we had withdrawn from a training post, we held a final case analysis meeting to agree on an overall account of the case, critique our performance, and plan for the next iteration of the intervention. We used memoing to expand upon ideas that had surfaced as brief notes or in meetings and to write about ideas that we had been struggling with as a means of clarifying them (e.g. our use of measures). This process of peer debriefing within each case was repeated within the longitudinal research team meetings, with KA adopting the important role of critical friend (Costa and Kallick Citation1993), challenging GI and TC about our interpretations of each case, changes to the educational design, our next steps, and so on.

We transcribed all the interviews but were selective in transcribing the other audio- or video-recordings (e.g. a quote that supported a topic), as we had already transformed that data into reflective notes and written case summaries. We chose an elemental approach to coding data, which we thought was appropriate for the study, as our aim was chunk segments of text into a small number of broad topics that would provide a descriptive account of ‘what happened’. That is, the data did not need to be further reconfigured into a higher level conceptual or theoretical abstraction in order to answer our research questions.

We conducted a rudimentary cross-case analysis using the techniques of charting, comparison, and triangulation; that is, organising the data, comparing data across the cases, and using the different data sources to crosscheck findings. This is primarily an inductive process that involves looking for patterns and insights in the entire data set (Yin Citation2014).

Ethics

Approval for the project to proceed was granted by Monash University Human Research Ethics Committee (Project Identification Number 26806). All participants provided written informed consent.

Findings

We begin the Findings section by highlighting some rhetorical devices that we have made in writing it. Our findings are presented primarily as synthesis of the four cases, that is, a cross-case analysis (Yin Citation2014), which are organised in relation to the five subsidiary research questions. As we have merged teacher action research together with a feasibility study, the findings embrace these two strands; that is, what we have learned about studying our practice (e.g. the challenges of measurement) and the acceptability of the intervention to the participants. As a qualitative paper, we have also chosen to interlace description, analysis, and interpretation (Richardson and St. Pierre Citation2005). As word restrictions limit the amount of qualitative data that can be included to support our claims, we have, in general, only included one quote to illustrate key ideas.

Can we recruit GP-supervisors to participate in this form of PD?

We were able to recruit participants from one email advertisement without difficulty and hold training posts in reserve. This ran contrary to our experience, as recruiting supervisors to research projects via email has typically been an opening gambit. We consider that not having to mount a stronger campaign, or make direct approaches to supervisors, as a positive indicator that there is a ‘market’ for this form of professional development, although we learnt nothing about its true size.

As indicated in , recruitment to the intervention was through the principal supervisor and it is they who determine the QI focus, which in turn governs whether other training post employees are invited to participate. provides demographic details of the six supervisor participants, and from which it can be inferred that three of the supervisors elected to focus on an individual ‘problem’, whilst in Case 3 the principal supervisor invited her supervisor colleagues to participate in the intervention.

Table 2. Demographic details of the supervisor participants.

The participants were all experienced GPs (12 to 30 years) and supervisors (5 to 31 years), working in predominantly rural areas. Four had completed their medical training in Australia. None had a formal teaching qualification.

Registrars were involved in all four cases, as recipients of the supervisors’ teaching, but were more directly involved in Cases 1 and 3. TC interviewed a registrar twice in the former, whilst three registrars attended the review meeting with the three supervisors in Case 3.

Participants with the ‘right stuff’

In articulating ‘why’ they had put themselves forward for the professional development, participants spoke in dispositional terms about why the intervention might not appeal to everyone. Supervisors had to be willing to expose their practice to others, be prepared to assess their own practice and be critiqued by others, and be able to redress any unsettling after-effects, if ‘how’ they needed to teach required a rethink.

This supervisor underscores a point made in the Introduction, that clinical educators’ teaching practice is a relatively private undertaking, which one needs to be willing to de-privatise:

When you’ve been practicing twenty-five, thirty years and you’re used to not being under much surveillance and just do what you want to do, pretty much, you’re the little lord of your domain and not that answerable really. (C2/S2/180821)

(Quotations are tagged by a data identifier, with ‘C’, ‘S’, and ‘ME’ referring to a particular Case, Supervisor, or Medical Educator, followed by the date on which the data was collected.)

He goes on:

I was talking about it with another GP, she quite likes the camaraderie and bouncing-off ideas you get in the larger group. And she wasn’t as keen to do something like this. And maybe shyer ones, or ones who didn’t want to share their shortcomings … Yeah, not everyone’s cup of tea … but I think there’d be the odd person or two who’d really like it … If you’re really up for it and you don’t mind being closely watched and closely analysed, almost like you’re being videoed; I’d say it’s really incredibly valuable and worthwhile having a go at it. Yeah, it shakes you out of your comfort zone, I suppose, as a thing, quite challenging. (C2/S2/180821)

His response alludes to potential obstacles to recruitment and also foregrounds the experiences that participants must be prepared to put themselves through (discussed below), which may be further barriers. Speaking with hindsight he implies that he possessed the necessary qualities to engage in the requisite tasks.

Is the QI intervention and the embedded activities suitable and acceptable to participants?

Supervisors opted-in to the professional development with preconceived ideas about what they were committing to. What looked acceptable at the start, would either be upheld or challenged by direct experience. From a feasibility perspective the participants needed to affirm that the activities they undertook were acceptable to them. summarises in a time-ordered matrix (Miles and Huberman Citation1994) ‘what happened’ in the phases of the QI cycles by case. The participants’ reported experiences of the cycles provide evidence for the suitability and acceptability of the intervention itself and its constituent parts.

Table 3. Outline of the four QI projects: questions, interventions, actions, evidence, and outcomes.

Convenience of work-based professional development

Our original conceptualisation of the intervention removed the burden of travel from GP-supervisors and placed it on medical educators. Participants viewed this as an agreeable arrangement.

It’s a really good option, it’s convenient, easy to do, and with the busy days that we have with general practice and everything, getting to do it in your own comfort zone where you work, as part of work, is really convenient … So, good of you to come up with this initiative. (C1/S1/060821)

As foreshadowed above, pandemic-related work restrictions meant that medical educators were not able to travel to the training posts in some cases, removing the impost of travel completely. The advantages and disadvantages of a purely online intervention and when this might be a reasonable option are discussed below.

Acceptable workload

Even without the responsibility of supervising registrars, GPs are time poor (Pearce et al. Citation2007). Participants highlighted that the seven hours of professional development was being squeezed into a crowded schedule, but overall, the impost was manageable.

Everything is adding to what is already, you know, everything else is just that one drop into a bucket, which is already full, so it’s never easy. But it wasn’t that much of a burden. I didn’t feel it so, because of how you structured it … That’s what we would be doing if we went to a workshop anyway. And the advantage was, there was no travel time. (C1/S1/060821)

Although we advocated that the supervisors should carve out protected space for the major events, in order to create the best learning environment, most tried to fit the meetings into the workday, typically over lunchtime. Although this did not interrupt the day-to-day business of the clinic, and it worked for these supervisors, there may be consequences for the learning climate and the participants’ well-being.

A timeline that sustains momentum

In contrast to a one-day workshop, the seven hours of the intervention play out over weeks. Entering the first action research cycle we had anticipated a six- to eight-week timeline from the supervisor receiving the priming activity to the evaluation. The average timeline was 9 weeks across the four iterations (range 5 to 12, ).

The timeline needs to realistically reflect the nature of the specific QI ‘problem’ and the actions that the participants identify as needing to be taken in relation to it. Participants made the important point that a timeline has a ‘sweet spot’, where they can keep the project in focus, neither feeling overburdened by a compressed timeline, nor losing sight of it if the timeline is over long.

I think it’s probably good if it’s not too drawn out. I don’t think you’d want it any longer, maybe like eight weeks or something. I think you really want to maintain the focus on it. And yes, I don’t think it’s useful to just give people lots of time to get it done. I think it’s probably you know, ‘This is a project we’re doing now, and we’ll keep the momentum happening’. We’re teaching the registrars every week. So, there’s always a time when we can fit in. (C3/S3/281021)

In our cases, the timelines also accommodated participant and registrar leave and adjustments to the ebb and flow of the COVID-19 pandemic. The specific focus of Case 4 – feedback conversations in relation to adverse incidents, which are thankfully infrequent – meant that the ‘taking action’ phase would have benefitted from playing out over a longer period, but by this time we were experiencing our own time pressures to finish the project. This point was picked up by the supervisor:

I think coming to the intervention part so late meant I didn’t get a lot of, didn’t get many opportunities to be able to practice it before coming back to the outcome interview’. (C4/S6/221121)

Supervisors’ genuine concerns

Effective professional development should stay close to the questions that supervisors ask about their own teaching, that is, their genuine concerns (Lord Citation1994). Focusing on self-identified QI ‘problems’ that were grounded in their day-to-day experiences was flagged as a welcome characteristic of the intervention.

It directly addresses an issue you’re having as an individual rather than a general populace curriculum … and it makes it more appliable … and there’s not a lot of time we spend one-on-one with anybody that’s about how we do things, it’s usually us giving our energy one-to-one to other people. (C4/S6/221121)

Discomfort for the purpose of learning

Participants experienced discomfort of the sort that they suggested may deter some supervisors from opting into the intervention. Their reflections suggested that the level of discomfort they experienced was acceptable and that an initial preparedness to experience discomfort was weighed against their motivations to improve as educators (Molloy and Bearman Citation2019).

If you ask my wife, she would say, ‘You were quite stressed about it. You’d wake up in the night thinking about it’, and a whole lot of stuff like that. But I don’t think it was, I mean, it wasn’t that negative experience or anything. And it doesn’t do any harm to, how do you say, rattle your cage, or shake you out of your comfort zone, because you do get a bit complacent over the years and bit set in your ways. So, I don’t think it was any bad thing at all, really. (C2/S2/180821)

A further consideration was whether participants were only exposing their practice to the facilitating medical educators or were sharing it more widely with members of the supervisory team, as in Case 3. There are likely to be individual differences as to whether people find exposing their practice to a medical educator or a peer more or less daunting.

Acceptance of online delivery

Our original educational design included provision for some of the medical educators’ time to be present at the training post. If they had little or no familiarity with the locale, the building, or its personnel, then we thought that face-to-face meetings could provide important sensitising context and accelerate the development of rapport. So as not to overburden the medical educators, it was always expected that some administrative tasks or short follow-up meetings could be done by telephone, email, or videoconferencing.

The COVID-19 pandemic and the tightening restrictions in Victoria meant that face-to-face visits were increasingly replaced by videoconferencing as the year progressed. The first case was completed as envisaged, the second as a hybrid, and the final two cases were completed entirely by videoconference. The participants stated that the online mode of delivery was acceptable, whilst at the same time acknowledging potential advantages of meeting in-person in some circumstances.

I think it’s fine by the videoconferencing, actually … I think it worked; it worked well. Maybe that meeting when we got together with the registrars; the last one, all together, at the end. That might have been quite nice if we’d all been able to be together in the same room. (C3/S4/281021)

We still think that our original rationale for meeting unacquainted participants and visiting an unfamiliar training post in person has merit. We gained useful insights about local geography and the physical settings through our visits (e.g. the proximity of the registrar’s consulting room to the supervisors) and valued the in-person contact. In-person visits may not be necessary if the process is repeated on another occasion, and in the context of a training programme, medical educators and supervisors may already have an existing relationship. In Case 4, which was conducted entirely online, the medical educator had known the supervisor as a registrar and met at professional development workshops. Commenting on that relationship, she suggested that the rapport might have been missing if they had been unknown to one another.

It’s a level of comfort and familiarity … I think that familiarity with [ME4] adds to the flow through into it being one-on-one therapy. I don’t think you would get that if you were meeting the person for the first time. You might, but you’re less likely to. (C4/S6/221121)

Acceptance of responsibility to self and others

The educational design circumvents the transfer of training problem that we highlighted in the Introduction. In addition to the aforementioned characteristics that facilitate this (e.g. workplace-based learning, self-identified questions), the participants identified and accepted the accountability that is built into the process.

I’d like to say that because we’re focusing on important practical skills it’s good, and most importantly that you’re giving us a bit of homework [meaning actions that they have to undertake] and then organising follow-up, so that’s going to make sure that we, or hopefully make sure that we keep it at the forefront of our minds and keep polishing our skills. (C3/S4/030921)

For example, supervisors were asked to complete the priming activity by a certain date, undertake particular actions, and report back at the review meeting. Clarity of purpose, what has been agreed, and so on, are emphasised in the guidebook for medical educators and were enacted in the four cases.

How appropriate are the data collection procedures and outcome measures to the medical educator and supervisory team participants and the purpose of the professional development intervention?

As the main goal of a feasibility study is to assess whether a new intervention can work, it is not expected to provide a rigorous examination of outcomes (Orsmond and Cohn Citation2015). Yet, as a QI cycle is at the heart of the intervention and definitions of QI emphasise data-driven activities that show whether there have been improvements in particular settings (Foster Citation2013, Merrill Citation2015), outcomes are an integral part of the intervention and were a regular focus in our research team meetings. Indeed, being able to demonstrate an improvement in processes, practices, or other outcomes is likely to be an important factor in deciding whether to scale out the intervention, and for individual participants being able to perceive improvements might be important in judging whether this form of professional development is a worthwhile use of their time.

For each case, provides a summary of the changes in practice (QI actions) that were intended to lead to the hoped-for improvements, the related data that was collected (evidence), and the reported outcomes. In this context, we are using outcome to refer to the specific short-term effects of the quality improvement actions. As each intervention was unique, the actions and evidence were conceived afresh in each case. shows that a variety of data were collected across the four cases as outcome evidence in the form of surveys, medical records, interviews, debriefing conversations, self-reports, and direct requests for feedback from learners.

As we foreshadowed above, what constituted ‘appropriate’ evidence was an ongoing discussion amongst the research team, where ‘appropriate’ had dual meanings; firstly, that the data was a credible indicator of improvement, and secondly, that participants endorsed the data collection plan as manageable. As a small-scale intervention, it was necessary to balance the need to collect meaningful data without the participants perceiving this task as burdensome. Donohoo’s (Citation2013) point that QI is not research was a valuable reminder to set limits on the requisite data, prompting us to think in terms of collecting evidence that the QI actions worked, rather than trying to prove beyond doubt that they worked.

We came to the view that determining whether changes in practices resulted in improvements is a ‘big ask’ in a tightly constrained intervention for a number of reasons. Firstly, it can be difficult to measure the outcomes of complex educational practices and show unequivocally that they resulted in changed knowledge, new skills, or an improved learning environment (Cook and West Citation2013). Secondly, we came to see it as naïve to expect that the agreed actions will have an immediate positive impact, because the clear benefits may take time to manifest themselves. A supervisor must learn to use new teaching and learning strategies effectively and a registrar may find any change of approach alien at first. Therefore, the supervisor may have to persist with the agreed actions for weeks or months to develop expertise and see the hoped-for improvements. Thirdly, we believe that teaching is not characterised by ‘right’ answers. Teaching has an artistry to it, and as it is not possible to prepare for every eventuality in advance, supervisors are required to be flexible and respond in-the-moment (Loughran Citation2006). So, activities, procedures, and strategies that worked today may not work tomorrow.

We came to see reported changes in supervisors’ behaviours, both by the supervisors themselves and other stakeholders, as the important outcome, as this indicated that they were enacting new approaches to teaching and learning; that is, experimenting with their teaching practices. These actions generated discussions in the review meetings about whether these approaches worked or not; indicative that people were inquiring into their practice. This more process-oriented approach, where it is impossible to separate out the participants, the process, and the outcomes conflicts with the post-positivist assumptions that underpin dominant views of QI, which favours quantifying performance and measuring outcomes as objectively and independently as possible (Foster Citation2013, Dennhardt et al. Citation2016).

In Case 2, for example, the supervisor’s reflections reveal much more about the effect of his actions than his rating of four out of five on a survey item that we had created to assess his confidence in assessing the registrar’s ‘patient safety’.

It really influenced my behaviour. It made me think a heck of a lot more about what I was teaching and how much preparation I was doing, and about my relationship with the registrar, how closely I was following his progress, and looking over his shoulder, and a whole lot of stuff like that. And it’s probably had some ongoing effect. I think I have more rapport with [the registrar] and we work more closely and better together as a result. The tutes [tutorials] I’ve done since, they’re more structured and you get to the point better with things. And [the registrar] seems really engaged with them … .And [he] prepared a bit too, he identified cases and everything. So, he put in a bit more background work for it, too. So, it was really good, actually … .it has been valuable and useful. (C2/S2/180821)

Selecting outcome measures for any intervention is taxing (Coster Citation2013) and there is undoubtedly more to be learned about the development and use of appropriate outcomes in this particular intervention. As the same supervisor remarked, referring to the post-intervention survey:

That series of questions … Yeah, they were very relevant and good talking points and good things to think about. As to how well we achieved on any of them, I couldn’t break it down into which ones were really good or anything like that … It is damn hard to measure and assess, I reckon … Not really nice firm end points are there. (C2/S2/180821)

It is likely that when it comes to ‘proof’ that medical doctors feel more comfortable with the conventional metrics of the post-positivist paradigm, such as the aforementioned quantification of performance and objective measures, rather than the ‘soft’ data of their reflections. Yet, we claim in relation to the four cases that the data collection plans were suitable for both the time-poor supervisors and the different QI ‘problems’. The comments about the overall acceptability of the workload in the previous section suggest that the amount of data collection was apposite and that the data is sensitive to the effects of the intervention, at least as we are framing it, whereby we consider an intervention to be a success if individual supervisors or the supervisory team are trying out new ways of teaching and looking for evidence about the consequences of those changed practices. In addition to being sensitive to the effects of the intervention, the processes of specifying actions to take, gathering related evidence, and engaging in a reflection-on-action conversation provided the participants with information that they could use to appraise their performance.

Do the medical educators have the resources and ability to manage the QI intervention?

In this section, we focus on ourselves as a key resource in implementing the intervention, reserving our thoughts about requisite financial resources until the Discussion. We claim that how we enacted our medical educator role was critical to achieving successful outcomes.

The previous sections imply that we were able to manage the more technical aspects of the intervention, such as the administrative tasks, managing the timeline, using videoconferencing proficiently, and so on. As designers of the intervention, we believed that it needed to be facilitated by educators with particular knowledge, skills and attitudes, which we hoped we possessed, in order to enact its important processual aspects. Our pedagogical approach, explicated in the accompanying guidebook (Clement et al. Citation2022), draws on ‘inquiry learning’ (Golding Citation2013), which requires medical educators who are comfortable with an open-ended approach, who can articulate the stance to supervisor participants, are confident in their ability to respond capably ‘in the moment’, and, due to the intervention’s bespoke character, can undertake independent research to inform each particular QI activity. A key part of the action research project was inquiring into the manner in which ‘we’ were facilitating the intervention. This ‘we’ includes the four medical educators who were recruited to work with us, who represent the ‘type’ of medical educator that might facilitate the intervention in the future.

provides some demographic details for GI and the four medical educators, all of whom are Australian trained, with a range of general practice experience (2–43 years), supervisor experience (0–28 years), and medical education experience (4–25 years).

Table 4. Demographic details of the medical educator participants.

In our context, medical educators are a diverse group, which, if the label is used more broadly, could include supervisors. Both are generally GPs who teach. (There are some medical educators who are not GPs, but they are the exception. There are also some medical educators who are GPs, but not supervisors.) However, at the risk of over-generalising, medical educators can be distinguished from supervisors. Both have strong subject knowledge and skills, but the latter have a heightened interest in education, more general pedagogical knowledge, and a willingness to become involved in a wider range of educational activities. Whereas supervisors might consider each other ‘near-peers’, we position medical educators as ‘far-peers’, whom we believe are more likely to push supervisors to stretch their thinking (see Lord Citation1994).

In addition, most of MCCC’s (approximately) 100 medical educators are involved in delivering the general practice curriculum to registrars (e.g. delivering online or classroom-based education for registrars, managing remediation programmes), while a much smaller number are involved in the professional development of their colleagues. This sub-set of medical educators are likely to have an even more heightened interest in education, although few have a relevant award-bearing qualification. In comparison to teachers, they are less likely to consider themselves teacher professionals. Thus, the finite pool of medical educators who are ‘willing’ and able to faithfully implement the steps in the QI cycle, and embrace the process – implementation fidelity (Patton Citation2008) – is a consideration for successfully ‘scaling out’ the intervention.

This medical educator’s reflection underscores why many medical educators do not want to teach supervisors; naming concerns about credibility and a lack of confidence in their skill set, and it reinforces earlier points about needing to be comfortable with the challenges of the pedagogical approach; preparedness to expose one’s practice and embrace any feelings of vulnerability; confidence in articulating pedagogical reasoning; and, able to follow the inquiry where it leads.

‘I think that a lot of the skills and knowledge [that medical educators have] are there, it’s just applying it in that different context, and along with that is a greater sense of vulnerability. So, to expand upon that, I think that one of the reasons that not many medical educators necessarily work with supervisors is that sense of, ‘Okay, I need to be a supervisor, and if I’m not, I don’t know what I’m talking about, or they’re not going to accept what I’m saying’. It’s surprising to hear that from some very accomplished medical educators who aren’t supervisors, ‘I can’t do that’. I think this [intervention] probably ramps that up further, and then the fact that we’ve made that deliberate decision to unpack what I was doing, to use that to build the relationship and as a teaching opportunity. Once we’re at that meta-level, not every medical educator that I know would necessarily feel comfortable with that. ‘Just give me a nice simple brief, a topic I can teach. That way I know the content well. I can deal with the questions that are going to come up’. Here, the sheer breadth and depth that it could potentially go to, and that confidence, being comfortable enough to just roll with that and go, ‘It’s not about the destination, it’s about the journey, and that’s actually okay’. I think that that’s a mindset thing as much as anything’. (C3/ME3/071021)

The inclusion of the four medical educators gave rise to a ‘train the trainer’ approach; that is, using our expertise to prepare potential medical educators to guide the intervention independently. A different medical educator took the lead facilitator role in each case, with GI scaffolding his direct involvement in the major events in relation to the medical educator’s experience. For example, S1, a relatively new GP who is not a supervisor, received the most support.

Our teacher research team meetings functioned like a teacher educator professional community (Hadar and Brody Citation2021), deepening our understanding of the intervention, providing support, and affirming the vision of what we were trying to achieve. We worked collaboratively to consistently enact the pedagogical approach that underpins the intervention, which was occasionally challenged by tensions that arose from participants’ different approaches to teaching and learning. The supervisor in Case 1, for example, initially expressed a view of teaching that was skewed towards the transmission of information. In her final interview she contrasted her initial expectations about what would be delivered with what happened:

I expected a bit more of analysis of what was right and what was not. I thought that there would be a lot more evaluation, or a lot more guidance in how to do this better … I actually thought that you would say that ‘This is what you have to do’ [laughs]. And so, when that didn’t happen, I was, I felt a bit short of what I wanted out of it. But like I said, the discussions that we had, the understanding that I got off what others do, and a couple of pointers that I got from the last meeting was really good. (C1/S1/060821)

Holding the pedagogical line – not telling this supervisor what to do – was made easier by being part of a teacher research team. We countered competing views of teaching and learning by sharing our pedagogical approach with supervisors. Indeed, this was an important part of the intervention, so that they could understand its underpinning ideas (Loughran Citation2014). Teacher research team meetings were an important forum for discussing the challenges we felt as individual educators. Medical educators may have to work doubly hard to counter an intuitive ‘fix the problem’ mindset that is an important part of their work as doctors, especially when learners appear to be struggling (Loughran Citation2006).

We agreed that facilitating the intervention was cognitively ‘hard work’ as this medical educator’s post-meeting reaction suggests:

It feels really tricky to navigate, just the process and content of what we are trying to do … I felt, not on edge, but constantly switched-on during it. I’m engaged in whatever I’m doing, but some things you can almost do on autopilot. This clearly isn’t one of those things. (C3/ME3/01121)

Taken as a whole, the findings imply that collectively, we had the personal resources to enact the pedagogical approach, to create the learning climate that was critical in enabling the reported outcomes. This required medical educators with the right knowledge and skills, probably medical educators with better than average pedagogical knowledge, who embraced the pedagogical approach, and its inherent challenges.

‘Sitting next to Nellie’ emerged as a good approach to guiding medical educators through the intervention and building confidence to implement it independently. All four MEs reported that they would be confident to do this in the future. Yet, we all identified the value of being partners in learning, of having a colleague ‘in the room’ for part of the QI cycle, particularly the scoping meeting, and being able to talk through a particular intervention with medical educator colleagues. Some form of medical educator support should be part of any scaling out, so that the medical educator’s practice does not become ‘private’, a recipe for isolation and a drift away from core processes.

Does the QI intervention show promise of being successful with the supervisors and medical educators?

In a feasibility study, the participants’ appraisal of the intervention is crucial to informing any decision to scale out. We suggest that the earlier more targeted quotations about particular aspects of the intervention indicate that it is acceptable to a sub-population of supervisors. Each of the four principal supervisors affirmed that the intervention has the potential to be a successful professional development offering in the future, to both novice and experienced supervisors.

I think it’s a really useful exercise. I would recommend it for sure … .the whole process, it was good. It just made me stop and make that space to look at things a bit more, and I’d like to do more of that. (C3/S3/281021)

For an intervention to show promise it also needs to show positive changes in the outcomes of interest. We flagged in the Introduction that our intent was to design an intervention for high impact. Although we highlighted the selection of outcome measures as challenging, we claimed that the data that were collected was sensitive to the effects of the intervention; participants were trying out new ways of teaching, looking for evidence about the consequences of changes made, and reflecting on their practice. We have provided evidence to support these claims.

In all cases, we stressed the value of the supervisors genuinely inviting the registrars to give feedback about their performance. We were generally reliant on supervisors to relay the content of these conversations to us, except in Case 3 where three registrars were invited to the review meeting. Registrars were able to confirm that supervisors changed their teaching practices and comment on how they experienced these changes. The quotation that follows comes from an interview that we conducted to validate the supervisor’s narrative, which affirms that the supervisor made multiple changes to her practice that were perceived positively. It is an example that provides further evidence of the intervention’s positive impact.

[The protected teaching session] was completely different to the previous one and multiple previous … She asked me what topic I would like to cover … and then she said that it had been suggested to do some more case-by-case discussion. So, if we could both bring some cases about the particular topic to the meeting … I was also able to do a little bit of pre-reading … so I was probably in a better position to get more out of it because I knew what I was coming in for … I also had specific patients in mind to be able to ask more questions, which I found really helpful … The interactive case-based discussion I find a lot more helpful because I can actually ask the specific questions. So, it was a lot more interactive … that’s always a good thing, being more engaged, rather than it just being that didactic learning … you’re just going to get more out of it if you’re actually engaged in the conversation rather than just sitting and listening … She also brought some cases to the table as well, she was, ‘I did X Y Z, but this arose and this is what I did, and this was the end result’, and so that expert knowledge instead of a textbook … I couldn’t tell you exactly how long it went for, but it certainly didn’t feel rushed … I prefer that sort of conversational case discussion … I found it really helpful and a lot better than previously, which I did tell [S1]. (C1/R1/020721)

Discussion and conclusion

We begin this final section by returning to our overarching question, ‘Can this intervention work?’, which we answer by summarising the core messages about its suitability and acceptability.

We learnt that some supervisors wanted to participate in this form of professional development and were prepared to experience discomfort for the purpose of improving their teaching practices, and that it was possible to include additional members of the supervisory team. As Grossman (Citation1990) points out, one case demonstrates that something is achievable. Participants thought that both the intervention as a whole, based on the authentic questions that they had about their practices, and the intervention’s constituent activities were acceptable to them. They perceived the intervention as convenient, which they could fit in with their daily activities. They found the workload manageable, suggested that it should be kept within a demarcated timeline, and thought that some facilitation activities could be completed remotely. The intervention supported the supervisor participants to experiment with new teaching behaviours, but measuring outcomes was challenging. For medical educators, facilitating the intervention was ‘hard work’, which required the application of specific knowledge, skills, and attitudes that were perceived as critical to creating the requisite learning climate. In short, medical educators’ and supervisor participants’ appraisals of the intervention imply that it can work; that is, it is a feasible form of professional development for supervisors and additional members of the supervisory team, and thus, should be implemented on a larger scale and evaluated for impact (Sandars et al. Citation2021).

A larger study would also allow both new and unresolved questions to be investigated. More work needs to be done to establish the intervention’s ‘reach’, given that some potential participants might perceive it as overly challenging and its activities unacceptable. Although Case 3 showed that it was possible to include additional members of the supervisory team, that is, to move beyond the principal supervisor, we need to know whether it is possible to engage members of the broader supervisory team (i.e. not just supervisors). Responses to the priming activity did indicate team-based concerns, but they were not prioritised by the participants in this study. Further work also needs to be done to investigate the impact of in-person and remote modes of delivery, conceptualising appropriate outcome measures that are meaningful without being burdensome, and how to optimise the in-practice learning climate so that participants carve-out dedicated time rather than squeeze meetings into the workday. A new, larger study could investigate the experiences of single medical educators implementing the intervention, their characteristics, how best to orientate them to the intervention and support them over time, and maintain ‘fidelity of implementation’ in the long term.

A ‘knee-jerk’ reaction may be to dismiss this form as professional development as prohibitively expensive. ‘Cost’ will obviously be a factor in any decision to scale out the intervention. Although we were successful in keeping supervisors to their seven hours of professional development time, quantifying medical educator time was confounded by our team-based action research context, where every major event was followed by long teacher research team meetings to discuss the case, reflect on our performance and the educational design, and plan the next steps. Less time will be required by a solitary medical educator working through the same steps, but without the productive exchange of ideas with medical educator colleagues that we experienced.

Medical educator costs are, of course, only one outlay. The intervention saves on the significant travel, accommodation, food, and room-hire expenditure incurred by the current programme of workshops. In addition, QI projects that engage additional members of the supervisory team are likely to generate more ‘bang for buck’ and it is likely that medical educators will develop efficiencies over time. Our experience was that each case required a bespoke intervention, whilst over time medical educators will develop a bank of resources than can be drawn on and tailored to new contexts. Probably more important are the weaknesses of extant professional development programmes that we listed in the Introduction, reflecting the ineffectual ‘spray on’ approach to professional development critiqued by Mockler (Citation2005). At present, we think that the bulk of the supervisor professional development budget is being spent on activities that are having little impact on the teaching environments where the future general practice workforce are learning their craft. Our intervention was designed to be high impact and overcome the problems with these activities, which we highlighted in the Introduction. However, only a methodologically challenging cost-effectiveness analysis could compare the efficacy of our QI intervention with an alternative programme that has similar goals (Levin Citation2005b) or a cost-benefit analysis would convert professional development outcomes into monetary estimates (Levin Citation2005a).

Riddell and Moore’s (Citation2015) distinction between scaling out, up, and deep is helpful in framing some final considerations about the systemic changes that are likely to be necessary to support this form of professional development in the future. ‘Scaling out’ simply requires the intervention to be ‘replicated’ in larger numbers, which is minimally necessary to undertake the suggested impact study. ‘Scaling deep’ requires the transformation of people’s values and cultural practices. For some supervisors this may require a commitment to de-privatise their teaching practices and to embrace discomfort for the purpose of learning. For some medical educators this may require targeted professional development to develop the requisite knowledge and skills and a preparedness to move away from modes of delivery that they feel comfortable with. ‘Scaling up’ refers to the institutional changes at the level of policy, rules and laws, which are likely to support durable cultural change. This is likely to require the two Australian general practice Colleges to endorse, promote, and fund the intervention. We are not suggesting that the intervention that we have reported on in this paper is ‘the answer’ to supervisor professional development, but we believe that our findings support that practice-based quality improvement professional development should be amongst the range of professional development activities that supervisors can choose to participate in, and that the intervention is worthy of further support and investigation.

Supplemental material

Supplementary_file_1.docx

Download MS Word (27.4 KB)

Acknowledgments

We would like to thank Dr Ashley Hayes, Dr Denise Ruth, Dr Kayty Plastow, and Dr Wendy Connor; the medical educators who took this journey with us and contributed so much. We would also like to thank the supervisors who embraced the intervention and provided valuable feedback.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Supplementary material

Supplemental data for this article can be accessed online at https://doi.org/10.1080/19415257.2023.2174162

Additional information

Funding

This work was supported by the Royal Australian College of General Practitioners through an Education Research Grant (ERG2021-10) with funding from the Australian General Practice Training Program: An Australian Government initiative.

References

  • Australian Government Department of Health , 2022. About the Australian General Practice Training (AGPT) Program. Australian Government Department of Heath. Available from: https://www.health.gov.au/initiatives-and-programs/australian-general-practice-training-agpt-program/about-the-australian-general-practice-training-agpt-program. [Accessed 7 February 2022].
  • Bearman, M., Eppich, W., and Nestel, D., 2019. How debriefing can inform feedback: practices that make a difference. In: M. Henderson, et al., eds. The impact of feedback in higher education: improving assessment outcomes for learners. London: Palgrave Macmillan, 165–188.
  • Brown, J., et al., 2018. The supervisory encounter and the senior GP trainee: managing for, through and with. Medical Education, 52 (2), 192–205. doi:10.1111/medu.13468
  • Clement, T., Ingham, G., and Anderson, K., 2022. Taking GP supervisor professional development to teaching practices: a guidebook for medical educators. MCCC GP training. http://hdl.handle.net/11343/297246 [Accessed 2 February 2022].
  • Cook, D.A. and West, C.P., 2013. Reconsidering the focus on “outcomes research” in medical education: a cautionary note. Academic Medicine, 88 (2), 162–167. doi:10.1097/ACM.0b013e31827c3d78
  • Costa, A.L. and Kallick, B., 1993. Through the lens of a critical friend. Educational Leadership, 51, 49–51.
  • Coster, W.J., 2013. Making the best match: selecting outcome measures for clinical trials and outcome studies. American Journal of Occupational Therapy, 67 (2), 162–170. doi:10.5014/ajot.2013.006015
  • Dennhardt, S., et al., 2016. Rethinking research in the medical humanities: a scoping review and narrative synthesis of quantitative outcome studies. Medical education, 50 (3), 285–299. doi:10.1111/medu.12812
  • Donohoo, J., 2013. Collaborative inquiry for educators: a facilitator’s guide to school improvement. Thousand Oaks, CA: Corwin.
  • Farrell, E., et al., 2013. In-practice teaching resource. Melbourne: General Practice Registrars Australia.
  • Ford, J.K. and Weissbein, D.A., 1997. Transfer of training: an updated review and analysis. Performance Improvement Quarterly, 10 (2), 22–41. doi:10.1111/j.1937-8327.1997.tb00047.x
  • Foster, J., 2013. Differentiating quality improvement and research activities. Clinical Nurse Specialist, 27 (1), 10–13. doi:10.1097/NUR.0b013e3182776db5
  • Golding, C., 2013. The teacher as guide: a conception of the inquiry teacher. Educational Philosophy and Theory, 45 (1), 91–110. doi:10.1080/00131857.2012.715387
  • Grossman, P.L., 1990. The making of a teacher: teacher knowledge and teacher education. New York: Teachers College Press.
  • Hadar, L.L. and Brody, D.L., 2021. Interrogating the role of facilitators in promoting learning in teacher educators’ professional communities. Professional Development in Education, 47 (4), 599–612. doi:10.1080/19415257.2020.1839782
  • Hattie, J., 2009. Visible learning: a synthesis of over 800 meta-analyses relating to achievement. Abingdon: Routledge.
  • Hoffmann, T.C., et al., 2014. Better reporting of interventions: template for intervention description and replication (TIDieR) checklist and guide. BMJ (Clinical research ed.), 348, g1687. doi:10.1136/bmj.g1687
  • Ingham, G., et al., 2019. Tell me if there is a problem: safety in early general practice training. Education for primary care, 1–8. doi:10.1080/14739879.2019.1610078
  • Levin, H.M., 2005a. Cost-benefit analysis. In: S. Mathison, ed. Encyclopedia of Evaluation. Thousand Oaks, California: Sage, 87–90.
  • Levin, H.M., 2005b. Cost effectiveness. In: S. Mathison, ed. Encyclopedia of Evaluation. Thousand Oaks, California: Sage, 90.
  • Lord, B., 1994. Teachers’ professional development: critical colleagueship and the role of professional communities. In: N.K. Cobb, ed. The future of education: perspectives on National Standards in America. New York, NY: College Entrance Examination Board, 175–204.
  • Loughran, J., 2006. Developing a pedagogy of teacher education: understanding teaching and learning about teaching. Abingdon: Routledge.
  • Loughran, J., 2014. Professionally developing as a teacher educator. Journal of Teacher Education, 65 (4), 271–283. doi:10.1177/0022487114533386
  • Maxwell, J.A., 2012. A realist approach for qualitative research. Thousand Oaks, CA: Sage.
  • MCCC GP Training, 2017. Growing the next generation of GPs for our diverse communities: checkpoint 2017. Warrnambool: Murray City Country Coast GP Training.
  • Merrill, K.C., 2015. Is this quality improvement of research? American Nurse, 10 (4), 1–3.
  • Miles, M.B. and Huberman, A.M., 1994. Qualitative data analysis: an expanded sourcebook. 2nd. Thousand Oaks, CA: Sage Publications.
  • Mills, G.E., 2018. Action research: a guide for the teacher researcher. 6th. New York, NY: Pearson.
  • Mockler, N., 2005. Trans/forming teachers: new professional learning and transformative teacher professionalism. Journal of In-Service Education, 31 (4), 733–746. doi:10.1080/13674580500200380
  • Molloy, E. and Bearman, M., 2019. Embracing the tension between vulnerability and credibility: ‘intellectual candour’ in health professions education. Medical Education, 53 (1), 32–41. doi:10.1111/medu.13649
  • Molloy, E., Borrell-Carrió, F., and Epstein, R., 2013. The impact of emotions in feedback. In: D. Boud and E. Molloy, eds. Feedback in higher and professional education. London: Routledge, 50–71.
  • Morgan, S., et al., 2015. Towards an educational continuing professional development (EdCPD) curriculum for Australian general practice supervisors. Australian Family Physician, 44 (11), 854–858.
  • Morgan, S. and Ingham, G., 2013. Random case analysis: a new framework for Australian general practice training. Australian Family Physician, 42 (1–2), 69–73.
  • Orsmond, G.I. and Cohn, E.S., 2015. The distinctive features of a feasibility study: objectives and guiding questions. OTJR: Occupation, Participation, and Health, 35 (3), 169–177. doi:10.1177/1539449215578649
  • Patton, M.Q., 2008. Utilization-focused evaluation. 4th. London: Sage.
  • Pearce, R., et al., 2007. The challenges of teaching in a general practice setting. Medical Journal of Australia, 187 (2), 129–132. doi:10.5694/j.1326-5377.2007.tb01161.x
  • Phillips, A.C., et al., 2016. Development and validation of the guideline for reporting evidence-based practice educational interventions and teaching (GREET). BMC Medical Education, 16 (1), 237. doi:10.1186/s12909-016-0759-1
  • Pine, G.J., 2009. Conducting teacher action research. In: G.J. Pine, ed. Teacher action research: building knowledge democracies. Thousand Oaks, CA: Sage Publications, Inc, 234–263.
  • Richardson, L. and St. Pierre, E.A., 2005. Writing: a method of inquiry. In: N.K. Denzin and Y.S. Lincoln, eds. The sage handbook of qualitative research. 3rd ed. Thousand Oaks, CA: Sage Publications, 959–978.
  • Riddell, D. and Moore, M., 2015. Scaling out, scaling up, scaling deep: advancing systemic social innovation and the learning processes to support it. Montreal: The JW McConnell Family Foundation.
  • Royal Australian College of General Practitioners, 2021. Standards for general practice training. 3rd ed. East Melbourne: The Royal Australian College of General Practitioners.
  • Saldaña, J., 2013. The coding manual for qualitative researchers. 2nd. London: Sage Publications Ltd.
  • Sandars, J., et al., 2021. Avoid ‘running before we can walk’ in medical education research: the importance of design and development research. Medical teacher, 43 (11), 1335–1336. doi:10.1080/0142159X.2020.1854452
  • Shochet, R., Colbert-Getz, J., and Wright, S., 2015. The Johns Hopkins learning environment scale: measuring medical students’ perceptions of the processes supporting professional formation. Academic Medicine, 90 (6), 810–818. doi:10.1097/ACM.0000000000000706
  • Tickle-Degnen, L., 2013. Nuts and bolts of conducting feasibility studies. American Journal of Occupational Therapy, 67 (2), 171–176. doi:10.5014/ajot.2013.006270
  • Yin, R.K., 2014. Case study research: design and methods. 5th. Thousand Oaks, CA: Sage Publications, Inc.