3,108
Views
3
CrossRef citations to date
0
Altmetric
Articles

Interprofessional population health advocacy: Developing and implementing a panel management curriculum in five Veterans Administration primary care practices

, , , , , , , & ORCID Icon show all
Pages S75-S85 | Received 22 Jun 2017, Accepted 23 Apr 2018, Published online: 10 May 2018
1

ABSTRACT

Health care systems expect primary care clinicians to manage panels of patients and improve population health, yet few have been trained to do so. An interprofessional panel management (PM) curriculum is one possible strategy to address this training gap and supply future primary care practices with clinicians and teams prepared to work together to improve the health of individual patients and populations. This paper describes a Veterans Administration (VA) sponsored multi-site interprofessional PM curriculum development effort. Five VA Centers of Excellence in Primary Care Education collaborated to identify a common set of interprofessionally relevant desired learning outcomes (DLOs) for the PM and to develop assessment instruments for monitoring trainees’ PM learning. Authors cataloged teaching and learning activities across sites. Results from pilot testing were systematically discussed leading to iterative revisions of curricular elements. Authors completed a retrospective self-assessment of curriculum implementation for the academic year 2015–16 using a 5-point scale: contemplation (score = 0), pilot (1), action (2), maintenance (3), and embedded (4). Implementation scores were analyzed using descriptive statistics. DLOs were organized into five categories (individual patients, populations, guidelines/measures, teamwork, and improvement) along with a developmental continuum and mapped to program competencies. Instruction and implementation varied across sites based on resources and priorities. Between 2015 and 2016, 159 trainees (internal medicine residents, nurse practitioner students and residents, pharmacy residents, and psychology post-doctoral fellows) participated in the PM curriculum. Curriculum implementation scores for guidelines/measures and improvement DLOs were similar for all trainees; scores for individual patients, populations, and teamwork DLOs were more advanced for nurse practitioner and physician trainees. In conclusion, collaboratively identified DLOs for PM guided development of assessment instruments and instructional approaches for panel management activities in interprofessional teams. This PM curriculum and associated tools provide resources for educators in other settings.

Introduction

In response to calls for improved population health management, health care systems expect primary care clinicians to manage panels of patients and improve evidence-based metrics (Institute of Medicine, Citation2003). Yet, most primary care clinicians and associated teams are ill-prepared to carry out such panel management (PM) responsibilities (Dixon et al., Citation2017; Strauss et al., Citation2015). A panel management curriculum is one possible strategy to address this training gap and supply future primary care practices with clinicians prepared to work collaboratively in team-based primary care systems to improve the health of individual patients and populations.

National organizations such as the Interprofessional Education Collaborative and Institute of Medicine have released reports promoting integration and improvement of population health education in health professions training (Institute of Medicine, Citation2003; Interprofessional Education Collaborative, Citation2016). Some have described strategies and frameworks for integrating population health into health professions training, although most efforts are small pilots or focus on profession-specific curricula (Ahluwalia, de Silva, & Chana, Citation2014; Koo & Thacker, Citation2008; Shermock, Citation2017). Others emphasize the importance of interprofessional approaches to population health management (Association for Prevention Teaching and Research, Citation2015; Garr, Margalit, Jameton, & Cerra, Citation2012; Interprofessional Education Collaborative, Citation2016; Zomorodi, Zerden, Alexander, & Nance-Floyd, Citation2017). Missing from these calls to action is a practical translation from knowing about population health to knowing how to implement panel management collaboratively in primary care teaching practices.

There is significant variability and ongoing evolution in the definitions and conceptualization of population health, population health management and panel management (Kindig & Stoddart, Citation2003; Steenkamer, Drewes, Heijink, Baan, & Struijs, Citation2017; Swarthout & Bishop, Citation2017). Nested within the larger goal of improved population health, we define panel management as the tools and processes for identifying patients in a primary care practice with unmet preventive and chronic care needs and working systematically as a team to address these needs (Batalden et al., Citation1997; Chen & Bodenheimer, Citation2011; Godfrey, Nelson, Wasson, Mohr, & Batalden, Citation2003; Neuwirth, Schmittdiel, Tallman, & Bellows, Citation2007). A panel, or group of patients assigned to a primary care provider or team, will include subgroups of patients with similar characteristics, for example, within particular age groups or with specific health conditions.

Increased availability of computerized patient data systems makes routine monitoring of patients’ chronic disease and preventive care outcomes possible. However, traditional clinical training focusing on individual patient care delivered by individual clinicians may not adequately prepare health care professionals to utilize such data to collaboratively identify and address care gaps for patient panels as a team (Garr et al., Citation2012). Further, traditional approaches ignore the critical role that population outcomes ought to play in alerting clinical practices of the need for system redesign to achieve improved outcomes. As Bodenheimer (Citation2011) describes, a culture change is needed to shift clinicians away from focusing on “patients scheduled for this week’s appointments” toward assuming “responsibility for the health of a panel of patients, whether or not they seek care.” (p. 1558) Training in PM is necessary to prepare interprofessional primary care teams to successfully meet these expectations of improving population health outcomes in a proactive manner (Chen & Bodenheimer, Citation2011; Neuwirth et al., Citation2007). Even though not all team members are assigned a primary care patient panel, all team members make important contributions to the care of those patients. When viewed as an interprofessional collaborative endeavor, team members from different professions provide the diverse perspectives necessary to address health care gaps for individuals and populations (Zenzano et al., Citation2011).

As a coordinated set of workplace activities, panel management can be modeled for interprofessional trainees informally or implicitly. Alternatively, panel management education can be formally integrated into primary care training as an interprofessional practice-based curriculum where working together merges with learning together (Mertens et al., Citation2017). Furthermore, dedicated resources including time, space, and strong facilitation have been shown to improve learner and patient outcomes in interprofessional education (Brewer, Flavell, & Jordon, Citation2017). This paper describes a five-site national collaborative process of developing and implementing a panel management curriculum that makes explicit the learning goals, learner assessment, and instructional strategies needed to better prepare trainees from medicine, nursing, pharmacy, and psychology to work together to improve the health of Veterans. We report results from the five-site curriculum implementation as stages of change that can be used to guide ongoing curricular improvement.

Methods

We used curriculum models (Fink, Citation2003; Kern, Thomas, & Hughes, Citation2009) that advocated for tight linkages between objectives, instruction, and assessment to prioritize the relationships between three key questions during our development process: 1) What do we want trainees to know and do differently?, 2) How will we know?, and 3) What learning activities lead to these outcomes? We outline our approach next.

Settings

In 2010, the U.S. Veterans Administration (VA) Office of Academic Affiliations established the Centers of Excellence in Primary Care Education (CoEPCE), a collaborative demonstration project whose purpose is to prepare future health care professionals to work in, lead, and improve patient-centered interprofessional teams within the VA primary care setting (Gilman, Chokshi, Bowen, Rugen, & Cox, Citation2014). A description of the CoEPCE program is reported elsewhere (Rugen et al., Citation2014). From the onset, all sites included NP and physician trainees. Pharmacy and psychology trainees were variably engaged across sites until 2015 when they also became core learners at all five Centers.

One of the goals of the CoEPCE was to engage Centers individually and collaboratively in performance improvement that addressed gaps in clinical care and/or education. To this end, a cross-site performance improvement work group (PIWG) was established. Members of the PIWG represented the professions of medicine, nursing, pharmacy, and psychology, as well as associated educators and evaluators. A member of the CoEPCE coordinating center, a physician with advanced training in education and quality improvement, facilitated the work group activities. In 2014, the PIWG selected panel management as a focus of its improvement efforts due to a multi-site needs assessment that revealed high variability among faculty, staff, and trainees in carrying out PM responsibilities and a paucity of resources to support sites to develop robust PM curricula. The authors of this paper, all members of the PIWG, were local champions of PM and national collaborators who shared the goals of 1) improving PM education, 2) engaging faculty, staff, and trainees across professions to address care gaps together, 3) establishing PM as a core collaborative practice, and 4) developing and sharing practices across sites. The PIWG met monthly by phone from October 2014 through February 2017. Between meetings, site team members gathered information, tested concepts, piloted assessment instruments, and cataloged implementation efforts of the PM curriculum. Meetings were used to discuss progress, share ideas, and resolve differences in interpretation of learning objectives and assessment approaches. Interaction among PIWG members at annual face-to-face CoEPCE meetings facilitated collaboration.

These activities were categorized as operations in the VA Handbook 1058.05, where the information generated is used for business operations and quality improvement and not subject to oversight from a Human Subjects Institutional Review Board.

Participants in the curriculum

Members of the PIWG and interprofessional team members (e.g., clinical faculty and staff) from each Center participated in the curriculum development process. Trainees who piloted the curriculum included internal medicine physician residents, nurse practitioner (NP) students, and post-licensure NP residents (Rugen, Speroff, Zapatka, & Brienza, Citation2017), pharmacy residents, and psychology post-doctoral fellows. Post-licensure NP residents are licensed and credentialed nurse practitioners who seek an additional year of mentored clinical training. (Brown, Poppe, Kaminetzly, Wipf, & Woods, Citation2015; Sciacca & Reville, Citation2016). shows the numbers of trainees who participated in 2015–2016, the year we rated our curriculum implementation scores.

Table 1. Number and type of trainees engaged in interprofessional PM curriculum across five sites (July 2015-June 2016).

Curriculum development

Goals and objectives

(What do we want trainees to know and do differently?) To better prepare the interprofessional health care work force of the future, the goal of this curriculum was to engage trainees in learning how to do PM in collaboration with interprofessional primary care team members, including primary care providers (e.g., NPs, physician assistants, physicians), medical assistants, nurses and nurse care managers, pharmacists, psychologists, and social workers. Our literature review found three articles describing key attributes of PM; none described curricula. (Chen & Bodenheimer, Citation2011; Neuwirth et al., Citation2007; Savarimuthu et al., Citation2013). We used existing health professions competencies to guide the creation of PM-relevant competencies (Englander et al., Citation2013; Interprofessional Education Collaborative, Citation2016). We then wrote learning objectives as desired learning outcomes (DLOs) to indicate what we thought trainees should be able to demonstrate at the completion of their program.

We organized DLOs along a developmental continuum, including novice, advanced beginner, competent, and proficient categories (Dreyfus, Dreyfus, & Athanasiou, Citation1986), and deliberately avoided assigning year-in-program to these developmental levels. Generally, we considered novice DLOs foundational and proficient DLOs appropriate for clinicians in established practice.

Learner assessment

(How will we know what is learned?) Because we defined learning success as the ability to participate in and lead (as appropriate) PM activities, our approach to monitoring trainee learning was based on workplace-based assessment (WBA) (Norcini & Burch, Citation2007), requiring direct observations. We developed two instruments: a direct observation version to guide faculty members’ ratings of trainees’ PM performances and a trainee self-appraisal version, which clarified for trainees the learning outcomes we wanted them to achieve. To determine WBA feasibility, the instruments were pilot tested and iteratively revised (Massie & Ali, Citation2016). The initial versions including all 32 DLOs proved too cumbersome. To reduce the number of items, we selected the 8 competent level DLOs. We asked trainees to report both their ability and performance frequency for each item using a 5-point rating scale (see sample assessment tool Table 2—online supplementary file). We asked faculty members to similarly rate trainees’ PM performances (see Appendix 1—online supplementary file). These shorter instruments proved feasible. Web-based versions are under development to facilitate direct data entry.

Instructional strategies

(What learning activities lead to desired outcomes?) We designed our PM curriculum to have common learning objectives (DLOs) and assessments but allowed Centers to vary their approaches to instruction. Centers had already partially implemented approaches to teaching PM, with some early innovators and others who developed curricula later. Structure varied in intensity from periodic formal sessions (i.e. one or more hours per week or 2–3 hours every 2 months) to ad hoc activities within primary care team meetings. Facilitation was interprofessional by NPs, pharmacists, physicians, psychologists, registered nurse care managers, and technology specialists depending on site or PM session theme. Sites focused on different chronic disease or prevention themes, although all addressed diabetes and hypertension performance measures. Sites utilized established VA dashboards and registries and developed site-specific worksheets, checklists and resource guides. After DLOs were finalized, Centers cataloged their instructional approaches for each DLO and shared strategies and materials with each other.

Implementation

We used a retrospective, self-assessment method to determine the level of implementation of the PM curriculum for each site. Because we conceptualized curriculum implementation as a change process, we modeled the rating scale after Prochaska (Prochaska, Redding, & Evers, Citation2002). Interprofessional teams from each site retrospectively rated implementation efforts for the 2015–2016 academic year for each DLO along a 5-point scale: 0 = Contemplation: Not teaching or facilitating learning yet; 1 = Pilot: Beginning to pilot a teaching/learning approach; 2 = Action: Teaching this for some trainees for at least one academic year; 3 = Maintenance: Routinely teaching this for most trainees for more than one academic year; 4 = Embedded: This teaching/learning approach is established for all relevant trainees, regularly recurring as appropriate. Centers rated curriculum implementation for the physician, NP, pharmacy, and psychology trainees by noting the extent to which their PM curriculum included each of these trainee groups on the 5-point scale. For each Center and each DLO, we calculated a mean implementation score for NP and physician trainees and a separate mean implementation score for pharmacy and psychology trainees. For each DLO with a score spread of 3 or more points across sites, we discussed the findings to calibrate sites’ interpretation. In the process of rating implementation, we identified some DLO terminology that translated poorly across professions, which required revision of DLOs to improve meaning clarity and to improve appropriate application across the four targeted professions. Sites then re-scored curriculum implementation with the updated DLOs. Using the raw scores, we calculated means, range, and standard deviation for each DLO across sites for NP/physician trainee implementation and pharmacy/psychology implementation.

Results

The stepwise and iterative approach to our curriculum development and a broad overview of our results are shown in .

Figure 1. Inteprofessional PM curriculum development.

Figure 1. Inteprofessional PM curriculum development.

Key attributes of PM (Chen & Bodenheimer, Citation2011; Neuwirth et al., Citation2007; Savarimuthu et al., Citation2013) guided our iterative development of DLOs in five categories: individual patients, populations, guidelines and measures, teamwork, and improvement. The relationship between key attributes and the curriculum’s DLO categories is shown in .

Table 3. Key attributes of interprofessional PM and associated learning outcome categories. Key attributes synthesized from (Chen & Bodenheimer, Citation2011; Neuwirth et al., Citation2007; Savarimuthu et al., Citation2013).

Competencies and objectives

One of the many uses of a curriculum is to demonstrate to accrediting organizations how learners are meeting expectations for graduation or program completion. Following Fink’s (Citation2003) recommendation that DLOs should map to higher level competencies and domains, we show linked competency domains, our revised set of competencies for PM, and the DLOs we developed in Appendix 2—see the online supplementary file. For example, in the competency domain ‘interprofessional collaboration’, we revised the competency statement “Use the knowledge of one’s own role and the roles of other health professionals to appropriately assess and address the health care needs of the patients and populations served” to read, “Use the knowledge of one’s own role and the roles of other health professionals to leverage the expertise of the team to appropriately assess and address the health care needs of the patients and populations served.” Two of our DLOs mapped to this competency: ‘Identifies members of the panel management team’ and ‘Describes profession-specific panel management responsibilities for each interprofessional team member.’

Our iteratively revised and piloted DLOs are shown in , column one. Developmental designation for each DLO is also noted. For example, in the category of Teamwork, the DLOs begin with the “novice” provider who can adequately identify their team members and progresses to the “proficient” provider who not only effectively performs profession-specific roles and responsibilities at the team level but also engages in PM at the system level. Some DLOs overlapped with sites’ existing teamwork, interprofessional communication, and quality improvement curricula. Because panel management provides a structure for engaging trainees in teamwork and system redesign using panel data to monitor and sustain improvements, we elected to keep these DLOs in the PM curriculum and recognize that DLOs may be partly learned, practiced, or observed in conjunction with other curricula.

Table 4. Results from Centers’ retrospective, self-assessment of their implementation of instructional strategies to support achievement of desired learning outcomes for 2 groupings of trainees.

Implementation

Results of Centers’ self-rated curriculum implementation scores are shown in . Sites reported more robust curriculum implementation for physician and NP trainees with 43% of DLOs (14 of 32) in the “maintenance” stage (mean score ≥ 3) and 16% (5 of 32) in the “contemplation” or “pilot” stage (mean score ≤ 2). For these learners, instructional strategies for DLOs written for novice and advanced beginner stages of development were more likely to be implemented, and instruction was more robust for the populations and guidelines/measures categories. Curriculum implementation for pharmacy and psychology trainees, who were included in all sites by 2015, was in the “contemplation” or “pilot” stage in 69% of DLOs (22 of 32), and only 2 DLOs reached the “maintenance” stage (DLO 15: Locates current chronic disease and preventive measure guidelines for common conditions in health care setting and DLO 17: Interprets current chronic disease and preventive measure guidelines for common conditions in health care settings and applies guidelines to the care of individual patients). Average implementation in the categories of improvement and guidelines/measures reached the action to maintenance stages. Scores were lowest for pharmacy and psychology trainees in the category of teamwork. Two sites selected PM as an early focus for their Centers’ implementation and their implementation scores were higher overall (data not shown).

Examples of instructional strategies

Listed here are representative examples of educational strategies that Centers used for each DLO category.

Individual patients

An interprofessional mix of trainee primary care providers (PCPs) (NP and physician) and pharmacy and psychology trainees meet in a computer lab weekly. Following brief didactics (on topics such as reviewing evidenced-based performance metrics for hypertension or reviewing smoking cessation resources), session facilitators show trainees how to access panel data (or observe trainees doing this later in the year) and trainees practice in real-time. Pharmacy and psychology trainees are aligned with specific PCP trainees or are given relevant performance measures to review. Using structured worksheets or checklists, trainees review their data, identify patients whose care is not meeting specific performance measures and propose interventions to address these quality gaps for individual patients. Trainees may work independently, in interprofessional pairs, or in consultation with a facilitator. Following independent and small group work, trainees share with the larger group a brief summary of what they accomplished, action plans, and key lessons learned. Weekly sessions allow trainees and their facilitators to continue to track the success of the teams’ interventions with specific patients.

Populations

Trainees participate in half-day sessions every two months in the clinic precepting room, allowing for easy collaboration with nearby clinic staff and sufficient computers for each participant. PCP trainees review their panel registry data to identify patterns related to gaps in care in a specific subpopulation (e.g., patients with diabetes). Pharmacy and psychology trainees review the registry data for all PCP trainees in attendance, looking for root causes that could be addressed from their profession-specific perspective. Facilitators ask trainees to identify at least one system-level intervention. Trainees verify the current practice or discuss the feasibility of interventions with staff team members. For example, they may review processes for systematically completing foot checks at each clinic visit for the population of patients with diabetes. Proposed interventions are shared with the group at the end of the session and serve as triggers for quality improvement projects related to practice re-design.

Guidelines and Measures

Using small interprofessional group sessions, trainees share responsibility with faculty for leading reviews of current evidence-based guideline recommendations as they relate to the clinic’s chronic disease and preventive health performance measures and discussion of interprofessional roles and expertise. Trainees then utilize dashboards or performance reports to analyze how their facility, clinic and/or primary care team is meeting performance measures set forth by current evidence-based guidelines.

Teamwork

All sites utilize an interprofessional team approach to model professional roles and responsibilities and demonstrate the scope of practice for PM. For example, when working collaboratively on care plans, pharmacists make recommendations for medication management and psychologists discuss behavioral health strategies or approaches to patients’ multiple co-morbidities. Regularly scheduled, faculty-coached team huddles take place in the primary care clinic and serve to promote team cohesion, effective communication, role clarity and shared PM goals among trainees, supervisors, nursing, and clerical staff.

Improvement

In close coordination with quality improvement sessions, trainees use performance reports from their patient panels to design practice improvements. Trainees complete the Institute for Healthcare Improvement’s Open School modules (Institute for Healthcare Improvement, Citation2017) to gain a foundation in improvement tools and methods, specifically the Model for Improvement and Plan Do Study Act (PDSA) cycles (Langley et al., Citation2009). Trainees apply tools like cause and effect diagrams and process mapping to analyze practice performance gaps and make evidence-based improvements to care for their panels using iterative change cycles.

Discussion

In response to a PM curricular gap, five VA-sponsored Centers of Excellence in Primary Care Education collaborated to develop competencies, desired learning outcomes, assessment instruments, and educational activities with the aim of preparing interprofessional trainees to conduct PM with primary care team members. Through a collaborative, interprofessional, iterative, multi-site process, we successfully identified DLOs appropriate for trainees across professions. We organized them into five categories: individual patients, populations, guidelines and measures, teamwork, and improvement. These DLOs, in turn, guided development of workplace-based assessment instruments, identification of teaching strategies and learning activities, and allowed us to evaluate and begin refining curriculum implementation across sites. Curriculum implementation was more advanced for trainees engaged in the CoEPCE program from the beginning, which may in part be related to their roles as primary care providers with assigned patient panels. Implementation of curricular elements addressing ‘guidelines and measures’ and ‘improvement’ reached similar stages for trainees from all four professions, while higher variability remained in the other three categories. Although we have not yet formally assessed trainees’ learning progress on PM DLOs or determined best instructional practices, we developed processes and instruments to guide the future evaluation of PM curricular results. Implementing a multi-site PM curriculum within practices developing and optimizing team-based care led to many lessons learned.

Our DLO implementation scores reflect a more established curriculum for PCP trainees than for pharmacy and psychology trainees; however, there were some exceptions where mean scores were low for all professions. This may reflect several possibilities. First, low scores may indicate more advanced learning objectives. For all trainees, DLOs assigned a “proficient” level had lower implementation scores, suggesting these DLOs were more aspirational, and likely require more time and experience to implement consistently. Second, low scores may represent curricular elements that are more difficult to implement due to logistical constraints, such as scheduling restrictions among professions or lack of local expertise or resources. Clinicians involved in interprofessional education (IPE) programs have identified that additional time and effort for curricular design is needed to successfully collaborate across professions (Kent et al., Citation2018), which can also slow implementation. Third, low scores may indicate practice factors and cultural challenges when bringing interprofessional trainees together in a new curriculum. For example, the relatively low mean implementation score for the advanced beginner DLO “describes profession-specific PM responsibilities for each interprofessional team member,” may reflect ongoing role confusion within practice sites, rather than simply an instructional gap. Role confusion may extend beyond trainees as staff and faculty may still be developing skills and protocols for interprofessional PM. We make three recommendations. First, programs should start with the DLOs that our sites found easier to implement (higher implementation scores) for some professions and consider postponing more challenging DLOs (lower implementation scores) until the foundation of PM practice is established (). Second, programs should assess for a PM training gap across all levels of experience (Dixon et al., Citation2017; Strauss et al., Citation2015) and encourage faculty and clinic staff to focus on achieving their own proficiency in all DLOs, which can be achieved through learning together with trainees (Clay et al., Citation2013). Third, using DLOs organized in a developmental continuum provides a road map for focusing curricular efforts to the level of the trainees’ prior experience, and can offer a mechanism for experienced learners to “test out” of more basic instruction to focus on higher levels of competency development.

As our implementation results suggest, ongoing work is needed to integrate pharmacy and psychology trainees into PM curricular activities. Our sites were often in the contemplation and pilot phases of curriculum implementation for these trainees. This may reflect the relatively recent addition of these learners to some Centers, challenges in curriculum implementation or Centers’ prioritizing other curricula. Developing PM instructional strategies for team members without individual patient panels introduced challenges. Un-empaneled providers in primary care may not feel responsible for panel management or have a broader approach to population health management at the level of the entire clinic or health care setting. These factors necessitated adaptation of existing instructional strategies targeted initially to PCP trainees. Although team members without assigned patient panels may not feel responsible for PM, they bring professional expertise to the work of improving the care of individual patients, populations, and clinic-wide functions. To improve engagement, we found that formally aligning trainees with specific primary care teams or patient populations facilitates interprofessional instructional strategies, trainee collaboration, and role clarity.

In the process of determining which competencies and DLOs trainees should be able to do when conducting PM activities, we discovered some overlap with existing curricula, for example, quality improvement (QI). We elected to keep these DLOs in the PM curriculum; some sites found it helpful to merge some aspects of existing PM activities with existing curricula in QI. In our effort to have this curriculum mirror what happens (or ought to happen) in practice, it became clear that trainees should be able to recognize that a pattern of suboptimal panel outcomes over time may require a systematic practice redesign effort, not more individual effort for each patient. Ideally, PM informs QI, which in turn, improves population health. Redesign may lead to practice enhancements, such as incorporating reminders for recommended care into electronic health records for simple, routine care (e.g., vaccinations) (Ruffin et al., Citation2015) or creating collaborative appointment types with additional time to increase implementation of recommendations for more nuanced decisions that involve shared decision-making (e.g., osteoporosis screening or health care proxy designation) (Loo et al., Citation2011). Linking QI competencies to a panel management curriculum makes explicit the systems perspective needed for practice transformation and a culture of continuous improvement (Batalden et al., Citation1997).

We conducted many improvement cycles to revise and refine our assessment instruments based on field testing, which is ongoing. Piloting both a direct observation and a self-appraisal tool highlighted several challenges. Others have documented difficulties with WBAs (Massie & Ali, Citation2016). Self-assessments, while easier to implement and provide insight to learners’ perceptions, may inaccurately reflect their skills (Eva & Regehr, Citation2005). Alternatively, it can be difficult for individual faculty to observe the continuum of a trainee’s PM activities. It can also be time-consuming and burdensome for trainees to “show their work” and make their thought processes explicit. Our early experiences with assessment led sites to add more reflective practice to their curriculum and pilot faculty doing simultaneous PM tasks with trainees’ data to “double check” trainees’ analyses and action plans. This would be akin to a preceptor joining a trainee and patient to verify parts of a patient’s history or physical exam during a clinic visit. Formalizing faculty and trainee time for reflective practice activities is important for curricular success (D’Eon, Citation2005). As PM is a team-based endeavor, future development of a team assessment tool may be warranted.

The iterative, longitudinal curriculum development process allowed for different sites to have time and space to augment or alter existing curricula to address each of the agreed-upon DLOs. With this foundation, sites used innovation, site-specific factors and instructional strategies that worked best for them. Variability in instruction will create a richer toolkit of examples for others to consider adopting (in development). It is too soon to draw conclusions about best teaching approaches. Some DLOs may best be taught in classrooms as foundation learning for practice and others may best be embedded in the workplace for applied learning. Further work is needed to explicate relationships between DLOs and instruction.

Our work is done in the context of significant variability in the understanding and practice of PM in primary care practices. Debate exists as to whether PM should be under the direction of the PCP (Neuwirth et al., Citation2007; Savarimuthu et al., Citation2013) versus simply involves any needed tasks carried out by the appropriate health care team member to meet patients’ care needs (Chen & Bodenheimer, Citation2011; Zenzano et al., Citation2011). PM activities focused on optimizing care for the most complex, challenging patients may require PCP engagement, while other clinic staff may be able to address PM activities with guideline-driven protocols (Bodenheimer, Willard-Grace, & Ghorob, Citation2014; Rogers et al., Citation2015). Our curriculum reflects the view that PM is an interprofessional activity requiring the expertise and perspective of all primary care team members. We favor a shared or distributed leadership model of PM activities within teams, as this model has been positively associated with team identification and performance (Bergman, Rentsch, Small, Davenport, & Bergman, Citation2012; Forsyth & Mason, Citation2017; Wang, Waldman, & Zhang, Citation2014). Our DLO and assessment tools’ generalizability to multiple professions seeks to acknowledge that leadership model. Future iterations should consider more explicitly including all members of the primary care team—administrative staff, medical assistants, nurse care managers—in curricular implementation and evaluation (Clay et al., Citation2013).

Our innovation has limitations. Our sites, although diverse in the geographic region, size, and staffing models, are limited to primary care settings in VA educational sites. Further, these sites received additional funding for innovation, which makes them unique within the VA primary care system. We relied on sites’ local engagement and self-appraisal to guide this work, which limits our ability to draw conclusions about the best educational approaches. Because sites were actively developing and continuously improving instructional strategies throughout the academic year, assigning implementation scores to this moving target is challenging. We chose to analyze instructional implementation for pharmacy and psychology trainees together as they were both added as core learners in 2015. Future evaluations should separate learner type as it may better highlight the unique needs, strengths, and contributions of different professions in achieving the shared goals of panel management.

Ongoing iterative implementation and assessment of learning will continue to build an evidence base of the effectiveness of this curriculum, including a robust toolkit of teaching and learning activities. Application in broader clinical settings and training programs will add further evidence of effectiveness. Demonstration of improved patient outcomes with interprofessional population health curricula would meet an important need for evidence of the impact of IPE on patient care (Brandt, Lutfiyya, King, & Chioreso, Citation2014).

Concluding comments

Training in PM will prepare primary care teams to proactively address unmet health care needs of patients and populations. Our described PM curriculum, developed through application of curriculum frameworks and improvement science, provides a guide and resources for educators to initiate or build upon their existing population health curricula. While many academic clinical sites may still be in early stages of developing panel management processes and tools, building high functioning teams and cultivating expertise among faculty and staff, these should not be reasons to delay curriculum implementation. Trainees, faculty, and staff can learn PM and work on related systems improvements in collaboration and in parallel. Notably, improving clinic-wide PM capabilities may reduce burnout (Willard-Grace et al., Citation2015). Only by building a future workforce of competent health care professionals can we leverage the concept of panel management to reach its fullest potential to efficiently and effectively improve population health outcomes.

Declaration of Interest

The authors report no conflicts of interest. The authors alone are responsible for the content and writing of the article.

Supplemental material

cjic-2017-0253-20180424012454_-_Supplementary_files.zip

Download Zip (46.7 KB)

Acknowledgments

Krista Gager, AGNP-BC, Natasha McEwan MSN, APRN, Bridget O’Brien, PhD, Mamta Singh MD, MS, Elena Speroff, DNP.

Supplemental data

Supplemental data for this article can be access on the publisher’s website.

Additional information

Funding

The Centers of Excellence in Primary Care Education are funded by the Office of Academic Affiliations, US Department of Veterans Affairs.

References

  • Ahluwalia, S., de Silva, D., & Chana, N. (2014). Evaluation of a programme in population health management for GP trainees. Public Health, 128, 925–933. doi:10.1016/j.puhe.2014.08.009
  • Association for Prevention Teaching and Research. (2015). Clinical prevention and population health curriculum framework: Version 3. Retrieved from http://c.ymcdn.com/sites/www.aptrweb.org/resource/resmgr/HPCTF_Docs/Revised_CPPH_Framework_2.201.pdf
  • Batalden, P. B., Mohr, J. J., Nelson, E. C., Plume, S. K., Baker, G. R., Wasson, J. H., & Wisniewski, J. J. (1997). Continually improving the health and value of health care for a population of patients: The panel management process. Quality Management in Health Care, 5, 41–51. doi:10.1097/00019514-199705030-00005
  • Bergman, J. Z., Rentsch, J. R., Small, E. E., Davenport, S. W., & Bergman, S. M. (2012). The shared leadership process in decision-making teams. The Journal of Social Psychology, 152(1), 17–42. doi:10.1080/00224545.2010.538763
  • Bodenheimer, T. (2011). Lessons from the trenches–A high-functioning primary care clinic. New England Journal of Medicine, 365, 5–8. doi:10.1056/NEJMp1104942
  • Bodenheimer, T., Willard-Grace, R., & Ghorob, A. (2014). Expanding the roles of medical assistants. Who does what in primary care? JAMA Internal Medicine, 174, 1025–1026. doi:10.1001/jamainternmed.2014.1319
  • Brandt, B., Lutfiyya, M. N., King, J. A., & Chioreso, C. (2014). A scoping review of interprofessional collaborative practice and education using the lens of the Triple Aim. Journal of Interprofessional Care, 28(5), 393–399. doi:10.3109/13561820.2014.906391
  • Brewer, M. L., Flavell, H. L., & Jordon, J. (2017). Interprofessional team-based placements: The importance of space, place, and facilitation. Journal of Interprofessional Care, 31(4), 429–437. doi:10.1080/13561820.2017.1308318
  • Brown, K., Poppe, A., Kaminetzly, C., Wipf, J., & Woods, N. F. (2015). Recommendations for nurse practitioner residency programs. Nurse Educator, 40, 148–151. doi:10.1097/NNE.0000000000000117
  • Chen, E. H., & Bodenheimer, T. (2011). Improving population health through team-based panel management. Archives of Internal Medicine, 171, 1558–1559. doi:10.1001/archinternmed.2011.395
  • Clay, M. A., Sikon, A. L., Lypson, M. L., Gomez, A., Kennedy-Malone, L., Bussey-Jones, J., & Bowen, J. L. (2013). Teaching while learning while practicing: Reframing faculty development for the patient-centered medical home. Academic Medicine, 88, 1215–1219. doi:10.1097/ACM.0b013e31829ecf89
  • D’Eon, M. (2005). A blueprint for interprofessional learning. Journal of Interprofessional Care, 19(sup1), 49–59. doi:10.1080/13561820512331350227
  • Dixon, B. E., Barboza, K., Jensen, A. E., Bennett, K. J., Sherman, S. E., & Schwartz, M. D. (2017). Measuring practicing clinicians’ information literacy. An exploratory analysis in the context of panel management. Applied Clinical Informatics, 8, 149–161. doi:10.4338/ACI-2016-06-RA-0083
  • Dreyfus, H., Dreyfus, S. E., & Athanasiou, T. (1986). Mind over machine: The power of human intuition and expertise in the era of the computer. New York, NY: The Free Press (Macmillan).
  • Englander, R., Cameron, T., Ballard, A. J., Dodge, J., Bull, J., & Aschenbrener, C. A. (2013). Toward a common taxonomy of competency domains for the health professions and competencies for physicians. Academic Medicine, 88, 1088–1094. doi:10.1097/ACM.0b013e31829a3b2b
  • Eva, K. W., & Regehr, G. (2005). Self-assessment in the health professions: A reformulation and research Agenda. Academic Medicine, 80(Suppl.), S46–S54. doi:10.1097/00001888-200510001-00015
  • Fink, L. D. (2003). Creating significant learning experiences. San Francisco, CA: Jossey-Bass.
  • Forsyth, C., & Mason, B. (2017). Shared leadership and group identification in healthcare: The leadership beliefs of clinicians working in interprofessional teams. Journal of Interprofessional Care, 31(3), 291–299. doi:10.1080/13561820.2017.1280005
  • Garr, D. R., Margalit, R., Jameton, A., & Cerra, F. B. (2012). Educating the present and future health care workforce to provide care to populations. Academic Medicine, 87, 1159–1160. doi:10.1097/ACM.0b013e3182628d59
  • Gilman, S. C., Chokshi, D. A., Bowen, J. L., Rugen, K. W., & Cox, M. (2014). Connecting the dots: Interprofessional health education and delivery system redesign at the Veterans Health Administration. Academic Medicine, 89, 1113–1116. doi:10.1097/ACM.0000000000000312
  • Godfrey, M. M., Nelson, E. C., Wasson, J. H., Mohr, J. J., & Batalden, P. B. (2003). Microsystems in health care: Part 3. Planning patient-centered services. Joint Commission Journal on Quality and Patient Safety, 29, 159–170. doi:10.1016/S1549-3741(03)29020-1
  • Institute for Healthcare Improvement. (2017). Open school improvement capability course (Modules QI 101 through QI 105). Retrieved from http://www.ihi.org/education/IHIOpenSchool/Courses/Pages/default.aspx
  • Institute of Medicine. (2003). Who will keep the public healthy? Educating public health professionals for the 21st century. Washington, DC: National Academy Press.
  • Interprofessional Education Collaborative. (2016). Core competencies for interprofessional collaborative practice: 2016 update. Washington, DC: Author.
  • Kent, F., Nankervis, K., Johnson, C., Hodgkinson, M., Baulch, J., & Haines, T. (2018). ‘More effort and more time.’ Considerations in the establishment of interprofessional education programs in the workplace. Journal of Interprofessional Care, 32(1), 89–94. doi:10.1080/13561820.2017.1381076
  • Kern, D. E., Thomas, P. A., & Hughes, M. T. (2009). Curriculum development for medical education: A six-step approach (2nd ed.). Baltimore, MD: The Johns Hopkins University Press.
  • Kindig, D., & Stoddart, G. (2003). What is population health? American Journal of Public Health, 93, 380–383. doi:10.2105/AJPH.93.3.380
  • Koo, D., & Thacker, S. B. (2008). The education of physicians: A CDC perspective. Academic Medicine, 83, 399–407. doi:10.1097/ACM.0b013e3181667e9a
  • Langley, G. J., Moen, R. D., Nolan, K. M., Nolan, T. W., Norman, C. L., & Provost, L. P. (2009). The improvement guide: A practical approach to enhancing organizational performance (2nd ed.). San Francisco, CA: Jossey-Bass Wiley.
  • Loo, T. S., Davis, R. B., Lipsitz, L. A., Irish, J., Bates, C. K., Agarwal, K., & Hamel, M. B. (2011). Electronic medical record reminders and panel management to improve primary care of elderly patients. Archives of Internal Medicine, 17, 1552–1558. doi:10.1001/archinternmed.2011.394
  • Massie, J., & Ali, J. M. (2016). Workplace-based assessment: A review of user perceptions and strategies to address the identified shortcomings. Advances in Health Science Education, 21, 455–473. doi:10.1007/s10459-015-9614-0
  • Mertens, F., de Groot, E., Meijer, L., Wens, J., Cherry, M., Deveugele, M., & Pype, P. (2017). Workplace learning through collaboration in primary healthcare: A BEME realist review of what works, for whom and in what circumstances: BEME Guide No. 46. Medical Teacher, 40(2), 117–134. doi:10.1080/0142159X.2017.1390216
  • Neuwirth, E. B., Schmittdiel, J. A., Tallman, K., & Bellows, J. (2007). Understanding panel management: A comparative study of an emerging approach to population care. The Permanente Journal, 11, 12–20. doi:10.7812/TPP/07-040
  • Norcini, J., & Burch, V. (2007). Workplace-based assessment as an educational tool: AMEE Guide No. 31. Medical Teacher, 29, 855–871. doi:10.1080/01421590701775453
  • Prochaska, J. O., Redding, C. A., & Evers, K. (2002). The transtheoretical model and stages of change. In K. Glanz, B. K. Rimer, & F. M. Lewis (Ed.), Health behavior and health education: Theory, research, and practice (3rd ed.). San Francisco, CA: Jossey-Bass.
  • Rogers, E. A., Hessler, D., Dube, K., Willard-Grace, R., Gupta, R., Bodenheimer, T., & Grumbach, K. (2015). The panel management questionnaire: A tool to measure panel management capability. The Permanente Journal, 19, 4–9.
  • Ruffin, M. T., Plegue, M. A., Rockwell, P. G., Young, A. P., Patel, D. A., & Yeazel, M. W. (2015). Impact of an Electronic Health Record (EHR) Reminder on Human Papillomavirus (HPV) Vaccine Initiation and Timely Completion. Journal of the American Board of Family Medicine, 28, 324–333. doi:10.3122/jabfm.2015.03.140082
  • Rugen, K. W., Speroff, E., Zapatka, S. A., & Brienza, R. (2017). Veterans affairs interprofessional nurse practitioner residency in primary care: A competency-based program. The Journal for Nurse Practitioners, 12(6), e267–e273. doi:10.1016/j.nurpra.2016.02.023
  • Rugen, K. W., Watts, S. A., Janson, S. L., Angelo, L. A., Nash, M., Zapatka, S. A., & Saxe, J. M. (2014). Veterans affairs centers of excellence in primary care education: Transforming nurse practitioner education. Nursing Outlook, 62, 78–88. doi:10.1016/j.outlook.2013.11.004
  • Savarimuthu, S. M., Jensen, A. E., Schoenthaler, A., Dembitzer, A., Tenner, C., Gillespie, C., & Sherman, S. E. (2013). Developing a toolkit for panel management: Improving hypertension and smoking cessation outcomes in primary care at the VA. BMC Family Practice, 14, 176. doi:10.1186/1471-2296-14-176
  • Sciacca, K., & Reville, B. (2016). Evaluation of nurse practitioners enrolled in fellowship and residency programs: Methods and trends. The Journal for Nurse Practitioners, 12, e275–e280. doi:10.1016/j.nurpra.2016.02.011
  • Shermock, K. M. (2017). Population health management: Challenges and opportunities for pharmacy. American Journal of Health-System Pharmacy, 74(18), 1398–1399. doi:10.2146/ajhp170530
  • Steenkamer, B. M., Drewes, H. W., Heijink, R., Baan, C. A., & Struijs, J. N. (2017). Defining population health management: A scoping review of the literature. Population Health Management, 20(1), 74–85. doi:10.1089/pop.2015.0149
  • Strauss, S. M., Jensen, A. E., Bennett, K., Skursky, N., Sherman, S. E., & Schwartz, M. D. (2015). Clinicians’ panel management self-efficacy to support their patients’ smoking cessation and hypertension control needs. Translational Behavior Medicine, 5, 68–76. doi:10.1007/s13142-014-0287-7
  • Swarthout, M., & Bishop, M. A. (2017). Population health management: Review of concepts and definitions. American Journal of Health-System Pharmacy, 74, 1405–1411. doi:10.2146/ajhp170025
  • Wang, D., Waldman, D. A., & Zhang, Z. (2014). A meta-analysis of shared leadership and team effectiveness. Journal of Applied Psychology, 99(2), 181–198. doi:10.1037/a0034531
  • Willard-Grace, R., Dube, K., Hessler, D., O’Brien, B., Earnest, G., Gupta, R., & Grumbach, K. (2015). Panel management, team culture, and worklife experience. Families, Systems, Health, 33, 231–241. doi:10.1037/fsh0000113
  • Zenzano, T., Allan, J. D., Bigley, M. B., Bushardt, R. L., Garr, D. R., Johnson, K., & Stanley, J. M. (2011). The roles of healthcare professionals in implementing clinical prevention and population health. American Journal of Preventive Medicine, 40, 261–267. doi:10.1016/j.amepre.2010.10.023
  • Zomorodi, M., Zerden, L., Alexander, L., & Nance-Floyd, B. (2017). Engaging students in the development of an interprofessional population health management course. Nurse Educator, 42(1), 5–7. doi:10.1097/NNE.0000000000000298