106
Views
0
CrossRef citations to date
0
Altmetric
Brief Report

Programmatic evaluation of interprofessional education: a quality improvement tool

ORCID Icon, ORCID Icon & ORCID Icon
Received 16 Oct 2023, Accepted 17 Apr 2024, Published online: 09 May 2024

ABSTRACT

Robust demonstration of high-quality, fit-for-purpose interprofessional education (IPE) is essential for today’s health professional students, staff, curricula, and regulatory bodies. As IPE moves from discrete “events” to fully embedded spirals of learning across degree programme curricula, effective mechanisms for monitoring continuous quality improvement are paramount. An accreditation tool was therefore developed for all learning activities contributing to the IPE curriculum of a university in Aotearoa New Zealand. We worked over 15 months, introducing a user-friendly tool to collect data, managing accreditation processes, and integrating with wider systems. We identified key levers to monitor, adjust, and continuously improve quality in IPE teaching and learning at individual-activity and programmatic levels.

Introduction

Tertiary education institutions internationally are increasingly integrating interprofessional education (IPE) into their pre-registration health professional curricula to prepare graduates for effective collaborative practice (Reeves et al., Citation2016). Interprofessional competency-based accreditation standards (Cox et al., Citation2016) and student assessment are becoming required by accrediting bodies (Grymonpré et al., Citation2021; Health Professions Accreditation Collaborative Forum Australia, Citation2022) and in legislation (Ministry of Health New Zealand, Citation2019).

Ensuring ongoing quality improvement (Thistlethwaite, Citation2021) for embedded, programmatic IPE across uniprofessional curricula presents challenges for institutions running various multi-year health professional degree programmes for many students. IPE programmes need mutually-developed quality frameworks (Institute of Medicine, Citation2015), and practical means, to monitor the range of discrete Learning Activities (LAs; including in clinical workplaces) which typically constitute a complete IPE curriculum, and to continuously evaluate programme outcomes as a complex whole.

In this paper, we share how a cross-professional IPE Accreditation Register was established at the University of Otago, Aotearoa New Zealand; and describe the resultant data collection and quality improvement. Work was undertaken by the Division of Health Sciences Centre for Interprofessional Education (IPE Centre; then 2.65 full-time-equivalent [FTE] academic and 2.5FTE administrative staff).

Background

Following growth of IPE LAs across health professional programmes at the University of Otago from 2016, an IPE Curriculum and Quality Framework was developed and ratified (2018–2020; Pullon & Symes, Citation2019). This framework was aimed to assure quality of IPE LAs singly and programmatically, with students progressing through a spiral of interprofessional learning. Interlocking components of the Framework are captured in .

Table 1. Curriculum and quality framework for Interprofessional Education (IPE) at Otago.

An associated online “register” was needed (i.e., a continuous record of all IPE LAs, and student participation in them). Such a register would serve to:

  1. initially, register and annually (re-)accredit all IPE LAs in the university (our focus in this paper)

  2. subsequently, track students’ participation in LAs and their progressive attainment of IPE competencies recorded as IPE “LA credits”

  3. ultimately, monitor and evaluate the outcomes of IPE LAs, student and staff experiences, and the entire IPE curriculum.

Methods

From 2020, an IPE Accreditation Register was developed to serve purpose (a) above. IPE LAs needed to be recorded by four local campuses, including 12 health professional degree programmes based in 11 Faculties/Schools/Centres of Excellence, and clinical placement sites in urban, regional, and rural locations. Students external to Otago were also engaged as cross-institution partners.

Previously, there was no ability to monitor where or when IPE LAs had been developed and delivered, their aims, intended learning outcomes (ILOs), participating programmes, and articulation with degree programmes. The IPE Accreditation Register aimed to answer such questions without delays from multi-part scoping and funding approvals required to insert IPE over time into existing student management systems.

From October 2020, over 15 months, we:

  1. assessed proprietary software options

  2. selected REDCap (Research Electronic Data Capture; Harris et al., Citation2009) for its capabilities (REDCap is a secure, web-based software platform supporting data capture for research studies and is free to not-for-profit organizations who join the REDCap Consortium)

  3. designed a customized tool using these capabilities (complex questions, repeatability, scheduling, pre-population)

  4. ran a pilot accreditation using the tool

  5. analyzed preliminary data to evaluate the tool’s suitability for quality improvement purposes

  6. adjusted and launched the tool for routine use from January 2022.

Results

Academic and administrative staff of the IPE Centre enlisted teachers offering IPE LAs into an accreditation pilot in June 2021. Completion of the tool takes 20–60 minutes, as teacher applicants work through nine sections – see .

Table 2. Features of accreditation tool (using REDCap).

Of 13 LAs in the pilot, seven immediately met all quality and sustainability criteria. Data for these seven accredited LAs were analyzed in August 2021 using the Quality Framework. Six other LAs needed revision to meet the criteria: one large LA (1,000 students) was comprehensively redesigned; one reviewed its competency domains/ILOs; one deferred accreditation until its next delivery; three small activities were discontinued as unsustainable. In 2022, all 10 remaining LAs satisfied annual reporting for reaccreditation, and two new LAs were accredited to begin addressing identified IPE competency gaps.

Case-based, event-type LAs enrolled most of the eligible early-years students. For students in later years, there were fewer available small-group simulation- or workplace-based LAs. Evidence supported the accuracy and resilience of LA credits assigned to activities of differing complexity and levels of learning. Review of accreditation applications validated the system of points accumulated across three indicators contributing to individual LA credits value (student workload hours, level of learning, competencies) to be applied in due course – see .

Key features () in the tool elicited improvements for IPE LAs. In particular, by using the tool, we were able to refine constructive alignment between an LA’s stated aim, competency domains, ILOs, and chosen assessment approach. We could encourage attention to sustainability: funding, workload, teaching continuity, and supply. Some teaching teams had aimed unrealistically for many ILOs across all/most competency domains with accreditation discussions prompting more judicious selections. Stratified competencies emerged organically: interprofessional communication and role clarification were most consistently addressed in early-years LAs; teamwork, collaborative leadership and followership, and shared decision-making were profiled in later-years LAs; reflective practice was consistent throughout. Gaps appeared (e.g., conflict negotiation was rarely addressed in the teamwork domain). Most importantly, the tool had value for the spiral of IPE learning by facilitating tracking of gaps, duplications, and incoherence across the IPE curriculum.

Discussion

Improved articulation and understanding has streamlined individual LAs to benefit students’ learning and the evolving IPE programme at Otago, with further progress made in 2023. Although the IPE Accreditation Register is only one critical element of a comprehensive information management system still to be completed, the iterative design effort succeeded – at low cost – in testing the Quality Framework that recommended its development, and in producing a simple, effective quality assurance tool. Consolidated data collection is more efficient than multiple uniprofessional schools/programmes replicating effort to demonstrate the same interprofessional competency-based capability of their thousands of students over time. Also, for discipline-specific professional accreditation, the IPE Accreditation Register offers excellent evidence that students successfully undertake accredited IPE.

Pilot and implementation data from the accreditation tool have highlighted uncertainty around the robustness of IPE evaluation and its tools (e.g., use of well-known but unsuitable evaluation instruments), as well as variability of IPE assessment and its tools (different opinions of the value of formal assessment, or reluctance to formally assess IPE LAs; Khalili et al., Citation2022). These themes continue to be explored as the IPE curriculum evolves at Otago, and as the debate around programmatic forms of assessment in uniprofessional programmes transfers to assessment debates in IPE.

Conclusion

In response to emerging imperatives upon IPE to meet quality standards that satisfy the needs of students, staff, curricula, and regulation, the University of Otago developed and piloted a tool to internally accredit its IPE LAs. Through this work, we confirmed the value of the newly adopted IPE Quality Framework; discovered levers for continuous quality improvement in practice; and created the foundation for every student’s LA credits to be tracked across their degree, and for competency-based programmatic evaluation. Effective application of an accessible tool suggests a pathway available to institutions with similar objectives.

Acknowledgments

We acknowledge funding and support from the Division of Health Sciences Centre for Interprofessional Education, University of Otago; and contributions by IPE Centre colleagues and partners, Vanderbilt University, and the REDCap Consortium.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Funding

The work was supported by the Division of Health Sciences Centre for Interprofessional Education, University of Otago.

Notes on contributors

Ashley Symes

Ashley Symes is the Manager of the Centre for Interprofessional Education, Division of Health Sciences, University of Otago. She coordinates policy and strategy development, quality assurance, systems, and logistics for the Centre.

Susan R. Pullon

Susan Pullon is Emeritus Professor at the Centre for Interprofessional Education, and Department of Primary Health Care, Division of Health Sciences, University of Otago. Sue has wide teaching and research experience in interprofessional education and collaborative practice, in primary care, and youth health. She was a general practitioner for over 30 years and is a Distinguished Fellow of the Royal New Zealand College of General Practitioners.

Eileen McKinlay

Eileen McKinlay is Associate Professor and Director of the Centre for Interprofessional Education, Division of Health Sciences, University of Otago. She leads the delivery of IPE learning activities to health sciences students across their training programmes and campus sites. Eileen is a nurse by background, and an experienced researcher in interprofessional education, and primary health care.

References

  • Cox, M., Cuff, P., Brandt, B., Reeves, S., & Zierler, B. (2016). Measuring the impact of interprofessional education on collaborative practice and patient outcomes. Journal of Interprofessional Care, 30(1), 1–3. https://doi.org/10.3109/13561820.2015.1111052
  • Grymonpré, R. E., Bainbridge, L., Nasmith, L., & Baker, C. (2021). Development of accreditation standards for interprofessional education: A Canadian case study. Human Resources for Health, 19(12). https://doi.org/10.1186/s12960-020-00551-2
  • Harris, P. A., Taylor, R., Thielke, R., Payne, J., Gonzalez, N., & Conde, J. G. (2009, April). Research electronic data capture (REDCap) – a metadata-driven methodology and workflow process for providing translational research informatics support. Journal of Biomedical Information, 42(2), 377–381. https://doi.org/10.1016/j.jbi.2008.08.010
  • Health Professions Accreditation Collaborative Forum Australia. (2022). Communiqué: Health professions accreditation collaborative forum – interprofessional education (IPE) project. 2022 09 06 communique developing a collaborative practitioner through accreditation processes (hpacf.Org.au).
  • Institute of Medicine. (2015). Measuring the impact of interprofessional education on collaborative practice and patient outcomes. The National Academies Press. https://doi.org/10.17226/21726
  • Khalili, H., Lackie, K., Langlois, S., Wetzlmair, L. C., & Working Group. (2022). Global IPE situational analysis result final report. InterprofessionalResearch.Global publication (ISBN: 978-1-7366963-2-3). https://interprofessionalresearch.global/
  • Ministry of Health New Zealand. (2019). Health practitioners competence assurance amendment act 2019. https://legislation.govt.nz/act/public/2019/0011/latest/whole.html#LMS12004
  • Pullon, S., & Symes, A. (2019). A curriculum and quality framework for interprofessional education at Otago: Strategic plan 2020-2024. Full report (discussion paper). Centre for Interprofessional Education, Division of Health Sciences, University of Otago. http://hdl.handle.net/10523/15182
  • Reeves, S., Fletcher, S., Barr, H., Birch, I., Boet, S., Davies, N., McFadyen, A., Rivera, J., & Kitto, S. (2016). A BEME systematic review of the effects of interprofessional education: BEME guide no. 39. Medical Teacher, 38(7), 656–668. https://doi.org/10.3109/0142159X.2016.1173663
  • Thistlethwaite, J. E. (2021). Curriculum development in interprofessional education in health. In I. Darmann-Finck & K. Reiber (Eds.), Development, implementation and evaluation of curricula in nursing and midwifery education. Springer. https://doi.org/10.1007/978-3-030-78181-1_12