2,600
Views
0
CrossRef citations to date
0
Altmetric
Report

The evidence base for interprofessional education within health professions education: A protocol for an update review

, , , , , , , ORCID Icon, & ORCID Icon show all
Pages 515-518 | Received 21 Jan 2022, Accepted 28 Jun 2022, Published online: 28 Aug 2022

ABSTRACT

Interprofessional education (IPE) interventions aiming to promote collaborative competence and improve the delivery of health and social care processes and outcomes continue to evolve. This paper reports on a protocol for an update review that we will conduct to identify and describe how the IPE evidence base has evolved in the last 7 years. We will identify literature through a systematic search of the following electronic databases: Medline, Embase, CINAHL, Education Source, ERIC, and BEI. We will consider all IPE interventions delivered to health professions students and accredited professionals. Peer-reviewed empirical research studies published in any language from June 2014 onwards will be eligible for inclusion. The outcomes of interest are changes in the reaction, attitudes/perceptions, knowledge/skills acquisition, behaviors, organizational practice, and/or benefits to patients. We will perform each task of screening, critical appraisal, data abstraction, and synthesis using at least two members of the review team. The review will enable an update and comprehensive understanding of the IPE evidence base to inform future IPE developments, delivery and evaluation across education and clinical settings.

Introduction

The authors of two previous comprehensive reviews systematically searched the interprofessional education (IPE) literature up to June 2014, identifying a total of 46 high-quality studies (Hammick et al., Citation2007; Reeves et al., Citation2016). These reviews have been key to informing ongoing development of IPE initiatives and policies internationally, cited by international bodies including the World Health Organization and the World Bank (Altmetric, n.d.). Echoing the conclusions of the 2007 review (Hammick et al., Citation2007), Reeves and colleagues 2016 update found that central issues concerning the context of the organization in which IPE was implemented, as well as the characteristics of participants, and IPE teaching and learning processes, continued to resonate in the delivery of IPE. Learners still had positive reactions to IPE, with evidence of improvements in attitudes and collaborative competence (Reeves et al., Citation2016); however, as also noted in the initial review, there remained sparse evidence on changes in learners’ behavior, organizational practice, and benefits to service users (Hammick et al., Citation2007; Reeves et al., Citation2016).

We report on a protocol for updating the previous reviews by Hammick et al. (Citation2007) and Reeves et al. (Citation2016), in which we aim to identify and describe how the IPE evidence base has evolved in the intervening period. The objective in our update review is to consider the effectiveness of different types of IPE interventions on a range of outcomes. These outcomes include impact on the modification of learner attitudes and perceptions, acquisition of knowledge and skills, subsequent change in organizational practice, and/or benefits to patients/clients. We expect that the update review will encourage and inform curriculum planners in designing future IPE interventions. We also expect that the update review will help policy makers, researchers, and grant funders to discern priorities for development in this field.

Background

Since the publication of the two previous reviews, the IPE field has continued to grow internationally, and this is reflected in the increasing number of publications and regular international conferences (”Interprofessional Practice and Education Center,” Citation2021; Bulcke et al., Citation2016; Cardarelli et al., Citation2018; Collins et al., Citation2017; Djukic et al., Citation2015; Naumann et al., Citation2020). Our review update is therefore timely given the continued interest and investment in IPE by researchers, educators, practitioners, and policymakers. Despite this positive trend, the evidence base continues to show signs of fragmentation, which introduces uncertainty about the direction and magnitude of the effects of IPE (Reeves et al., Citation2013).

IPE refers to “occasions when two or more professions learn with, from and about each other to improve collaboration and the quality of care (CAIPE, Citation2002, para. 3). It is a specific kind of educational intervention, uptake of which is evident worldwide with a view to strengthening the collaborative capacity and practice of health professionals (Barr et al., Citation2005; Kitto, Chesters et al., Citation2011). Moreover, through consequent improvements in the efficiency and quality of clinical practice, IPE is also regarded as having potential to improve the safety and quality of patient care (Reeves et al., Citation2011).

Over recent years, IPE has been a key feature of pre- and post-qualification health and social care education (Bulcke et al., Citation2016; Naumann et al., Citation2020), and of continuing professional education offering to qualified clinicians (Cardarelli et al., Citation2018). Although it is generally understood that IPE has strong potential to improve learners’ collaborative attitudes, knowledge, skills, and behaviors, it remains a relatively young field with a rapidly growing evidence base (Collins et al., Citation2017; Djukic et al., Citation2015; Naumann et al., Citation2020). In our update review, we are particularly interested in assessing whether impacts at higher level outcome measures, such as clinical behavior, patient and organizational outcomes, have been evidenced over the last 7 years.

Methods

This paper reports on a protocol of a systematic review in accordance with the reporting guidance provided by the Preferred Reporting Items for Systematic Review and Meta-Analysis Protocols (PRISMA-P) criteria (Moher et al., Citation2015).

Inclusion and exclusion criteria

Types of participants

We will include study interventions that target the following professional health and social care groups: chiropodist/podiatrist, complementary therapists, dentists, dieticians, hygienists, midwives, nurses, occupational therapists, paramedics, pharmacists, physicians, physician associates, psychologists, psychotherapists, physiotherapists, radiographers, social workers, speech therapists, sports and exercise medicine professionals, assistant practitioners, care or case coordinators, medical assistants/aids, and managers. The studies may evaluate IPE delivered to undergraduate health professions students, IPE to postgraduate students, IPE at the in-service continuing professional development (CPD) level, or IPE to a mixture of learners such as pre-qualification students and qualified staff.

Types of intervention

An IPE intervention will be defined as: when members of more than one health and/or social care profession learn interactively together, for the explicit purpose of improving the health or well-being of patients/clients (Reeves et al., Citation2013). Interactive learning requires active learner participation, and active exchange between learners from different professions. We will consider all designs of IPE interventions for our review that are within the scope of the above definition.

Types of comparison

If a comparison group is included, we will include studies that compare other forms of IPE or other learning against the intervention.

Types of outcome measures

The outcome measures will be based on Barr and colleagues’ extended version of Kirkpatrick’s classic educational outcomes model: Level 1 – Reaction; Level 2a – Modification of attitudes/perceptions; Level 2b – Acquisition of knowledge/skills; Level 3 – Behavioral changes; Level 4a – Change in organizational practice; Level 4b – Benefits to patients/clients (Barr et al., Citation2005).

Types of studies

We will consider all research designs identified by Patton (Citation2014): applied, evaluation, basic, and participatory action research (i.e., experimental studies, quasi-experimental studies, action research, case study, and qualitative studies, published in the peer-reviewed literature). We will not include studies that were analyzed in the previous reviews (Hammick et al., Citation2007; Reeves et al., Citation2016) in the update review search. However, all these studies will be combined at the analysis phase.

Information sources and searching

Our review will search the following electronic databases for publications from June 2014 onwards to update the latest review: Medline and Embase on the OVID platform; CINAHL, ERIC, Education Source, and BEI on the EBSCO platform. The search strategies for the databases are included in Appendix 1 (online supplement). Due to variations in how each of the electronic databases employ key terms (subject headings, key words), the search strategy for Ovid MEDLINE has been adapted for each electronic database. In addition to database searching, we will conduct hand searches of interprofessional journals, such as the Journal of Interprofessional Care, Journal of Research in Interprofessional Practice and Education, and Health and Interprofessional Practice. We will also perform a manual search of the reference lists of the relevant articles to consider additional studies for potential inclusion.

Screening and selection process

We will import all the database search results into Covidence (Veritas Health Innovation Ltd.), where duplicate records will be removed. In the first level of screening, at least two PhD level screeners will independently screen all titles and abstracts. The full text article will be obtained if the abstract suggests the following: (a) the intervention resulted in interprofessional exchange; (b) learning took place; (c) learner, professional practice, change in organizational practice, patient care processes or health and satisfaction outcomes are reported; (d) the intervention was evaluated using an appropriate design. In the second level of screening, at least two senior members of the review team will independently screen the full text of articles deemed relevant from first level of screening, to determine eligibility. We will not exclude papers based on language of publication. We will use professional translation software and/or translators to translate non-English documents to English, when necessary. All conflicts from level 1 screening will be resolved by AWF; all conflicts from level 2 screening will be resolved by SK or AX.

Quality assessment

Two dyads (4 reviewers working in independent pairs) will assess the methodological quality of each of the studies that pass the second level of screening. For the purposes of advancement and transparency, we will employ a new tool to perform the quality appraisals in this review and future update reviews. In particular, we will assess the quality of each study using the Mixed Methods Appraisal Tool (MMAT) – V.2018 (Hong et al., Citation2018). The MMAT is designed for systematic reviews that include various study designs (Hong et al., Citation2019). It contains criteria to assess the quality of qualitative research, quantitative (subdivided into randomized-controlled trials, non-randomized studies, and descriptive studies), and mixed-methods studies (e.g., sequential explanatory design). The ability to concurrently appraise the various study designs using a single tool will improve the efficiency and consistency of our appraisal process (Hong et al., Citation2019; Pluye, Citation2015).

For all study designs, only articles categorized as high quality will be selected for data abstraction, analysis, and synthesis. The use of this approach will aim to identify the most rigorous IPE studies available. The MMAT V. 2018 includes five separate questions for each category of study designs [See Appendix 2, online supplement]. These questions will be answered with Yes, (1 point) No (0 points) or Can’t tell (0.5 points). Studies that receive a score of at least 4/5 in the appropriate section of the MMAT will be judged as high-quality and included in the review. Any disagreements during appraisal will be settled through discussion. In the absence of consensus, disagreements will be reviewed and resolved by SK or AX.

Abstraction of included studies

We will extract the general characteristics, methodological information, and outcome information of the high-quality papers included in our review. Following the abstraction procedure described in the two previous reviews, our data extraction process aims to generate basic descriptive information from each paper. Data from the included studies will be abstracted into one of the two coding sheets employed in a previous review (Reeves et al., Citation2016) [see data abstraction sheets – Appendix 3 and 4, online supplements]. We will use the abstracted data as the basis for analysis.

The Quantitative Data Abstraction sheet will be used where studies have used quantitative methods of data collection [see Appendix 3, online supplement]. The Qualitative Data Abstraction sheet will be used where studies have used qualitative methods of data collection [see Appendix 4, online supplement]. For mixed-method studies, we will use the Quantitative Abstraction sheet to extract the quantitative components, and the Qualitative Abstraction sheet to extract the qualitative components. These coding sheets will help to ensure consistency across the qualitative and quantitative data collection methods and reflect the unique features of their approaches. At least two members of the review team will independently code a 20% sample of the full-texts into the appropriate abstraction sheets to ensure consistency and reliability between the reviewers. Discrepancies and disputes will be resolved through discussion. In the absence of consensus, disagreements will be reviewed and resolved by SK or AX.

Analysis and synthesis

Experience from the previous reviews (Hammick et al., Citation2007; Reeves et al., Citation2016) indicates that very few of the variables used in the papers will be ratio data; some will be interval data, and others will be categorical data. This means that a standard multivariate analysis will not be possible, therefore we expect to employ non-parametric methods for the analysis. Due to the heterogeneity of IPE interventions identified in the previous reviews (different curriculum content, duration of courses, participating professional groups) and study designs (quasi-experimental, exploratory, action-orientated), we speculate that a pooled estimate of the impact of IPE through a meta-analysis will not be possible. The nature of education research in this field also makes a meta-analysis unlikely. Therefore, the studies identified from the update search will be added to the existing 46 studies to form a single narrative of all included studies.

The previous reviews employed Biggs’ presage-process-product (3P) model of learning and teaching to help understand IPE research in relation to contextual factors, educational processes, and associated outcomes (Biggs, Citation1993). We will similarly employ the 3P model as an analytical framework to synthesize the abstracted data from all the included studies. At least two members of the review team will independently distill issues from the papers that can be mapped onto the 3-P model. This work will involve populating the presage, process, product sections with extracted points. A draft narrative will be produced based on this work. We will discuss and refine the synthesized narrative of the included studies linking IPE presage with IPE processes and products. For further details on the use of the 3P model in our previous review, see, Reeves et al. (Citation2016).

Discussion

A key contribution of updating the previous reviews will be to synthesize the best, current available evidence to inform future IPE developments, delivery and evaluation across education and clinical settings. We also expect that this update will inform other IPE stakeholders including managers, policy makers, and practitioners, of the effects of various IPE initiatives on longer-term outcomes, including service delivery and patient care.

We anticipate three key limitations to our review: (a) the inability to complete a meta-analysis due to the known heterogeneity of IPE interventions; (b) the exclusion of non-English or French language articles in the previous reviews may have omitted some high-quality studies published in other languages. Our intention is to be more inclusive in this review and in future updates; and (c) the possibility of positive publication bias in the IPE literature, as well as the dominance of Western research in biomedical databases. These risks are recognized in the health sciences literature (Ayorinde et al., Citation2020; Joober et al., Citation2012), including the knowledge translation field (Kitto, Sargeant et al., Citation2012); however, these issues have not been appropriately acknowledged in the IPE field. To ameliorate these risks, we have put together a diverse, and highly experienced project team from multiple institutions. Other strategies to improve the rigor and ameliorate potential risks in this review include adherence to a reporting guideline, comprehensive searching of the literature, and the planned use of multiple review members for each stage of the review.

Conclusion

In the proposed update review, we aim to provide up-to-date evidence of the effectiveness of IPE interventions on collaborative competence and the delivery of health and social care processes and outcomes. In updating the findings from two previous reviews, our review will help inform curriculum developers and educators about the utility of different IPE interventions delivered in various contexts.

Contributions

SK, AWF and AX designed and drafted this short report protocol manuscript. ND, AM, JR, IB, SF, and HB contributed to the conceptualization and design of the previous review. KF developed the search strategies for this short report protocol manuscript. ND, AM, JR, and IB reviewed and provided critical comments on the short report. AWF, SK and AX revised the short report based on the critical comments. All authors read and approved this final manuscript.

Supplemental material

Supplemental Material

Download Zip (126.8 KB)

Disclosure statement

The authors declare that they have no conflicts of interest. The authors alone are responsible for the content and writing of this short report.

Supplementary material

Supplemental data for this article can be accessed online at https://doi.org/10.1080/13561820.2022.2097651.

Additional information

Funding

The author(s) reported there is no funding associated with the work featured in this article.

References

  • Altmetric. (n.d.). A BEME systematic review of the effects of interprofessional education: BEME Guide No. 39. Overview of attention for article published in Medical Teacher; 2016. https://tandf.altmetric.com/details/7169241
  • Ayorinde, A. A., Williams, I., Mannion, R., Song, F., Skrybant, M., Lilford, R. J., & Chen, Y.-F. (2020). Publication and related biases in health services research: A systematic review of empirical evidence. BMC Medical Research Methodology, 20(1), 137. https://doi.org/10.1186/s12874-020-01010-1
  • Barr, H., Koppel, I., Reeves, S., Hammick, M., & Freeth, D. S. (2005). Effective interprofessional education: Argument, assumption and evidence (Promoting partnership for health). Wiley-Blackwell.
  • Biggs, J. B. (1993). From theory to practice: A cognitive systems approach. Higher Education Research & Development, 12(1), 73–85. https://doi.org/10.1080/0729436930120107
  • Bulcke, B., Van den Vyt, A., Vanheule, S., Hoste, E., Decruyenaere, J., & Benoit, D. (2016). The perceived quality of interprofessional teamwork in an intensive care unit: A single centre intervention study. Journal of Interprofessional Care, 30(3), 301–308. https://doi.org/10.3109/13561820.2016.1146876
  • CAIPE. (2002). The Centre for the Advancement of Interprofessional Education. https://www.caipe.org/
  • Cardarelli, R., Elder, W., Weatherford, S., Roper, K. L., King, D., Workman, C., Stewart, K., Kim, C., & Betz, W. (2018). An examination of the perceived impact of a continuing interprofessional education experience on opiate prescribing practices. Journal of Interprofessional Care, 32(5), 556–565. https://doi.org/10.1080/13561820.2018.1452725
  • Collins, A., Broeseker, A., Cunningham, J., Cortes, C., Beall, J., Bigham, A., & Chang, J. (2017). A longitudinal online interprofessional education experience involving family nurse practitioner students and pharmacy students. Journal of Interprofessional Care, 31(2), 218–225. https://doi.org/10.1080/13561820.2016.1255600
  • Djukic, M., Adams, J., Fulmer, T., Szyld, D., Lee, S., Oh, S.-Y., & Triola, M. (2015). E-Learning with virtual teammates: A novel approach to interprofessional education. Journal of Interprofessional Care, 29(5), 476–482. https://doi.org/10.3109/13561820.2015.1030068
  • Hammick, M., Freeth, D., Koppel, I., Reeves, S., & Barr, H. (2007). A best evidence systematic review of interprofessional education: BEME Guide no. 9. Medical Teacher, 29(8), 735–751. https://doi.org/10.1080/01421590701682576
  • Hong, Q. N., Pluye, P., Fàbregues, S., Bartlett, G., Boardman, F., Cargo, M., Dagenais, P., Gagnon, M. P., Griffiths, F., Nicolau, B., O’Cathain, A., Rousseau, M. C., & Vedel, I. (2019). Improving the content validity of the mixed methods appraisal tool: A modified e-Delphi study. Journal of Clinical Epidemiology, 111, 49–59. https://doi.org/10.1016/j.jclinepi.2019.03.008
  • Hong, Q. N., Pluye, P., Fàbregues, S., Bartlett, G., Boardman, F., Cargo, M., Dagenais, P., Gagnon, M.-P., Griffiths, F., Nicolau, B., Rousseau, M.-C., & Vedel, I. (2018). MIXED METHODS APPRAISAL TOOL (MMAT) VERSION 2018 User guide . http://mixedmethodsappraisaltoolpublic.pbworks.com/
  • Interprofessional Practice and Education Center. (2021). https://ipe.iu.edu/scholarship/presentations/
  • Joober, R., Schmitz, N., Annable, L., & Boksa, P. (2012). Publication bias: What are the challenges and can they be overcome? Journal of Psychiatry & Neuroscience: JPN, 37(3), 149–152. https://doi.org/10.1503/jpn.120065
  • Kitto, S., Chesters, J., Thistlethwaite, J., & Reeves, S. (2011). Sociology of interprofessional health care practice: Critical reflections and concrete solutions. Nova Science Publishers.
  • Kitto, S., Sargeant, J., Reeves, S., & Silver, I. (2012). Towards a sociology of knowledge translation: The importance of being dis-interested in knowledge translation. Advances in Health Sciences Education, 17(2), 289–299. https://doi.org/10.1007/s10459-011-9303-6
  • Moher, D., Shamseer, L., Clarke, M., Ghersi, D., Liberati, A., Petticrew, M., Shekelle, P., Stewart, L. A., & Group, -P.-P. (2015). Preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P) 2015 statement. Systematic Reviews, 4(1), 1. https://doi.org/10.1186/2046-4053-4-1
  • Naumann, F. L., Nash, R., Schumacher, U., Taylor, J., & Cottrell, N. (2020). Interprofessional education clinical placement program: A qualitative case study approach. Journal of Interprofessional Care, 0(1), 1–8. https://doi.org/10.1080/13561820.2020.1832448
  • Patton, M. Q. (2014). Qualitative research and evaluation methods: Theory and practice (4th ed.). Sage Publications.
  • Pluye, P. (2015). Mixed kinds of evidence: Synthesis designs and critical appraisal for systematic mixed studies reviews including qualitative, quantitative and mixed methods studies. Evidence-Based Medicine, 20(2), 79. https://doi.org/10.1136/EBMED-2014-110158
  • Reeves, S., Fletcher, S., Barr, H., Birch, I., Boet, S., Davies, N., McFadyen, A., Rivera, J., & Kitto, S. (2016). A BEME systematic review of the effects of interprofessional education: BEME Guide No. 39. Medical Teacher, 38(7), 656–668. https://doi.org/10.3109/0142159X.2016.1173663
  • Reeves, S., Lewin, S., Espin, S., & Zwarenstein, M. (2011). Interprofessional teamwork for health and social care. John Wiley & Sons.
  • Reeves, S., Perrier, L., Goldman, J., Freeth, D., & Zwarenstein, M. (2013). Interprofessional education: Effects on professional practice and healthcare outcomes. The Cochrane Database of Systematic Reviews, 2013(3). CD002213. https://doi.org/10.1002/14651858.CD002213.pub3