747
Views
1
CrossRef citations to date
0
Altmetric
Research Article

Personalised versus non-individualised case-based CME: A randomised pilot study

, , , , , , , , , , , , & show all
Article: 2153438 | Received 23 Aug 2022, Accepted 26 Nov 2022, Published online: 29 Nov 2022

ABSTRACT

The PinPoint Case Platform (PPCP) offers independent online case-based CME. To align with personal learning needs, a functionality of needs assessments (“QuickScan”) was developed, directing users to follow personalised case journeys. A randomised study was conducted, comparing its effectiveness, time efficiency and user experience with a format of non-individualised case-based learning. Forty-two residents in urology from five European countries were randomly assigned to follow non-individualised case-based learning (control group) or a needs assessment plus personalised case journeys on different topics in prostate cancer. After performing a pre- and post-assessment, both groups showed a similar increase in test scores (Mann-Whitney U = 247; p = .113), but the time needed for completing the learning exercise was significantly lower in the group with the personalised approach (median: 45 vs 90 minutes; Mann-Whitney U = 97.5; p = .0141). The quality of the two learning methods was similarly well received by both groups. In conclusion, learners who followed personalised case journeys learned similarly effective but more time efficient than non-individualised case-based learners. Future studies should determine if these findings can be extrapolated to board-certified physicians following CME activities.

This article is part of the following collections:
Special Collection 2022: Innovation and Impact in CME/CPD

Introduction

Continuing medical education (CME) should aid healthcare professionals (HCPs) to apply new scientific evidence into clinical practice with the goal to improve clinical decision making and eventually also patient health outcomes. The steadily increasing volume of literature to assimilate puts the time available for CME under pressure. Because traditional CME formats (e.g. lectures and self-study of scientific literature) often fail to change the clinical decision performance of physicians, many CME providers are directing their efforts towards the development of more interactive and competency-based educational strategies [Citation1–3]. Especially during times of the COVID-19 pandemic and growing environmental awareness, digitisation of CME has become increasingly relevant [Citation4,Citation5]. Compared with traditional learning formats, online CME is more easily accessible, time efficient, self-paced and cost saving [Citation6]. Besides these advantages, several studies have demonstrated that online CME can exert significant improvements in knowledge and clinical practice [Citation7–10]. However, digitisation of educational materials as such is not the key to success. Online educational materials should be short, highly relevant and patient-focussed [Citation11–13]. These aspects may all be achieved by offering online CME in a case-based format with educational content that is ideally tailored to the specific needs of the individual learner. Although needs assessments are becoming well integrated into CME programmes, they mostly focus on educational needs at an aggregate level. Nevertheless, greater performance may be achieved when online CME addresses individual gaps in knowledge, skills and practice performance [Citation11,Citation12,Citation14].

Based on the above-described insights and premises, the Mirrors of Medicine™ PinPoint Case Platform (PPCP) (https://ppcp.mirrorsmed.org/) was developed and launched in 2018 (). It is the first online case-based learning platform to receive accreditation by the European Union of Medical Specialists – European Accreditation Council for Continuing Medical Education (UEMS-EACCME). The PPCP provides independent online CME with free access to students, residents and healthcare professionals. The content development is supported by multiple unrestricted educational grants. Recently, a new feature was added to the platform allowing users to follow personalised case journeys (PCJs) directed by the outcomes of individual needs assessments (QuickScans) (). To understand the potential benefits of adding this feature to the PPCP, we conducted a randomised study comparing the effectiveness, time efficiency and user satisfaction of non-individualised case-based learning with learning through PCJs addressing individual learning needs identified by QuickScans.

Table 1. Characteristics and functionalities of the PinPoint Case Platform (PPCP).

Materials and Methods

Study Population

The study population consisted of residents in urology (all study years) from five European countries invited by their supervisors who contribute as experts to the PPCP. The choice for residents was made to facilitate recruitment of a homogeneous study population. Participants needed to be unfamiliar with the platform and willing not to use the platform outside the context of the study. In addition, they were instructed not to use information from any other educational sources during the conduct of the study and participated voluntarily to the study after providing written informed consent.

Study Design

Participants were block randomised to follow non-individualised case-based learning (control group) or a personalised approach consisting of a QuickScan plus PCJs (QS-PCJ group) (). Before learning on the PPCP, participants of the control group had to complete an electronic survey, consisting of 27 assessment questions that were derived from the relevance ranking lists generated during the development of the QuickScans. The participants of the QS-PCJ group were requested to complete the same 27 assessment questions but on the PPCP in the form of a QuickScan, providing targeted feedback by showing the correct answer with supporting evidence. After completion of the pre-test, the control group was invited to study five topics on the PPCP, while the QS-PCJ group was requested to complete the selected PCJs based on the individual learning needs identified through the QuickScan. Ten days after the start, both groups received an electronic survey, including the same 27 pre-assessment questions but in a different order. Participants were also requested to report the time spent to study the five topics (control group) or to document how much time was needed to complete both the QuickScan and PCJs (QS-PCJ group). To evaluate user experience, participants of both groups were asked to assess different aspects of the PPCP on a 9-point Likert scale.

Figure 1. Overview of the study design. The same 27 pre-assessment questions were included in the electronic survey and QuickScan but, in contrast to the electronic survey, the QuickScan provided targeted feedback (correct answer with supporting evidence).

Figure 1. Overview of the study design. The same 27 pre-assessment questions were included in the electronic survey and QuickScan but, in contrast to the electronic survey, the QuickScan provided targeted feedback (correct answer with supporting evidence).

Statistical Analysis

Descriptive variables are summarised as medians with interquartile ranges. The test scores on the pre- and post-assessment were calculated as percentages of the maximum score (maximum score = 27). Given the nature of the data, we did not assume a normal distribution and non-parametric statistical tests (Wilcoxon test and Mann-Whitney U test) were used. All statistical analyses were performed using the IBM SPSS Statistics 28 software package (SPSS Inc.).

Results

Study Population

The study population consisted of 42 junior and senior residents in urology from different centres across five European countries. Out of the 42 enrolled residents, 39 completed both the pre- and post-assessment. Participants with incomplete data (n = 3) were excluded from the analysis. Basic characteristics of the population by study arm are shown in . Statistical analyses, where possible due to the low numbers, did not reveal any significant differences between the two study groups.

Table 2. Description of the study population.

Learning Process

Participants from both groups completed the 27 questions of the pre-assessment or QuickScan. Following the pre-assessment, all residents in the control group went through the indicated five topics, including a total of 25 cases. In the QS-PCJ group, the number of PCJs to follow was dependent on the outcome of the QuickScan and varied between 1 and 5. Three-quarters had to follow at least four PCJs with all participants fully completing the individually assigned case journeys.

Learning Effects

After completing the non-individualised case-based learning activity on the PPCP, median test scores significantly improved from 66.7% to 81.5% (Z = 163; p = .006) in the control group ()). A significant increase was also observed in the QS-PCJ group, showing an improvement in test scores from 74.1% to 92.6% (Z = 185; p < .0001) upon completion of the individually assigned case journeys ()). Although the change in the percentage of correct answers was slightly higher in the QS-PCJ group, differences did not reach statistical significance (Mann-Whitney U = 247; p = .113) ()).

Figure 2. Comparison of learning effects between the control and QS-PCJ group. (A) Pre- and post-test scores of the control group. (B) Pre- and post-test scores of the QS-PCJ group. (C) Change in the percentage of correct answers in both groups. The graphs represent the median values with interquartile ranges. QS-PCJ: QuickScan plus Personalised Case Journey.

Figure 2. Comparison of learning effects between the control and QS-PCJ group. (A) Pre- and post-test scores of the control group. (B) Pre- and post-test scores of the QS-PCJ group. (C) Change in the percentage of correct answers in both groups. The graphs represent the median values with interquartile ranges. QS-PCJ: QuickScan plus Personalised Case Journey.

Time Efficiency

The median time reported by the participants in the control group to study all five topics on the PPCP was 90 minutes. Even though one outlier was included in the control group, the self-reported learning time was significantly lower in the QS-PCJ group, reporting a median time of 45 minutes to complete both the QuickScan and PCJs (Mann-Whitney U = 97.5; p = .0141) ().

Figure 3. Self-reported time spent with learning on the PPCP. The graph represents the median values with interquartile ranges. QS-PCJ: QuickScan plus Personalised Case Journey.

Figure 3. Self-reported time spent with learning on the PPCP. The graph represents the median values with interquartile ranges. QS-PCJ: QuickScan plus Personalised Case Journey.

Learner Experience and Assessment of Quality Aspects

Both groups showed a high and similar appreciation of studying on the PPCP scoring a median of 8 on a 9-point scale for five different aspects (). Several other aspects on the quality of the platform received similar high scores without significant differences between the groups (Supplementary file 1). Additionally, participants of both groups were as likely to recommend the PPCP to a colleague.

Table 3. Perception of studying on the PPCP.

Discussion

End of 2018, the PPCP was launched via a weekly case programme, offering interactive online case-based CME by easy-to-digest, self-directed and competency-based learning content that can be continuously updated to advancements in clinical practice. Three years after its launch, the PPCP has around 3,600 active users mainly in the field of uro-oncology [Citation16]. Recently, a feature for allowing personalised learning on the PPCP has been developed, offering users the possibility to follow PCJs based on the outcomes of individual needs assessments (QuickScans). This feature is built on the premise that CME will accomplish the greatest performance gains when tailored to the needs of the individual learner. When reviewing learning needs, a distinction should be made between perceived and unknown learning needs [Citation17]. Perceived learning needs, which are usually self-recognised, may not necessarily reflect a person’s “actual” learning needs [Citation17]. Hence, it is not surprising that there is only a low correlation between self-assessments of physicians and their performance on objective knowledge tests [Citation1]. An effective needs assessment should, therefore, aim to identify a learner’s unknown needs [Citation1,Citation17]. On the PPCP, this has recently been achieved by implementing QuickScans aiming to reveal true performance gaps that may have been unknown to the learner. Anticipating on individual performances and/or knowledge gaps was the target for the generation of adaptive PCJs fitting the need of every single learner focussing on true unperceived learning needs instead of educational learning needs that address a group of learners as a whole. By customising the learning pathways, the PCJs allow each learner to focus on individual gaps in a certain learning domain. This approach represents a simplified version of an adaptive e-learning strategy, known as “curriculum sequencing” [Citation18,Citation19]. In addition, the QS-PCJs on the PPCP are adapted to the speciality of the learner within a disease area, but to develop a more sophisticated, algorithmic approach within the framework of adaptive e-learning more data need to be collected that will allow further tailoring of the PCJs based on the knowledge level of the individual learner. Besides being adaptive, the PCJs integrate the concept of learning from mistakes by repeating incorrectly answered cases or questions until they are correctly answered. This concept is believed to further increase learning gains through emphasising discrepancies between the learner’s choice and the opinion of peers along with providing evidence to support the correct answer to ultimately induce changes in individual performances [Citation20–22].

In this randomised pilot study, we compared the short-term learning effects of non-individualised case-based learning (control) with a personalised approach consisting of the QuickScan and related PCJs (QS-PCJ). The results showed that both learning methods significantly improved learning outcomes, and are, thus, effective in inducing short-term learning effects and potentially changes in clinical performance. Although improvement was higher in the group following PCJs, the difference in test scores between the two groups did not reach statistical significance, most likely due to the small group sizes. Despite this lack of statistical significance in improvement of test scores, participants following PCJs achieved similar short-term learning effects in significantly less time. It should be noted that the learning time reported by the QS-PCJ group included both the time spent on the QuickScan and PCJs, while the control group only reported the time needed to study the five topics on the PPCP. This represents the actual learning situation on the PPCP, where users have the possibility to either study topics based on their own interests or to conduct the QuickScan to focus on their individual learning needs. In addition, the QuickScans on the PPCP are shorter (maximum 10 questions), and therefore, the actual time efficiency of the personalised learning approach is probably underestimated in the current study. Regardless of the learning format that was followed by the participants, all aspects related to the perception of studying and quality of the content were very well received.

Although randomised, the herein reported study results are merely descriptive and hence, a follow-up randomised controlled trial should be designed to confirm the findings. In the current study , the limited number of participants most likely contributed to the relatively high degree of variation in the test scores. In addition, the time spent on the platform was self-reported by the participants, which could have biased to some extent the time efficiency results. For the purpose of this study, only residents were included because this facilitated (easy) recruitment of a homogeneous study population. Although senior residents had slightly higher scores on the pre-assessment than residents in the first years of their training, their distribution was equal between groups and did, therefore, not affect the study outcomes. However, as the PPCP is focussed on continuing medical education, residents are not the most representative of the target audience. In addition, they were recruited by their supervisors, which may have affected their incentive to participate and their performance on the learning activities. Therefore, additional studies should determine if the findings are also valid for board-certified physicians and other healthcare professionals.

Conclusions

To the best of our knowledge, the PPCP (https://ppcp.mirrorsmed.org/) is the first platform offering both non-individualised and personalised case-based learning. Both learning methods were shown to be effective in inducing short-term learning effects; however, following a tailored programme was significantly more time efficient without compromising the short-term learning outcomes. Because both learning methods were equally effective and similarly well received by the participants, the PPCP offers users the possibility to either choose for non-individualised or personalised case-based learning depending on personal preferences and interests, though mentioning time saving as the principle advantage of the tailored approach. Future trials will be initiated to further understand the impact of both learning methods on inducing measurable short- and long-term competence, and performance gains with focus on board-certified physicians and other healthcare professionals.

List Of Abbreviations

CME: Continuing Medical Education; HCP: Healthcare Professional; PPCP: PinPoint Case Platform; QS-PCJ: QuickScan plus Personalised Case Journey(s); UEMS-EACCME: European Union of Medical Specialists - European Accreditation Council for Continuing Medical Education

Availability Of Data And Materials

The data that support the findings of this study are available from Ismar Healthcare ([email protected]) but restrictions apply to the availability of these data, which were used under license for the current study, and so are not publicly available. Data are, however, available from the authors upon reasonable request and with permission of Ismar Healthcare.

Ethics Approval And Consent To Participate

Because the study is not to be considered as an experiment with human beings and not related to any medical intervention, formal institutional review was not considered necessary. In addition, participants were recruited via invitation of their supervisors in various universities in different countries, but participation was in a personal capacity, and not institution-related. Nevertheless, the study complies with Good Research Practices, following the principles outlined in the Declaration of Helsinki through obtaining written informed consent from all participants before the start of this non-medical study. All data was handled, processed and stored in line with GDPR, ensuring that only personal data relevant to the study (country, year of residency) were collected.

Supplemental material

Supplemental Material

Download MS Word (33.3 KB)

Acknowledgments

The authors gratefully acknowledge the efforts of all experts, residents and Ismar Healthcare staff members who contributed to the development and maintenance of the PinPoint Case Platform.

Disclosure Statement

AB has received meeting support and honorarium from Sanofi, Astellas, Janssen, Bayer and AAA Pharmaceutical. NRMM, JS, JT, FVdA and TZ declare no competing interests related to the present study. NH and JY are employees of Ismar Healthcare. LS, LVR and HS are partners in Ismar Healthcare and e-HIMS. MJS has received honoraria from Astellas for lectures/chairmanship. BFT has received grants/research supports or honoraria from Amgen, Astellas, Janssens, Ferring, Sanofi, Bayer and Myovant. MCM has received speaker and/or consultancy honorary from Apogepha, Astellas, Dr. Willmar Schwabe, GSK and Sanofi-Aventis.

Supplemental data

Supplemental data for this article can be accessed online at https://doi.org/10.1080/21614083.2022.2153438.

Additional information

Funding

The research and development of the individualised learning platform was funded by a grant received by the Flanders Agency for Innovation and Entrepreneurship (VLAIO) (grant agreement HBC.2019.2822).

References

  • VanNieuwenborg L, Goossens M, De Lepeleire J, et al. Continuing medical education for general practitioners: a practice format. Postgrad Med J. 2016;92(1086):217–7.
  • Cervero RM, Gaines JK. The impact of CME on physician performance and patient health outcomes: an updated synthesis of systematic reviews. J Contin Educ Health Prof. 2015;35(2):131–138.
  • Bloom BS. Effects of continuing medical education on improving physician clinical care and patient health: a review of systematic reviews. Int J Technol Assess Health Care. 2005;21(3):380–385.
  • Shah S. The technological impact of COVID-19 on the future of education and health care delivery. Pain Physician. 2020;23(4S):S367–S80.
  • Gravas S, Ahmad M, Hernandez-Porras A, et al. Impact of COVID-19 on medical education: introducing homo digitalis. World J Urol. 2021;39(6):1997–2003.
  • Setia S, Tay JC, Chia YC, et al. Massive open online courses (MOOCs) for continuing medical education - why and how? Adv Med Educ Pract. 2019;10:805–812.
  • Casebeer L, Brown J, Roepke N, et al. Evidence-based choices of physicians: a comparative analysis of physicians participating in Internet CME and non-participants. BMC Med Educ. 2010;10(1):42.
  • Emami Z, Kouhkan A, Khajavi A, et al. Knowledge of physicians regarding the management of type two diabetes in a primary care setting: the impact of online continuous medical education. BMC Med Educ. 2020;20(1):374.
  • Fordis M, King JE, Ballantyne CM, et al. Comparison of the instructional efficacy of internet-based CME with live interactive CME workshops: a randomized controlled trial. JAMA. 2005;294(9):1043–1051.
  • Williams JG. Are online learning modules an effective way to deliver hand trauma management continuing medical education to emergency physicians? Plast Surg (Oakv). 2014;22(2):75–78.
  • Cook DA, Blachman MJ, Price DW, et al. Educational technologies for physician continuous professional development: a national survey. Acad Med. 2018;93(1):104–112.
  • Lee BC, Ruiz-Cordell KD, Haimowitz SM, et al. Personalized, assessment-based, and tiered medical education curriculum integrating treatment guidelines for atrial fibrillation. Clin Cardiol. 2017;40(7):455–460.
  • Praharaj SK, Ameen S. The relevance of telemedicine in continuing medical education. Indian J Psychol Med. 2020;42(5 Suppl):97S–102S.
  • Norman GR, Shannon SI, Marrin ML. The need for needs assessment in continuing medical education. BMJ. 2004;328(7446):999–1001.
  • Brook RH, Chassin MR, Fink A, et al. A method for the detailed assessment of the appropriateness of medical technologies. Int J Technol Assess Health Care. 1986;2(1):53–63.
  • PinPoint Case Platform Annual Report 2021, ISSECAM & E-HIMS, Lier, Belgium.
  • Thampy H. How to … identify learning needs. Educ Prim Care. 2013;24(2):138–140.
  • Sengupta S. A review of the adaptive features of e-learning. Int J Learn Teach Educ Res. 2018;4(4):277–284.
  • Magoulas GD, Papanikolaou Y, Grigoriadou M. Adaptive web-based learning: accommodating individual differences through system’s adaptation. Br J Educ Technol. 2003;34(4):511–527.
  • Fischer MA, Mazor KM, Baril J, et al. Learning from mistakes. Factors that influence how students and residents learn from medical errors. J Gen Intern Med. 2006;21(5):419–423.
  • Koo A, Smith JT. Does learning from mistakes have to be painful? Analysis of 5 years’ experience from the Leeds radiology educational cases meetings identifies common repetitive reporting errors and suggests acknowledging and celebrating excellence (ACE) as a more positive way of teaching the same lessons. Insights Imaging. 2019;10(1):68.
  • Kerfoot BP, Lawler EV, Sokolovskaya G, et al. Durable improvements in prostate cancer screening from online spaced education a randomized controlled trial. Am J Prev Med. 2010;39(5):472–478.