403
Views
1
CrossRef citations to date
0
Altmetric
Brief Report

Benefits of Providing Feedback and Utilisation Metrics to Specialists on Their Participation in eConsult

, , , , &
Article: 2116193 | Received 20 Apr 2022, Accepted 18 Aug 2022, Published online: 02 Sep 2022

ABSTRACT

Our study evaluates the impact of feedback sent to specialists participating in eConsult services. eConsult Specialists from two eConsult services in Ontario, Canada, received feedback on their use of eConsult via bi-annual specialist reports. An 11-item survey was developed to evaluate the impact, content, and distribution process of these specialist reports. We distributed 742 specialist reports in March 2021 and surveyed the specialists in July 2021. Our findings show that specialists largely felt that the feedback received validated their efforts (83%) and that receiving the report made them more likely to continue to participate in the eConsult service (59%). Most did not feel judged (74%) or distressed (79%) by the reports, and 72% said that reporting the median self-reported billing time did not impact their own billing times. Overall, eConsult services can capture, report and aggregate data valuable to specialists and is useful for Continuing Professional Development. Benefits and lack of risk implementing this type of feedback should encourage other services to consider similar processes.

Introduction

There is increasing expectation that specialists will participate in practice audits and peer review as continuing professional development (CPD) and maintenance of certification (MOC) [Citation1]. Unfortunately, many physicians are unsatisfied and concerned with current MOC activities and their relevance [Citation2]. CPD needs to be grounded and guided by daily practice and feedback data from performance and practice [Citation3]. Normative feedback about performance relative to peers may be particularly important in new health care delivery models where standards are not clearly established and skills are developed with experience rather than previous training.

eConsult services are technology-enhanced systems allowing providers, such as primary care providers (PCPs) or specialists, to ask clinical advice from a specialist consultant [Citation4]. eConsult services provide a unique opportunity for direct communication between providers, thus fostering education, professionalism and collegiality [Citation5,Citation6]. They also enable case-based data collection that can be aggregated and may include feedback from requesting providers through surveys or free text comments. This can be shared with providers as part of quality improvement (QI) and engagement activities.

Based in Ontario, Canada, the Ontario eConsult service and the Champlain BASE eConsult service (www.econsultontario.ca) are part of the Ontario eConsult programme. Both include a mandatory survey within their workflow that requesting providers complete before closing cases. To maintain a timely and effective eConsult service, the eConsult Centre of Excellence (COE) distributes bi-annual reports to all qualifying specialists who participated in eConsult services.

Our study evaluates impact and usefulness of bi-annual specialist reports distributed in March 2021 by collecting feedback on its impact, content, format and process from participating eConsult specialists. Here, we explore satisfaction with and effectiveness of individualised bi-annual reports as feedback provided to specialist consultants from PCPs during electronic consultations.

Methods

Settings and Participants

This study is set in Ontario, which is divided into 5 distinct and diverse healthcare regions. The federal government and the Ontario Ministry of Health finance public health services, community and hospital care, and most primary care and specialist funding [Citation7]. Specialists participating in the Ontario and Champlain BASE eConsult services are compensated with an hourly rate pro-rated based on self-reported billing time. In December 2020, there were 941 specialists and 3506 requesting providers active (i.e. participated in 3 or more eConsults within six months) on the two services. From July 2020 and December 2020, an average of 5979 eConsults were completed monthly across both services.

Report Preparation and Distribution

The eConsult COE creates and distributes specialist reports to specialists having provided 5 or more eConsults within the designated 6-month period (January to June and July to December) when participating in the two eConsult services. Reports are distributed the third month following the reporting period (i.e. September and March). Reports include metrics and direct feedback from requesting physicians and nurse practitioners on completed eConsults, and compare their metrics within their speciality and the entire eConsult service ().

Figure 1. An example eConsult Specialist Report.

Figure 1. An example eConsult Specialist Report.

To create reports, we analyse raw data for each service to generate metrics for the service as a whole, each speciality grouping and each individual specialist (). Cross-sectional descriptive analysis is completed for the 6-month time period of interest for qualifying specialists. Number of cases is counted for each user, speciality, and service, along with the average time billed and percentage of eConsults responded to within 7 days, and results for the close-out survey are also calculated. Leveraging Microsoft® Power Automate and other connected Microsoft® 365 software, metrics are input, and individual metrics translated onto a report template, exported as a PDF and emailed to the appropriate provider. Testing and quality checks are performed before and during the distribution process, ensuring that accurate reports are sent to their corresponding providers. This requires the support of three staff with ~ 20–30 hours each over 3–4 weeks.

Figure 2. Process for Preparation and Distribution of eConsult Specialist Reports.

Figure 2. Process for Preparation and Distribution of eConsult Specialist Reports.

Survey Development and Distribution

Created by consensus from the eConsult programme clinical leads and management team, an 11 item survey (), including items on the specialist’s feedback report usage, what they found most useful, their perceived value of the report and influence on their behaviour. On July 7th, 2021, the survey was sent utilising an online survey platform to all specialists who had received an individualised eConsult Specialist Report in March 2021 (n = 742) and remained open for 2 weeks with a reminder email sent to recipients after one week. Surveys were anonymous and did not collect identifying data. We summarised results using descriptive analysis and identified major themes from free text responses.

Table 1. Survey Questions and Response Options.

Results

Specialist Report Preparation & Distribution

Between March 15th and 31st, 2021, we prepared and sent eConsult specialist reports to all specialists who had provided 5 or more eConsults between July and December 2020 on the Ontario eConsult Service (568 distinct reports sent) and/or the Champlain BASE™ eConsult Service (214 distinct reports sent). The median number of eConsults provided by specialists per speciality over the 6-month reporting period was 21 eConsults (range 5–620).

Survey Results

A total of 244 (33%) recipients completed the survey; 158 (65%) were participants of the Ontario eConsult Service, 49 (20%) were participants of the Champlain BASE™ service, 23 (9%) were participants in both services, and 14 (6%) did not identify the service they participated in. Psychiatry, Neurology, Cardiology, Paediatrics, Endocrinology, Infectious Diseases and Haematology were the highest represented specialisations of respondents (combined 45%). Of all the responses, 123 (50%) specialists stated that they reviewed the report in detail, 78 (32%) said they glanced at it, and only 5 (2%) did not use the reports. Eighty-eight respondents (36%) submitted their review of the report for MOC credits with their regulatory college. Most (n = 177, 73%) found the sections of the report with direct feedback from requesting PCPs on the impact on patient care the most useful component. The reporting of their utilisation metrics (i.e. number of eConsults provided, responsiveness and billing time) and the ability to compare their results with their peers were identified by 142 (58%) and 162 (66%) respondents, respectively, as most useful.

Specialists found the reports’ format acceptable (n = 215, 88%) and found them easy to understand (n = 211, 87%). The majority supported the continued distribution of the reports in general (n = 211, 87%), with 172 (71%) satisfied with the bi-annual distribution timeline.

Receiving the report made most specialists feel like they were providing the advice required (n = 199, 82%) and more likely to continue participating in the eConsult service (n = 143, 59%). This was highlighted through the free text comments, with one specialist stating, “my participation is predicated on the fact that I feel I am making a positive impact and that I am providing quality service”. Only 15 (6%) specialists found the information in the reports surprising, while 202 (83%) found the reports validated the effort put into answering eConsults. One specialist highlighted that “it is definitely motivating to know that[their] responses have met the needs of the referring physicians”.

Reporting on the median self-reported billing time of the specialists and comparing it to peers in their speciality and the service did not impact the billing times for most specialists (n = 176, 72%). One specialist indicated that they “ … like to validate that [their] mean billing time is similar to others”.

Specialists did not find that the reports made them feel judged (n = 180, 74%) nor caused them distress (n = 194, 79%). One specialist indicated that they “ … enjoy reading critical reports”, while another stated that “[specialists] should be judged by users, that’s what evaluation means. [They are] ok with that”. Specialists (n = 183, 76%) were not concerned that reports would be used against them by a regulatory body, hospital administration, or other.

Evaluation of Specialist Report Preparation & Distribution Process

Based on the support for continuation and positive feedback received through this survey and informed by ad hoc feedback from specialists, process improvements were put in place to ease the logistical challenges and ensure sustainability. To ensure continuous quality improvement, iterative process improvements were implemented, including (i) further automation of document creation and distribution utilising Microsoft® 365 software suite, (ii) addition of staff resourcing and time allocation, (iii) updated and more detailed process documentation, and (iv) improved quality assurance processes to ensure accuracy of reports ().

Figure 3. Distribution Workflow MS Power Automate.

Figure 3. Distribution Workflow MS Power Automate.

Discussion

Receiving feedback and comparison from peers can be an effective method of CPD activities. Our results show that providing reports to specialists regarding their performance in eConsult is an acceptable method of providing feedback. Specialists indicated that they use the reports for Maintenance of Certification (MOC) and value receiving feedback from requesting providers and comparisons to their peers, particularly those in the same speciality. Our study can inform other eConsult services worldwide to adopt this method for augmenting specialist CPD activities. . There was no evidence that providing specialists with feedback led to an increase in perceived threat, negative feelings, or disengagement. There was strong consensus that they should continue to be provided with few recommendations for changes.

Despite eConsult services being well-developed and available in many jurisdictions, we are not aware of other systematically generated evaluation of specialist reports. Although there is increased awareness of the importance of developing skills in virtual care, including eConsult services, participation in eConsult services is new for most physicians and will not have been part of their training prior to going into practice. Traditionally, audit and feedback processes compare an individual’s performance to an established professional standard and accepted standards are not defined for eConsult services, peer comparisons are used as a surrogate marker.

Although feedback from PCPs is an important motivator for specialists participating in eConsult services, it has been associated with negative emotions and risk of disengagement in other settings [Citation8,Citation9]. The data provided must be meaningful and credible for feedback to be engaging [Citation10]. For example, one study of PCPs who completed a QI module and were provided pre- and post-performance feedback noted the importance of accurate data, enhanced detail in the content of feedback, and ability to customise peer comparison groups to compare performance to peers with similar patient populations or practice characteristics [Citation11].

A recent study has shown that feedback comparing physicians to their top-performing peers using other specialists’ ratings improves performance [Citation12]. This clustered randomised trial included 80 speciality clusters and 214 specialist consultants, and outcome measures included 1) elicitation of information from primary care practitioners; 2) adherence to institutional clinical guidelines; 3) agreement with peer’s medical decision-making; 4) educational value; and 5) relationship building. Rating colleagues’ responses and receiving individualised feedback resulted in significant improvements on 3 of the 5 consultation performance dimensions: medical decision-making, educational value, and relationship building. This required a new workflow and manual rating done by other specialists (not PCPs).

Such studies show that reporting feedback to specialist consultants is critical to improving consult advice and, thus, more streamlined and efficient patient care. A unique feature of the included eConsult services is the need for the specialist to self-report their billing time. Although survey responses indicate that peer comparison for billing time did not alter behaviour for individual providers, we were pleased to see that the reports caused few negative emotions.

Evaluating existing processes and making iterative improvements based on direct provider feedback is paramount to continuous quality improvement. This study allowed process improvements to be implemented; this can help sustain the programme management team’s ability to provide these highly valued specialist reports to the eConsult specialists.

Our study is limited by the response rate of 33% and a single, albeit large, geographical area with two provincial services. Our service may be unique in the types of data collected about the services and thus other services may not be able to include all the information our services can include in the reports. We do not have data on whether the reports change specialist behaviour; we only have their perception. This is an area for future research.

Conclusion

eConsult services can capture, report and aggregate data valuable to specialists and is useful for CPD. The benefits and lack of risk implementing this type of feedback should encourage other services to consider a similar process.

Disclosure statement

Dr. Liddy and Dr. Keely are co-founders of the Champlain BASE™ eConsult Service, but have no commercial interest in the service and do not retain any proprietary rights. As Co-Executive Directors of the Ontario eConsult Centre of Excellence, they receive salary support from the Ontario Ministry of Health. Dr. Keely answers eConsults (less than one per month) as a specialist through the service, for which she is reimbursed.

Additional information

Funding

Funding for this project was provided by the Ontario Ministry of Health. The opinions, results and conclusions reported in this paper are those of the authors and are independent from the funding sources. No endorsement by the Ontario MOHLTC is intended or should be inferred. The views expressed herein do not necessarily reflect the views of the Province.

References

  • Cordovani L, Wong A, Monteiro S. Maintenance of certification: how do we teach practicing physicians? Can Med Ed J [Internet]. updated 2019 Dec 5; cited 2021 Dec 13]; Available from: https://journalhosting.ucalgary.ca/index.php/cmej/article/view/53065
  • Cook DA, Blachman MJ, West CP, et al. Physician attitudes about maintenance of certification. Mayo Clin Proc. 2016 Oct;91(10):1336–7.
  • Sargeant J, Bruce D, Campbell CM. Practicing physicians’ needs for assessment and feedback as part of professional development. J Contin Educ Health Prof. 2013;33(Suppl. 1):S54–62.
  • North F, Uthke LD, Tulledge-Scheitel SM. Integration of e-consultations into the outpatient care process at a tertiary medical centre. J Telemed Telecare. 2014 Jun;20(4):221–229.
  • Keely E, Williams R, Epstein G, et al. Specialist perspectives on Ontario provincial electronic consultation services. Telemed E-Health. 2019 Jan;25(1):3–10.
  • Liddy C, Abu-Hijleh T, Joschko J, et al. eConsults and learning between primary care providers and specialists. Fam Med. 2019 Jul 5;51(7):567–573.
  • Canada H. Canada health act [Internet]. 2004 [cited 2022 Jan 7]. Available from: https://www.canada.ca/en/health-canada/services/health-care-system/canada-health-care-system-medicare/canada-health-act.html
  • Keely E, Drosinis P, Afkham A, Liddy, C. Perspectives of Champlain BASE Specialist Physicians: Their Motivation, Experiences and Recommendations for Providing eConsultations to Primary Care Providers, et al. In: Stud Health Technol Inform. Vol. 209. ; 2015. p. 38–45.
  • Payne VL, Hysong SJ. Model depicting aspects of audit and feedback that impact physicians’ acceptance of clinical performance feedback. BMC Health Serv Res. 2016 Dec;16(1):260.
  • Cooke LJ, Duncan D, Rivera L, et al. The Calgary Audit and Feedback Framework: a practical, evidence-informed approach for the design and implementation of socially constructed learning interventions using audit and group feedback. Implementation Sci. 2018 Dec;13(1):136.
  • Eden AR, Hansen E, Hagen MD, et al. Physician perceptions of performance feedback in a quality improvement activity. Am J Med Qual. 2018 May;33(3):283–290.
  • Meeker D, Friedberg MW, Knight TK, et al. Effect of peer benchmarking on specialist electronic consult performance in a los angeles safety-net: a cluster randomized trial. J Gen Intern Med [Internet]. 2022 Sep 9 [updated 2021 Dec 13; cited 2021 Dec 13];37(6):1400–1407. Available from: https://link.springer.com/10.1007/s11606-021-07002-1