1,175
Views
0
CrossRef citations to date
0
Altmetric
Patient Safety

A resident-led initiative to improve patient safety event reporting in an internal medicine residency program

, , , ORCID Icon &
Pages 111-116 | Received 25 Nov 2019, Accepted 21 Jan 2020, Published online: 21 May 2020

ABSTRACT

Background

Despite the Clinical Learning Environment Review’s recommendations of their use, patient safety event reporting systems are underutilized by residents.

Objective

We aimed to identify perceived barriers to event reporting amongst internal medicine residents and implement a targeted quality improvement initiative to address the identified barriers and increase overall resident event report rates.

Methods

A total of 94 Internal Medicine (IM) residents participated in the educational intervention in 2018. We measured residents’ perception of barriers to event reporting and employed the results of the questionnaire to create a skill-based educational workshop. We conducted the plan-do-study-act model to test a structured educational intervention and its effectiveness on pre-post IM residents’ event report rates and compared it to report rates of Non-Internal Medicine (Non-IM) residents. Additionally, we assessed pre-post intervention knowledge, skills, and attitudes in event reporting.

Results

94/94 (100%) of IM residents had a significantly higher median percent of patient safety event reporting when compared to pre-intervention (23.6% compared to 5.88%, p-value = 0.0030) and when compared to Non-IM residents (23.6% compared to 5.31%, p-value = 0.0002). Residents performed better on the post-test compared to the pre-test (90% compared to 30%, p-value = 0.0001) for knowledge. 100% of the critical action items were completed and 90% of participants reported their perception of the event reporting process improved.

Conclusions

By elucidating common reasons why residents are not reporting patient safety events, a specific intervention can be created to target the identified impediments and improve resident event reporting.

Abbreviations

IM: Internal Medicine IM; Non-IM: Non-Internal Medicine; IOM: Institute of Medicine I; ACGME CLER: Accreditation Council for Graduate Medical Education Clinical Learning Environment Review; GME: Graduate Medical Education; IRB: Institutional Review Board; PDSA: Plan, Do, Study, Act

1. Introduction and problem description

In the USA, iatrogenic injuries affect approximately 18% of hospitalized patients and cost the healthcare system an excess of 100 USD billion dollars a year [Citation1,Citation2]. Medical errors may cause harm to patients by preventing or delaying appropriate treatment, or by leading to unnecessary and harmful treatment [Citation3,Citation4]. A recent report revealed that approximately 44% of adverse events are preventable, and our healthcare system has been subject to public scrutiny to improve patient safety conditions [Citation5,Citation6]. The Institute of Medicine’s (IOM) landmark report on medical errors propelled patient safety as a primary focus for hospital-based care [Citation7]. Preventable harm due to medical errors is now considered a national public health crisis [Citation8]. Since the IOM’s report, the World Health Organization and other leading health-care organizations have followed suit and published reports highlighting the concerning rates of patient safety events [Citation9]. To this end, we have seen a paradigm shift in healthcare policy and research efforts focusing on patient safety as a systemic problem. A number of organizations have emerged with the mission of improving health-care quality and patient safety with a common mission to create a culture of increased attentiveness to patient safety [Citation5]. However, health-care systems remain rife with medical errors and objective measures fail to demonstrate marked improvements in inpatient safety [Citation10].

Quantifying the magnitude of medical errors is a first step toward improving patient safety [Citation8]. Reporting of health-care errors is an essential component to mitigating patient safety events and is required to initiate meaningful conversation on safety issues. Much of our understanding of patient safety has been predicated on event reporting systems and the subsequent root cause analyses that ensue [Citation11]. A structured reporting system has been used as a vehicle to implement change in multiple institutions [Citation12]. However, underreporting has undermined the value of incident reporting systems [Citation13]. A number of studies reveal that event reporting rates among physicians are markedly low, and implementation of systematic event reporting systems has failed to improve reporting rates [Citation14Citation16]. Graduate medical education in the past has had little focus on improving health-care quality and safety monitoring [Citation17]. However, The Accreditation Council for Graduate Medical Education Clinical Learning Environment Review (ACGME CLER) has sought to improve the quality, safety, and professionalism of the physician workforce by identifying training in event reporting as a key element to physician competency [Citation7].

Resident physicians are at the frontlines of patient care and ideally positioned to recognize and report flaws in our systems of care. There are multiple studies exploring the barriers to physician event reporting. Many have attributed the poor physician involvement in event reporting to a culture entailing fear of blame, lawsuits, and reprisal or punishment [Citation18]. However, the current data on barriers to physician event reporting may dramatically differ from barriers faced by the new generation of physicians-in-training and more research is clearly needed [Citation19]. We attempted to advance the dialogue on event reporting by identifying barriers faced by internal medicine residents within our institution. The identified barriers were then used as a premise to implement systemic change within our residency program. The specific objectives of this study were: (1) establish a baseline understanding of resident physician event report rates; (2) identify perceived barriers to event reporting by IM resident physicians, specifically; (3) deconstruct and address each identified barrier through replicable and sustainable resident-driven initiatives targeted to promote event reporting; (4) measure post-intervention event reporting rates in IM residents and compare rates to Non-IM residents across the institution.

2. Methods

2.1. Baseline measurement

This quality improvement project took place at Stony Brook University Hospital, an academic medical center that serves as the tertiary care center of Suffolk County, New York, from 2017 to 2018. At Stony Brook, residents completed only 15 out of 521 (2.8%) event reports in the year of 2015. The intervention took place in the Internal Medicine Residency program with 94 participants.

2.2. Pre-intervention context

A voluntary and anonymous online 5-point Likert scale-rating six question survey was administered to 94 IM residents ranging from post-graduate level one to three in June 2017. The survey was developed by the authors and designed to assess resident physicians’ perceptions of specific barriers to event reporting at Stony Brook University Hospital. Each question was chosen after careful review and deliberation of well-publicized work in this area [Citation20]. The sixth question allowed residents to select their top three reasons for not reporting an error or free text their reason for not contributing to event reporting systems (). The results of the questionnaire were employed to create a structured educational intervention to specifically address the reported perceived barriers.

Table 1. Proportion (%) of resident trainees’ perceived barriers to event reporting (N = 30).

2.3. Interventions

We used the Plan-Do-Study-Act (PDSA) model of quality improvement to conduct our interventions and test our changes. After each cycle, the interventions were modified, and we acted on what was learned in the prior cycle.

2.3.1. PDSA cycle 1

Aims for PDSA cycle 1: Educational intervention to: a) address the perceived punitive process and b) address knowledge gaps related to the event reporting process (barriers # 1–7).

All total of 94 IM residents participated in a mandatory two-hour workshop on event reporting in January 2018. The first hour entailed an educational didactic with a video presentation filmed by the house staff to depict the most common barriers related to an event report. This was followed by a hands-on facilitated skills workshop where residents analyzed a case-based example of an actual medical error and conducted an event report entry of the error using Stony Brook’s web-based portal called SB safe. Faculty and Chief Resident facilitators directly observed the SB safe event report entry, provided direct feedback, and completed an event report checklist of critical action tasks. Learning outcomes were evaluated using the Knowledge-Skills-Attitudes model to assess the effectiveness of the workshop. All IM residents completed a pre- and post-workshop 7-question knowledge assessment test (supplement 1), a post-workshop 4-point Likert scale 3-question survey to assess perception (supplement 2), and data from direct observation and checklist of task completion (supplement 3) were measured to identify omissions during event report entry.

2.3.2. PDSA cycle 2

Aims for PDSA cycle 2: To demonstrate positive patient impact of reported errors by IM residents and reinforce a non-punitive process (specifically addressing barriers # 8 and # 9 related to perceived lack of benefit in event reporting).

We tracked and collected event report descriptions and outcomes conducted by the internal medicine residents 6 months following the intervention. Repeat didactics on the importance of event reporting in addition to great catches reported by residents were announced during afternoon conference during the month of July (6 months following initial educational intervention). A PowerPoint didactic was created to demonstrate the reported event and the corresponding action plans or outcome of the event report entry.

2.3.3. PDSA cycle 3

Aim for PDSA cycle 3: To specifically address barrier # 5 related to lack of feedback on reported error entries.

In November 2018, 10 months after the initial intervention, a specific feedback strategy was created that entailed email feedback and a thank you note sent to each IM resident after the completion of an event report. Additionally, an event report certificate was created and completed by the Program Directors for each medicine resident in New Innovations, our residency electronic evaluation tracking system. We partnered with our Graduate Medical Education (GME) office to obtain monthly data on reported events by the resident trainees by specialty. The Associate Program Director for Quality and Safety conducted the email feedback and completion of the event report certificate for all IM residents.

2.4. Measurement of outcomes

This project was reviewed by the Institutional Review Board (IRB) and approved as a Quality Improvement Initiative. Pre-intervention data on resident event report rates were collected from January 2017 to December 2017. The intervention took place during a series of five, 2-hour educational workshops in January 2018. Post-intervention event report rates were tracked from February 2018 to January 2019.

The effectiveness of the overall study was measured using quantitative data of number of event reports submitted per month. The system was able to identify if the reporter was a resident physician versus other clinical faculty/staff, unless submitted anonymously, and could also separate which residency program the reporter was in. The intervention was conducted on 94 IM residents, as part of their mandatory patient safety didactics. Non-IM residents refers to all other residents of the institution that belong to other departments (surgery, obstetrics and gynecology, urology, neurology, pediatrics and emergency medicine, psychiatry), this group served as a control for comparison to IM residents. Data were entered into a password-protected encrypted spreadsheet and reporting rates were monitored monthly to help guide subsequent tests of change. We used pre-post intervention retrospective analyses to test a structured educational intervention and its effectiveness on resident event reporting rates. Additionally, learning outcomes were assessed using a pre-post workshop knowledge assessment test, checklist of observed critical actions during electronic event report, and post-workshop attitudinal survey.

3. Results

3.1. Statistical analysis

The statistical analysis was conducted using SAS v9.4. All continuous variables were not normally distributed, therefore when making comparisons between groups Wilcoxon Rank Sum tests or Wilcoxon signed rank test were used. To compare the safety workshop quiz results, a Wilcoxon signed rank test was used because the data is paired and not normally distributed. A Wilcoxon rank sum test was used for all other comparisons because the data is independent and not normally distributed. A p-value of below 0.05 is considered statistically significant. IM resident event report rates are the percentage of IM resident entries divided by the total number of IM residents (N = 94). Non-IM resident event report rates are the percentage of Non-IM resident entries divided by the total number of Non-IM residents in the institution (N = 696). Data for event reporting measures were collected and plotted on run charts which allowed data to be represented in a time-ordered sequence. Scores from post-intervention knowledge assessment test in PDSA cycle #1 were reported as average test scores based on the number of correct answers divided by the total number of test questions. Data from the post-workshop attitude survey were collected and plotted on bar graphs to show percentage of residents who perceived each level of change for each of three statements. Checklists were observed for omission of critical action items and totaled to come up with an average completion percentage.

3.2. Event report rates

During pre-intervention, IM residents did not have a significantly different percent of patient safety event reporting when compared to Non-IM residents at Stony Brook Hospital (5.88% compared to 5.12%, p-value = 0.0812) (). One year following intervention, the median percent of event reporting rates for IM residents increased significantly from 5.88% to 23.6% (p-value = 0.0030). When compared to Non-IM residents at Stony Brook, IM residents had a significantly higher median percent of safety event reporting (23.6% compared to 5.31%, p-value = 0.0002). Although there was a significant increase in event report rates within the IM cohort, the overall event report rates for all residents at Stony Brook Hospital did not significantly change over time (5.12% compared to 5.31%, p-value = 0.5679). Non-IM event reports were further sub-divided into percent reported by specialty areas. Overall, pediatric residents submitted the majority of event reports ().

Table 2. Event reporting rates by IM and non-IM residents.

Table 3. Event reporting rates by non-IM residents.

3.3. Knowledge and skills evaluation results

A total of 94 out of 94 (100%) residents participated in the educational intervention. IM residents performed better on the post-test compared to pre-test (average score 90% compared to 30%, p-value <0.0001) (). 100% of the critical action items were completed upon direct observation of a computerized event report entry in SB safe following the workshop.

Table 4. Knowledge assessment test median scores pre- and post-workshop.

3.4. Survey results

3.4.1. Pre-intervention survey

A total of 30 IM residents out of 94 completed the questionnaire. A total of 30% of respondents stated they had never submitted an event report. 70% of respondents reported moderate-high confidence in their ability to identify a reportable event. Yet, only 13.33% of respondents felt highly confident and 36.67% felt moderately confident in the process of event reporting. When asked if there had been an error that should have been reported but was not, 67% of residents reported occasionally to frequently encountering such an event. Additionally, 63.33% of residents selected ‘not at all likely’ were they to report a near miss if encountered. The most commonly selected reasons for not reporting an error were residents ‘don’t want to tell on someone else’ (44.83%), ‘don’t know what types of events to enter’ (34.48%), ‘lack of feedback on the issue’ (27.59%), and disbelief that reporting is an ‘anonymous process’ (31.03%) ().

3.4.2. Post-workshop attitudinal survey results

A total of 63 out of 94 IM residents completed the post-workshop attitudinal survey. Over 90% of the participants report that their perception and appreciation of the event report process is improved. 91% of the residents are more likely to enter a SB safe report as a result of the workshop (not graphed). Additionally, 92% of the residents agreed that their comfort level in event reporting improved after the workshop.

4. Discussion

We present a quality initiative which identified barriers to event reporting with subsequent interventions that resulted in a significant increase in total event report entries by our IM residents. While several methods of educational activities geared towards improving safety event report rates have been previously described in the literature [Citation21], our study offers a model that specifically addresses resident perceived barriers with a targeted intervention for resident trainees. Resident trainees are frontline providers and are well suited to not only identify, but also to conduct safety event reports and contribute to the overall safety mission of an institution. Residency programs are charged with ensuring that trainees acquire the skills to recognize patient safety errors. Our study showed significant improvement in IM resident knowledge and attitudes on event reporting following the educational intervention. Given that knowledge deficits about reporting events has been shown to be a leading factor in underreporting [Citation16], we feel improving knowledge and attitude through our educational intervention attributed to a positive patient safety culture change and increased event reporting by our residents. To our knowledge, this is the first study on event reporting that entails the development of a video-based didactic, filmed by house staff to specifically address resident perceived barriers to event reporting. Our results suggest that this may be a promising strategy to engage residents in event reporting and improve overall event report rates.

We believe that the safety event report initiative is highly replicable at other institutions. We believe the success of our program was reliant on four important factors: 1) Resident-driven approach to emphasize peer-to-peer demonstration of the barriers; 2) Involvement of the chief residents to further highlight peer discussion around event reporting of patient safety errors; 3) Buy-in and support from the GME office in providing the data on resident event report rates; 4) Support from the Program Directors to endorse the initiative to encourage residents to report events. We believe these strategies were effective at integrating our residents into our institutional safety reporting system and improving overall IM resident event reporting rates.

Although barriers to event reporting have been well described in the literature [Citation20,Citation21], the incidence and type of barriers may vary across institutions based on safety culture, policies, and event reporting systems. We believe the success of our program was reliant on the initial assessment of our own institution’s perceived barriers specifically faced by IM resident trainees. We suggest other programs survey resident trainees for their own institutional barriers prior to creating an effective and targeted intervention.

A major challenge we encountered was sustainability in improving event report rates. We noted an initial surge of event reports immediately after PDSA cycle 1 interventions and small surges after PDSA cycle 2 and 3. We also experienced evaporation effects at various points throughout the study. July 2018 demarcates the transition of graduating and incoming resident physicians, and it is unsurprising that the effects of PDSA cycle 1 were diminished at that point. The subsequent rise and falls after each PDSA cycle highlight the need for ongoing monitoring and constant re-enforcement of the initiatives to avoid the evaporation effect of quality interventions. Similarly, the continued overall low trend of event reporting by resident physicians compared to all hospital reports highlights the need for assessment of perceived barriers which may vary between different departments based on their respective safety culture. Likely multiple longitudinal educational interventions will be needed to integrate patient safety and discussion of medical errors into the everyday responsibilities of the resident. As emphasized by the 2008 study by Kaldjian et al., in order to improve physician reporting of medical errors, it is important to demonstrate that reporting leads to results [Citation16]. Thus, we feel our interventions which focused on closing the loop with reporters, providing positive feedback, and also peer to peer sharing of great catches may be critical ways to create Just Culture and maintain the motivation needed for sustainability.

5. Limitations

There are several limitations to this study beyond the inherent limitations of a pre-and post-observation design. First, it was a single-center study at a large academic residency program, thus limiting its generalizability. The data collected in this study were derived from a single institution and only reflect our event reporting computerized system (SB safe). Reliable information on specific barriers to event reporting is institution-specific; therefore, it is difficult to recommend a standardized approach to improve safety event reporting. A dedicated faculty member as well as our chief residents were key players in enabling our interventions. Additionally, we relied heavily on our GME department to provide us with monitoring and tracking of resident reports by sub-specialty area. We acknowledge that our intervention required significant resources, and this may be difficult for other programs to obtain.

Although assessment of perception of barriers to safety event reporting was a crucial aspect of our study, we received a low response rate of only 30 residents out of a total 94 in the IM residency program. Therefore, it is possible that we are missing other significant barriers that could have been addressed.

We created a checklist of critical task completion and conducted direct observation of mock event reporting by the residents. Our checklist is specific to our institutional web-based event report system, this was not a validated tool. Additionally, we did not directly observe the resident event report entries in the actual clinical environment. We also did not have specific requirements from the residency program in terms of the number of event reports. We tracked event reports from our SB safe system and conducted certification of completion in New Innovations, our residency program tracking system. Further studies may explore whether direct observation and feedback during spontaneous event reports during clinical patient encounters may impact the number of event reports by resident physicians and milestone-related competency such as systems-based practice.

Another limitation of the study is the lack of outcomes data on actual process improvements on all reported safety events. Since our focus was on rates of safety events reported by residents, we did not measure patient outcomes as a result of our interventions. However anecdotally, many of the great catches by residents highlighted during afternoon conference during PDSA cycle #2 did bring about system-wide changes at our institution that have the potential to prevent future medical errors and improve patient outcomes. We were also unable to track anonymous reports that may have been filed by residents. The goal of increasing event reporting is to improve patient care. We believe that future research may look at the impact of resident event reports on patient care and process outcomes.

6. Conclusions

In conclusion, the barriers to event reporting amongst IM resident physicians are multifaceted. There are elements of unfamiliarity with event reporting systems, distrust of the system, and lack of awareness of the benefits of event reporting. Having elucidated the common reasons why residents were not reporting safety errors, we successfully implemented a replicable and structured learning program to minimize the identified impediments. Deconstructing each respective barrier through an educational intervention can be successful in increasing event reporting rates. Our next steps include expansion of the interventions across other departments to impact overall change in event reports for all residents within our institution.

Supplemental material

Supplemental Material

Download MS Word (17.8 KB)

Acknowledgments

The authors wish to thank the resident members of the Stony Brook University Internal Medicine Program, with special mention to the Internal Medicine Patient Safety Quality Council. We wish to thank Dr. Matthew Berger for his skillful filming and editing of the video and those who participated in acting. We sincerely thank the Residency Program Director, Dr. Susan Lane, for her support in endorsing the initiative. We acknowledge and thank our GME department (Laura Kantor and Dr. William Wertheim) for providing us with the monthly data.

Disclosure statement

No potential conflict of interest was reported by the authors.

Supplementary material

The supplemental data for this article can be accessed here.

References

  • Incidence of adverse events and negligence in hospitalized patients. N Engl J Med. 1991;325(3):210.
  • Bates DW, Spell N, Cullen DJ, et al. The costs of adverse drug events in hospitalized patients. Adverse drug events prevention study group. Jama. 1997;277(4):307–311.
  • Campione JR, Mardon RE, McDonald KM. Patient safety culture, health information technology implementation, and medical office problems that could lead to diagnostic error. J Patient Saf. 2018.
  • Gandhi TK, Kachalia A, Thomas EJ, et al. Missed and delayed diagnoses in the ambulatory setting: a study of closed malpractice claims. Ann Intern Med. 2006;145(7):488–496.
  • Perez B, Knych SA, Weaver SJ, et al. Understanding the barriers to physician error reporting and disclosure: a systemic approach to a systemic problem. J Patient Saf. 2014;10(1):45–51.
  • Levinson D. Adverse events in hospitals: national incidence among medicare beneficiaries. Washington, DC: US Department of Health and Human Services, Office of the Inspector General; 2010.
  • Institute of Medicine Committee on Quality of Health Care in, A. To err is human: building a safer health system. Kohn LT, Corrigan JM, Donaldson MS, editors. In: National academies press (US) copyright 2000 by the national academy of sciences. Washington (DC): All rights reserved; 2000.
  • Lind DP, Andresen DR, Williams A. Medical errors in Iowa: prevalence and patients’ perspectives. J Patient Saf. 2018;1. DOI:10.1097/PTS.0000000000000523
  • Organization WH. The World health report: 2000: health systems: improving performance. Geneva: World Health Organization; 2000.
  • Pronovost PJ, Cleeman JI, Wright D, et al. Fifteen years after to err is human: a success story to learn from. BMJ Qual Saf. 2016;25(6):396–399.
  • Shojania KG, Wald H, Gross R. Understanding medical error and improving patient safety in the inpatient setting. Med Clin North Am. 2002;86(4):847–867.
  • Reznek MA, Barton BA. Improved incident reporting following the implementation of a standardized emergency department peer review process. Int J Qual Health Care. 2014;26(3):278–286.
  • Cullen DJ, Bates DW, Small SD, et al. The incident reporting system does not detect adverse drug events: a problem for quality improvement. Jt Comm J Qual Improv. 1995;21(10):541–548.
  • Rowin EJ, Lucier D, Pauker SG, et al. Does error and adverse event reporting by physicians and nurses differ? Jt Comm J Qual Patient Saf. 2008;34(9):537–545.
  • Milch CE, Salem DN, Pauker SG, et al. Voluntary electronic reporting of medical errors and adverse events. An analysis of 92,547 reports from 26 acute care hospitals. J Gen Intern Med. 2006;21(2):165–170.
  • Kaldjian LC, Jones EW, Wu BJ, et al. Reporting medical errors to improve patient safety: a survey of physicians in teaching hospitals. Arch Intern Med. 2008;168(1):40–46.
  • Fox MD, Bump GM, Butler GA, et al. Making residents part of the safety culture: improving error reporting and reducing harms. J Patient Saf2017:1. DOI:10.1097/PTS.0000000000000344
  • Wolf ZRH, Hughes RG. Advances in patient safety error reporting and disclosure. In: Hughes RG, editor. Patient safety and quality: an evidence-based handbook for nurses. Rockville (MD): Agency for Healthcare Research and Quality (US); 2008.
  • Leape LL, Berwick DM. Five years after to err is human: what have we learned? JAMA. 2005;293(19):2384–2390.
  • Poorolajal J, Rezaie S, Aghighi N. Barriers to medical error reporting. Int J Prev Med. 2015;6(1):97.
  • Lawton R, Parker D. Barriers to incident reporting in a healthcare system. Qual Saf Health Care. 2002;11(1):15–18.