3,346
Views
24
CrossRef citations to date
0
Altmetric
Web Paper BEME Guide

Educational interventions to improve the meaningful use of Electronic Health Records: A review of the literature: BEME Guide No. 29

, , , , , & show all
Pages e1551-e1560 | Published online: 12 Jul 2013

Abstract

Background: Electronic health records (EHRs) are increasingly available and this was expected to reduce healthcare costs and medical errors. This promise has not been realized because healthcare professionals are unable to use EHRs in a manner that contributes to significant improvements in care, i.e. meaningful. Policymakers now acknowledge that training healthcare professionals in meaningful use is essential for successful EHR implementation. To help educators and policymakers design evidence based educational interventions (i.e. interventions that involve educational activities but no practical lessons) and training (i.e. interventions that involve practical components), we summarized all evidence regarding the efficacy of different educational interventions to improve meaningful use of EHRs.

Methods: We used a predefined search filter to search eight databases for studies that considered an educational intervention to promote meaningful use of EHRs by healthcare professionals.

Results: Seven of the 4507 reviewed articles met the in- and exclusion criteria.

Conclusions: These studies suggest that a combination of classroom training, computer-based training and feedback is most effective to improve meaningful use. In addition, the training should be tailored to the needs of the trainees and they should be able to practice in their own time. However, the evidence is very limited and we recommend that governments, hospitals and other policymakers invest more in the development of evidence based educational interventions to improve meaningful use of EHRs.

Introduction

Rising healthcare costs, inefficient delivery of care and unsatisfactory quality of patient care are problems faced by many governments of industrialized countries (Emanuel et al. Citation2012; Hass et al. Citation2012; Keehan et al. Citation2012; Vavken et al. Citation2012). The widespread implementation of Electronic Health Records (EHRs) is widely regarded as an essential component of government policies to address these problems (Watson Citation2012).

An EHR is a repository of patient data in an electronic form, stored and transmitted securely, and accessible by multiple authorized users (I.S.O. 2005). EHRs are used in primary, secondary and tertiary care and their main purpose is to support continuing, efficient and integrated healthcare. Chaudhry et al. (Citation2006) performed a systematic review to assess the effects of the implementation of EHRs. They concluded that EHRs have multiple benefits over paper records. Three major benefits on quality were an increased adherence to guideline-based care, enhanced surveillance and monitoring, and decreased medication errors (Chaudhry et al. Citation2006). In addition, the storage of patient data in a computer processable form provides opportunities for researchers, policymakers, health service managers and medical educators. This has led policymakers to acknowledge that the establishment of a nationwide EHR network will increase the cost effectiveness of healthcare systems. Governments from a variety of countries have developed formal national EHR adoption programs to increase the availability of EHRs (Committee & Force Citation2008; Jha et al. Citation2008).

The United Kingdom (UK) was one of the first nations to invest in health information technology by formulating the national program for Information Technology in 2003. This program intends to enable the formation of a new EHR network which enables the storage of clinical information, electronic transfer of prescriptions, outpatient scheduling and the use of patient data to provide anonymized business reports and statistics for research and public health purposes. Countries such as South Africa, Sweden, Germany, France and the Netherlands also provide funding to support committees that develop policies for the implementation of a national EHR system. Other countries such as Israel and Japan have not instituted a national EHR adoption program but have high EHR adoption rates because of competition between hospitals. The United States of America (USA), Singapore and India also have relied on the private sector and competition to fuel EHR adoption (BH Gray & Bowden Citation2011). These worldwide developments stimulate a world healthcare IT market that is expected to grow from $99.6 billion in 2010 to $162.2 billion in 2015 (Marketsandmarkets.com Citation2011).

These tremendous government and hospital efforts will inevitably result in high adoption rates of EHRs (Blumenthal & Tavenner Citation2010). However, now that the first cost-effectiveness results of countries and hospital that successfully implemented the EHRs are published, it is becoming clear that the increased adoption of EHRs does not necessarily result in a reduction of healthcare costs or an increase in the quality of care. Indeed, the introduction of EHRs has been accompanied by an increase in medical errors and mortality in some settings (Koppel et al. Citation2005; Han et al. Citation2005; Sittig et al. Citation2006). There is widespread consensus among EHR experts that the inability of healthcare professionals to use the available EHRs in ways that contribute to healthcare improvements is an important factor that precludes the realization of the full potential of EHRs.

Policymakers in the USA have introduced the term meaningful use of EHRs to distinguish the use of EHRs as simple replacements for paper records from using the full functionality of the EHR software (Classen & Bates Citation2011). Meaningful use is defined as EHR use that contributes to achieve significant improvements in the quality of care (Classen & Bates Citation2011). Governments around the world have translated these or similar definitions of meaningful use into pay for performance programs in which clinicians are financially rewarded for the meaningful use of EHRs (Gray et al. Citation2011).

The criteria for meaningful use developed in the USA are the most extensively formulated and have also been used to evaluate meaningful use in other countries. According to this program, physicians must meet a number of criteria to qualify for financial bonuses. All physicians must meet a set of core objectives that include tasks that are considered essential to improve healthcare quality; the entry of basic data into the EHR, the request of clinical orders through the EHR, the use of clinical decision support and the use of computerized physician order entry. In addition, a clinician has to meet at least five of the following additional criteria: the use of drug formulary checks, the incorporation of clinical laboratory test results into EHRs as structured data, the use of the EHR for quality improvement by generating patient lists ordered by specific conditions, the use of EHR technology to identify patients who need specific educational resources, to use EHRs to perform medication reconciliation between care settings, to use the EHR to provide summary of care records for patients who are transitioned to another healthcare setting, to submit immunization data to information systems and to submit electronic syndrome surveillance data to public health agencies (Blumenthal & Tavenner 2010). The goal of policymakers to achieve meaningful use of EHRs by all healthcare providers according to these criteria is ambitious considering the changes of workflow and work processes that accompany the implementation, including meaningful use, of sophisticated EHRs.

It is important that governments, hospitals or private companies provide education to stimulate meaningful use among healthcare professionals. Healthcare educators have the important task to develop and evaluate the effectiveness of educational interventions to improve meaningful use. However, the design of evidence-based interventions is hindered by scarcity of primary studies and meta-analyses evaluating the effect of educational interventions on meaningful use.

Here, we report a systematic review of the literature on evidence-based educational interventions and training to improve the meaningful use of EHRs. Our aim was to provide evidence to guide healthcare educators in the design of evidence-based educational interventions to improve the meaningful use of EHRs.

Methods

Objective

To identify, summarize and synthesize all existing evidence regarding the efficacy of educational interventions (i.e. interventions that involve educational activities but no practical lessons) and training (i.e. interventions that involve practical components) that have been used to improve meaningful use of EHRs. We aimed to determine which educational interventions or what aspects of training are effective to improve meaningful use of the EHR in healthcare professionals.

Types of studies

The inclusion and exclusion criteria for this review are summarized in and . We considered any article concerning empirical research into the effectiveness of interventions to promote the meaningful use of EHRs in healthcare professionals. This included but was not limited to studies that collected qualitative data, used comparative and non-comparative research designs and included randomized clinical trials, non randomized trials, time series, surveys, focus groups and observational studies.

Table 1  Inclusion criteria

Table 2  Exclusion criteria

Types of participants

Participants in selected studies had to be physicians, nurses, residents and other healthcare professionals providing clinical care to patients. For example, specialist nurses, nurse practitioners, registered nurses, physicians, medical residents and paramedics. Studies that involved healthcare professionals not providing direct clinical care, such as administrative employees, medical students and medical researchers, were excluded.

Types of interventions

We included any type of intervention that aimed to improve the meaningful use of EHRs by healthcare professionals by means of an educational intervention or training. Interventions without a specific educational component, such as pay for performance and change management interventions and those that targeted other types of health information technology such as computerized physician order entry systems, were excluded.

Types of outcome measures

We used Kirkpatrick's hierarchical evaluation model to categorize the results of included studies. Using this model we classified the reported effects of the interventions into four levels of increasing complexity. Studies that report the subjective opinions of trainees about the intervention were categorized into the first level. Studies that describe an intervention that induces changes in the attitude of trainees towards the EHR are classified as level 2a. Interventions that increase the knowledge or skills about the EHR are categorized as level 2b. Studies that report on interventions that increase the meaningful use of EHRs were classified as Kirkpatrick level 3. Studies that measured the effect of the intervention on organizational practice were categorized as level 4a and studies that were classified as Kirkpatrick level 4b measured the benefits to the patient of the intervention and the resulting increase in meaningful use of EHRs.

Search methods for identification of studies

We searched for articles published from 2000 to 2012. This time frame was chosen because the information technology field develops rapidly and EHRs described in older studies are not representative of today's sophisticated EHRs. Our first search strategy combined the search term “Electronic Health Record” and all its synonyms with the Boolean operator OR. The second search strategy combined the term “Educational intervention” and its synonyms, including the term training, with the Boolean operator OR. Finally, we combined the searches with the Boolean operator AND. This was translated for all databases, using the appropriate vocabulary. We conducted searches in Medline, PsychINFO, ISI Web of Knowledge, EMBASE, Ovid, Eric and CINAHL. In addition, we hand-searched four medical educational journals: Medical Education, Medical Teacher, Teaching & Learning in Medicine, and Medical Care.See Appendix 1 for details.

Inclusion of studies

To identify potentially relevant studies, two reviewers (B.K. and J.G.) independently screened all identified studies on title and abstract. Next, B.K. and J.G. identified eligible papers by applying the in- and exclusion criteria to the selection. Full text copies were retrieved if necessary and a third reviewer (F.S.) was consulted to resolve discrepancies. The references of these articles were screened for articles that were missed in our initial search which yielded no additional papers.

Quality assessment and data extraction

An adjusted version of the BEME systematic review coding sheet was used by two teams of two reviewers (J.G.-Z.C. and F.S.-C.K.) to independently scrutinize the characteristics and methodological quality of the eight eligible studies. The reviewers used a five point scale to rate the evaluation methods, the strength of the findings, the appropriateness of the study design and their overall impression of the quality of the article. Discrepancies in coding were resolved by discussion, a third reviewer was involved if necessary. The studies included for final analysis were further classified according to type of intervention, Kirkpatrick level of evaluation, and research design of these studies.

Results

The process of reference selection is depicted in . Our broad and sensitive search of the literature identified a total of 4507 articles of which 97 were potentially eligible for inclusion in this review. 89 of the 97 potentially relevant articles did not use educational interventions or did not specifically aim to promote meaningful use; these studies were excluded from this review. We finally selected a total of eight studies that met the in- and exclusion criteria (Lusignan et al. 2002; Kirshner et al. Citation2004, Porcheret et al. Citation2004; Badger et al. Citation2005; McCain et al. 2008; Kushniruk et al. Citation2009; Lemmetty et al. Citation2009; Stromberg et al. 2011).

Figure 1. Overview of the selection process of studies included in this review.

Figure 1. Overview of the selection process of studies included in this review.

Methodological quality of included studies

The methodological quality of the eight included articles was assessed in detail using an adjusted version of the BEME coding sheet (. The references are given in Appendix 1). The overall methodological quality of the studies was poor compared to clinical studies in other fields of medicine. None of the studies used randomization or blinding to prevent bias. One of eight studies, Porcheret et al. (Citation2004), provided follow-up results. Only the studies of Porcheret et al. (Citation2004) and Kirshner et al. (Citation2004) used appropriate study designs. Two of eight studies, Porcheret et al. (Citation2004) and Kushniruk et al. (Citation2009), provided adequate data analysis. The study of Badger et al. (Citation2005) did not specify a study design or a method for analyzing the results. Therefore, the results of this study could not be interpreted in a meaningful way and the study was excluded from this review.

Table 3  Methodological quality of studies

Table 4  Summary of included studies

Types of interventions

We classified the studies according to the type of intervention, the Kirkpatrick level of evaluation and the research design (). Four studies described classroom-based training, one study described personal guidance and two studies described an educational intervention using feedback. The classroom-based training of Kushniruk and both feedback interventions were categorized as Kirkpatrick level 3, the other interventions on Kirkpatrick level 1. There were no comparative studies and the majority of authors described their results in qualitative terms. Therefore, we chose to discuss the seven included studies as a mini case report in a narrative about reactions and learning of healthcare professionals to educational inverventions (i.e. interventions that involve educational activities but no practical lessons) and training (i.e. interventions that involve practical components).

Intervention 1: Classroom-based training

Four studies used classroom training (CRT) to improve the use of EHRs by healthcare professionals. Lemmetty investigated the satisfaction of participants with CRT. McCain describes an educational intervention that combines CRT with computer-based training (CBT), so-called blended learning. Stromberg et al. describe the development and evaluation of a training course to improve data entry by nurses. Kushniruk reports on the effect of CRT on the capability of physicians to use EHRs.

Kirkpatrick level 1

Lemmetty et al. (Citation2009) reports a retrospective study that describes EHR competence of 290 healthcare professionals in a Finland hospital district after an educational intervention. All participants received an educational intervention to optimize their knowledge and use of a newly implemented EHR. The intervention consisted of classroom teaching by an experienced teacher but no further details were provided. After completion of the course participants were sent a questionnaire to investigate themes such as computer skills, EHR use, previous training, and satisfaction with the educational intervention. Forty-eight percent returned the questionnaire. Most respondents considered their computer skills average after the intervention. The majority of the respondents were satisfied with the teaching methods. Healthcare providers with 15 years or more working experience were less confident with their computer skills than those with less experience. And 37% of the respondents reported that they needed homework, additional training in EHR use or personal counseling to become confident EHR users.

Kirkpatrick level 1

In the study described by McCain et al. (2008) 63 physicians and nurses were trained to use a new EHR. The training consisted of three classroom sessions provided by a teacher who demonstrated several functionalities of the EHR. After the demonstration all participants had to practice the use of several EHR functionalities to reinforce the covered material. During the practice sessions the teacher was available to provide personal guidance to the trainees. Physicians took an average of eight hours of training, while nurses completed 12 hours of training on average. Evaluation after the course identified several weaknesses of the format. First, the course was designed for participants with average computer skills. Participants with advanced skills reported that the pace of the course was too slow. In contrast, trainees with minimal skills complained that the course material was too intense. Furthermore, the number of hours spent in class was perceived as overpowering. In addition, physicians reported that they preferred training after office hours.

To address these problems the educational committee developed a CBT that trainees could complete in their own time and pace. In addition, trainees were provided with contact information of an EHR specialist to answer specific questions regarding the training and EHR functionalities. After completion of the assignments a teacher gave a final demonstration of the correct way to perform the tasks. Evaluation of the new training revealed that trainees were enthusiastic about the opportunity to perform the CBT in their own pace. Two thirds of the participants preferred the new CBT with personal guidance over CRT alone.

Kirkpatrick level 1

Stromberg et al. (2011) describe the development and evaluation of an educational intervention to improve data entry in EHRs by nurses. Stromberg et al. initially tried to organize the class material to satisfy participants of all levels. However, when this proved to be difficult and disruptive they redesigned the whole curriculum. Four of their main goals in redesigning the educational intervention were to create discipline-specific sessions, reduce the total amount of time spent in the classroom on any given day, present material in smaller packages, and expand the amount of time devoted to each topic. In the redesigned intervention nurses received 23 course hours over four separate days. After learning the basics of the system each participant was provided with a different simulation patient and a clinical scenario calling for patient information such as assessment findings, some patient history, laboratory values, diagnosis, and other facts critical to the care plan. Using these data, the participants and instructor work together to create a patient-specific care plan. A total of 125 nurses were trained using this intervention. Stormberg et al. did not perform any formal analysis or statistical review. However, they report anecdotal evidence from managers who uniformly reported that the nurses were better prepared to use the system after the educational intervention. The user-related problems they encountered were both fewer and less often related to problems discussed during the course.

Kirkpatrick level 3

Kushniruk et al. (Citation2009) studied the relationship between an educational intervention and the capability to use EHRs. Five physicians of a medium sized hospital received one classroom session that covered logging in to the system, documenting and reviewing office visit data, placing orders and documenting a complex visit. After the course the participants had to complete two written assignments. These scenario's required the physician to perform specific EHR tasks such as documenting patient history, entering medication, writing orders, checking alerts and adding notes and letters. During the assignment all computer screens were captured with special software. In addition, the clinicians were asked to think out loud which was recorded and later transcribed. After completion of the assignments the researchers conducted a semi structured interview in which they asked for the perceived usability of the EHR and problems that the physicians had encountered before the training while using the EHR in daily practice. All subjects were able to complete the two scenarios. Subjects showed an adequate ability to use the system and to carry out the tasks. Comments of the participants were generally in favor of the system but they reported that the intervention could be improved with a module to learn how to use the EHR without interfering with the doctor–patient relationship.

From their data Kushniruk et al. (Citation2009) gives recommendations for advanced training on the use of EHRs. They suggest to educate physicians in using EHR in various clinical contexts, for different types of workflow, and in various situations such as situations of high urgency or complexity.

Educational intervention 2: individual counseling

We identified one study (Kirshner Citation2004) that reported on individual counseling (i.e. personal guidance and instructions) of clinicians as a method to improve the use of clinical information systems (CIS).

Kirkpatrick level 1

Kirshner et al. (Citation2004) developed a one-on-one educational intervention to examine the effect of individual counseling on the meaningful use of EHRs of physicians who practice in a large health management organization. Participants received a single three- to four-hour educational intervention in their own office. The session included a core competency evaluation plus tailored instructions on how to use four specific CIS functionalities. A paper-based survey was conducted to evaluate the course. One hundred twenty nine participants, of whom 53% were primary care physicians, returned the questionnaire. Respondents reported that the session improved their proficiency in using all four CIS functions, especially the EHR application. One-on-one counseling was perceived as more effective compared to CRT or CBT. Participants were satisfied with the training reflected by a mean score of 4.1 on a five point scale. They suggested that the training could be improved with follow up sessions and the provision of written material.

Educational intervention 3: Feedback

Two studies examined the effect of feedback. From an educational perspective feedback is specific information about the comparison between a trainee's observed performance and a standard, given with the intent to improve the trainee's performance (van de Ridder et al. Citation2008). Both studies were performed in large primary care practice research networks based in the UK.

Kirkpatrick level 3

Lusignan et al. (2002) used a retrospective cohort study to examine the effectiveness of feedback on the data quality of EHRs. To investigate this relation they used data from a large national electronic database which draws information from over 500 representative General Practitioners (GPs) across the UK. GPs who provide data for this database received feedback regarding the quality of the data they provide. They were awarded a £400 (€470) bonus if they met certain quality standards. The researchers hypothesized that the quality of the data physicians provide increases over time as a result of the feedback given. To investigate this relationship they grouped physicians according to the year they first submitted data. Subsequently, they calculated the mean scores of each of the 10 quality markers that were fed back to the physicians. Regression analysis was used to determine whether length of time in the scheme predicted data quality. The authors found that four markers improved over time and six did not. This study suggests that there might be a relationship between feedback and the quality of database entries. However, due to the retrospective approach and the inconclusive results it is not possible to draw firm conclusions on the ability of feedback to improve the meaningful use of EHRs.

Kirkpatrick level 3

Porcheret et al. (Citation2004) used a different approach to assess the effect of feedback on the quality of coded clinical data in EHRs of general practices. Seven general practices involved in a UK-based primary care research network participated in this study. At baseline the data quality of three markers was determined: (1) the proportion of recorded consultations coded with a read code problem title, (2) the fraction of patients that was assigned a read code among patients that received a drug that is only used for a small number of conditions and (3) the prevalence of 12 selected conditions compared to an external validated reference source. The results of the baseline assessment were fed back to the practices in a feedback session and suggestions were given on how data quality could be improved. During this plenary feedback session the investigators and the group of GPs reached agreement on training needs. According to these agreements, specific two hour training sessions were designed and provided. After the initial feedback and training, three follow up measurements were performed and fed back to the practices. The results show that during follow up all practices improved or maintained their initial level of data quality. There was no correlation between the baseline level of data quality and the level of improvement in the practice.

Discussion

The purpose of this review was to determine the efficacy of different educational interventions to improve meaningful use in healthcare professionals. Our results suggest that multifaceted interventions (combinations of CRT, CBT, individual counseling, and feedback) are most effective in improving meaningful use in healthcare professionals. We could not determine the efficacy of the different components of such an intervention because none of the studies provided a careful evaluation of the efficacy of their intervention. Nonetheless, our results could assist healthcare educators in designing evidence based interventions to improve meaningful use of EHRs in healthcare professionals.

Among the included studies were no comparative research designs, none of the studies used standardized evaluation tools and the majority of authors described their results in qualitative terms.

The majority of the excluded articles discussed the implementation of an EHR without mentioning an intervention to stimulate meaningful use or they described a type of pay-for-performance intervention to improve meaningful use. Together, this could be an indication that educational intervention and training of healthcare professionals is largely neglected when hospitals are implementing new EHRs.

This is surprising because the results of our review suggest that physicians and other healthcare workers require substantially more training than is provided in even the most extensive courses. In the study of Lemmety et al. A total of 37% of the respondents reported that they needed additional homework, additional training or personal counseling to become confident EHR users. Participants in the study of Kushniruk et al. were satisfied with the training but specifically asked for additional training in which they would learn to use the EHR without disrupting the doctor-patient relationship. In the study of Kirshner et al. physicians also asked for follow-up training and written material. This shows that there is a great willingness among healthcare professionals to use the EHR in an effective and meaningful manner.

Best evidence educational interventions to improve EHR use

On the basis of our results and educational theories we identified a number of key issues that healthcare providers have to consider when designing an educational intervention to improve meaningful use of EHRs.

Our results suggest that educational and training interventions need to be flexible with regard to at what pace, when, and where the material is completed. Lemmety et al. showed that there is a large variation in computer skills among healthcare providers, and a large proportion of respondents requested additional study material that could be completed independently. Participants in their study with 15 years or more working experience were less confident with their computer skills than those with less experience. The study of McCain et al. identified the same problems when they designed an educational intervention, some of the participants perceived the intervention as too intense, while others found it to be too slow. This problem was successfully solved by offering a CBT that could be performed in participants’ own time and pace.

The study of McCain et al. also suggests that physicians prefer training after office hours and that physicians value the availability of a help desk to whom they can refer for specific questions. The results of Lusignan et al. and Porcheret et al. suggest that follow-up sessions should be tailored to individual needs by providing feedback on the quality of EHR use after the initial teaching course. This is consistent with the main principles of the adult learning theory (Cross Citation1981). These principles are: (1) adult learning programs should capitalize on the experience of participants; (2) learning programs should adapt to aging limitations; (3) adults should be challenged to move in increasingly advanced states of personal development; and (4) adult should have as much choice as possible in the availability and organization of learning programs.

Although we were not able to determine the efficacy of the various educational interventions described in this review, from educational theories we know that some interventions are potentially more effective than others. First, educating physicians in only a classroom lecture is not likely to be very effective. An educational intervention for meaningful use of EHRs aims to prepare physicians for the use of this EHRin the complex real clinical practice. Learning to deal with this real clinical practice takes place when physicians engage with an uncertain and unfamiliar context. This cannot be taught or passively assimilated (Fraser & Greenhalgh Citation2001). Second, feedback has a higher potential to be an effective educational intervention, especially when combined with other interventions. This is because behavior of learners evolves in response to feedback about the impact of their own actions (Fraser & Greenhalgh Citation2001). Last, learning is more effective when learners have the opportunity to deliberately practice with real world casus. An educational intervention containing the opportunity to practice EHR skills in various casus should therefore be considered as “training,” as opposed to all other educational interventions that do not contain this practical element.

Limitations

We have made extensive efforts to design a broad and sensitive search filter, considered several thousands of references and identified only seven studies that addressed the training and education of healthcare professionals in meaningful use of EHRs. A major limitation of this study is that we cannot ascertain that we did not miss any relevant articles. The main reason for this is that the literature on educational interventions to promote the meaningful use of EHRs is extremely heterogeneous. There is no standardized language in education research and different authors use different terms to describe the same concepts. This limits the efficient retrieval of studies that contain important evidence. This is not only a major limitation for researcher but also for healthcare educators who aim to design effective and evidence based educational interventions on the basis of the literature.

The synthesis of the results of studies included in this review was limited by the heterogeneity in the interventions, settings and participants. Moreover, not all studies provided detailed protocols of the educational intervention to improve EHR use. In addition, all studies used different EHRs, none of the studies assessed the highest level of Kirkpatrick (performance) and none of the studies used a comparative design. Furthermore, there are no standardized tools to assess improvements in meaningful use of EHRs by healthcare professionals. This puts limitation on all future efforts to synthesize evidence on the effectiveness of interventions. Healthcare educators have the responsibility to professionalizing educational research, not only to establish evidence based interventions but also to facilitate communication with policymakers which will improve the recognition of educational research as a thrust-worthy and useful science.

Author's conclusions

The lack of effective educational interventions impedes EHR potential

The lack of scientifically evaluated educational interventions to improve the use of EHRs is consistent with the reported gap in knowledge on education requirements for effective use of other healthcare information technologies (McKibbon et al. Citation2011). There is widespread consensus that the inability of healthcare providers to use EHRs in a meaningful way is one of the main factors that have resulted the failure of EHRs to reduce the costs of healthcare (Watson Citation2012). Our results show that there is a major gap in knowledge on how to effectively equip physicians, nurses and other healthcare professionals to use EHRs in a meaningful way. Therefore, it is surprising that the USA, the UK and many other developed countries invest billions to stimulate the widespread availability of EHRs but fail to invest in educational research to develop effective training programs that improve the meaningful use of EHRs. Governments, hospitals and other policymakers cannot expect to receive return on their tremendous investments if end-users are unable to use the full functionality of EHRs.

The lack of effective educational interventions poses a threat to patient safety

The widespread adoption of EHRs is supposed to improve the quality of care by improving the communication between healthcare providers. However, studies have shown that the inability of healthcare providers to use complicated EHRs results in higher mortality, incorrectly entered documentation, order details that are prone to misinterpretation and patient care plans that are poorly managed and clinically insufficient (Han et al. Citation2005; Koppel et al. Citation2005; Smith Citation2005, Sittig et al. Citation2006). This illustrates that the implementation of an EHR does not automatically improve the quality of care. On the contrary, suboptimal and incorrect use of health ICT even poses a real safety hazard to patients. Educational interventions to could effectively address these problems and should not be neglected when planning the implementation of a new EHR.

Future directions

Based on our findings we recommend the design of multifaceted interventions that provide a combination of classroom teaching, CBT and personal guidance or feedback. These interventions should take place in a research setting where both the immediate and long-term effects on the use of EHRs are measured. More studies of high methodological quality with standardized outcome measures are needed to support general recommendations on how to optimize meaningful use in healthcare professionals. Therefore, educational research field has to start using standardized educational protocols and assessment tools, allowing for reliable comparisons both within and between studies. We encourage researchers in the field of medical education to act as health advocates by undertaking methodologically sound research of well designed interventions to improve meaningful use of EHRs. Governments and other policymakers should invest significantly more in the funding of this research. We believe that this will pay off in terms of improved quality of care and patient safety as well as with respect to limiting the ever-rising costs of health care. Finally, we urge regulatory bodies to ensure that staff are competent to use EHRs and education programs designed to ensure this are evidence based, and evaluated with regards to their efficacy and safety on suitable, patient-related outcome measures.

Author contributions

All authors contributed to the review design, execution and original manuscript of this review.

Declaration of interest: None of the authors has a conflict of interest to declare.

References

  • Bh Gray T, Bowden T, Johansen I, Koch S. 2011. Electronic health records: an international 1089 perspective on “meaningful use”. Issue Brief (commonwealth fund). 28:1–18.
  • Blumenthal D, Tavenner M. The “meaningful use” regulation for electronic health records. N Engl J Med 2010; 363: 501–504
  • Chaudhry B, Wang J, Wu S, Maglione M, Mojica W, Roth E, Morton SC, Shekelle PG. Systematic review: Impact of health information technology on quality, efficiency, and costs of medical care. Ann Intern Med 2006; 144: 742–752
  • Classen DC, Bates DW. Finding the meaning in meaningful use. N Engl J Med 2011; 365: 855–858
  • Committee HESS, Force ATGET, 2008. Electronic health records: A global perspective. Healthcare information and management systems society
  • Cross KP. Adults as learners. Jossey-bass, San francisco, CA 1981
  • Emanuel E, Tanden N, Altman S, Armstrong S, Berwick D, de brantes F, Calsyn M, Chernew M, Colmers J, Cutler D, et al. A systemic approach to containing health care spending. N Engl J Med 2012; 367: 949–954
  • Fraser SW, Greenhalgh T. Coping with complexity: Educating for capability. Br Med J 2001; 323: 799–803
  • Gray BH, Bowden T, Johansen I, Koch S. Electronic health records: An international perspective on “meaningful use”. Issue brief (commonw fund) 2011; 28: 1–18
  • Han YY, Carcillo JA, Venkataraman ST, Clark RS, Watson RS, Nguyen TC, Bayir H, Orr RA. Unexpected increased mortality after implementation of a commercially sold computerized physician order entry system. Pediatrics 2005; 116: 1506–1512
  • Hass B, Pooley J, Feuring M, Suvarna V, Harrington AE. Health technology assessment and its role in the future development of the indian healthcare sector. Perspect Clin Res 2012; 3: 66–72
  • Jha AK, Doolan D, Grandt D, Scott T, Bates DW. The use of health information technology in seven nations. Int J Med Inform 2008; 77: 848–854
  • Keehan SP, Cuckler GA, Sisko AM, Madison AJ, Smith SD, Lizonitz JM, Poisal JA, Wolfe CJ. National health expenditure projections: Modest annual growth until coverage expands and economic growth accelerates. Health aff (millwood) 2012; 31: 1600–1612
  • Koppel R, Metlay JP, Cohen A, Abaluck B, Localio AR, Kimmel SE, Strom BL. Role of computerized physician order entry systems in facilitating medication errors. J Am Med Assoc 2005; 293: 1197–1203
  • Marketsandmarkets.com 2011. World healthcare it market: Trens & forecast (2010 – 2015)
  • McKibbon KA, Lokker C, Handler SM, Dolovich LR, Holbrook AM, O’reilly D, TR Hemens BJ, Basu R, Troyan S, Roshanov PS, Archer NP, Raina P. 2011. Enabling medication management through health information technology. Evidence Report/technology assessment no. 201
  • Sittig DF, Ash JS, Zhang J, Osheroff JA, Shabot MM. Lessons from “unexpected increased mortality after implementation of a commercially sold computerized physician order entry system”. Pediatrics 2006; 118: 797–801
  • Smith JA, Jr. Role of computerized physician order entry systems in facilitating medication errors. J Urol 2005; 174: 1400–1401
  • Van de Ridder JM, Stokking KM, McGaghie WC, ten Cate OT. What is feedback in clinical education?. Med Educ 2008; 42: 189–197
  • Vavken P, Pagenstert G, Grimm C, Dorotka R. Does increased health care spending afford better health care outcomes? Evidence from Austrian health care expenditure since the implementation of drgs. Swiss Med Wkly 2012; 142: w13589
  • Watson T, 2012. 2012 global medical trends [online]. Available: http://www.towerswatson.com/assets/pdf/7394/towerswatson-globalmedtrendssvyrpt-na-2012-23911.pdf

Appendix 1. References of reviewed papers

  • Badger SL, Bosch RG, Toteja P. Rapid implementation of an electronic health record in an academic setting. J Healthc Inf Manag 2005; 19: 34–40
  • De Lusignan S, Stephens PN, Adal N, Majeed A. Does feedback improve the quality of computerized medical records in primary care?. J Am Med Inform Assoc 2002; 9: 395–401
  • Kirshner M, Salomon H, Chin H. An evaluation of one-on-one advanced proficiency training in clinicians' use of computer information systems. Int J Med Inform 2004; 73: 341–348
  • Kushniruk AW, Myers K, Borycki EM, Kannry J. Exploring the relationship between training and usability: A study of the impact of usability testing on improving training and system deployment. Stud Health Technol Inform 2009; 143: 277–283
  • Lemmetty K, Hayrinen K, Sundgren S. The impacts of informatics competencies and user training on patient information system implementation. Stud Health Technol Inform 2009; 146: 646–651
  • McCain CL. The right mix to support electronic medical record training: Classroom computer-based training and blended learning. J Nurses Staff Dev 2008; 24(4)151–154
  • Porcheret M, Hughes R, Evans D, Jordan K, Whitehurst T, Ogden H. Data quality of general practice electronic health records: The impact of a program of assessments, feedback, and training. J Am Med Inform Assoc 2004; 11: 78–86
  • Stromberg SC. A training model for orienting newly hired nurses to an organization's electronic health record. Computers, informatics, nursing 2011; 29: 321–325

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.