687
Views
3
CrossRef citations to date
0
Altmetric
Letter to the Editor

Enhancing the impact of BEME systematic reviews on educational practice

, &

Dear Sir

As medical educators, we believe that the synthesis of our field’s rapidly expanding, kaleidoscopic streams of research is as vital as the production of novel primary research. Clearly, imparting the skills to achieve this to the next generation of medical educators is vital in achieving this goal, hence our intense interest in Ahmadi et al. (Citation2015) timely BEME review. The goals of this review are appropriate and consistent with contemporaneous systematic reviews in the field, focussing on the effectiveness of Evidence-Based Medicine (EBM) teaching. Indeed, the team go further by breaking up the elements of EBM education in an attempt to ascertain their relative importance. The team exemplify how far the BEME collaboration has supported the evolution of health education evidence synthesis, particularly in the area of methodology. Ahmadi et al. (Citation2015) use robust, transparent methods that demonstrate to the reader that reviewer bias has been minimised and that the evidence base presented is reliable and complete. This is at the heart of the BEME vision to systematically review evidence in a manner that supports the shift to evidence-based medical education.

Despite these significant strengths, when considering the final synthesised piece, we were left considering if one element is missing. The authors do not describe the various teaching strategies used in the primary studies (e.g. learning outcomes, pedagogy, resources required or lesson plans). This is possibly a shift in perspective that authors may feel is outside the scope of secondary evidence synthesis. Authors may argue that the aim of their reviews are to focus on “justification” of education, namely whether interventions work. Indeed, authors may go on to argue that, given the nature of much primary research reporting medical education, it is highly unlikely that such an attempt would have yielded very much in the way of extracted data.

From our perspective, there are two reasons why such an addition may enhance a BEME review. Firstly, when synthesising any groupings of primary research, heterogeneity must always be considered. In the context of reviews of educational interventions, it is important to consider “educational heterogeneity” and we would propose that it is extremely difficult to consider this without having details of the primary education that was delivered. Of course, it may be that such data is not presented and obviously this lack of data presents a risk of bias that should be considered within the discourse of the review, but this possibility does not negate the potential benefits of attempting to extract such data.

Secondly, when considering impact, a greater question outside of the context of this review is raised – what do readers want medical education review to offer them? (Gordon et al. Citation2014). In medical education systematic reviews focussed on a type of teaching or learning, focussing on whether teaching works (justification) and completely ignoring what works, for whom, in what circumstances (description of factors associated to positive or negative outcomes) (Cook et al. Citation2008) risks under-utilising the significant investment of time and resource needed to complete such reviews. When extracting data, adding such outcomes as well as considering associated factors, can again add a meaningful dimension to the findings, despite the challenges that confounding variables may present.

As authors of primary studies are often contacted to clarify methodological data, asking them to also give details of their educational interventions could easily be incorporated. A recent study of non-pharmacological interventional studies in major medical journals found that while reporting of interventions was low, response from the authors after contact was positive in two-thirds of the cases (Hoffmann et al. Citation2013). This suggestion is not a proposal as to how such data should be utilised, as authors could report anything from a simple summary of the types of teaching used, to a qualitative synthesis of primary education to clarify (Cook et al. Citation2008) appropriate theoretical elements pertinent in the setting. Rather, we seek to highlight an issue that we often consider when reading health education systematic reviews.

Addition of descriptive elements of teaching and learning can only enhance the final product of systematic review, both by enhancing the rigour of the methodology and by offering greater utility for those delivering and innovating medical education. Presentation of the content of the interventions under scrutiny can support dissemination and replication, as well as allowing a deeper level of synthesis to consider issues such as context and learner characteristics. While such activity will add more demands to an already demanding process, we would propose that the potential gains merit consideration of such works in future reviews. While the focus of many BEME reviews appears still to be on “whether” a particular intervention in a particular specific teaching area is effective, we believe that without the addition of these “educational” elements, such reviews risk being of limited value to other educators, researchers and will have less impact on practice.

Declaration of interest: Morris Gordon has received various travel grants and honoraria to support the dissemination of various works from companies including Danone, Abbott, Vifor, Ferring, Nutricia, Warner Chilcott, Cassen Fleet and Norgine. At no point have these companies had any involvement in this work. Others have nothing to declare.

References

  • Ahmadi SF, Baradaran HR, Ahmadi E. 2015. Effectiveness of teaching evidence-based medicine to undergraduate medical students: A BEME systematic review. Med Teach 37(1):21–30
  • Cook DA, Bordage G, Schmidt H. 2008. Description, justification, and clarification: A framework for classifying the purposes of research in medical education. Med Educ 42:128–133
  • Gordon M, Carneiro AV, Patricio M, Gibbs T. 2014. Missed opportunities in health care evidence synthesis. Med Educ 48:644–645
  • Hoffmann TC, Erueti C, Glasziou PP. 2013. Poor description of non-pharmacological interventions: Analysis of consecutive sample of randomised trials. BMJ 347:f3755

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.