10,428
Views
50
CrossRef citations to date
0
Altmetric
Research Article

Mixing it but not mixed-up: Mixed methods research in medical education (a critical narrative review)

Pages e92-e104 | Published online: 28 Jan 2011

Abstract

Background: Some important research questions in medical education and health services research need ‘mixed methods research’ (particularly synthesizing quantitative and qualitative findings). The approach is not new, but should be more explicitly reported.

Aim: The broad search question here, of a disjointed literature, was thus: What is mixed methods research – how should it relate to medical education research?, focused on explicit acknowledgement of ‘mixing’.

Methods: Literature searching focused on Web of Knowledge supplemented by other databases across disciplines.

Findings: Five main messages emerged: – Thinking quantitative and qualitative, not quantitative versus qualitative – Appreciating that mixed methods research blends different knowledge claims, enquiry strategies, and methods – Using a ‘horses for courses’ [whatever works] approach to the question, and clarifying the mix – Appreciating how medical education research competes with the ‘evidence-based’ movement, health services research, and the ‘RCT’ – Being more explicit about the role of mixed methods in medical education research, and the required expertise

Conclusion: Mixed methods research is valuable, yet the literature relevant to medical education is fragmented and poorly indexed. The required time, effort, expertise, and techniques deserve better recognition. More write-ups should explicitly discuss the ‘mixing’ (particularly of findings), rather than report separate components.

Introduction

As in health services, some medical education research questions (e.g. about new or complex initiatives and interactions (Schifferdecker & Reed Citation2009)) need ‘mixed methods research’:

[combining] quantitative and qualitative research techniques, methods, approaches, concepts or language into a single study (Johnson & Onwuegbuzie Citation2004, p. 17)

investigator collects and analyzes data, integrates the findings, and draws inferences using both qualitative and quantitative approaches or methods in a single study or program of inquiry (Tashakkori & Creswell Citation2007, p. 4)

Despite some harmonization, ‘quantitative–qualitative’ turf wars flare up, featuring a Trojan horse of assumptions. Problems include quantitative institutional cultures and clashing cultures between medicine (e.g. ‘evidence-based medicine’) and education, and, within medical education, about notions of quality, evidence, population perspective, health policy, and heuristics (Sales & Schlaff Citation2010).

Medical education research should aim to improve students’ learning and their ultimate health impact, while reconciling eclectic views on how knowledge is created, discovered, learned, valued, justified, and verified, and challenging concrete ideas of ‘science’. Defining ‘science’ is contentious (Chalmers Citation1982; Regehr Citation2010), classifying medicine as possibly science, art, or both. Science crosses sundry social realities between objective and subjective assumptions about ontology (our existence), epistemology (our ideas about knowledge), human nature, and research methods (Burrell & Morgan Citation1979; Cohen & Manion Citation1994; Wilson Citation2000) ().

Box 1. Four dimensions distinguishing assumptions underlying ‘objective’ and ‘subjective’ approaches to social science

Wilson (Citation2000) noted that medical schools promote objectivist assumptions (i.e. ‘enculturation’ of: patient–disease separation; simplistic cause-and-effect models; and keeping doctor–patient distance), with doctors viewing medicine mostly as a (‘biomedical’) ‘science’. To him, clinical practice needed both universal and existential approaches though, not just the ‘detached observer’ (gaining knowledge like a natural scientist). Cribb and Bignold (Citation1999) noted that ‘it would be dangerously cavalier’ (p. 207) to dismiss this ‘detachment’ survival mechanism, but medical schools needed more reflexivity. Such debates in anthropology, physics and politics have allowed cultural relativity and more subjectivity (Wilson Citation2000).

The challenge ‘to look critically in our researches at the uniquely human elements in medical education’ (Mawardi Citation1967, p. 280) is longstanding. Defying quantitative tendencies to measure human behaviour by physical science goals and standards (used for test-tube chemicals or planets) (Buchanan Citation1992), Wilson (Citation2000) urged a social constructivist approach of qualitative–quantitative equivalence. Constructivist theories (constructing understanding from our current knowledge and experiences) are contested, however, whether about learning (Colliver Citation1999; Giordan et al. Citation1999) or research (Colliver Citation1996a; Colliver Citation1999; Colliver Citation2002a; Jervis & Jervis Citation2005).

Beyond medical education, qualitative research is relatively popular (Roche Citation1991; Bergsjø Citation1999) in general practice (Boulton et al. Citation1996; Hoddinott & Pill Citation1997) and health services research (Pope & Mays Citation1995; Shortell Citation1999; Hoff & Witt Citation2000; Bradley et al. Citation2007; O’Cathain et al. Citation2007a; Lingard et al. Citation2008; O’Cathain Citation2009; O'Cathain et al., 2009). Qualitative research still struggles, however, for funding and mainstream medical acceptance (Cribb & Bignold Citation1999; Morse Citation2006; Dixon-Woods et al. Citation2007; Goguen et al. Citation2008; Sandelowski Citation2008; Pope & Mays Citation2009), undermined in ‘evidence’ hierarchies (Thorne Citation2009) and the hidden curriculum (Goguen et al. Citation2008). Challenges include the ‘evidence-based’ movement (Green & Britten Citation1998) (albeit evidence-based nursing appears more accommodating (Meadows-Oliver Citation2009; Thorne Citation2009; Broeder & Donze Citation2010)), outcomes research (Curry et al. Citation2009), and agreed standards, whether technical (Pope & Mays Citation2009) or ethical (e.g. potential distress, exploitation, distortion, and participant identification) (Richards & Schwartz Citation2002). Only 2% of original articles in seven journals (general medical, general practice and public health), 1991–1995, reported qualitative research (its very use sometimes meriting publication (Woolf Citation2006)), only 17% of these using mixed research (Boulton et al. Citation1996). Mixed methods went from 17% to 30% of commissioned health services research (Department of Health) between the early 1990s to 2002–2004 (O’Cathain et al. Citation2007a), and the number of Medline articles mentioning qualitative research increased progressively for the last two decades, up by 26% between 2001–2005 and 2006–2010 (first-quarter) (Ring et al. Citation2010).

‘Mixed methods’ suffer conflicting guidance on qualitative research standards (Mays & Pope Citation1995; Boulton et al. Citation1996; Hoddinott & Pill Citation1997; Chapple & Rogers Citation1998; Popay et al. Citation1998; Pope & Mays Citation1999; Stacy & Spencer Citation2000; Côté & Turgeon Citation2005). Stacy and Spencer (Citation2000) understandably favoured being theoretically explicit (e.g. Abassi & Smith on behalf of British Medical Journal Education Group for Guidelines on Evaluation 1999) more than elusive researcher ‘independence’ (e.g. Harden et al. Citation1999, p. 557)), but medical education research reports rarely explore paradigmatic (theoretical perspective) assumptions (Mylopoulos & Regehr Citation2007; Bunniss & Kelly Citation2010) or substantive theory (Rees & Monrouxe Citation2010). The broad search question here, of a disjointed literature, was thus: What is mixed methods research - how should it relate to medical education research?, focused on explicit acknowledgement of ‘mixing’.

Searching literature databases (Haig & Dozier Citation2003a, 2003b) mostly used free-text in Web of Science (Science Citation Index Expanded, 1945-; Social Sciences Citation Index, 1956-; and Arts and Humanities Citation Index, 1975-) to August 2010 (), for - - -mixed method* and ‘(medical) education* (research)’- - -; medical education and qualitative and quantitative and mixed) or evidence-based or research paradigm (*=‘wildcard’). English language titles (±abstracts) yielded articles reviewing mixed methods theory or practice in (‘evidence-based’) health professional education (mostly undergraduate medical). Five key messages were apparent:

  • Thinking quantitative and qualitative, not quantitative versus qualitative

  • Appreciating that mixed methods research blends different knowledge claims, enquiry strategies, and methods

  • Using a ‘horses for courses’ [whatever works] approach to the question, and clarifying the mix

  • Appreciating how medical education research competes with the ‘evidence-based’ movement, health services research, and the ‘RCT’

  • Being more explicit about the role of mixed methods in medical education research, and the required expertise

Box 2. Literature search strategy: ‘What is mixed methods research, and how does it relate to medical education research?’

Thinking quantitative and qualitative, not quantitative versus qualitative

The zealous ‘quantitative versus qualitative’ research debate appears inescapable, yet somewhat futile (Niglas Citation1999), steeped in military rhetoric (‘a phony war’ (Bergsjø Citation1999, p. 561) and ‘more a battlefield than a field of inquiry’ (Norman Citation1998, p. 77), prompting ‘Why can’t we all just get along?’ (Onwuegbuzie Citation2000, p. 11)). In education, Onwuegbuzie (Citation2000) traced this debate to the late 1800s, when logical positivism underpinned ‘science’, seeking ‘verifiability’ via systematic ‘hard’ data collection of objective evidence (with probabilistic and inferential analysis) to explain, predict, and control phenomena. Psychosocial researchers soon backed interpretivism, such as hermeneutics (Onwuegbuzie Citation2000) – seeking meaning from participants (rather than external ‘truth’). This contrasted with refinements of positivism in quantitative research, by researching in everyday settings with minimal interference (and knowledge a matter of interpretation – interpretivism), yet still using scientific method (naturalism) (Pope & Mays Citation1999). The main schism remained (Onwuegbuzie Citation2000) ().

Box 3. Four paradigms for considering knowledge claims and theory, the six key strands of the pragmatism continuum, and the ‘schism’ it spans

Purists claim paradigm superiority, like a moral crusade (Onwuegbuzie Citation2000), pledging allegiance (Johnson & Onwuegbuzie Citation2004). The debate:

… has tended to obfuscate rather than to clarify, to stereotype rather than to enlighten, and to divide rather than to unite educational researchers. (Onwuegbuzie Citation2000, p. 10)

Paradigms become barriers (Niglas Citation1999), yet both approaches have strengths and weaknesses, and underpin social research. Johnson and Onwuegbuzie (Citation2004) and Punch (Citation1998) noted quantitative–qualitative similarities in data type and handling. From nursing, Goodwin and Goodwin (Citation1984) dismissed myths about certain methods being exclusive for certain paradigms (despite others defining qualitative research thus (Boulton et al. Citation1996)) and about qualitative research being always or exclusively unobtrusive, naturalistic, and subjective. Like Goodwin and Goodwin (Citation1984), Morse (Citation1999a) dismissed the supposed irrelevance of validity and reliability in qualitative research (and also debunked the non-generalizability myth (Morse Citation1999b)). Newman (Citation2000) dismissed another myth:

There is a frequently held misconception that quantitative research uses numbers and qualitative research is narrative. This is a misleading simplification…it is not the technique that makes something quantitative or qualitative, but it is the intent of its uses. Is it testing hypotheses or is it helping to develop hypotheses or describe the data (Newman Citation2000, pp. 4–5)

For Punch (Citation1998) it was ‘not inevitable, or essential, that we organize our empirical data as numbers’ (p. 58), but both approaches could induce or test theory.

The pragmatism paradigm (in the sense of a ‘world-view’ or approach) (Bergman Citation2010) of ‘mixed methods’ reduces tough choices between methods, logic, or epistemology (Tashakkori & Teddlie Citation1998), reaching Onwuegbuzie's (Citation2000) ‘epistemological ecumenism’ (p. 11). In psychosocial sciences, Tashakkori and Teddlie (Citation1998) cast ‘pragmatism’ as pacifist in ‘the paradigm wars’ (usually positivism versus constructivism), contrasting its American roots (e.g. Dewey (Hallet Citation1997), Rorty (Citation2000)) with European caution over debunking metaphysical truth (to ‘what works’). An alternative (post-positivist) paradigm, ‘critical realism’ (Colliver Citation1996b; McEvoy & Richards Citation2006) accepts an external reality with multiple (albeit fallible) outlooks to seek sense (via observation, measurement, perception, or cognition) (Trochim Citation2006), but strongly rejects relativism, while pragmatism seems agnostic on it (Proctor Citation2004).

In summary, mixing uses the pragmatism paradigm, an inclusive ‘what works’ approach to ‘truth’. This reconciles assumptions about social reality, avoiding futile, ‘either–or’, qualitative–quantitative polemics. The key ideas show agnosticism rather than bland compromise.

Appreciating that mixed methods research blends different knowledge claims, enquiry strategies, and methods

From educational psychology, Creswell's (Citation2003) three questions (from Crotty's (Citation1998) model) distinguished qualitative, quantitative, and mixed methods research.

What knowledge claims and theory (the paradigm)?

Creswell (Citation2003) includes the pragmatism paradigm (besides postpositivism, constructivism, and advocacy/participatory ()), i.e.:

  • No specific schemes of philosophy and reality; free choice of methods; indifference to ‘qualitative or quantitative’; and belief in ‘qualitative and quantitative’, truth being what works at the time

  • Focusing on the purpose of research; appreciating social, historical, and political context; unperturbed by disputed reality or disputed laws of nature.

To Cherryholmes (Citation1992, p. 16), ‘pragmatists’ searching for reality was hopeless: ‘Even if we came upon a True account of what is ‘real’, we would be at a loss to recognize it as True’, whereas ‘scientific realists’ romantically seek an objective independent reality. Onwuegbuzie (Citation2000) disputed purists’ self-defeating assumptions:

Qualitative: ‘All truth is relative’ would be true only relatively. Accepting ‘There are multiple realities’ should allow the ‘quantitative’ version of reality. (Universal assertions that there are no universals are indeed ironic (Norman Citation1998; Colliver Citation2002a).)

Quantitative: ‘The verifiability principle’ (assertions are only meaningful if verifiable) is not empirical or logical.

What enquiry strategy (associated traditions of enquiry)?

Classifications of qualitative, quantitative, and mixed methods research differ considerably. Social research literature tends to see ‘experiments’ or ‘surveys’ (Creswell Citation2003) rather than, for example, the differentiated clinicoepidemiological hierarchy of quantitative study design (i.e. case report, case series (clinical or population), cross-sectional study, case-control study, cohort study, and randomized controlled trial (Bhopal Citation2008)). Quantitative research tends not to follow ‘traditions’ explicitly per se.

Qualitative research strategies explore, describe, or generate theory, especially for uncertain and ‘immature’ concepts (Morse Citation1991); sensitive and socially dependent concepts (Roche Citation1991); and complex human intentions and motivations (Harris Citation2003). This generally ‘case-orientated’ (not ‘variable-orientated’) (Punch Citation1998) research favours open-ended questions, unstructured approaches, and highlighting differences rather than averaging responses (Roche Citation1991). Classifications abound – for example, Creswell (Citation1998) outlined five main traditions (from about 20): biography/narrative, phenomenology (underused in medical education), grounded theory, ethnography, and case studies, while Grbich (Citation1999) noted field-, action-, or library-based approaches in health research. Classifications of mixed methods research, avoiding ‘mixed up methods’ (Tashakkori & Teddlie Citation1998), include the simplified Teddlie and Tashakkori (Citation2006) typology of ‘mixed’ versus ‘quasi-mixed’ (the latter being without substantial integration of findings and inferences).

What data collection and analysis methods?

Competing advice about ‘doing’ qualitative research (Chapple & Rogers Citation1998) is off-putting, as are much criticized recipe-like checklists (Barbour Citation2001), and ‘feigning “immaculate perception”’ (Wolcott Citation1992, p. 43) (e.g. claiming exclusivity for trained social scientists). ‘Pragmatism’ is not precious about approach. Educational researchers have promoted qualitative–quantitative ‘mixing’ (Punch Citation1998; Tashakkori & Teddlie Citation1998; Creswell Citation2003; Elliott Citation2004; Johnson & Onwuegbuzie Citation2004; Raudenbush Citation2005; Demerath Citation2006; Miller & Fredericks Citation2006), e.g. ‘ … we should be shamelessly eclectic in our use of methods’ (Rossman & Wilson Citation1991, p. 17). Such mixing occupies a continuum (Newman Citation2000) though, as do methods, logic, epistemology, axiology, ontology, and causation in pragmatism (Tashakkori & Teddlie Citation1998) ().

In summary, mixing methods (a ‘movement’ in only its third decade (Creswell & Garrett Citation2008)) blends eclectic views of knowledge, traditions of enquiry, methods, and results; stays practice-orientated; and uses ‘what works’, not elitist stances. Classifications do not necessarily clarify ‘the mix’ though.

Using a ‘horses for courses’ [whatever works] approach to the question, and clarifying the mix

Rossman and Wilson (Citation1985) noted three stances about qualitative–quantitative mixing:

purism: they cannot be combined and one is favoured

situationalism: both are valuable, maybe in one study, but only in their place

pragmatism: both are valuable, especially combined, whether in study design, data collection, or analysis.

Their rationale for ‘pragmatic’ combination could be: corroboration (true triangulation), elaboration (‘If we think of social phenomena as gems, elaboration designs are intended to illuminate different facets of the phenomenon of interest’ (Rossman & Wilson Citation1991, p. 2)), initiation, and/or development (), and Onwuegbuzie et al. (Citation2010) highlighted instrument development and construct validation.

Box 4. The rationale for mixing quantitative and qualitative research

On the mixing continuum, Punch (Citation1998) meant: adding, interweaving, integrating, or linking methods, data, and/or findings with increasing complexity. Creswell (Citation2003) meant: implementing ‘mixing’ simultaneously, sequentially, and/or transformatively; showing priority to qualitative, quantitative, or both equally, integrating data collection, analysis, and/or interpretation, and being theoretically explicit or implicit. Visually representing key ‘mixing’ decisions might help (Ivankova et al. Citation2006).

Some published examples (O’Sullivan Citation2010) do not report or critique details of mixing data collection via a single questionnaire, for example, yet this can give meaningful mixing (e.g. Johnson and Onwuegbuzie's (Citation2004) ‘within-stage mixed model design’):

For example, in data collection, this ‘mixing’ might involve combining open-ended questions on a survey with closed-ended questions on the survey. Mixing at the stage of data analysis and interpretation might involve transforming qualitative themes or codes into quantitative numbers and comparing that information with quantitative results in an ‘interpretation’ study. (Creswell Citation2003, p. 212).

Combining approaches can mean simply incorporating open-ended questions in a fixed-choice self-completion questionnaire, or systematically collecting quantitative information (such as age or length of an experience) during interviews or focus groups. (Barbour Citation1999, p. 40)

Tashakkori and Teddlie (Citation1998) related pragmatism to the ‘scientific method’ research cycle of inferences, but with induction and deduction more apparent (possibly simultaneous):

The research question predominates in mixed methods in mainstream education research, and, for example, health services research (Barbour Citation1999), nursing research since the 1990s (Morse Citation1991; Carr Citation1994; Sandelowski Citation2000), and more recently with health sciences research (Andrew & Halcomb Citation2009), and engineering education (Leydens et al. Citation2004; Borrego et al. Citation2009). This ‘horses for courses’ eclecticism of medical education research is increasingly acknowledged (Abassi & Smith on behalf of British Medical Journal Education Group for Guidelines on Evaluation 1999; Bligh & Anderson Citation2000), but needs better quality research questions (Schuwirth & van der Vleuten Citation2004; Shea et al. Citation2004). If science essentially seeks better story-telling about how the world works, then: ‘while the particular rules of the story may differ, just as the rules of a sonnet differ from a limerick, good stories are independent of the form’ (Norman Citation1998, p. 79).

Tashakkori and Teddlie (Citation1998) highlighted the required ‘dictatorship of the research question (not the paradigm or method)’:

pragmatists consider the research question to be more important than either the method they use or the worldview that is supposed to underlie the method…For most researchers committed to the thorough study of a research problem, method is secondary to the research question itself, and the underlying worldview hardly enters the picture, except in the most abstract sense. (Tashakkori & Teddlie Citation1998, p. 21)

Oppenheim advised likewise:

… It would be more helpful to suggest that choosing the best design or the best method is a matter of appropriateness. No single approach is always or necessarily superior; it all depends on what we need to find out and on the type of question to which we seek an answer. (Oppenheim Citation1992, p. 12)

Rigid ‘quantitative versus qualitative’ stances then appear facile. Punch (Citation1998) considered that approach should also reflect context, current literature, feasibility, potential cost-benefit, and personal expertise/experience, while Creswell (Citation2003) also highlighted the researcher's experience and potential audience.

Challenges remain though. Buchanan (Citation1992) argued that quantitative research dominated social science because of: ‘scientific method’ being successful in understanding the natural world; comforting ‘certainties’ in ‘hard’ science; government and funding support; wanting the ‘perfect’ experiment; rejecting subjectivity; and unfamiliarity with qualitative research (goals, standards, and assumptions). He favoured qualitative research in any mixing, yet was sceptical about mixing because:

  • Quantitative research disregards outliers, whereas qualitative research highlights the singular response (the exception), because that responder might be more perceptive or articulate in raising this ‘important’ issue.

  • Qualitative research aims for logical, not probabilistic, generalization (also Popay et al. Citation1998). Logical inference is, however, problematic when quantifying qualitative data by dichotomizing complex answers about opinions, which do not necessarily relate one-to-one, linearly, with behaviour.

  • Themes/‘ideal types’ (abstract descriptions of typical constructs) emerge piecemeal but convincingly across an interview, yet defy ‘scoring’.

  • Quantitative research does not help with all-pervasive (universal) themes.

Howe (Citation2004) promoted ‘mixed methods interpretivism’ over two educational research developments he perceived would undermine qualitative research by focusing on ‘cause’, ‘effectiveness’ and randomization:

‘neoclassical experimentalism’: … even more restrictive designs

‘mixed methods experimentalism’: … tokenistic addition of qualitative components

In summary, the mix in mixed methods research can vary by type, extent, and intention, so researchers should be clear what best answers their research question. Factors such as context, evidence-base, and feasibility (including cost-benefit and personal expertise) also affect what/how to mix. Sometimes this involves true ‘triangulation’ (a term best used for exploring the same issue in different ways). One instrument might suffice, e.g. a questionnaire, but mixing should be meaningful, not tokenistic.

Appreciating how medical education research competes with the ‘evidence-based’ movement, health services research, and the ‘RCT’

Medical education research and theory are critiqued harshly () through a lens of ‘evidence-based medicine’ (Leung & Johnston Citation2006). They seek ‘evidence-based’ credibility (Wartman Citation1994; Albanese et al. Citation1998), struggling since their structured foundations in the late 1950s/early 1960s (McGuire Citation1996) (and informal studies of medical students’ personality and intelligence in the early 20th century (Levine et al. Citation1974)). Some detractors (Wartman Citation1994) expected too much applied output from the ‘formative’ stages of this research (Norman & Schmidt Citation1999), which apparently trails behind medical domains of similar vintage such as clinical epidemiology. Problems include:

  • Inadequate questions, designs, and samples (Dauphinee Citation1996; McGuire Citation1996) being neither truly basic nor clinical science (Friedman Citation1996)

  • Attracting ‘me-too research’ (Norman Citation2006), ‘saying little more than that the students liked the innovation’ (Abassi & Smith on behalf of British Medical Journal Education Group for Guidelines on Evaluation 1999, p. 1265)

  • Historical underfunding (Wartman Citation1994; Albanese et al. Citation1998), weak institutional support (Lovejoy & Armstrong Citation1996; Albanese et al. Citation1998), vaguer productivity measures than for health care (research), and the effect of some health care systems (e.g. United States managed care undermining innovation, funding, and research for medical education) (Albanese et al. Citation1998).

Box 5. Contrasting contemporary views of medical education research

Norman and Schmidt (Citation1999) urged more basic theory-building cognitive psychology research, and McGuire (Citation1996) considered that research should: redefine medical education goals against a robust concept of the competent doctor; help design relevant curricula (in an evidence-based way (Taylor Citation2010)); and evaluate cost-effectiveness. Despite cognitive science giving coherent insights (Reese Citation1998; Mayer Citation2010), Colliver criticized it (and its embodiment in problem-based learning (PBL) (Dolmans & Schmidt Citation1996)) (Colliver Citation2000), rebuffing excuses for its infancy (Norman & Schmidt Citation1999). For Colliver (Citation2002b, p. 1220), ‘educational innovations and practice claims are at best conjecture, not evidence-based science’, and the major reviews of PBL effectiveness (Albanese & Mitchell Citation1993; Berkson Citation1993; Vernon & Blake Citation1993), for example, were unpersuasive, irrespective of scant formal evidence for ‘conventional’ education. For Colliver (Citation2002b), educational theory was metaphor, ‘not rigorous, tested, confirmed scientific theory’ (p. 1217).

The relative weakness of such research extends beyond deficient expertise. Murray (Citation2002) listed barriers such as: complex educational interventions; problems randomly allocating interventions; elusive outcomes and measurement tools; underfunding; and especially clinicians’ disinclination. Petersen (Citation1999) also urged medical educationalists to use accessible terminology and improve research designs, while awaiting graduates of innovative curricula to improve attitudes. Van Der Vleuten et al. (Citation2000) noted how university staff perceived research or professional practice markedly differently from education, where, ‘ … any challenge to one's convictions is an actual challenge to one's professional integrity’ (p. 246). They argued that using tradition and intuition had led to such flawed notions as: ‘teaching is learning’, ‘the more we teach the more students learn’, ‘competence consists of distinct competences’ (confirmed by the non-existence of content-independent ‘problem-solving’ skills), and ‘the curriculum [rather than assessment] dictates learning’. Petersen noted:

… the same professional standards are not so commonly applied. All doctors have been successful medical students, and it seems easy to assume that this alone qualifies them to educate others. Few surgeons would claim that surviving a surgical procedure qualifies a patient to perform it on another (Petersen Citation1999, p. 1223)

Medical education research has become more robust (Baernstein et al. Citation2007) (and, arguably, sufficiently robust evidence exists to reassure and act (Norman Citation2000)), but how it should develop (Shea Citation2001; Prideaux Citation2002a), focus, and learn from other fields seems uncertain. Directions of influence with other fields are debatable. Cochrane Collaboration ‘evidence-based medicine’ has influenced the international Best Evidence Medical Education (BEME) collaboration in synthesizing evidence (Wolf et al. Citation2001) – and evaluating against ‘QUESTS’ dimensions (quality, utility, extent, strength, target and the setting) places the best available evidence on an ‘evidence-based↔opinion-based’ continuum (Harden et al. Citation1999). Wolf et al. (Citation2001) noted that the Cochrane Collaboration revisited a term – ‘meta-analysis’ – from an American Educational Research Association presidential address in the mid-1970s. Mainstream education research thus remains influential (Lloyd Citation1991), with Harris (Citation2003) attributing this and social science influences to researchers’ close links with educational psychology and the ‘biomedical’ research culture of medical schools – but health services research is influential too.

Prystowsky and Bordage (Citation2001) applied a health care outcomes research framework for their content analysis of medical education research, and found product cost and quality of medical education to be under-researched. Shea (Citation2001) disagreed with their applying health services research examples, because:

  • The ‘primary customer’ is the learner not the patient.

  • Dilution makes it nearly impossible to show learners’ outcomes affecting patients’ outcomes. (McGuire (Citation1996) lamented the ‘inexcusable shortage of outcomes research’ (p. S125), wanting impact measured in health care currency, but this still seems overambitious.)

  • Medical education often changes before strong study designs are possible (Taylor Citation2010).

  • Cost analysis may often be supplementary, which the authors presumably omitted in studying one main focus per article.

Conversely, Murray (Citation2002) valued health services research lessons, where evaluating similarly complex interventions required combined qualitative and quantitative approaches, but omitted the thorny technical barrier of ‘mixing’, and its perceived ‘sudden faddishness’ (Morse Citation2005, p. 583).

‘Evidence-based education’ (Davies Citation1999) remains aspirational. Wolf (Citation2000) considered that evidence-based medical education would probably develop similarly to evidence-based medicine, where the ‘critical appraisal’ step (then ‘finding best evidence’) had made the most progress (▸asking relevant answerable questions; ▸finding best evidence efficiently, ▸critically appraising it; ▸using expertise to adapt and apply evidence; ▸evaluating impact). Better research questions are needed (Schuwirth & van der Vleuten Citation2004; Shea et al. Citation2004), maybe to altruistic goals (Sestini Citation2010). The first of Wolf's (Citation2000) 10 lessons from evidence-based medicine – synthesizing evidence is usually more complex and complicated than anticipated – applies particularly when synthesizing non-RCT evidence (from qualitative (Thorne Citation2009; Broeder & Donze Citation2010) and/or quantitative approaches). Mining gems from meaningful but messy medical education research about ‘colourful’ phenomena needs a robust yet inclusive research constitution, and a wider world-view than just the RCT or any other potentially monochromatic mindset. Better research will require ‘elaboration of the messy parts of our efforts to intervene’ (Regehr Citation2010, p. 38) in education.

RCTs fit poorly into social research (Cook & Shadish Citation1994), although mixing with qualitative research helps (Bradley et al. Citation2005; Moffatt et al. Citation2006), and a sea-change to medical educational epidemiology (Carney et al. Citation2004) or single-case experimental designs (Bryson-Brockmann & Roll Citation1996), for example, would not answer many quantitatively orientated questions. Exploring BEME assumptions, Norman (Citation2000) agreed that education research eludes universal standards, but not because it is ‘soft’, noting that many clinical questions also defy the ‘universal approach’ of RCTs (and thus, noted Gillett (Citation2004), ‘[their] current fundamentalism’ (p. 730) and ‘positivist conceptions of argument and investigation … of evidence-based medicine’ (p. 732)). In selected clinical treatment areas, Benson and Hartz (Citation2004) found that observational study evidence was reasonably solid compared with RCTs. Cohort studies are quite robust (Goldacre Citation2001), and Concato et al. (Citation2004) found that allegedly inflated ‘treatment effects’ were unfounded with contemporary controls. The ‘perfect study’ might well be useless (Lloyd Citation1991).

Norman (Citation2000) noted BEME aptly included different epistemologies but, by seeking uniform quality assurance, assumed unidimensionality. Moreover, he noted:

  • Unlike drug doses, educational interventions are rarely standard, reducing transferability from RCTs.

  • Mostly, one ‘world-view’ judges the strength and breadth of evidence, yet one good study can be enough, small p-values do not necessarily mean large effect sizes, and many worthy research questions do not translate into effect sizes anyway (and are not clearcut quantitative↔qualitative issues).

  • BEME may well underestimate the generalizability of medical education evidence, as well-established examples exist.

Everyday education defies randomization, blinding, and controlled interventions, and lacks good outcome measures (Norman & Schmidt Citation2000; Prideaux Citation2002b) though, and ‘trials of curriculum level interventions … are … a waste of time and resources’ (Norman & Schmidt Citation2000; disputed by Colliver Citation2004). The PBL evidence-base, for example, suffers: confounders; small, context-bound, single-site studies; varied PBL definitions in disparate settings; and unsuitable conditions for randomized controlled trials (Finucane et al. Citation1998).

In summary, like health services research, medical education research suffers expectations from evidence-based medicine and the ‘RCT’ mirage, despite different: ‘customers’, dilution effects, timescales of change, and cost. Complex, context-specific questions and settings bedevil medical education research, and require expertise (such as in ‘messy’ mixed methods approaches) that clinical, academic, and institutional cultures undervalue and underfund. Medical education research could usefully focus more on: theory-building cognitive psychology (Norman Citation2004); illuminating the competent doctor, curriculum design (overcoming a ‘know-do’ gap (Levinson Citation2010)), outcomes and cost; challenging long-held assumptions that favour teaching over learning; improving both primary research and research synthesis (while accepting ‘imperfect’ designs); and embracing eclecticism of epistemology and enquiry.

Being more explicit about the role of mixed methods in medical education research, and the required expertise

Further quandaries for mixed methods medical education research specifically include the time, effort, and expertise required (Morse Citation1991; Creswell Citation2003). Like health services research (O’Cathain Citation2009), the approach itself is insufficiently discussed, e.g. appropriate sampling (Teddlie & Yu Citation2007) and practicalities (Schifferdecker & Reed Citation2009). Besides, the preferences of the researchers, disciplines, funders, and publishers, barriers to mixing (Bryman Citation2007) include insufficient practical guidance (Schifferdecker & Reed Citation2009), insufficiently detailed write-ups of what was integrated and how (O’Cathain et al. Citation2007b, 2008, Citation2010; Bryman Citation2007; Alise & Teddlie Citation2010; O'Cathain et al. 2009), and dissemination meeting inertia (Wilson Citation2010)

Teddlie and Tashakkori (Citation2009) cast their prototypical mixed methods researcher, ‘Professor Eclectica’, in public health. Prideaux (Citation2002a, p. 502) highlighted the ‘sophistication in thinking and understanding’ required to research medical education across research traditions. Assimilating diverse types of evidence (Leung Citation2002) and the identity crisis of spanning disciplines (and being disowned in-between) are barriers. Prideaux reinforced the ‘ … “virtue” in embracing “eclecticism” … ’ (2002a, p. 502), whether by one researcher spanning research traditions or many researchers collaborating from different backgrounds. The ‘lone researcher’ needs diverse skills and different logical principles (Mason Citation1994) to undertake time-consuming work and negotiate purism. Ironically, some researchers might shun mixing assumptions across the main approaches, yet happily mix methods, with very different assumptions, across traditions within qualitative research (Barbour Citation1998).

Specific, labelled, examples of mixed methods research in undergraduate medical education should be more prominent, but include, for example, exploring medical students’ early patient contact (Howe et al. Citation2007) and learning in the operating theatre (Lyon Citation2003) (and academic surgeons as educators in theatre and clinic (Cox & Swanson Citation2002)). Others have explored small-groupwork dynamics (MacPherson et al. Citation2001), programme evaluation (Gerrity & Mahaffy Citation1998), and interprofessional learning (Bradley et al. Citation2009). Frye et al. (Citation1993) commended a mixed approach to explore medical students’ complex learning environment in a problem-based parallel track while rotating through clerkships, and Maudsley et al. (Citation2007) used a questionnaire-based mixed methods approach to explore students’ perceptions of a good doctor and of learning in a problem-based curriculum (2008).

Barbour (Citation1999) and Creswell (Citation2003) recognized mixed methods potential in the questionnaire, but it remains rather misused and maligned, and lists of ‘bona fide’ qualitative research data collection methods (Boulton et al. Citation1996) usually omit it. Bergsjø (Citation1999) recognized its role in qualitative research, although as ‘the most programmatic approach’ (p. 560). Frye et al. (Citation1993, p. 46) reported mixing ‘five qualitative data collection methods’, including 5-min questionnaires comprising open-ended questions only. They considered it unobtrusive and efficient at giving complementary insights: ‘No single method captures the “big picture”’ (p. 59).

Compared with the revered semistructured interview (Cicco-Bloom & Crabtree Citation2006), however, questionnaire weaknesses are often highlighted, whether from social desirability bias, acquiescence bias, rigidity, dogmatism or authoritarianism (Oppenheim Citation1992). Poor questionnaire design (or application) adds to ‘a bit of a study’ medical education research culture.

In summary, mixed methods research is valuable in medical education, but the required time, effort, expertise (in ‘messiness’), and mixing techniques are seldom explicitly discussed. A carefully designed ‘questionnaire’ has much potential, but sloppy use of ‘a bit of a questionnaire’ for ‘a bit of a study’ undermines medical education research.

Comment

Although as Eva (Citation2008) noted, interpreting and integrating highly varied education literature will tend towards quirkiness, this critical narrative review highlights some recurring messages, emerging over the last two decades, about mixed methods research, relevant to medical education. Medical education research struggles self-consciously for credibility against:

  • RCT-driven evidence-based medicine and health services research;

  • a jumble of philosophies, concepts, assumptions and criticisms; and

  • inadequate study designs pursuing unrealistic questions and expectations.

Using the pragmatism world-view, mixed methods is research question-driven, conciliatory and underpins much robust research in education generally. The ‘mixing’ varies by type, extent and intention (blending eclectic views of knowledge, traditions of enquiry, methods and findings). Mixing requires expertise and resilience, amongst clashing expectations and cultures with other medical research.

Current literature about mixed methods focuses on the theory, techniques, nature, use and politics (Creswell Citation2009), but in medical education this is fragmented and poorly indexed. More write-ups should explicitly discuss the ‘mixing’ (Alise & Teddlie Citation2010; O’Cathain et al. Citation2010) (particularly of findings), rather than reporting qualitative–quantitative components separately. Mastering the mixing ‘trade’ involves harnessing the cognitive dissonance (Norman Citation1998; Colliver Citation2002a) and complementary strengths (Johnson & Onwuegbuzie Citation2004; Niaz Citation2008), while challenging the ‘incompatibility thesis’ dogma (forbidding mixture of paradigms and methods, Howe Citation1988) and ‘Jack of all trades, master of none’ criticisms. As Norman (Citation2008) noted, the ‘RCT’ and the ‘systematic review with meta-analysis’ (Norman & Eva Citation2008) are usually problematic here, and ‘The time is long overdue to abandon the worship of the false God of the RCT’ (p. 388). Mixing methods is not new, and seems increasingly relevant to medical education (Durning et al. Citation2010) – being robust and explicit about its theory and practice is overdue.

Ethical approval

Not applicable.

Funding: Nil

Acknowledgements

Thanks to my thesis supervisors, Dr Lyn Williams and Dr David Taylor, all that time ago, who commented constructively on my original review that I subsequently developed into this article.

Declaration of interest: The author reports no conflicts of interest. The author alone is responsible for the content and writing of this article.

References

  • Abassi K, Smith R, editors on behalf of British Medical Journal Education Group for Guidelines on Evaluation: Dillner L, Hutchinson L, Kerr J, Leinster S, Matheson K, Peterson S, Rake M, Richards T, Smith R, Wood D, et al. 1999. Guidelines for evaluating papers on educational interventions. BMJ 318(7193):1265–1267
  • Albanese M, Horowitz S, Moss R, Farrell P. An institutionally funded program for educational research and development grants: It makes dollars and sense. Acad Med 1998; 73(7)756–761
  • Albanese MA, Mitchell S. Problem-based learning: A review of literature on its outcomes and implementation issues. Acad Med 1993; 68(1)52–81
  • Alise MA, Teddlie CA. Continuation of the paradigm wars? Prevalence rates of methodological approaches across the social/behavioral sciences. J Mixed Methods Res 2010; 4(2)103–126
  • Andrew S, Halcomb EJ. Mixed methods research for nursing and the health sciences. Wiley-Blackwell, Oxford 2009
  • Baernstein A, Liss HK, Carney PA, Elmore JG. Trends in study methods used in undergraduate medical education research, 1969-2007. JAMA 2007; 298(9)1038–1045
  • Barbour RS. Mixing qualitative methods: Quality assurance or qualitative quagmire?. Qual Health Res 1998; 8(3)352–361
  • Barbour RS. The case for combining qualitative and quantitative approaches in health services research. J Health Serv Res Policy 1999; 4(1)39–43
  • Barbour RS. Checklists for improving rigour in qualitative research: A case of the tail wagging the dog?. BMJ 2001; 322(7294)1115–1117
  • Benson K, Hartz AJ. A comparison of observational studies and randomized, controlled trials. N Engl J Med 2004; 342(25)1878–1886
  • Bergman MM. On concepts and paradigms in mixed methods research. J Mixed Methods Res 2010; 4: 171–175
  • Bergsjø P. Qualitative and quantitative research – Is there a gap, or only verbal disagreement?. Acta Obstet Gynecol Scand 1999; 78(7)559–562
  • Berkson L. Problem-based learning: Have the expectations been met?. Acad Med 1993; 68(10 Suppl)S79–S88
  • Bhopal R. Concepts of epidemiology: Integrating the ideas, theories, principles and methods of epidemiology, 2nd. Oxford University Press, Oxford 2008
  • Bligh J, Anderson MB. Medical teachers and evidence. Med Educ 2000; 34(3)162–163
  • Borrego M, Douglas EP, Amelink CT. Quantitative, qualitative, and mixed research methods in engineering education. J Eng Educ 2009; 98(1)53–66
  • Boulton M, Fitzpatrick R, Swinburn C. Qualitative research in health care: II. A structured review and evaluation of studies. J Eval Clin Pract 1996; 2(3)171–179
  • Bradley P, Cooper S, Duncan F. A mixed-methods study of interprofessional learning of resuscitation skills. Med Educ 2009; 43(9)912–922
  • Bradley EH, Curry LA, Devers KJ. Qualitative data analysis for health services research: Developing taxonomy, themes, and theory. Health Serv Res 2007; 42(4)1758–1772
  • Bradley P, Oterholt C, Nordheim L, Bjorndal A. Medical students’ and tutors’ experiences of directed and self-directed learning programs in evidence-based medicine – A qualitative evaluation accompanying a randomized controlled trial. Eval Rev 2005; 29(2)149–177
  • Broeder JL, Donze A. The role of qualitative research in evidence-based practice. Neonatal Netw 2010; 29(3)197–202
  • Bryman A. Barriers to integrating quantitative and qualitative research. J Mixed Methods Res 2007; 1(1)8–27
  • Bryson-Brockmann W, Roll D. Single-case experimental designs in medical education: An innovative research method. Acad Med 1996; 71(1)78–85
  • Buchanan DR. An uneasy alliance: Combining qualitative and quantitative research methods. Health Educ Q 1992; 19(1)117–135
  • Bunniss S, Kelly DR. Research paradigms in medical education research. Med Educ 2010; 44(4)358–366
  • Burrell G, Morgan G. Sociological paradigms and organisational analysis: Elements of the sociology of corporate life. Heinemann Educational Books, London 1979
  • Carney PA, Nierenberg DW, Pipas CF, Brooks WB, Stukel TA, Keller AM. Educational epidemiology: Applying population-based design and analytic approaches to study medical education. JAMA 2004; 292(9)1044–1050
  • Carr LT. The strengths and weaknesses of quantitative and qualitative research: What method for nursing?. J Adv Nurs 1994; 20(4)716–721
  • Chalmers AF. What is this Thing Called Science?, 2nd. Open University Press, Buckingham 1982
  • Chapple A, Rogers A. Explicit guidelines for qualitative research: A step in the right direction, a defence of the ‘soft’ option, or a form of sociological imperialism?. Fam Pract 1998; 15(6)556–561
  • Cherryholmes CH. Notes on pragmatism and scientific realism. Educ Res 1992; 21(6)13–17
  • Cicco-Bloom B, Crabtree BF. The qualitative research interview. Med Educ 2006; 40(4)314–321
  • Cohen L, Manion L. Research methods in education, 4th. Routledge, London 1994
  • Colliver JA. Constructivism and educational research: Hazards for research in medical education. Prof Educ Res Q 1996a; 17(4)2–5
  • Colliver JA. Science in the postmodern era: Postpositivism and research in medical education. Teach Learn Med 1996b; 8(1)10–18
  • Colliver JA. Constructivism with a dose of pragmatism: A cure for what ails educational research. Adv Health Sci Educ 1999; 4(2)187–190
  • Colliver JA. Effectiveness of problem-based learning curricula: Research and theory. Acad Med 2000; 75(3)259–266
  • Colliver JA. Constructivism: The view of knowledge that ended philosophy or a theory of learning and instruction?. Teach Learn Med 2002a; 14(1)49–51
  • Colliver JA. Educational theory and medical education practice: A cautionary note for medical school faculty [Review]. Acad Med 2002b; 77(12 Pt1)1217–1220
  • Colliver JA. Full-curriculum interventions and small-scale studies of transfer: Implications for psychology-type theory. Med Educ 2004; 38(12)1212–1218
  • Concato J, Shah N, Horwitz RI. Randomized, controlled trials, observational studies, and the hierarchy of research designs. N Engl J Med 2004; 342(25)1887–1892
  • Cook TD, Shadish WR. Social experiments: Some developments over the past fifteen years. Annu Rev Psychol 1994; 45: 545–580
  • Côté L, Turgeon J. Appraising qualitative research articles in medicine and medical education. Med Teach 2005; 27(1)71–75
  • Cox SS, Swanson MS. Identification of teaching excellence in operating room and clinic settings. Am J Surg 2002; 183(3)251–255
  • Creswell JW. Qualitative inquiry and research design: Choosing among five traditions. Sage, Thousand Oaks, CA 1998
  • Creswell JW. Research design: Qualitative, quantitative, and mixed methods approaches, 2nd. Sage Publications, London 2003
  • Creswell JW. Mapping the field of mixed methods research. J Mixed Methods Res 2009; 3(2)95–108
  • Creswell JW, Garrett AL. The “movement” of mixed methods research and the role of educators. S Afr J Educ 2008; 28(3)321–333
  • Cribb A, Bignold S. Towards the reflexive medical school: The hidden curriculum and medical education research. Stud High Educ 1999; 24(2)195–209
  • Crotty M. The foundations of social research: Meaning and perspective in the research process. Sage, London 1998
  • Curry LA, Nembhard IM, Bradley EH. Qualitative and mixed methods provide unique contributions to outcomes research. Circulation 2009; 119(10)1442–1452
  • Dauphinee WD. Contributions and challenges of medical education research – Response. Acad Med 1996; 71(10)S127–S128
  • Davies P. What is evidence-based education?. Br J Educ Stud 1999; 47(2)108–121
  • Demerath P. The science of context: Modes of response for qualitative researchers in education. Int J Qual Stud Educ 2006; 19(1)97–113
  • Denzin NK. The research act: A theoretical introduction to sociological methods, 2nd. McGraw-Hill, New York 1978
  • Dixon-Woods M, Sutton A, Shaw R, Miller T, Smith J, Young B, Bonas S, Booth A, Jones D. Appraising qualitative research for inclusion in systematic reviews: A quantitative and qualitative comparison of three methods. J Health Serv Res Policy 2007; 12(1)42–47, [Correction 2008, 13:56]
  • Dolmans D, Schmidt H. The advantages of problem-based curricula. Postgrad Med J 1996; 72(851)535–538
  • Durning SJ, Elnicki ME, Gruppen L, Torre D, Hemmer PA. AMEE spotlight: AMEE 2009 spotlight on educational research. Med Teach 2010; 32(4)340–342
  • Elliott J. Multimethod approaches in educational research. Int J Disabil Dev Educ 2004; 51(2)135–149
  • Eva KW. On the limits of systematicity. Med Educ 2008; 42(9)852–853
  • Finucane PM, Johnson SM, Prideaux DJ. Problem-based learning: Its rationale and efficacy. Med J Aust 1998; 168(9)445–448
  • Friedman CP. Contributions and challenges of medical education research – Response. Acad Med 1996; 71(10)S129
  • Frye AW, Hoban JD, Richards BF. Capturing the complexity of clinical learning environments with multiple qualitative methods. Eval Health Prof 1993; 16(1)44–60
  • Gerrity MS, Mahaffy J. Evaluating change in medical school curricula: How did we know where we were going?. Acad Med 1998; 73(9)S55–S59
  • Gillett G. Clinical medicine and the quest for certainty. Soc Sci Med 2004; 58(4)727–738
  • Giordan A, Jacquemet S, Golay A. A new approach for patient education: Beyond constructivism. Patient Educ Couns 1999; 38(1)61–67
  • Goguen J, Knight M, Tiberius R. Is it science? A study of the attitudes of medical trainees and physicians toward qualitative and quantitative research. Adv Health Sci Educ 2008; 13(5)659–674
  • Goldacre M. The role of cohort studies in medical research. Pharmacoepidemiol Drug Saf 2001; 10(1)5–11
  • Goodwin LD, Goodwin WL. Qualitative vs. quantitative research or qualitative and quantitative research?. Nurs Res 1984; 33(6)378–380
  • Grbich C. Qualitative research in health. Sage, London 1999
  • Green J, Britten N. Qualitative research and evidence based medicine. BMJ 1998; 316(7139)1230–1232
  • Greene JC, Caracelli VJ, Graham WF. Toward a conceptual framework for mixed-method evaluation designs. Educ Eval Policy Anal 1989; 11(3)255–274
  • Haig A, Dozier M. BEME Guide No 3: Systematic searching for evidence in medical education – Part 1: Sources of information. Med Teach 2003a; 25(4)352–363
  • Haig A, Dozier M. BEME Guide No. 3: Systematic searching for evidence in medical education – Part 2: Constructing searches. Med Teach 2003b; 25(5)463–484
  • Hallet CE. Pragmatism and Project 2000: The relevance of Dewey's theory of experimentalism to nursing education. J Adv Nurs 1997; 26(6)1229–1234
  • Harden RM, Grant J, Buckley G, Hart IR. Best evidence medical education (BEME) Guide No. 1: Best evidence medical education. Med Teach 1999; 21(6)553–562
  • Harris I. What does “The discovery of grounded theory” have to say to medical education?. Adv Health Sci Educ 2003; 8(1)49–61
  • Hoddinott P, Pill R. A review of recently published qualitative research in general practice. More methodological questions than answers? Fam Pract 1997; 14(4)313–319
  • Hoff TJ, Witt LC. Exploring the use of qualitative methods in published health services and management research. Med Care Res Rev 2000; 57(2)139–160
  • Howe KR. Against the quantitative–qualitative incompatibility thesis or dogmas die hard. Educ Res 1988; 17(8)10–16
  • Howe KR. A critique of experimentalism. Qual Inq 2004; 10(1)42–61
  • Howe A, Dagley V, Hopayian K, Lillicrap M. Patient contact in the first year of basic medical training – Feasible, educational, acceptable?. Med Teach 2007; 29(2–3)237–245
  • Ivankova NV, Creswell JW, Stick SL. Using mixed-methods sequential explanatory design: From theory to practice. Field Methods 2006; 18(1)3–20
  • Jervis LM, Jervis L, 2005. What is the constructivism in constructive alignment? Biosci Educ Electron J 6 (November):1–14
  • Johnson RB, Onwuegbuzie AJ. Mixed methods research: A research paradigm whose time has come. Educ Res 2004; 33(7)14–26
  • Kadushin C, Hecht S, Sasson T, Saxe L. Triangulation and mixed methods designs: Practicing what we preach in the evaluation of an Israel experience educational program. Field Methods 2008; 20(1)46–65
  • Leung WC. Why is evidence from ethnographic and discourse research needed in medical education: The case of problem-based learning. Med Teach 2002; 24(2)169–172
  • Leung GM, Johnston JM. Evidence-based medical education – Quo vadis?. J Eval Clin Pract 2006; 12(3)353–364
  • Levine DM, Barsky AJ, Fox RC, Freidin RB, Williams SR, Wysong JA. Trends in medical education research: Past, present, and future. J Med Educ 1974; 49(2)129–136
  • Levinson AJ. Where is evidence-based instructional design in medical education curriculum development?. Med Educ 2010; 44(6)536–537
  • Leydens JA, Moskal BM, Pavelich M. Qualitative methods used in the assessment of engineering education. J Eng Educ 2004; 93(1)65–72
  • Lingard L, Albert M, Levinson W. Grounded theory, mixed methods, and action research. [Qualitative research] BMJ 2008; 337: 459–461
  • Lloyd DA. Can medical education be researched. Med Teach 1991; 13(2)145–148
  • Lovejoy FH, Armstrong E. Medical education research retreat. Acad Med 1996; 71(1)3–4
  • Lyon PMA. Making the most of learning in the operating theatre: Student strategies and curricular initiatives. Med Educ 2003; 37(8)680–688
  • MacPherson R, Jones A, Whitehouse CR, O’Neill PA. Small group learning in the final year of a medical degree: A quantitative and qualitative evaluation. Med Teach 2001; 23(5)494–502
  • Mason J. Linking qualitative and quantitative data analysis. Analyzing Qualitative Data, A Bryman, RG Burgess. Routledge, London 1994; 69–110, Chapter 5
  • Maudsley G, Williams EMI, Taylor DCM. Junior medical students’ notions of a ‘good doctor’ and related expectations: A mixed methods study. Med Educ 2007; 41(5)476–486
  • Maudsley G, Williams EMI, Taylor DCM. Problem-based learning at the receiving end: A ‘mixed methods’ study of junior medical students’ perspectives. Adv Health Sci Educ 2008; 13(4)435–451
  • Mawardi BH. Human element in medical education and medical education research [editorial]. J Med Educ 1967; 42(3)279–280
  • Mayer RE. Applying the science of learning to medical education. Med Educ 2010; 44(6)543–549
  • Mays N, Pope C. Rigour and qualitative research. BMJ 1995; 311(6997)109–112
  • McEvoy P, Richards D. A critical realist rationale for using a combination of quantitative and qualitative methods. J Res Nurs 2006; 11(1)66–78
  • McGuire CH. Contributions and challenges of medical education research. Acad Med 1996; 71(10 Suppl)S121–S126
  • Meadows-Oliver M. Does qualitative research have a place in evidence-based nursing practice?. J Pediatr Health Care 2009; 23(5)352–354
  • Miller SI, Fredericks M. Mixed-methods and evaluation research: Trends and issues. Qual Health Res 2006; 16(4)567–579
  • Moffatt S, White M, Mackintosh J, Howel D. Using quantitative and qualitative data in health services research – What happens when mixed method findings conflict?. BMC Health Serv Res 2006; 6: 10
  • Morse JM. Approaches to qualitative-quantitative methodological triangulation. Nurs Res 1991; 40(2)120–123
  • Morse JM. Editorial: Myth #93: Reliability and validity are not relevant to qualitative inquiry. Qual Health Res 1999a; 9(6)717–718
  • Morse JM. Editorial: Qualitative generalizability. Qual Health Res 1999b; 9(1)5–6
  • Morse JM. Editorial: Evolving trends in qualitative research: Advances in mixed-method design. Qual Health Res 2005; 15(5)583–585
  • Morse JM. The politics of evidence: Keynote address: First Congress of Qualitative Inquiry. Qual Health Res 2006; 16(3)395–404
  • Murray E. Challenges in educational research. Med Educ 2002; 36(2)110–112
  • Mylopoulos M, Regehr G. Cognitive metaphors of expertise and knowledge: Prospects and limitations for medical education. Med Educ 2007; 41(12)1159–1165
  • Newman I, 2000. A conceptualization of mixed methods: A need for inductive/deductive approach to conducting research. Annual meeting of the American Educational Research Association. 2000 April 24–25. New Orleans, LA, 14pp
  • Niaz M. A rationale for mixed methods (integrative) research programmes in education. J Philos Educ 2008; 42(2)287–305
  • Niglas K, 1999. Quantitative and qualitative inquiry in educational research: Is there a paradigmatic difference between them? Education-line Paper presented at the European Conference on Educational Research. 1999 September 22–25. Lahti, Finland. [Published 1999; Accessed 7.11.10] http://www.leeds.ac.uk/educol/documents/00001487.htm
  • Norman GR. Editorial: On science, stories, quality and quantity. Adv Health Sci Educ 1998; 3(2)77–80
  • Norman GR. Reflections on BEME. Med Teach 2000; 22(2)141–144
  • Norman G. Editorial: Theory testing research versus theory-based research. Adv Health Sci Educ 2004; 9(3)175–178
  • Norman G. Editorial: The joy of science. Adv Health Sci Educ 2006; 11(1)1–4
  • Norman G. The end of educational science?. Adv Health Sci Educ 2008; 13(4)385–389
  • Norman GR, Eva KW, 2008. Quantitative methods. ASME Monograph, Association for the Study of Medical Education
  • Norman GR, Schmidt HS. Of what practical use is a baby? Perspectives on educational research as a scientific enterprise. Prof Educ Res Q 1999; 20(3)1–5
  • Norman GR, Schmidt HG. Effectiveness of problem-based learning curricula: Theory, practice and paper darts. Med Educ 2000; 34(9)721–728
  • O’Cathain A. Mixed methods research in the health sciences: A quiet revolution. J Mixed Methods Res 2009; 3(1)3–6
  • O’Cathain A, Murphy E, Nicholl J. Why, and how, mixed methods research is undertaken in health services research in England: A mixed methods study. BMC Health Serv Res 2007a; 7: 85
  • O’Cathain A, Murphy E, Nicholl J. Integration and publications as indicators of “yield” from mixed methods studies. J Mixed Methods Res 2007b; 1(2)147–163
  • O’Cathain A, Murphy E, Nicholl J. Research methods & reporting: Three techniques for integrating data in mixed methods studies. Br Med J 2010; 341: 4587
  • O’Cathain A, Nicholl J, Murphy E. Structural issues affecting mixed methods studies in health research: A qualitative study. BMC Med Res Methodol 2009; 9: 82(p 8)
  • Onwuegbuzie AJ, 2000. Positivists, post-positivists, post-structuralists, and post-modernists: Why can’t we all get along? Towards a framework for unifying research paradigms. Annual Meeting of the Association for the Advancement of Educational Research, Ponte Vedra, FL, 2000 November 18. 20p
  • Onwuegbuzie AJ, Bustamante RM, Nelson JA. Mixed research as a tool for developing quantitative instruments. J Mixed Methods Res 2010; 4(1)56–78
  • Oppenheim AN. Questionnaire design, interviewing and attitude measurement, 2nd. Continuum, London 1992
  • O’Sullivan EM. A national study on the attitudes of Irish dental faculty members to faculty development. Eur J Dent Educ 2010; 14(1)43–49
  • Petersen S. Time for evidence based medical education. BMJ 1999; 318(7193)1223–1224
  • Popay J, Rogers A, Williams G. Rationale and standards for the systematic review of qualitative literature in health services research. Qual Health Res 1998; 8(3)341–351
  • Pope C, Mays N. Reaching the parts other methods cannot reach: An introduction to qualitative methods in health and health services research. BMJ 1995; 311(6996)42–45
  • Pope C, Mays N., editors. 1999. Qualitative research in health care. 2nd ed. London: BMJ Books
  • Pope C, Mays N. Critical reflections on the rise of qualitative research. [Research methods & reporting] BMJ 2009; 339: 737–739
  • Prideaux D. Medical education research: Is there virtue in eclecticism?. Med Educ 2002a; 36(6)502–503
  • Prideaux D. Researching the outcomes of educational interventions: A matter of design – RCTs have important limitations in evaluating educational interventions. BMJ 2002b; 324(7330)126–127
  • Proctor JD. The social construction of nature: Relativist accusations, pragmatist and critical realist responses. Ann Assoc Am Geogr 2004; 88(3)352–376
  • Prystowsky JB, Bordage G. An outcomes research perspective on medical education: The predominance of trainee assessment and satisfaction. Med Educ 2001; 35(4)331–336
  • Punch KF. Introduction to social research: Qualitative and quantitative approaches. Sage, London 1998
  • Raudenbush SW. Learning from attempts to improve schooling: The contribution of methodological diversity. Educ Res 2005; 34(5)25–31
  • Rees CE, Monrouxe LV. Theory in medical education research: How do we get there?. Med Educ 2010; 44(4)334–339
  • Reese AC. Implications of results from cognitive science research for medical education. Med Educ Online 1998; 3(1)1–9
  • Regehr G. It's NOT rocket science: Rethinking our metaphors for research in health professions education. Med Educ 2010; 44(1)31–39
  • Richards HM, Schwartz LJ. Ethics of qualitative research: Are there special issues for health services research?. Fam Pract 2002; 19(2)135–139
  • Ring L, Gross CR, McColl E. Putting the text back into context: Toward increased use of mixed methods for quality of life research. Qual Life Res 2010; 19(5)613–615
  • Roche AM. Making better use of qualitative research: Illustrations from medical education research. Health Educ J 1991; 50(3)131–137
  • Rorty R. Pragmatism. Int J Psychoanal 2000; 81: 819–823
  • Rossman GB, Wilson BL. Number and words: Combining quantitative and qualitative methods in a single large-scale evaluation study. Eval Rev 1985; 9(5)627–643
  • Rossman GB, Wilson BL, 1991. Numbers and words revisited: Being “shamelessly eclectic”. Annual meeting of the American Educational Research Association, 1991 April 3–7. Chicago, IL. 20p
  • Sales CS, Schlaff AL. Reforming medical education: A review and synthesis of five critiques of medical practice. Soc Sci Med 2010; 70(11)1665–1668
  • Sandelowski M. Combining qualitative and quantitative sampling, data collection, and analysis techniques in mixed-method studies. Res Nurs Health 2000; 23(3)246–255
  • Sandelowski MJ. Justifying qualitative research. [Editorial] Res Nurs Health 2008; 31(3)193–195
  • Schifferdecker KE, Reed VA. Using mixed methods research in medical education: Basic guidelines for researchers. Med Educ 2009; 43: 637–644
  • Schuwirth LWT, van der Vleuten CPM. Changing education, changing assessment, changing research?. Med Educ 2004; 38(8)805–812
  • Sestini P. Epistemology and ethics of evidence-based medicine: Putting goal-setting in the right place. J Eval Clin Pract 2010; 16(2)301–305
  • Shea JA. Mind the gap: Some reasons why medical education research is different from health services research. Med Educ 2001; 35(4)319–320
  • Shea JA, Arnold L, Mann KV. A RIME perspective on the quality and relevance of current and future medical education research. Acad Med 2004; 79(10)931–938
  • Shortell SM. The emergence of qualitative methods in health services research. Health Serv Res 34(5) Pt. 1999; 2: 1083–1090
  • Stacy R, Spencer J. Assessing the evidence in qualitative medical education research. Med Educ 2000; 34(7)498–500
  • Tashakkori A, Creswell JW. The new era of mixed methods. J Mixed Methods Res 2007; 1(1)3–7
  • Tashakkori A, Teddlie C. Mixed methodology: Combining qualitative and quantitative approaches. Sage, London 1998
  • Taylor CR. Perspective: A tale of two curricula: A case for evidence-based education?. Acad Med 2010; 85(3)507–511
  • Teddlie C, Tashakkori A. A general typology of research designs featuring mixed methods. Res Sch 2006; 13(1)12–28
  • Teddlie C, Tashakkori A. Foundations of mixed methods research: Integrating quantitative and qualitative approaches in the social and behavioral sciences. Sage, London 2009
  • Teddlie C, Yu F. Mixed methods sampling: A typology with examples. J Mixed Methods Res 2007; 1(1)77–100
  • Thorne S. The role of qualitative research within an evidence-based context: Can metasynthesis be the answer?. Int J Nurs Stud 2009; 46(4)569–575
  • Trochim WMK, 2006. Positivism & post-positivism. Research methods knowledge base. Web Center for Social Research Methods website. [Published 2006; Accessed 7.11.10]. Available from http://www.socialresearchmethods.net/kb/positvsm.php
  • Van der Vleuten CPM, Dolmans DHJM, Scherpbier AJJA. The need for evidence in education. Med Teach 2000; 22(3)246–250
  • Vernon DT, Blake RL. Does problem-based learning work? A meta-analysis of evaluative research. Acad Med 1993; 68(7)550–563
  • Wartman SA. Research in medical education: The challenge for the next decade. Acad Med 1994; 69(8)608–614
  • Wilson HJ. The myth of objectivity: Is medicine moving towards a social constructivist medical paradigm?. [Review] Fam Pract 2000; 17(2)203–209
  • Wilson I. Qualitative research in medical education. [letter] Med Educ 2010; 44(9)941
  • Wolcott HF. Posturing in qualitative inquiry. The handbook of qualitative research in education, MD LeCompte, WL Millroy, J Preissle. Academic Press, New York 1992; 3–52
  • Wolf FM. Lessons to be learned from evidence-based medicine: Practice and promise of evidence-based medicine and evidence-based education. Med Teach 2000; 22(3)251–259
  • Wolf FM, Shea JA, Albanese MA. Toward setting a research agenda for systematic reviews of evidence of the effects of medical education. Teach Learn Med 2001; 13(1)54–60
  • Woolf K. Surely this can’t be proper research? Experiences of a novice qualitative researcher. Clin Teach 2006; 3(1)19–22

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.