974
Views
1
CrossRef citations to date
0
Altmetric
Original Articles

Mixing and matching: using qualitative methods to improve quantitative impact evaluations (IEs) and systematic reviews (SRs) of development outcomes

, , , , , , & show all
Pages 400-421 | Received 17 Jul 2018, Accepted 04 Oct 2018, Published online: 02 Nov 2018
 

ABSTRACT

Recent evaluations have begun to use qualitative data in a manner that helps improve the quality and relevance of studies through the inferences that are drawn from them, and their applicability to policy makers and programme implementers. This paper reviews this work and identifies good practices to integrate qualitative methods into quantitative impact evaluations (IEs) and systematic reviews (SRs). Using recent literature on the characteristics of such practices, we developed two tools to assess the methodological rigour and mixed methods integration of 40 IEs and 7 SRs, drawing upon previous approaches. Our findings are that successful mixed methods quantitative impact evaluations: (1) provide a clear rationale for integration of methods; (2) deploy multidisciplinary teams; (3) provide adequate documentation; and (4) acknowledge limitations to the generalisability of qualitative and quantitative findings. Successful integration tended to improve mixed methods impact evaluations by collecting better data to inform the study design and findings, which helped contextualise quantitative findings. Our main observation on the integration of mixed methods in the systematic reviews is that mixed methods systematic reviews bringing together literatures that answer different questions can go beyond the ‘sum of their parts’ to provide holistic answers about development effectiveness. The findings of this study inform several recommendations to improve the conduct and reporting of mixed methods impact evaluations and systematic reviews.

Acknowledgments

This paper draws on a CEDIL paper which is available on-line. We would like to acknowledge the helpful comments we received from our CEDIL and DFID reviewers as well as the anonymous reviewer for the journal without implicating them in any way for any errors in the final product.

Disclosure statement

No potential conflict of interest was reported by the authors.

Notes

1. We recognize that the primary focus on attributable evidence is a limitation of the scope of the paper which does not mean to suggest that quantitative analysis ranks ahead of qualitative techniques.

2. This definition is consistent with that of Creswell (Citation2014) who defines ‘qualitative research’ as ‘a means for exploring and understanding the meaning individuals or groups ascribe to a social or human problem. The process of research involves emerging questions and procedures; collecting data in the participants’ setting; analysing the data inductively; building from particulars to general themes; and making interpretations of the meanings of the data.’

3. Bias is commonly understood to be a concept drawn from the quantitative research paradigm, and incompatible with the philosophical underpinnings of qualitative enquiry (Creswell, Citation2014; Thorne, Stephens, and Truant Citation2016; Davies and Dodd Citation2002). Instead, qualitative researchers agree that concepts such as rigour and trustworthiness are more applicable to the subjective nature of qualitative research. Our tool incorporates these concepts based on the ideas proposed by Creswell (Citation2014), Greene, Caracelli, and Graham (Citation1989), Miles and Huberman(Citation1994), Pluye et al. (Citation2011), Langer (Citation2017) and more.

4. Section B5 of the MMIE tool covers the description of the context and conditions under which phenomena of interest occur, and the scope and limitations of data presented to enable generalisation to other settings. The term ‘thick descriptions’ is typically used in ethnographies, and we erred on the side of caution by not privileging one method over the other in the scoring criteria.

6. The World Bank’s harmonised list of fragile situations for 2018 is available at: http://pubdocs.worldbank.org/en/189701503418416651/FY18FCSLIST-Final-July-2017.pdf.

7. By analytical framework, we are referring to whether or not the study reported the themes, coding and analysis procedures.

8. For an example of such divergences, refer to Section IV, part B.

9. Integration indicators cover six domains, which include the provision of logic or programme models explored through mixed methods, the use of mixed methods to inform components of study design, and to inform the interpretation of findings, as well as limitations to the integration of methods. For more information, refer to Section C of the tool in the appendix, and part II, Section C.

11. Please refer to footnote no. 5.

Additional information

Notes on contributors

Emmanuel Jimenez

Emmanuel (Manny) Jimenez is Executive Director of the International Initiative on Impact Evaluation (3ie), a non-profit organization which provides grants for the rigorous assessment of the effectiveness of development projects and programs and supports the use of such evidence in decision-making. He came to 3ie early in 2015 after many years at the World Bank Group where he provided technical expertise and strategic leadership in a number of research and operational positions including as director of the bank’s operational program in human development in its Asia regions from 2000-2012 and as director of public sector evaluations from 2012- 2014. Before joining the bank, Dr Jimenez was on the economics faculty at the University of Western Ontario in London, Canada. He received his Ph.D. from Brown University.

Hugh Waddington

HughWaddington is Senior Evaluation Specialist in 3ie's Synthesis and Reviews Office. He has a background in research and policy, having worked previously in the Government of Rwanda, the UK National Audit Office and the World Bank, and before that with Save the Children UK and the Department for International Development. He is managing editor of the Journal of Development Effectiveness and co-chair of the International Development Coordinating Group (IDCG) of the Campbell Collaboration.

Neeta Goel

Neeta Goel is a Senior Evaluation Specialist in 3ie's Evaluation Office. She is responsible for the review and management of 3ie-funded research and impact evaluation grants.She has over seventeen years of experience in the international development sector. Her work includes the design, implementation and evaluation of NGO interventions focusing on disadvantaged children and communities. Prior to 3ie, Neeta worked in several national and international NGOs. In her most recent assignment, she served as the Program Director for Children International, managing programmes in ten countries across Africa, Asia and Latin America. Neeta holds a Ph.D. in Childhood Studies from Rutgers University and is a member of the American Evaluation Association.

Audrey Prost

Audrey Prost is a social anthropologist based at LSHTM, where she serves as the Head of the Doctoral College. Her research focuses on designing and evaluating community interventions to improve maternal, child and adolescent health in India. I am particularly interested in participatory interventions and community engagement methods.

Andrew Pullin

Andrew Pullin is the Director of the Centre for Evidence-Based Conservation at Bangor University. He is interested in the concept of evidence synthesis and evidence-based practice in environmental management. In 2007 he co-founded the Collaboration for Environmental Evidence which promotes the conduct and dissemination of systematic reviews of evidence on environmental impacts of human actions and effectiveness of environmental management and policy interventions worldwide.

Howard White

Howard Whiteis CEO of the Campbell Collaboration. He has published widely on the effectiveness of aid and anti-poverty programmes, and evaluation and systematic review methods. He is former editor of the Journal of Development Effectiveness and Journal of Development Studies.

Shaon Lahiri

Shaon Lahiri is a PhD candidate in the Social and Behavioral Sciences at the George Washington University. Prior to this, he was  research associate in 3ie’s Evaluation Office where he worked on grants related to water, sanitation and hygiene.

Anmol Narain

Anmol Narain is a research assistant in 3ie's Evaluation Office. She provides research, planning and project management support for 3ie's thematic grant programme on promoting latrine use in rural India, and other grants related to water, sanitation and hygiene.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.