417
Views
0
CrossRef citations to date
0
Altmetric
Original Articles

Timely evaluation in international development

, , , &
Pages 482-508 | Received 26 Jul 2018, Accepted 30 Oct 2018, Published online: 27 Nov 2018

References

  • Andrews, M. 2015. “Explaining Positive Deviance in Public Sector Reforms in Development.” World development 74: 197–208. doi:10.1016/j.worlddev.2015.04.017.
  • Baker, U., S. Peterson, T. Marchant, G. Mbaruku, S. Temu, F. Manzi, and C. Hanson. 2015. “Identifying Implementation Bottlenecks for Maternal and Newborn Health Interventions in Rural Districts of the United Republic of Tanzania.” Bulletin of the World Health Organization 93: 380–389. doi:10.2471/BLT.14.141879.
  • Bamberger, M. 2012. “Introduction to Mixed Methods in Impact Evaluation.” Impact Evaluation Notes 3: 1–38.
  • Barder, O., and B. Ramalingam. 2012. Complexity, Adaptation, and Results. Center For Global Development. https://www.cgdev.org/blog/complexity-adaptation-and-results.
  • Barnett, C., and T. Munslow, 2014. “Process Tracing: The Potential and Pitfalls for Impact Evaluation in International Development.” Summary of a Workshop held on 7 May 2014 (No. IDS Evidence Report 102). Institute of Development Studies.
  • Barr, J. 2015. Monitoring and Evaluating Flexible and Adaptive Programming. Itad. http://www.itad.com/monitoring-and-evaluating-flexible-and-adaptive-programming/.
  • Beach, D., and R. B. Pedersen. 2013. Process-Tracing Methods: Foundations and Guidelines. Ann Arbor: University of Michigan Press.
  • Beebe, J. 2001. Rapid Assessment Process: An Introduction. Walnut Creek, CA: AltamiraPress.
  • Befani, B. 2013. “Between Complexity and Generalization: Addressing Evaluation Challenges with QCA.” Evaluation 19: 269–283. doi:10.1177/1474022213493839.
  • Befani, B., and J. Mayne. 2014. “Process Tracing and Contribution Analysis: A Combined Approach to Generative Causal Inference for Impact Evaluation.” IDS bulletin 45: 17–36. doi:10.1111/1759-5436.12110.
  • Benneyan, J. C., R. C. Lloyd, and P. E. Plsek. 2003. “Statistical Process Control as a Tool for Research and Healthcare Improvement.” BMJ quality & safety 12: 458–464. doi:10.1136/qhc.12.6.458.
  • Bertrand, M., D. Karlan, S. Mullainathan, E. Shafir, and J. Zinman. 2010. “What’s Advertising Content Worth? Evidence from a Consumer Credit Marketing Field Experiment.” The quarterly journal of economics 125: 263–306. doi:10.1162/qjec.2010.125.1.263.
  • Bhatt, D. L., and C. Mehta. 2016. “Adaptive Designs for Clinical Trials.” The New England journal of medicine 375: 65–74. doi:10.1056/NEJMra1510061.
  • Biglan, A., D. Ary, and A. C. Wagenaar. 2000. “The Value of Interrupted Time-Series Experiments for Community Intervention Research.” Prevention science : the official journal of the Society for Prevention Research 1: 31–49. doi:10.1023/A:1010024016308.
  • Bothwell, L. E., J. Avorn, N. F. Khan, and A. S. Kesselheim. 2018. “Adaptive Design Clinical Trials: A Review of the Literature and ClinicalTrials.gov.” BMJ open 8: e018320. doi:10.1136/bmjopen-2017-018320.
  • Burke, L. E., S. Shiffman, E. Music, M. A. Styn, A. Kriska, A. Smailagic, D. Siewiorek, et al. 2017. “Ecological Momentary Assessment in Behavioral Research: Addressing Technological and Human Participant Challenges.” Journal of medical Internet research 19: e77. doi:10.2196/jmir.7138.
  • Busza, J., S. Teferra, S. Omer, and C. Zimmerman. 2017. “Learning from Returnee Ethiopian Migrant Domestic Workers: A Qualitative Assessment to Reduce the Risk of Human Trafficking.” Globalization and health 13. doi:10.1186/s12992-017-0293-x.
  • Butler, L. M. 1995. The Sondeo: A Rapid Reconnaissance Approach for Situational Assessment. WREP0127, Washington State University Extension Service.
  • Cellamare, M., S. Ventz, E. Baudin, C. D. Mitnick, and L. Trippa. 2017. “A Bayesian Response-Adaptive Trial in Tuberculosis: The endTB Trial.” Clinical trials (London, England) 14: 17–28. doi:10.1177/1740774516665090.
  • Choko, A. T., K. Fielding, N. Stallard, H. Maheswaran, A. Lepine, N. Desmond, M. K. Kumwenda, and E. L. Corbett. 2017. “Investigating Interventions to Increase Uptake of HIV Testing and Linkage into Care or Prevention for Male Partners of Pregnant Women in Antenatal Clinics in Blantyre, Malawi: Study Protocol for a Cluster Randomised Trial.” Trials 18. doi:10.1186/s13063-017-2093-2.
  • Connors, S. C., S. Nyaude, A. Challender, E. Aagaard, C. Velez, and J. Hakim. 2017. “Evaluating the Impact of the Medical Education Partnership Initiative at the University of Zimbabwe College of Health Sciences Using the Most Significant Change Technique.” Academic medicine : Journal of the Association of American Medical Colleges 92: 1264–1268. doi:10.1097/ACM.0000000000001519.
  • Copestake, J. 2014. “Credible Impact Evaluation in Complex Contexts: Confirmatory and Exploratory Approaches.” Evaluation 20: 412–427. doi:10.1177/1356389014550559.
  • Copestake, J., C. Allan, B. W. Van, Belay, M. Goshu, T. Mvula, P. Remnant, F. Thomas, and E. Zerahun. 2018a. “Managing Relationships in Qualitative Impact Evaluation of International Development: QuIP Choreography as a Case Study.” Evaluation 24: 169–184. doi:10.1177/1356389018763243.
  • Copestake, J., and F. Remnant. 2015. Assessing Rural Transformations: Piloting a Qualitative Impact Protocol in Malawi and Ethiopia, In: Mixed Methods Research in Poverty and Vulnerability. 119–148. London:Palgrave Macmillan. doi: 10.1057/9781137452511_6.
  • Copestake, J., M. Morsink, and F. Remnant, Eds. 2018b. Attributing Development Impact: The QuIP Case Book. Rugby: Practical Action.
  • Cori, A., C. A. Donnelly, I. Dorigatti, N. M. Ferguson, C. Fraser, T. Garske, T. Jombart, et al. 2017. “Key Data for Outbreak Evaluation: Building on the Ebola Experience. Philos.” Philosophical transactions of the Royal Society of London. Series B, Biological sciences 372. doi:10.1098/rstb.2016.0371.
  • Davies, R. 1998. An Evolutionary Approach to Facilitating Organisational Learning: An Experiment by the Christian Commission for Development in Bangladesh. Impact Assessment and Project Appraisal 16(3): 243–250. doi:10.1080/14615517.1998.10590213.
  • Davies, R. 2014. Thinking about Set Relationships within Monitoring Data. Rick on the Road. http://mandenews.blogspot.co.uk/2014/01/thinking-about-set-relationships-within.html.
  • Davies, R., 2016a. “Qualitative Comparative Analysis [WWW Document].” Better Evaluation. Accessed 1 March 2018. http://www.betterevaluation.org/en/evaluation-options/qualitative_comparative_analysis
  • Davies, R. 2016b. Evaluating the Impact of Flexible Development Interventions Using a ‘Loose’ Theory of Change Reflections on the Australia-Mekong NGO Engagement Platform (A Method Lab Publication). London: Overseas Development Institute.
  • Davies, R., and J. Dart. 2005. The “Most Significant Change” (MSC). Technique: A Guide to Its Use. https://www.kepa.fi/tiedostot/most-significant-change-guide.pdf.
  • Davies, R., J. Laidlaw, and P. Rogers. 2016. Process Tracing [WWW Document]. Better Evaluation. https://www.betterevaluation.org/en/evaluation-options/processtracing.
  • DDD, 2014. “Doing Development Differently [WWW Document].” Doing Development Differently. URL Accessed 17 Febuary 2018 http://doingdevelopmentdifferently.com/the-ddd-manifesto/
  • Dellicour, S., J. Hill, J. Bruce, P. Ouma, D. Marwanga, P. Otieno, M. Desai, M. J. Hamel, S. Kariuki, and J. Webster. 2016. “Effectiveness of the Delivery of Interventions to Prevent Malaria in Pregnancy in Kenya.” Malaria journal 15: 221. doi:10.1186/s12936-016-1261-2.
  • DFID. 2012. Results in Fragile and Conflict-Affected Stated and Situations: How to Note. Department for International Development. https://www.gov.uk/government/publications/results-in-fragile-and-conflict-affected-states-and-situations.
  • Dibner-Dunlap, A., and Y. Rathore, 2016. “Beyond RCTs: How Rapid-Fire Testing Can Build Better Financial Products [WWW Document].” Innovations for Poverty Action. URL Accessed 2 January 2018. https://www.poverty-action.org/blog/beyond-rcts-how-rapid-fire-testing-can-build-better-financial-products
  • Earl, S., F. Carden, and T. Smutylo. 2001. Outcome Mapping: Building Learning and Reflection into Development Programs. Ottawa, Canada: International Development Research Centre.
  • Eirich, F., and A. Morrison. n.d. Guide 6: Contribution Analysis, Social Sciene Methods Series. Scottish Government. https://www.gov.scot/resource/doc/175356/0116687.pdf.
  • Fereday, S. 2015. A Guide to Quality Improvement Methods. Healthcare Quality Improvement Partnership. https://www.hqip.org.uk/resource/guide-to-quality-improvement-methods/#.W-1nuTj7SUk.
  • Gamble, J. 2006. A Developmental Evaluation Primer. Canada: J.W. McConnell Family Foundation.
  • Ganann, R., D. Ciliska, and H. Thomas. 2010. “Expediting Systematic Reviews: Methods and Implications of Rapid Reviews.” Implementation Science 5: 56. doi:10.1186/1748-5908-5-56.
  • Garnett, G. P., T. B. Hallett, A. Takaruza, J. Hargreaves, R. Rhead, M. Warren, C. Nyamukapa, and S. Gregson. 2016. “Providing a Conceptual Framework for HIV Prevention Cascades and Assessing Feasibility of Empirical Measurement with Data from East Zimbabwe: A Case Study.” Lancet HIV 3: e297–e306. doi:10.1016/S2352-3018(16)30039-X.
  • Goertz, G., and J. Mahoney. 2012. A Tale of Two Cultures: Qualitative and Quantitative Research in the Social Sciences. Princeton: Princeton University Press.
  • Grant, M. J., and A. Booth. 2009. “A Typology of Reviews: An Analysis of 14 Review Types and Associated Methodologies.” Health Information & Libraries Journal 26: 91–108. doi:10.1111/j.1471-1842.2009.00848.x.
  • Green, D. 2015. Doing Development Differently: A Great Discussion on Adaptive Management (No, Really). From Poverty to Power. https://oxfamblogs.org/fp2p/doing-development-differently-a-great-discussion-on-adaptive-management-no-really/.
  • Harris, K. J., N. W. Jerome, and S. B. Fawcett. 1997. “Rapid Assessment Procedures: A Review and Critique.” Human organization 56: 375–378. doi:10.17730/humo.56.3.w525025611458003.
  • HEARD Project, 2018. “Rapid Review Vs. Systematic Review: What are the Differences? [WWW Document].” USAID. Accessed 26 October 2018. https://www.heardproject.org/news/rapid-review-vs-systematic-review-what-are-the-differences/
  • Hildebrand, P. E. 1981. “Combining Disciplines in Rapid Appraisal: The Sondeo Approach.” Agricultural Administration 8: 423–432. doi:10.1016/0309-586X(81)90037-6.
  • Ho, L. S., G. Labrecque, I. Batonon, V. Salsi, and R. Ratnayake. 2015. “Effects of a Community Scorecard on Improving the Local Health System in Eastern Democratic Republic of Congo: Qualitative Evidence Using the Most Significant Change Technique.” Conflict and health 9: 27. doi:10.1186/s13031-015-0055-4.
  • Hubbard, B. 2010. Root Cause Analysis (Overview). Lean Learning Revolution!. https://bobsleanlearning.wordpress.com/2010/09/16/root-cause-analysis-overview/.
  • IPA. 2016. Introduction to Rapid-Fire Operational Testing for Social Programs (Goldilocks Deep Dive). New Haven: Innovations for Poverty Action.
  • Jones, H., and S. Hearn. 2009. Outcome Mapping: A Realistic Alternative for Planning, Monitoring and Evaluation (Working and Discussion Paper). London, UK: Overseas Development Institute.
  • Jordan, E., M. E. Gross, A. N. Javernick-Will, and M. J. Garvin. 2011. “Use and Misuse of Qualitative Comparative Analysis.” Construction Management and Economics 29: 1159–1173. doi:10.1080/01446193.2011.640339.
  • Kairalla, J. A., C. S. Coffey, M. A. Thomann, and K. E. Muller. 2012. “Adaptive Trial Designs: A Review of Barriers and Opportunities.” Trials 13: 145. doi:10.1186/1745-6215-13-145.
  • Kane, H., M. A. Lewis, P. A. Williams, and L. C. Kahwati. 2014. “Using Qualitative Comparative Analysis to Understand and Quantify Translation and Implementation.” Translational behavioral medicine 4: 201–208. doi:10.1007/s13142-014-0251-6.
  • Karlan, D. 2017. Nimble RCTs. A Powerful Methodology in the Program Design Toolbox. Innovations for Poverty Action, Yale University. http://pubdocs.worldbank.org/en/626921495727495321/Nimble-RCTs-WorldBankMay2017-v4.pdf.
  • Karlan, D., M. McConnell, S. Mullainathan, and J. Zinman. 2016. “Getting to the Top of Mind: How Reminders Increase Saving.” Management science 62: 3393–3411. doi:10.1287/mnsc.2015.2296.
  • Kontopantelis, E., T. Doran, D. A. Springate, I. Buchan, and D. Reeves. 2015. “Regression Based Quasi-Experimental Approach When Randomisation Is Not an Option: Interrupted Time Series Analysis.” BMJ (Clinical research ed.) 350: h2750. doi:10.1136/bmj.h2750.
  • Korn, E. L., and B. Freidlin. 2017. “Adaptive Clinical Trials: Advantages and Disadvantages of Various Adaptive Design Elements.” Journal of the National Cancer Institute 109. doi:10.1093/jnci/djx013.
  • Lacouture, A., E. Breton, A. Guichard, and V. Ridde. 2015. “The Concept of Mechanism from a Realist Approach: A Scoping Review to Facilitate Its Operationalization in Public Health Program Evaluation.” Implementation science : IS 10: 153. doi:10.1186/s13012-015-0345-7.
  • Ladner, D. 2015. Strategy Testing: An Innovative Approach to Monitoring Highly Flexible Aid Programs (Case Study No 3), Working Politically in Practice. San Francisco: Asia Foundation.
  • Lang, T. 2011. “Adaptive Trial Design: Could We Use This Approach to Improve Clinical Trials in the Field of Global Health?.” Amercian The Journal of tropical medicine and hygiene 85: 967–970. doi:10.4269/ajtmh.2011.11-0151.
  • Limato, R., R. Ahmed, A. Magdalena, S. Nasir, and F. Kotvojs. 2018. “Use of Most Significant Change (MSC) Technique to Evaluate Health Promotion Training of Maternal Community Health Workers in Cianjur District, Indonesia.” Evaluation and program planning 66: 102–110. doi:10.1016/j.evalprogplan.2017.10.011.
  • Lopez Bernal, J., S. Cummins, and A. Gasparrini. 2017. “Interrupted Time Series Regression for the Evaluation of Public Health Interventions: A Tutorial.” International journal of epidemiology 46: 348–355. doi:10.1093/ije/dyw098.
  • Lopez Bernal, J., S. Cummins, and A. Gasparrini. 2018. “The Use of Controls in Interrupted Time Series Studies of Public Health Interventions.” International journal of epidemiology. doi:10.1093/ije/dyy135.
  • Mahajan, R., and K. Gupta. 2010. “Adaptive Design Clinical Trials: Methodology, Challenges and Prospect.” Indian journal of pharmacology 42: 201–207. doi:10.4103/0253-7613.68417.
  • Manderson, L., and P. Aaby. 1992. “Can Rapid Anthropological Procedures Be Applied to Tropical Diseases?” Health policy and planning 7: 46–55. doi:10.1093/heapol/7.1.46.
  • Manzano, A. 2016. “The Craft of Interviewing in Realist Evaluation.” Evaluation 22: 342–360. doi:10.1177/1356389016638615.
  • Mayne, J., 2008. “Contribution Analysis [WWW Document].” Better Evaluation. Accessed 26 October 2018. https://www.betterevaluation.org/en/plan/approach/contribution_analysis
  • Moore, G. F., S. Audrey, M. Barker, L. Bond, C. Bonell, W. Hardeman, L. Moore, A. O’Cathain, T. Tinati, and D. Wight. 2015. “Process Evaluation of Complex Interventions: Medical Research Council Guidance.” BMJ (Clinical research ed.) 350: h1258. doi:10.1136/bmj.h1258.
  • O’Connell, T., and A. Sharkey. 2013. Reaching Universal Health Coverage: Using a Modified Tanahashi Model Sub-Nationally to Attain Equitable and Effective Coverage. New York: UNICEF.
  • O’Donnell, M. 2016. Adaptive Management: What It Means for Civil Society Organisations. London: Bond.
  • ODI. 2009. Strategy Development: Outcome Mapping, In: Tools for Knowledge and Learning: A Guide for Development and Humanitarian Organisations. London.
  • Optipedia, n.d. “A/B Testing [WWW Document].” Optimizely. Accessed 1 February 2018. https://www.optimizely.com/optimization-glossary/ab-testing/
  • Patton, M. Q. 2008. Utilization-Focused Evaluation. Thousand Oaks, CA: Sage publications.
  • Patton, M. Q., 2013. “Utilization-Focused Evaluation (U-FE) Checklist [WWW Document].” Accessed 12 December 2017. https://wmich.edu/sites/default/files/attachments/u350/2014/UFE_checklist_2013.pdf
  • Pawson, R. 2013. The Science of Evaluation: A Realist Manifesto. Thousand Oaks, CA: Sage.
  • Pawson, R., and N. Tilley. 2004. Realist Evaluation. London: Sage.
  • Peerally, M. F., S. Carr, J. Waring, and M. Dixon-Woods. 2017. “The Problem with Root Cause Analysis.” BMJ quality & safety 26: 417–422. doi:10.1136/bmjqs-2016-005511.
  • Peters, A., 2018. “At These Camps, Refugees Can Give Real-Time Customer Feedback [WWW Document].” Fast Company. Accessed 24 July 2018. https://www.fastcompany.com/40575160/at-these-camps-refugees-can-give-real-time-customer-feedback
  • Portela, M. C., P. J. Pronovost, T. Woodcock, P. Carter, and M. Dixon-Woods. 2015. “How to Study Improvement Interventions: A Brief Overview of Possible Study Types.” BMJ quality & safety 24: 325–336. doi:10.1136/bmjqs-2014-003620.
  • Positive Deviance Initiative, 2017. “What Is Positive Deviance? [WWW Document].” Positive Deviance Initiative. URL Accessed 20 March 2018. https://positivedeviance.org/
  • Research to Action, 2012. “Outcome Mapping: A Basic Introduction [WWW Document].” Research to Action. Accessed 7 December 2017. http://www.researchtoaction.org/2012/01/outcome-mapping-a-basic-introduction/
  • Rio, D., J. Hedges, S. Woodhead, and E. Rogers. 2015. What Is the Bottleneck Analysis Approach for the Management of Severe Acute Malnutrition?. UNICEF and Action Against Hunger. http://www.coverage-monitoring.org/wp-content/uploads/2015/12/BAA-12-08-2015.pdf.
  • Schünemann,H. J., and L. Moja 2015. "Reviews: Rapid! Rapid! Rapid! …and systematic." Systematic Reviews 4(1): 4. doi:10.1186/2046-4053-4-4.
  • Shiffman, S., A. A. Stone, and M. R. Hufford. 2008. “Ecological Momentary Assessment.” Annual review of clinical psychology 4: 1–32. doi:10.1146/annurev.clinpsy.3.022806.091415.
  • Smutylo, T. 2005. “Outcome Mapping: A Method for Tracking Behavioural Changes in Development Programs (No.” ILAC Brief 7. https://www.outcomemapping.ca/resource/outcome-mapping-a-method-for-tracking-behavioural-changes-in-development-programs.
  • Srivastava, K., 2014. “The ‘Adjacent Possible’ of Big Data: What Evolution Teaches about Insights Generation [WWW Document].” WIRED. Accessed 21 January 2018. https://www.wired.com/insights/2014/12/the-adjacent-possible-of-big-data/
  • Stetler, C. B., M. W. Legro, C. M. Wallace, C. Bowman, M. Guihan, H. Hagedorn, B. Kimmel, N. D. Sharp, and J. L. Smith. 2006. “The Role of Formative Evaluation in Implementation Research and the QUERI Experience.” Journal of general internal medicine 21: S1–S8. doi:10.1111/j.1525-1497.2006.00355.x.
  • Talcott, F., and V. Scholz. 2015. Methodology Guide to Process Tracing for Christian Aid. Oxford: International Non-Governmental Training and Research Centre.
  • Tanahashi, T. 1978. “Health Service Coverage and Its Evaluation.” Bulletin of the World Health Organization 56: 295–303.
  • Theiss-Nyland, K., D. Koné, C. Karema, W. Ejersa, J. Webster, and J. Lines. 2017. “The Relative Roles of ANC and EPI in the Continuous Distribution of LLINs: A Qualitative Study in Four Countries.” Health policy and planning 32: 467–475. doi:10.1093/heapol/czw158.
  • Thorlund, K., J. Haggstrom, J. J. Park, and E. J. Mills. 2018. “Key Design Considerations for Adaptive Clinical Trials: A Primer for Clinicians.” BMJ (Clinical research ed.) 360: k698. doi:10.1136/bmj.k698.
  • Ton, G. 2012. “The Mixing of Methods: A Three-Step Process for Improving Rigour in Impact Evaluations.” Evaluation 18: 5–25. doi:10.1177/1356389011431506.
  • Tricco, A. C., E. Langlois, and S. E. Straus. 2017. Rapid Reviews to Strengthen Health Policy and Systems: A Practical Guide. Geneva: World Health Organization, Alliance for Health Policy and Systems Research.
  • Tricco, A. C., J. Antony, W. Zarin, L. Strifler, M. Ghassemi, J. Ivory, L. Perrier, B. Hutton, D. Moher, and S. E. Straus. 2015. “A Scoping Review of Rapid Review Methods.” BMC medicine 13: 224. doi:10.1186/s12916-015-0465-6.
  • UN Global Pulse 2012. Big Data for Development: Challenges and Opportunities. New York, NY: United Nations.
  • Valters, C., C. Cummings, and H. Nixon. 2016. Putting Learning at the Centre. Adaptive Development Programming in Practice. London: Overseas Development Institute.
  • Villar, S. S., J. Bowden, and J. Wason. 2017. “Response-Adaptive Designs for Binary Responses: How to Offer Patient Benefit while Being Robust to Time Trends?” Pharmaceutical statistics. 10.1002/pst.1845.
  • Vlassoff, C., and M. Tanner. 1992. “The Relevance of Rapid Assessment to Health Research and Interventions.” Health policy and planning 7: 1–9. doi:10.1093/heapol/7.1.1.
  • Walji, A., and C. Vein, 2013. “Learning from Data-Driven Delivery [WWW Document].” World Bank. Accessed 11 October 2017. http://blogs.worldbank.org/voices/learning-data-driven-delivery
  • Webster, J., K. Kayentao, J. Bruce, S. I. Diawara, A. Abathina, A. A. Haiballa, O. K. Doumbo, and J. Hill. 2013. “Prevention of Malaria in Pregnancy with Intermittent Preventive Treatment and Insecticide Treated Nets in Mali: A Quantitative Health Systems Effectiveness Analysis.” PloS one 8: e67520. doi:10.1371/journal.pone.0067520.
  • Wechsberg, W. M., J. W. Ndirangu, I. S. Speizer, W. A. Zule, W. Gumula, C. Peasant, F. A. Browne, and L. Dunlap. 2017. “An Implementation Science Protocol of the Women’s Health CoOp in Healthcare Settings in Cape Town, South Africa: A Stepped-Wedge Design.” BMC women’s health 17. doi:10.1186/s12905-017-0433-8.
  • White, H., 2013. “Using the Causal Chain to Make Sense of the Numbers [WWW Document].” International Initiative for Impact Evaluation. Accessed 17 October 2018. http://www.3ieimpact.org/en/announcements/2013/02/12/using-causal-chain-make-sense-numbers/
  • White, H., and D. Phillips, 2012. “Addressing attribution of cause and effect in small n impact evaluations: towards an integrated framework (Working Paper 15).” International Initiative for Impact Evaluation.
  • Wilson-Grau, R., 2015. “Outcome Harvesting [WWW Document].” Better Evaluation. Accessed 4 January 2018 http://www.betterevaluation.org/en/plan/approach/outcome_harvesting
  • Wilson-Grau, R., and H. Britt. 2012. Outome Harvesting Brief. Cairo, Eygpt: Ford Foundation.
  • Wohlin, C., 2014. Guidelines for Snowballing in Systematic Literature Studies and a Replication in Software Engineering, in: Proceedings of the 18th International Conference on Evaluation and Assessment in Software Engineering, EASE ’14. ACM, New York, pp. 38: 1–38:10. doi: 10.1145/2601248.2601268
  • Woolcock, M. 2009. “Toward a Plurality of Methods in Project Evaluation: A Contextualised Approach to Understanding Impact Trajectories and Efficacy.” Journal of development effectiveness 1: 1–14. doi:10.1080/19439340902727719.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.