1,393
Views
20
CrossRef citations to date
0
Altmetric
Research Article

Evaluating outcome-based payment programmes: challenges for evidence-based policy

ORCID Icon & ORCID Icon
Pages 61-77 | Received 20 Oct 2017, Accepted 21 Jan 2019, Published online: 07 Feb 2019

References

  • Albertson, K., K. Bailey, C. Fox, J. LaBarbera, C. O’Leary, and G. Painter. 2018. Payment by Results and Social Impact Bonds: Outcome-Based Payment Systems in the UK and US. Bristol: Policy Press.
  • Anders, J., and R. Dorsett. 2017. HMP Peterborough Social Impact Bond - Cohort 2 and Final Cohort Impact Evaluation. London: NIESR.
  • ATQ Consultants and Ecorys. 2015. Ways to Wellness Social Impact Bond: The UK’s First Health SIB: A Deep Dive Report. London: Commissioning Better Outcomes Evaluation.
  • ATQ Consultants and Ecorys. 2016. Reconnections Social Impact Bond: Reducing Loneliness in Worcestershire. London: Commissioning Better Outcomes Evaluation.
  • Baron, R. M., and D. A. Kenny. 1986. “The Moderator-Mediator Variable Distinction in Social Psychological Research: Conceptual, Strategic and Statistical Considerations.” Journal of Personality and Social Psychology 51 (6): 1173–1182. doi:10.1037/0022-3514.51.6.1173.
  • Befani, B., and G. Stedman-Bryce. 2016. “Process Tracing and Bayesian Updating for Impact Evaluation.” Evaluation. doi:10.1177/1356389016654584.
  • Bewley, H., A. George, C. Rienzo, and J. Portes. 2016. National Evaluation of the Troubled Families Programme: National Impact Study Report. London: DCLG.
  • Biesta, G. 2010. “Pragmatism and the Philosophical Foundations of Mixed Methods Research.” In SAGE Handbook of Mixed Methods in Social & Behavioral Research, edited by A. Tashakkori and C. Teddlie. Thousand Oaks, California: SAGE Publications. doi:10.4135/9781506335193.
  • Blamey, A., and M. Mackenzie. 2007. “Theories of Change and Realistic Evaluation: Peas in a Pod or Apples and Oranges?” Evaluation 13: 439–455. doi:10.1177/1356389007082129.
  • Bonell, C., A. Fletcher, M. Morton, T. Lorenc, and L. Moore. 2012. “Realist Randomised Controlled Trials: A New Approach to Evaluating Complex Public Health Interventions.” Social Science & Medicine 75: 2299–2306. doi:10.1016/j.socscimed.2012.08.032.
  • Bonell, C., G. Moore, E. Warren, and L. Moore. 2018. “Are Randomised Controlled Trials Positivist? Reviewing the Social Science and Philosophy Literature to Assess Positivist Tendencies of Trials of Social Interventions in Public Health and Health Services.” Trials 19 (1): 238. doi:10.1186/s13063-018-2589-4.
  • Boutron, I., A. Dg, D. Moher, S. Kf, P. Ravaud, and C. N. P. T. Group, for The. 2017. “Consort Statement for Randomized Trials of Nonpharmacologic Treatments: A 2017 Update and A Consort Extension for Nonpharmacologic Trial Abstracts.” Annals of Internal Medicine 167 (1): 40–47. doi:10.7326/M17-0046.
  • Byrne, D. 2009. “Case-Based Methods: Why We Need Them; What They Are; How to Do Them.” In The SAGE Handbook of Case-Based Methods, edited by D. Byrne and C. C. Ragin, 1–9. London: Sage.
  • Cabinet Office. 2011. Open Public Services White Paper. London: Cabinet Office.
  • Cartwright, N., and J. Hardie. 2012. Evidence-Based Policy: A Practical Guide to Doing It Better. Oxford: Oxford University Press.
  • Collier, D., H. E. Brady, and J. Seawright. 2010. “Sources of Leverage in Causal Inference: Toward an Alternative View of Methodology.” In Rethinking Social Inquiry: Diverse Tools, Shared Standards, edited by H. E. Brady and D. Collier, 161–199. 2nd ed. Lanham, MD: Rowman and Littlefield.
  • Cook, T., W. Shadish, and V. Wong. 2008. “Three Conditions under Which Experiments and Observational Studies Produce Comparable Causal Estimates: New Findings from Within-Study Comparisons.” Journal of Policy Analysis and Management 27 (4): 724–750. doi:10.1002/pam.v27:4.
  • DCLG. 2014. Supporting People Payment by Results Pilots: Final Evaluation. London: DCLG.
  • DCLG. 2015. Qualitative Evaluation of the London Homelessness Social Impact Bond: Second Interim Report. London: DCLG.
  • Deaton, A., and N. Cartwright. 2017. “Understanding and Misunderstanding Randomized Controlled Trials.” Social Science & Medicine. doi:10.1016/j.socscimed.2017.12.00.
  • Disley, E., J. Rubin, E. Scraggs, N. Burrowes, D. Culley, and R. A. N. D. Europe. 2011. Lessons Learned from the Planning and Early Implementation of the Social Impact Bond at HMP Peterborough. London: MoJ.
  • Dowling, E. 2017. “In the Wake of Austerity: Social Impact Bonds and the Financialisation of the Welfare State in Britain.” New Political Economy 22 (3): 294–310. doi:10.1080/13563467.2017.1232709.
  • Dowling, E., and D. Harvie. 2014. “Harnessing the Social: State, Crisis and (Big) Society.” Sociology 48 (5): 869–886. doi:10.1177/0038038514539060.
  • Edmiston, D., and A. Nicholls. 2017. “Social Impact Bonds: The Role of Private Capital in Outcome-Based Commissioning.” Journal of Social Policy 47 (1): 57–76.
  • Foster, R., L. Small, S. Foster, O. Skrine, G. Hunter, and P. Turnbull. 2013. Evaluation of the Employment and Reoffending Pilot: Lessons Learnt from the Planning and Early Implementation Phase. London: Ministry of Justice.
  • Fraser, A., S. Tan, M. Lagarde, and N. Mays. 2016. “Narratives of Promise, Narratives of Caution: A Review of the Literature on Social Impact Bonds.” Social Policy & Administration 52 (1): 4–28. doi:10.1111/spol.12260.
  • Gerring, J. 2010. “Causal Mechanisms: Yes, But….” Comparative Political Studies 43 (11): 1499–1526. doi:10.1177/0010414010376911.
  • GIIN. 2017. Annual Impact Investor Survey. New York: GIIN.
  • Gosling, H. 2016. “Payment by Results: Challenges and Conflicts for the Therapeutic Community.” Criminology & Criminal Justice 16 (5): 519–533. doi:10.1177/1748895816641997.
  • Government Social Research Unit. 2007. Background Paper 2 – What Do We Already Know? Harnessing Existing Research. London: Cabinet Office.
  • Green, D. P., S. E. Ha, and J. G. Bullock. 2010. “Enough Already about “Black Box” Experiments: Studying Mediation Is More Difficult than Most Scholars Suppose.” The Annals of the American Academy of Political and Social Science 628: 200–208. doi:10.1177/0002716209351526.
  • Greenberg, D. H., and M. Shroder. 2004. The Digest of Social Experiments. Washington, DC: Urban Insitute.
  • Griffiths, R., A. Thomas, and A. Pemberton. 2016. Qualitative Evaluation of the DWP Innovation Fund: Final Report. London: Department for Work and Pensions.
  • Halligan, J., C. Sarrico, and M. L. Rhodes. 2012. “On the Road to Performance Governance in the Public Domain?” International Journal of Productivity and Performance Management 61 (3): 224–234. doi:10.1108/17410401211205623.
  • Hanley, P., B. Chambers, and J. Haslam. 2016. “Reassessing RCTs as the ‘Gold Standard’: Synergy Not Separatism in Evaluation Designs.” International Journal of Research & Method in Education 39: 287–298. doi:10.1080/1743727X.2016.1138457.
  • HM Government. 2010. The Coalition: Our Programme for Government. London: Cabinet Office.
  • Hood, C. 1991. “A Public Management for All Seasons?” Public Administration 69 (1): 3–19. doi:10.1111/padm.1991.69.issue-1.
  • Imai, K., L. Keele, D. Tingley, and T. Yamamoto. 2011. “Unpacking the Black Box of Causality: Learning about Causal Mechanisms from Experimental and Observational Studies.” American Political Science Review 105 (4): 765–789. doi:10.1017/S0003055411000414.
  • Johnson, R. B., A. J. Onwuegbuzie, and L. A. Turner. 2007. “Toward a Definition of Mixed Methods Research.” Journal of Mixed Methods Research 1 (2): 112–133. doi:10.1177/1558689806298224.
  • Jolliffe, D., and C. Hedderman. 2014. Peterborough Social Impact Bond: Final Report on Cohort 1 Analysis, London: Ministry of Justice.
  • Lane, P., R. Foster, L. Gardiner, L. Lanceley, and A. Purvis. 2013. Work Programme Evaluation: Procurement, Supply Chains and Implementation of the Commissioning Model. London: DWP.
  • Lascoumes, P., and P. Le Galès. 2007. “Introduction: Understanding Public Policy through Its Instruments—From the Nature of Instruments to the Sociology of Public Policy Instrumentation.” Governance 20 (1): 1–21. doi:10.1111/j.1468-0491.2007.00342.x.
  • Linder, S., and B. Peters. 1990. “The Design of Instruments for Public Policy.” In Policy Theory and Policy Evaluation, edited by S. Nagel, 103–119. Westport, CT: Greenwood Press.
  • Lowe, T. 2017. “Debate: Complexity and the Performance of Social Interventions.” Public Money & Management 37 (2): 79–80. doi:10.1080/09540962.2016.1266141.
  • Lowe, T., and R. Wilson. 2015. “Playing the Game of Out-Comes-Based Performance Management: Is Gamesmanship Inevitable?” Evidence from Theory and Practice’, Social Policy and Administration. doi:10.1111/spol.12205.
  • Maxwell, J. A. 2012. “The Importance of Qualitative Research for Causal Explanation in Education.” Qualitative Inquiry 18 (8): 655–661. doi:10.1177/1077800412452856.
  • Mayne, J. 2012. “Contribution Analysis: Coming of Age?” Evaluation 18 (3): 270–280. doi:10.1177/1356389012451663.
  • Ministry of Justice. 2014. Peterborough Social Impact Bond HMP Doncaster: Payment by Results Pilots – Final Re-Conviction Results for Cohorts 1. London: Ministry of Justice.
  • Ministry of Justice. 2015. HMP Doncaster: Payment by Results Pilot – Final Re-Conviction Results for Cohort 2. London: Ministry of Justice.
  • Morris, S. P., T. Edovald, C. Lloyd, and Z. Kiss. 2016. “The Importance of Specifying and Studying Causal Mechanisms in School-Based Randomised Controlled Trials: Lessons from Two Studies of Cross-Age Peer Tutoring.” Educational Research and Evaluation 22 (7–8): 422–439. doi:10.1080/13803611.2016.1259113.
  • Mulgan, G., N. Reeder, M. Aylott, and L. Bosher. 2010. Social Impact Investment: The Opportunity and Challenge of Social Impact Bonds. London: Young Foundation.
  • Nafilyan, V., and S. Speckesser. 2014. The Youth Contract Provision for 16- and 17-Year-Olds Not in Education, Employment or Training Evaluation: Econometric Estimates of Programme Impacts and Net Social Benefits. London: DfE.
  • National Audit Office. 2015. Outcome-Based Payment Schemes: Government’s Use of Payment by Results. London: NAO.
  • Newton, B., S. Speckesser, V. Nafilyan, S. Maguire, D. Devins, and T. Bickerstaffe (2014) The Youth Contract for 16–17 year olds not in education, employment or training evaluation. Research Report, London: DfE.
  • O’Flynn, P., and C. Barnett. 2017. Evaluation and Impact Investing: A Review of Methodologies to Assess Social Impact. Brighton: Institute of Development Studies.
  • Oakley, A., V. Strange, C. Bonell, E. Allen, and J. Stephenson. 2006. “Process Evaluation in Randomised Controlled Trials of Complex Interventions.” British Medical Journal 332: 413–415. doi:10.1136/bmj.332.7538.413.
  • OECD. 2015. Social Impact Investment: Building the Evidence Base. Paris: OECD.
  • Paluck, E. L. 2010. “The Promising Integration of Qualitative Methods and Field Experiments.” The Annals of the American Academy of Political and Social Science 628: 59–71. doi:10.1177/0002716209351510.
  • Pawson, R., T. Greenhalgh, G. Harvey, and K. Walshe. 2005. “Realist Review-A New Method of Systematic Review Designed for Complex Policy Interventions.” Journal of Health Services Research & Policy 10 (1_suppl): 21–34. doi:10.1258/1355819054308530.
  • Pawson, R., and N. Tilley. 1997. Realistic Evaluation. London: Sage Publications.
  • Pearce, S., D. Murray, and M. Lane. 2015. HMP Doncaster Payment by Results Pilot: Final Process Evaluation Report. London: MoJ.
  • Porter, S., T. McConnell, and J. Reid. 2017. “The Possibility of Critical Realist Randomised Controlled Trials.” Trials 18 (1): 133. doi:10.1186/s13063-017-1855-1.
  • Rawhouser, H., M. Cummings, and S. L. Newbert. 2017. “Social Impact Measurement: Current Approaches and Future Directions for Social Entrepreneurship Research.” Entrepreneurship Theory and Practice. doi:10.1177/1042258717727718.
  • Shadish, W. R., T. D. Cook, and D. T. Campbell. 2002. Experimental and Quasi-Experimental Designs for Generalized Causal Inference. New York, NY: Houghton Mifflin and Company.
  • Sherman, L. W., D. Gottfredson, D. MacKenzie, J. Eck, P. Reuter, and S. Bushway. 1998. ‘Preventing Crime: What Works, What Doesn’t, What’s Promising’, National Institute of Justice Research in Brief. Washington D. C.: National Institute of Justice.
  • Sherman, L. W., and H. Strang. 2004. “Experimental Ethnography: The Marriage of Qualitative and Quantitative Research.” The Annals of the American Academy of Political and Social Science 595: 204–222. http://www.jstor.org/stable/4127621
  • Sidebottom, A., and N. Tilley. 2012. “Further Improving Reporting in Crime and Justice: An Addendum to Perry, Weisburd and Hewitt (2010).” Journal of Experimental Criminology 8 (1): 49–69. doi:10.1007/s11292-011-9128-6.
  • Social Finance. 2009. Social Impact Bonds Rethinking Finance for Social Outcomes. London: Social Finance.
  • Social Finance. 2016. Balancing Evidence and Risk. London: Social Finance.
  • Spencer, L., J. Ritchie, J. Lewis, and L. Dillon. 2003. Quality in Qualitative Evaluation: A Framework for Assessing Research Evidence. London: Cabinet Office.
  • Stern, E., N. Stame, J. Mayne, K. Forss, R. Davies, and B. Befani. 2012. Broadening the Range of Designs and Methods for Impact Evaluations: Report of a Study Commissioned by the Department for International Development. DFID: Department for International Development.
  • Tan, S., A. Fraser, C. Giacomantonio, K. Kruithof, M. Sim, M. Lagarde, E. Disley, J. Rubin, and N. Mays. 2015. An Evaluation of Social Impact Bonds in Health and Social Care: Interim Report. London: PIRU.
  • Warner, M. 2013. “Private Finance for Public Goods: Social Impact Bonds.” Journal of Economic Policy Reform 16: 303–319. doi:10.1080/17487870.2013.835727.
  • Webster, R. 2016. “Payment by Results: Lessons from the Literature,” Accessed 01 March 2016. www.russellwebster.com
  • White, H., and D. Phillips. 2012. “Addressing Attribution of Cause and Effect in Small N Impact Evaluations: Towards an Integrated Framework,”Working Paper 15, International Initiative for Impact Evaluation. Accessed 23 February 2016. www.3ieimpact.org/media/filer_public/2012/06/29/working_paper_15.pdf
  • Wong, K., D. Ellingworth, and L. Meadows. 2015. Local Justice Reinvestment Pilot: Final Process Evaluation Report. London: MoJ.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.