455
Views
6
CrossRef citations to date
0
Altmetric
Articles

The Evidence of Effectiveness: Beyond the Methodological Standards

ORCID Icon, &
Pages 155-177 | Received 28 Oct 2019, Accepted 04 Feb 2020, Published online: 24 Feb 2020

References

  • Bowers, K. J., Johnson, S. D., & Hirschfield, A. F. G. (2004). Closing off opportunities for crime: An evaluation of alley-gating. European Journal on Criminal Policy and Research, 10(4), 285–308. doi:10.1007/s10610-005-5502-0
  • Brownson, R. C., Royer, C., Ewing, R., & McBride, T. D. (2006). Researchers and policymakers: Travelers in parallel universes. American Journal of Preventive Medicine, 30(2), 164–172. doi:10.1016/j.amepre.2005.10.004
  • Burkhardt, J.T., Schroeter, D.C., Magura, S., Means, S. N., & Coryn, C.L.S. (2015). An overview of evidence-based program registers (EBPRs) for behavioral health. Evaluation and Program Planning, 48, 92–99. doi:10.1016/j.evalprogplan.2014.09.006
  • Carvalho, M. L., Honeycutt, S., Escoffery, C., Glanz, K., Sabbs, D., & Kegler, M. C. (2013). Balancing fidelity and adaptation: Implementing evidence-based chronic disease prevention programs. Journal of Public Health Management and Practice, 19(4), 348–356. doi:10.1097/PHH.0b013e31826d80eb
  • Castro, F. G., Barrera, M., Jr., & Holleran Steiker, L. K. (2010). Issues and challenges in the design of culturally adapted evidence-based interventions. Annual Review of Clinical Psychology, 6(1), 213–239. doi:10.1146/annurev-clinpsy-033109-132032
  • Cherney, A., Head B.W., Boreham, P., Povey, J. & Ferguson, M. (2012). Perspectives of academic social scientists on knowledge transfer and research collaborations: A cross‐sectional survey of australian academics. Evidence and Policy 8(4), 433–453. doi:10.1332/174426412X660098
  • Clear, T. C. (2010). Policy and evidence: The challenge to the American Society of Criminology: 2009 Presidential Address to the American Society of Criminology. Criminology, 48(1), 1–25. doi:10.1111/j.1745-9125.2010.00178.x
  • Cohen, J. (1988). Statistical power analysis for the behavioral sciences. Hillsdale, NJ, L Erlbaum Associates.
  • Cohen D.J., Crabtree B.F., Etz R.S., Balasubramanian B.A., Donahue K.E., Leviton L.C., Clark E.C., Isaacson N.F., Stange K.C., Green L.W. (2008). Fidelity versus flexibility: translating evidence-based research into practice. American Journal of Prevention Medicine, 35, S381–S389. doi:10.1016/j.amepre.2008.08.005
  • Cooper, B. R., Shrestha, G., Hyman, L., & Hill, L. (2016). Adaptations in a community-based family intervention: Replication of two coding schemes. The Journal of Primary Prevention, 37(1), 33–52. doi:10.1007/s10935-015-0413-4
  • Development Services Group, Inc. (2016). National Registry of Evidence-based Programs and Practices Reviewer Manual 1.0. Unpublished Manual. Bethesda, MD.
  • Drake, E. K., Aos, S., & Miller, M. G. (2009). Evidence-based public policy options to reduce crime and criminal justice costs: Implications in Washington State. Victims & Offenders, 4(2), 170–196. doi:10.1080/15564880802612615
  • Escoffery, C., Lebow-Skelley, E., Udelson, H., Boing, E. A., Wood, R., Fernandez, M. E., & Mullen, P. D. (2019). A scoping study of frameworks for adapting public health evidence-based interventions. Translational Behavioral Medicine, 9(1), 1–10. doi:10.1093/tbm/ibx067
  • Evidence Based Policymaking Collaborative. (2016). Principles of evidence-based policymaking. Retrieved from https://www.urban.org/research/publication/principles-evidence-based-policymaking
  • Fagan, A. A., & Buchanan, M. (2016). What works in crime prevention? A comparison and critical review of three crime prevention registries. Criminology & Public Policy, 15(3), 617–649. doi:10.1111/1745-9133.12228
  • Fagan, A. A., & Eisenberg, N. (2012). Latest developments in the prevention of crime and anti-social behaviour: An American perspective. Journal of Children's Services, 7(1), 64–72. doi:10.1108/17466661211213689
  • Farrington, D. P. (2003). Methodological quality standards for evaluation research. The Annals of the American Academy of Political and Social Science, 587(1), 49–68. doi:10.1177/0002716202250789
  • FDA (Food and Drug Administration). (2018). Development & approval process. Retrieved from https://www.fda.gov/drugs/development-approval-process-drugs
  • Feucht, T.E. & Tyson, J. (2018) Advancing “What Works” in justice: past, present, and future work of federal justice research agencies, Justice Evaluation Journal, 1(2), 151–187, doi:10.1080/24751979.2018.1552083
  • Fisher, R. A. (1926). The arrangement of field experiments. Journal of the Ministry of Agriculture of Great Britain, 33: 503–515.
  • Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M., & Wallace, F. (2005). Implementation research: A synthesis of the literature. Tampa, FL: University of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network (FMHI Publication #231).
  • Flay, B.R., Biglan, A., Boruch, R.F., Castro, F.G., Gottfredson, D., Kellam, S., & Ji, P. (2005). Standards of evidence: Criteria for efficacy, effectiveness and dissemination. Prevention Science, 6(3), 151–175. doi:10.1007/s11121-005-5553-y
  • Gottfredson, D. C., Cook, T. D., Gardner, F. E. M., Gorman-Smith, D., Howe, G. W., Sandler, I. N., & Zafft, K. M. (2015). Standards of evidence for efficacy, effectiveness, and scale-up research in prevention science: Next generation. Prevention Science, 16(7), 893–926. doi:10.1007/s11121-015-0555-x
  • Haskins, R., & Baron, J. (2011, September). Building the connection between policy and evidence: The Obama Evidence-Based Initiatives. NESTA. Retrieved from http://coalition4evidence.org/wp-content/uploads/2011/09/Haskins-Baron-paper-on-fed-evid-basedinitiatives-2011.pdf
  • Heard, C. L., Rakow, T., & Spiegelhalter, D. (2018). Comparing comprehension and perception for alternative speed-of-ageing and standard hazard ratio formats. Applied Cognitive Psychology, 32(1), 81–93.
  • Hennessy, K. D., Finkbiner, R., & Hill, G. (2006). The National Registry of Evidence-based Programs and Practices: A decision-support tool to advance the use of evidence-based services. International Journal of Mental Health, 35(2), 21–34. doi:10.2753/IMH0020-7411350202
  • Hernan, M. A. (2005). Invited commentary: Hypothetical interventions to define causal effects—afterthought or prerequisite?. American Journal of Epidemiology, 162, 618–620.
  • HHS (Department of Health and Human Services). (n.d.). Home visiting evidence of effectiveness: Review process. Retrieved from https://homvee.acf.hhs.gov/Review-Process/4/Producing-Study-Ratings/19/5
  • Higgins, J. P. T., & Green, S. (Eds.). (2011, March). Cochrane handbook for systematic reviews of interventions, version 5.1.0. The Cochrane Collaboration. Retrieved from https://training.cochrane.org/handbook/current
  • Holland, P. W. (1986). Statistics and causal inference. Journal of the American Statistical Association, 81(396), 945–960. doi:10.1080/01621459.1986.10478354
  • IOM (Institute of Medicine). (2009). Initial national priorities for comparative effectiveness research. Washington, DC: The National Academies Press. Retrieved from https://www.nap.edu/read/12648/chapter/1
  • Karlsson, P., & Bergmark, A. (2015). Compared with what? An analysis of control‐group types in Cochrane and Campbell reviews of psychosocial treatment efficacy with substance use disorders. Addiction, 110(3), 420–428. doi:10.1111/add.12799
  • Kettrey, H. H., Marx, R. A., & Tanner-Smith, E. E. (2019). Effects of bystander programs on the prevention of sexual assault among adolescents and college students. Campbell Systematic Reviews, 15(1–2), e1013. doi:10.4073/csr.2019.1
  • Lipsey, M. W., Puzio, K., Yun, C., Hebert, M. A., Steinka-Fry, K., Cole, M. W., … Busick, M. D. (2012). Translating the statistical representation of the effects of education interventions into more readily interpretable forms. (NSCER 2013-3000). Washington, DC: National Center for Special Education Research, Institute of Education Sciences, U.S. Department of Education.
  • Löfholm, C., Brännström, L., Olsson, M., & Hansson, K. (2013). Treatment‐as‐usual in effectiveness studies: What is it and does it matter? International Journal of Social Welfare, 22(1), 25–34.
  • Magill, M., & Longabaugh, R. (2013). Efficacy combined with specified ingredients: A new direction for empirically supported addiction treatment. Addiction ( Addiction), 108(5), 874–881. doi:10.1111/add.12013
  • McCord, J. (2003). Cures that harm: Unanticipated outcomes of crime prevention programs. The Annals of the American Academy of Political and Social Science, 587(1), 16–30.
  • Means, S. N., Magura, S., Burkhardt, J. T., Schröter, D. C., & Coryn, C. L. (2015). Comparing rating paradigms for evidence-based program registers in behavioral health: Evidentiary criteria and implications for assessing programs. Evaluation and Program Planning, 48, 100–116. doi:10.1016/j.evalprogplan.2014.09.007
  • Mears, D. P. (2010). American criminal justice policy: An evaluation approach to increasing accountability and effectiveness. New York, NY: Cambridge University Press.
  • Mears, D.P., & Barnes, J.C. (2010). Toward a systematic foundation for identifying evidence-based criminal justice sanctions and their relative effectiveness. Journal of Criminal Justice, 38, 702–710. doi:10.1016/j.jcrimjus.2010.04.044
  • Mihalic, S. (2004). The importance of implementation fidelity. Emotional and Behavioral Disorders in Youth, 4(4), 83–105.
  • Moore, J. E., Bumbarger, B. K., & Cooper, B. R. (2013). Examining adaptations of evidence-based programs in natural contexts. The Journal of Primary Prevention, 34(3), 147–161. doi:10.1007/s10935-013-0303-6
  • Mosteller, F., & Boruch, R. F. (2002). Evidence matters: Randomized trials in education research. Washington, DC: Brookings Institution Press.
  • Nardini, C. (2014). The ethics of clinical trials. Ecancermedicalscience, 8, 387. doi:10.3332/ecancer.2014.387
  • National Research Council. (2012). Reforming juvenile justice: A developmental approach. Washington, DC: The National Academies Press.
  • Neuhoff, A., Axworthy, S., Glazer, S., & Berfond, D. (2015). The what works marketplace: Helping leaders use evidence to make smarter choices. Retrieved from http://results4america.org/wp-content/uploads/2015/04/WhatWorksMarketplace-vF.pdf
  • Newman, J., Cherney, A., & Head, B. W. (2016). Do policy makers use academic research? Reexamining the “two communities” theory of research utilization. Public Administration Review, 76(1), 24–32. doi:10.1111/puar.12464
  • Nutley, S. M., & Davies, H. T. O. (2000). Making a reality of evidence-based practice: Some lessons from the diffusion of innovations. Public Money and Management, 20(4), 35–42. doi:10.1111/1467-9302.00234
  • O’Connell, M. E., Boat, T., & Warner, K. E. (2009). Preventing mental, emotional, and behavioral disorders among young people: Progress and possibilities. Washington, DC: National Academies Press.
  • Paulsell D., Thomas J., Monahan S., & Seftor, N.S. (2017). A trusted source of information: how systematic reviews can support user decisions about adopting evidence-based programs. Evaluation Review, 41, 50–77. doi:10.1177/0193841X16665963
  • Petrosino, A. (2003). Standards for evidence and evidence for standards: The case of school-based drug prevention. Annals of the American Academy of Political and Social Science, 587, 180–207. doi:10.1177/0002716203251218
  • Pew-MacArthur Results First Initiative. (2014). Evidence-based policymaking. Retrieved from http://www.pewtrusts.org/∼/media/assets/2014/11/evidencebasedpolicymakingaguideforeffectivegovernment.pdf
  • Piquero, A. R. (2019). Nothing fake here: The public criminology case for being smart on crime by being smarter on people. Justice Evaluation Journal, 2(1), 73–92. doi:10.1080/24751979.2019.1597636
  • Rabe-Hemp, C., & Navarro, J. (2016). Experimental design in the study of crime media and popular culture. Oxford Research Encyclopedia of Criminology. Retrieved from https://oxfordre.com/criminology/view/10.1093/acrefore/9780190264079.001.0001/acrefore-9780190264079-e-54
  • Rossi, P. H., Lipsey, M. W., & Freeman, H. E. (2004). Evaluation: A systematic approach. Thousand Oaks, CA: Sage.
  • Sackett D., Strauss S., Richardson W., Rosenberg W., & Haynes R. (2000). Evidence-Based Medicine: How to Practice and Teach EBM. 2nd ed. Churchill Livingstone; Edinburgh.
  • Sherman, L. W., Gottfredson, D. C., MacKenzie, D. L., Eck, J., Reuter, P., & Bushway, S. D. (1998). Preventing crime: What works, what doesn’t, what’s promising. National Institute of Justice, Research in Brief. Retrieved from https://www.ncjrs.gov/pdffiles/171676.PDF
  • Sherman, L. W., Gottfredson, D., MacKenzie, D., Eck, J., Reuter, P., & Bushway, S. D. (1997). Preventing crime: What works, what doesn’t, what’s promising: A report to the United States Congress. Prepared for the National Institute of Justice. Retrieved from http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.130.6206&rep=rep1&type=pdf
  • Siegfried, T. (2014). To make science better, watch out for statistical flaws. ScienceNews. Retrieved from https://www.sciencenews.org/blog/context/make-science-better-watch-out-statisticalflaws
  • Sturgeon–Adams, L., Adamson, S., & Davidson, N. (2005). Hartlepool: A case study in burglary reduction. Hull, UK: Centre for Criminology and Criminal Justice, University of Hull. Retrieved from https://www.researchgate.net/publication/265023594_Hartlepool_A_Case_Study_
  • The National Academy of Sciences. (2019). Definitions of evolutionary terms. Retrieved from http://www.nas.edu/evolution/Definitions.html
  • U.S. Department of Education’s Institute of Education Sciences (IES) (2017). Procedures Handbook Version 4.0. https://ies.ed.gov/ncee/wwc/Docs/referenceresources/wwc_procedures_handbook_v4.pdf. Accessed Aug 2019.
  • Valentine, J., & Cooper, H. (2006). Effect size substantive interpretation guidelines: Issues in the interpretation of effect sizes. Washington, DC: What Works Clearinghouse. Retrieved from https://wmich.edu/sites/default/files/attachments/u58/2015/Effect_Size_Substantive_Interpretation_Guidelines.pdf
  • Van de Moere, J. (2013). A simple guide to semantic versioning. Retrieved from https://www.jvandemo.com/a-simple-guide-to-semantic-versioning/
  • Wasserstein, R. L., & Lazar, N. A. (2016). The ASA’s statement on p-values: Context, process, and purpose. The American Statistician, 70(2), 129–133. doi:10.1080/00031305.2016.1154108
  • Watts, S. E., Turnell, A., Kladnitski, N., Newby, J. M., & Andrews, G. (2015). Treatment-as-usual (TAU) is anything but usual: A meta-analysis of CBT versus TAU for anxiety and depression. Journal of Affective Disorders, 175, 152–167. doi:10.1016/j.jad.2014.12.025
  • Welsh, B., & Farrington, D. (2012). The science and politics of crime prevention: Toward a new crime policy. In D. Farrington & B. Welsh (Eds.), The Oxford handbook of crime prevention. Oxford: Oxford University Press. Retrieved from https://www.oxfordhandbooks.com/view/10.1093/oxfordhb/9780195398823.001.0001/oxfordhb-9780195398823-e-25
  • Westbrook, T. R., Avellar, S. A., & Seftor, N. (2017). Reviewing the reviews: Examining similarities and differences between federally funded evidence reviews. Evaluation Review, 41(3), 183–211. doi:10.1177/0193841X16666463
  • Western, M. (2019). How to increase the relevance and use of social and behavioral science: Lessons for policy-makers, researchers and others. Justice Evaluation Journal, 2(1), 18–34. doi:10.1080/24751979.2019.1600381
  • Wiles, P. (2002). Criminology in the 21st century: Public good or private interest?—The Sir John Barry Memorial lecture. Australian & New Zealand Journal of Criminology, 35(2), 238–252. doi:10.1375/acri.35.2.238
  • Wilson, D. B. (n.d.). Practical meta-analysis effect size calculator. [Online calculator]. Retrieved from https:/www.campbellcollaboration.org/research-resources/research-for-resources/effect-size-calculator.html
  • WSIPP (Washington State Institute for Public Policy). (2018, December). Benefit-cost technical documentation. Olympia, WA: Author. Retrieved from http://www.wsipp.wa.gov/TechnicalDocumentation/WsippBenefitCostTechnicalDocumentation.pdf
  • Zarkin, G. A., Dunlap, L. J., Belenko, S., & Dynia, P. A. (2005). A benefit–cost analysis of the Kings County district attorney’s office drug treatment alternative to prison. Justice Research and Policy, 7(1), 1–25. doi:10.3818/JRP.7.1.2005.1
  • Zhang, S. X., Roberts, R. E. L., & Callanan, V. J. (2006). The cost benefits of providing community-based correctional services: An evaluation of a statewide parole program in California. Journal of Criminal Justice, 34(4), 341–350. doi:10.1016/j.jcrimjus.2006.05.001

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.