1,540
Views
6
CrossRef citations to date
0
Altmetric
Editorial

From the Editor—Do We Have a Replication Crisis in Social Work Research?

The so-called “replication crisis” is an important issue facing primarily life and social scientists. In a nutshell, scholars have found that results of many research studies are difficult or even impossible to reproduce. This failure to replicate occurs not only for independent researchers seeking to reproduce the original effects, but also for the original researchers (Ioannidis, Citation2012). Because replication is a hallmark of the scientific method, inability to replicate original results presents a crisis for scientific progress.

The replication crisis hits close to home with evidence that psychology, and in particular, social psychology, has seen some difficulties in attempts to reproduce classic studies. The crisis even extends to medical science, perhaps exacerbating what appears to be increasing popularity of anti-science sentiments. In recent attempts to replicate classic results, the focus has been on not only determining the statistical reliability of results, but also the reasons for any unreliability found.

Begley and Ioannidis (Citation2015) summarize the present crisis with the following: There is a desperation to publish (or perish) that may contribute to studies currently being produced at a heretofore unprecedented rate; failure to adhere to sound science and a rush to print may increase the likelihood that many of these studies will not be reproducible; and this is a multi-faceted and multi-stakeholder problem that will resist simple solutions. The burgeoning glut of scientific papers overwhelms journal editors and the peer-review process that would normally be able to flag most papers that have obvious methodological limitations, but are not artfully crafted scientific fraud.

So far, there have been no wide-scale accusations of replication failure in social work research, but that does not mean that social work scholars are immune. Many editors report difficulty finding enough reviewers to keep manuscripts processing, although some editors (J. Jenson, personal communication, January 19, 2019) find ways to stay on top of the flow by appointing numerous associate editors to assist with enlisting competent reviewers. With at least some journals finding it difficult to recruit competent peer-reviewers, surely some bad science is escaping notice.

What are some mechanisms for trying to control the quality of social work research? Among the suggestions are the following:

  1. Pre-registration of research studies (Harrison & Mayo-Wilson, Citation2014). Researchers would be required to submit a description of study methods and analyses prior to data collection, circumventing the reporting of only selected, statistically significant results. Journals should require that any clinical studies reported have been previously registered.

  2. Addressing misinterpretation of statistical significance (Ziliak & McCloskey, Citation2008). The .05 level of statistical significance means only that we accept as true studies that could occur by chance alone less than 5% of the time — a wholly artificial criterion. We could change that level to make it more stringent, let’s say .05%, we could rely on Bayesian statistics (Colquhoun, Citation2015), or we could encourage larger sample sizes (Maxwell, Lau, & Howard, Citation2015).

  3. Share raw data in online repositories, such as the Open Science Framework (http://OSF.io) to encourage public evaluation and improve integrity and replication of research. Many journals, particularly in medicine, now require that such online storage be in place prior to accepting papers for publication. Taylor & Francis, the publisher of JSWE and several other social work journals, encourages authors to “deposit data in a suitable repository, cite it, and include a data availability statement explaining where others can access the data” (https://authorservices.taylorandfrancis.com/understanding-our-data-sharing-policies/).

  4. Adherence to empirically derived reporting guidelines for research. The EQUATOR network (http://www.equator-network.org/) provides standards for reporting most varieties of health research, ranging from randomized trials to qualitative research to economic evaluations. This international organization seeks to improve the replicability of health research by promoting transparent and accurate reporting and wider use of robust reporting guidelines.

While many social work scholars are probably not as aware of the replication crisis as other scientists, I have no doubt that a rigorous investigation of the reproducibility of social work research would find similar levels of problems. I think it is important that we continue to educate ourselves in ways to reduce the risk of reproducibility failure by incorporating training in new forms of scientific rigor in social work doctoral education, seeking new competencies for existing social work researchers, increasing training in peer-review practices, and encouraging our journals to implement more rigorous standards for papers published in our journals. As social work research has critical implications for the people who use social work services, we need to be accountable not only to ourselves, but also to the people we serve.

In this issue

As always, this issue of JSWE offers a collection of conceptual, empirical, and teaching notes. Less usual, however, are invited articles, and the first article of JSWE’s Volume 55 is an invited article from Bruce Thyer. Thyer’s invited article (“Predatory Doctoral Programs: Warnings for Social Workers”) bears on the issue of quality assurance in doctoral education in social work — a topic about which we share a passion. We all want our doctoral programs to produce faculty and scholars who do high quality research, report it transparently, and who communicate findings so clearly that the findings will be embraced by the profession. A second conceptual article in this issue, by Laging and Heidenreich (“Towards a Conceptual Framework of Service User Involvement in Social Work Education: Empowerment and Educational Perspectives”), develops a conceptual framework of service user involvement in social work education to guide further research and practice. Broadly international in perspective, this article is thought-provoking, particularly in its documentation of different available conceptualizations of service user involvement and differences between an empowerment and an educational perspective.

The next seven articles in this issue present the results of studies using a variety of qualitative methods. Cheung, Zhou, Narendorf, and Mauldin (“Curriculum Mapping in a Social Work Program With the 2015 Educational Policy and Accreditation Standards”) report on a mapping process for an MSW program, using the process for its clinical specialization as an exemplar. Curriculum mapping is defined as a five-stage process which can assist in assessing the teaching contents of program components and the authors provide helpful suggestions for using such an assessment process prior to reaffirmation studies. Street, MacGregor, and Cornelius-White (“A Stakeholder Analysis of Admission in a Baccalaureate Social Work Program”) conducted interviews with a variety of BSW program stakeholders not generally involved in the admissions process, such as field instructors, social service employers, and adjunct faculty members, and found four expectations for the admission process, including gatekeeping, a self-reflection process for students, an indicator of social work program educational quality, and professional socialization for students. Held and colleagues (“Training the Future Workforce: Social Workers in Integrated Health Care Settings”) interviewed social workers employed in integrated health care settings and found themes related to existing social work strengths, further training needed for this work, and fundamental skills for team-based collaboration. Kenney and Young (“Using Experiential Learning to Help Students Understand the Impact of Food Insecurity”) conducted a content analysis of student responses to an experiential exercise intended to assist MSW students to better understand food insecurity and the SNAP program. A number of themes emerged from the analysis demonstrating that students increased their knowledge of the program and gained a better understanding of how food insecurity personally affects individuals and families. Sanchez, Norka, Corbin, and Peters (“Use of Experiential Learning, Reflective Writing, and Metacognition to Develop Cultural Humility Among Undergraduate Students”) also used content analysis of self-reflective comments written in the margins of student essays to evaluate their reactions to experiential learning that challenged them to learn about people with different social identities. Students used these margin notes to examine their own emotions, to seek understanding of self and others, and to recognize their privilege. Giertsen (“Heteronormativity Prevails: A Study of Sexuality in Norwegian Social Work Bachelor Programs”) used content analysis of baccalaureate social work curricula in Norway to examine how sexuality is addressed. Giertsen found only six articles across curricula addressing sexuality, representing only 0.08% of content. These articles were judged to be heteronormative in perspective, and the author presents suggestions for reorienting curricula. Finally, Griffiths, Royse, Murphy, and Starks (“Self-Care Practice in Social Work Education: A Systematic Review of Interventions”) provide a narrative synthesis of interventions used to improve student self-care practice in social work education. The four included studies all presented Mindfulness interventions to enable social workers to sustain their well-being and model self-care for their clients.

There are two articles using mixed methods in this issue. Clarkson-Hendrix and Carroll-Barbuto (“Curriculum Implications for Collaborative Practice From Veterans Health Care Sector Social Workers Serving OIF/OEF Veterans”) use a combination of surveys and interviews to examine perceptions of social workers involved in health care provision for Operation Iraqi Freedom/Enduring Freedom veterans with the purpose of developing curriculum implications for social work education. Recommendations emerging from this study center on infusing content on interprofessional collaboration in practice courses, content on intervening with colleagues to promote optimal process and outcomes, and development of positive interprofessional climates. Lu, D’Angelo, and Willett (“Learning Through Teaching: A Study of Social Work PhD Students in Their Roles as Educators and Learners of Research”) used a combination of qualitative interviews and quantitative analysis of course evaluations to examine the experience of PhD students teaching an introductory research methods course for MSW students. They found that providing this opportunity for PhD students to develop their role of educators provided benefits for both students and instructors.

Wrapping up the empirical articles in this issue of JSWE, there are four quantitative papers presented in this issue. O’Neill, Yoder Slater, and Batt (“Social Work Student Self-Care and Academic Stress”) surveyed 90 BASW and MSW students using the Academic Stress Scale to investigate the relationship between self-care and academic stress and found that students using self-care strategies and students further along in their programs had lower levels of academic stress. Morton, Wells, and Cox (“The Implicit Curriculum: Student Engagement and the Role of Social Media”) used web-based surveys to explore perceptions of the implicit curriculum in one MSW program. They found that students who perceived social media as more useful also reported higher levels of engagement and participation in program governance and suggest that competent use of social media may be used as a tool to increase student engagement. Davis (“Historical Knowledge of Oppression and Racial Attitudes of Social Work Students”) surveyed 305 first-year MSW students at five CSWE accredited social work programs to examine the relationship between the historical knowledge of oppression that students possess at the beginning of their MSW education and endorsement of a color-blind ideology. The findings suggest that students with more knowledge of oppression and younger students reported fewer colorblind beliefs. Lastly, Cummings, Chaffin, and Milam (“Comparison of an Online and Traditional MSW Program: A 5-Year Study”) examined educational outcomes from 883 online and face-to-face MSW students from one program. Findings evidenced significantly higher knowledge scores for face-to-face students and significantly higher skills scores for online students, but effect sizes were small, suggesting relatively minor differences in educational outcomes.

Finally, we present three teaching notes in this issue of JSWE. Roberson (“Closing the Health Gap and Social Work Education: A Grand Challenge”) presents an assignment for incorporating one of the 12 Grand Challenges for Social Work, Closing the Health Gap, into already existing classes, and mapping the assignment to social work competencies. Putney and colleagues (“Implementation of Online Client Simulation to Train and Assess Screening and Brief Intervention Skills”) focus on the use of online simulation to teach screening and brief intervention skills for empirically supported treatments for preventing and reducing substance use. They present preliminary analysis of changes in MSW students’ skills. Finally, Dessel and colleagues (“Challenges in the Classroom on LGBTQ Topics and Christianity in Social Work”) discuss facilitating the development of culturally sensitive skills for working with LGBTQ populations. Because faculty may not be comfortable with the issues of power, privilege, social values, and beliefs about gender and sexuality this content engenders in the classroom, the authors provide guidance and suggest pedagogical techniques for developing faculty and student competence and awareness when working with LGBTQ populations.

References

  • Begley, C. G., & Ioannidis, J. P. (2015). Reproducibility in science: Improving the standard for basic and preclinical research. Circulation Research, 116, 116–126. doi:10.1161/CIRCRESAHA.114.303819
  • Colquhoun, D. (2015). An investigation of the false discovery rate and the misinterpretation of p-values. Royal Society Open Science, 1, 140216. doi:10.1098/rsos.140216
  • Harrison, B., & Mayo-Wilson, E. (2014). Trial registration: Understanding and preventing bias in social work research. Research on Social Work Practice, 24, 372–376. doi:10.1177/1049731513512374
  • Ioannidis, J. P. A. (2012). Why science is not necessarily self-correcting. Perspectives on Psychological Science, 7, 528–530. doi:10.1177/1745691612464056
  • Maxwell, S. E., Lau, M. Y., & Howard, G. S. (2015). Is psychology suffering from a replication crisis? What does “failure to replicate” really mean? American Psychologist, 70, 487–498. doi:10.1037/a0039400
  • Ziliak, S. T., & McCloskey, D. N. (2008). The cult of statistical significance: How the standard error costs us jobs, justice, and lives. Ann Arbor: University of Michigan Press.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.