517
Views
3
CrossRef citations to date
0
Altmetric
Guest Editorial

Evolving reporting guidelines for social work research

As social work research evolves in both quality and quantity, it is necessary that certain conventions be adopted with respect to the preparation of journal manuscripts and for the complete and accurate reporting of empirical studies. This facilitates the reader gaining an accurate understanding of a given piece of research, the legitimacy of the authors’ conclusions, and it promotes other scholars’ ability to replicate one’s research project (Thyer Citation2008). To this end, scholarly journals adopt some official publication style. For example, Nordic Social Work Research makes use of the Chicago Style of formatting manuscripts, referencing and citing works. Other English language journals sometimes make use of the Publication Manual of the American Psychological Association (APA Citation2010). The focus of this editorial is on some of the evolving research standards which have been adopted in other fields, and which have the potential to inform and enhance the design and conduct of social work research.

Journal Article Reporting Standards

The APA manual, apart from formatting guidelines, now includes a set of Journal Article Reporting Standards (JARS) which it recommends that authors follow when preparing empirical research articles (APA Citation2010, 245–253). There are three separate sets of reporting guidelines contained within the JARS. The first consists of information recommended for inclusion in all manuscripts that report new data collection, regardless of research design. There are several dozen recommendations included in this document, beginning with what information an article’s title should contain [e.g. ‘Identify variables and theoretical issues under consideration’, what information an abstract should contain (‘Problem under investigation, Participants … Study method … Sample size, outcome measures … research design, findings … conclusions and the implications’), and similar details for subsequent portions of a research paper (Introduction, Method, Research Design, Results, Discussion)]. While many of these points will be obvious to most well-trained social work researchers, just as airline pilots make use of a checklist before taking off, the JARS can be helpful to even the most experienced author in initially preparing a manuscript or in reviewing one’s paper prior to submission.

Apart from the general JARS guidelines, there are separate additional standards for studies reporting the results of a manipulation or intervention involving non-random assignment (e.g. quasi-experimental studies, see Thyer Citation2012). These relate to providing details of the intervention itself, the nature of any control or comparison groups, who provided the interventions, the flow of participants (numbers initially assigned to each arm of the study, drop-outs, those available at follow-up, etc.), evidence of treatment adherence and fidelity, length of follow-up and issues of external validity. Further guidelines are provided for reporting on true experimental studies, wherein clients were randomly assigned to different conditions. One should describe how random assignment was carried out (coin toss, random number table, computer programme), whether or not the assessors were blind to the arm of the study the client was assigned to, how such masking (if any) was evaluated and details of the statistical analyses conducted. It is also recommended that all quasi-experimental and experimental studies include a participant flow chart or diagram. A final set of guidelines are providing for reporting the results of a meta-analysis. Whether or not a given journal has formally adopted the APA manual, the JARS is an extremely useful checklist for the authors of all empirical research papers to consult. Doing so will increase the transparency of our research, help avoid inadvertent omissions of important information, reduce the frequent tendency to make exaggerated claims and promote a higher standard of research. Free tutorials on APA style can be found at http://www.apastyle.org/learn/index.aspx.

The APA manual also now suggests (APA Citation2010, 34) that one always report the appropriate effect size (ES) associated with any statistically significant difference reported in one’s results (e.g. Cohen’s d, odds ratio, number needed to treat, relative risk, etc.), and the appropriate confidence interval associated with each ES. Effect sizes are extremely useful in avoiding excessive claims. If the confidence interval associated with a given ES embraces zero, or comes close to it, the possibility exists that the effect is due to chance and has little practical value. It is also recommended that exact p values be reported, not simply, for example, p < .05. The instructions for authors for the journal Research on Social Work Practice, which I edit, specifically directs authors to adhere to the relevant JARS and members of the RSWP Editorial Board are asked to use the JARS in preparing their critical analyses of articles submitted for review.

CONSORT

Another checklist which is being widely adopted is the Consolidated Standards of Reporting Trials (CONSORT, see http://www.consort-statement.org/), detailed guidelines for reporting the design, conduct and results of randomized controlled trials (RCTs). Hundreds of journals in the health and biomedical areas now endorse CONSORT. While it is often contended that true experiments such as RCTs are impractical or inapplicable to evaluate the outcomes of social work programmes and policies, a recent bibliography has located over 500 true experiments, including RCTs, published by social workers, with the earliest one appearing in 1949 (Thyer and Massie Citation2014). True experiments are actually a widely used research method in social work, but this is somewhat of an invisible literature because the studies are usually reported in non-social work journals. There is a great deal of overlap in the JARS and CONSORT guidelines, but on balance CONSORT is a more sophisticated approach to reporting RCTs. Work is now underway to adapt the CONSORT to the specifics of social and psychological intervention trials (Mayo-Wilson et al. Citation2013). It is hoped that the availability of a CONSORT standard appropriate for the reporting of complex psychosocial interventions will enhance the quality of research reporting in the area of social work RCTs, as it has the field of medicine (see Plint et al. Citation2006).

Clinical trial registries

A clinical trial registry is a site wherein researchers can report their research protocols in advance of a study being actually conducted. Doing so encourages transparency and honesty, and reduces the file-drawer problem, wherein studies with negative results are less likely to be published, or are deliberately suppressed. It is regrettably not uncommon for a clinical trial to initially include multiple outcome measures, but for the authors to subsequently only publish the results of those measures which yielded positive results. Or, less egregious but still problematic, to publish several outcome studies, each including only a single outcome measure (or several), giving rise to the appearance of there being more than one main study having been conducted. One of the major clinical trials registries in the USA is ClinicalTrials.gov, with over 200,000 study protocols included. CONSORT states that prior registration in a clinical trials registry is required of all RCTs. On 2 April 2014, the European Parliament voted in favour of requiring, effective 2016, all drug trials conducted in Europe to be entered into a clinical trials registry (see http://www.cochrane.org/news/news-events/current-news/europe-votes-clinical-trial-transparency). Relatedly, in February 2014, the National Institute of Mental Health in the USA announced that it would require all new grantees undertaking a clinical trial to advance-register their study (http://www.nimh.nih.gov/about/director/2014/a-new-approach-to-clinical-trials.shtml), and a number of journals now will not publish clinical trials unless they have been prospectively entered into a clinical trials registry. There is no reason why well-designed RCTs of psychosocial interventions should not be similarly registered into an appropriate system, and as Harrison and Mayo-Wilson (Citationforthcoming) contend, adopting such a standard within social work would help reduce reporting bias in social work research. Some authorities contend that NOT reporting the results of a completed trial is a form of research misconduct (see http://www.alltrials.net/2013/gmc-research-misconduct/).

Summary

The cumulative effect of initiatives such as the JARS, CONSORT and clinical trial registries will to improve the quality of published research. To the extent to which social work researchers wish to conduct studies consistent with the highest standards of psychosocial and health care research, our discipline will embrace these new standards and contribute to their development so that they are compatible with the complex problems and interventions that are the focus of our discipline. Social work journals such as Nordic Social Work Research and Research on Social Work Practice can help this process by disseminating information on these new standards, and, over time, begin to adopt them in our information for contributors, and directing authors to consult them when revising papers for resubmission. It would also be desirable for social work education programmes to include such content in their research training modules and continuing education offerings. I am grateful to the Editor of Nordic Social Work Research, Professor Tarja Pösö, for inviting this Guest Editorial, and I hope it stimulates some thoughtful discussion among the readers of this journal.

Bruce A. Thyer
College of Social Work, Florida State University, Tallahassee, FL, USA

References

  • APA (American Psychological Association). 2010. Publication Manual of the American Psychological Association. 6th ed. Washington, DC: American Psychological Association.
  • Harrison, B., and E. Mayo-Wilson. Forthcoming. “Trial Registration: Understanding and Preventing Reporting Bias in Social Work Research.” Research on Social Work Practice.
  • Mayo-Wilson, E., P. Montgomery, S. Hopewell, G. Macdonald, D. Moher, and S. Grant. 2013. “Developing a Reporting Guideline for Social and Psychological Intervention Trials.” The British Journal of Psychiatry 203: 250–254.10.1192/bjp.bp.112.123745
  • Plint, A. C., D. Moher, A. Morrison, K. Schulz, D. G. Altman, C. Hill, and I. Gaboury. 2006. “Does the CONSORT Checklist Improve the Quality of Reports of Randomised Controlled Trials? A Systematic Review.” Medical Journal of Australia 185: 263–267.
  • Thyer, B. A. 2008. Preparing Research Articles. New York: Oxford.10.1093/acprof:oso/9780195323375.001.0001
  • Thyer, B. A. 2012. Quasi-Experimental Research Designs. New York: Oxford.10.1093/acprof:oso/9780195387384.001.0001
  • Thyer, B. A., and K. M. Massie. 2014. A Bibliography of Randomized Controlled Experiments in Social Work (1949–2013): Solvitur Ambulando. Unpublished manuscript.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.