1,532
Views
13
CrossRef citations to date
0
Altmetric
From the Editor

The Future of Counseling Outcome Research and Evaluation

(Editor)

As we head into the next decade of Counseling Outcome Research and Evaluation (CORE), it is easy to see how the contributions of our authors, Editorial Board Members, and past Editors have carried CORE from a new publication to one emerging as a valued resource for its readership and the broader professional community. During the last 10 years, we have increased the number and type of submissions (Cade, Gibson, Swan, & Nelson, Citation2018), engaged in critical conversations (Hays, Citation2010; Lenz, Citation2018), and promoted author visibility through growing abstracting and indexing profiles. Through these foundations, expansion of our editorial team, and enhancement of author experiences, I believe that CORE is poised for continued impacts. As we look ahead to the next decade of CORE, one area that is important for the editorial team is our commitment to the prudent and even-handed representation of research and evaluation findings within published articles.

Considerations for Reporting in CORE

With the new launch of the Publication Manual of the American Psychological Association, Seventh Edition (APA, 2020), all submissions to CORE will be reviewed and edited based on the related Journal Article Reporting Standards for quantitative, qualitative, and mixed methods studies. However, some aspects of quantitative submissions warrant discussion here.

Diversity, Culture, and Intersectionality

Counselors and those represented among their empirical pursuits are characterized by distinctive intersections of diversity and culture (Chan, Henesy, & Erby, Citation2019). The CORE editorial team aims to deliver content representing the breadth of interests held by our authorship base, as well as related participant populations. It has been my observation that the complexities inherent within intersections of diversity and cultural categories are often obscured through an emphasis on collecting and reporting the Big 3—age, biological sex, and ethnic or racial identities. Although these characteristics provide important boundaries to the generalizability of findings, authors are encouraged to use affirming and inclusive narrative descriptions and related visual depictions of participant characteristics that honor the complexity of their sample, support more accurate depictions of data generalization boundaries, and are suitable for inclusion in systematic reviews and meta-analyses.

Sample Size, Statistical Power, and Precision

The sample size of any research or evaluation design influences a delicate balance for investigators and defines their ability to identify the genuine effects associated with their inquiries. When designs have too few participants, the underpowered analyses pose a risk of missing effects that exist (Type II error); by contrast, when designs have an abundance of participants, the overpowered analyses may emphasize the presence of effects that are functionally marginal (Aberson, Citation2019; Balkin & Sheperis, Citation2011). Moving forward, authors will be required to report and interpret the results of prospective or post-hoc power analyses and their interplay with the ability to make statistically-based inferences using their data.

Statistical and Practical Significance

In September of 2019, I met with a group of CORE stakeholders represented by contributing authors, prospective authors, Editorial Board Members, a previous-Editor, and members of the Association for Assessment and Research in Counseling Executive Council to address priorities for reporting statistical and practical significance. Based on the emergent conversations and perspectives, CORE will now require that authors report means, standard deviations, and test statistics associated with statistical significance and effect sizes that support inferences of practical significance. These requirements are based on the assumptions that p-values are a helpful tool, yet they are also sample-bound, intended for a unique purpose, context-specific, and apt to be misinterpreted by users and consumers (see Betensky, Citation2019; Valentine, Aloe, & Lau, Citation2015). The editorial team believes that careful, contextual reporting of p-values along with a thoughtful interpretation of effects sizes using strategies such as those described by Watson, Lenz, Schmit, and Schmit (Citation2016) will increase transparency and usability of the findings published in CORE.

Clinical Significance and Related Plausible Implications

Given that CORE is intended to support both the dissemination and use of research and evaluation practices for work with individuals across the lifespan, clinical significance and implications are of particular importance. Moving beyond the presence, magnitude, and precision of an effect, clinical significance represents the real-world changes in everyday life that participants may experience as a result of an intervention or program. On one hand, clinical significance may be best approximated using measures that feature reliable change indices or normative sample comparisons which facilitate contrasts of data over time and between groups (Beutler & Moleiro, Citation2001; Cribbie & Arpin-Cribbie, Citation2009). On the other hand, in the absence of such measures, prudent synthesis of expertise and previous evidentiary support from a strong body of literature may elucidate plausible experiences associated with the effects detected among participants. It is worth noting that while the latter is regarded as conjectural without the inclusion of additional supportive evidence, such expositions may provide reasonable foundations for exploring the implications of observed effects.

Content and Format of Manuscripts

CORE will continue to publish articles related to treatment efficacy, program impacts, clinical diagnostic practices, research and evaluation designs, and outcome measure reviews. The editorial team welcomes studies using all well-designed qualitative, quantitative, and methodologies including those featuring alternatives to classical designs such as community-based and participatory procedures. We have also updated our author guidelines to provide additional submission categories and guidance for titling, abstract content, and keywords. The latter changes were made to increase the probability of article access, use, citation, and impact within CORE’s outward-facing presence which exists in front of consumer subscription paywalls.

Taken together, the CORE editorial team is energized and honored to continue providing service to this publication which we regard dearly. As we go forward into the next decade, we hope that the responsibilities and challenges ahead will continue to deepen the impacts CORE has among readers and the broader professional community of counseling outcome researchers and program evaluators. To this end, I hope you will consider joining us on this journey as an author or review board member.

References

  • Aberson, C. L. (2019). Applied power analysis for the behavioral sciences (2nd ed.). New York, NY: Routledge.
  • American Psychological Association. (2020). Publication manual of the American Psychological Association (7th ed.). Washington, DC: Author.
  • Balkin, R. S., & Sheperis, C. J. (2011). Evaluating and reporting statistical power in counseling research. Journal of Counseling & Development, 89, 268–272. doi:10.1002/j.1556-6678.2011.tb00088.x
  • Betensky, R. A. (2019). The p-value requires context, not a threshold. The American Statistician, 73(sup1), 115–117. doi:10.1080/00031305.2018.1529624
  • Beutler, L. E., & Moleiro, C. (2001). Clinical versus reliable and significant change. Clinical Psychology: Science and Practice, 8, 441–445. doi:10.1093/clipsy.8.4.441
  • Cade, R., Gibson, S., Swan, K., & Nelson, K. (2018). A content analysis of Counseling Outcome Research and Evaluation (CORE) from 2010 to 2017. Counseling Outcome Research and Evaluation, 9(1), 5–15.
  • Chan, C. D., Henesy, R. K., & Erby, A. N. (2019). Toward praxis, promise, and futures of intersectionality in multimethod counseling research. Counseling Outcome Research and Evaluation, 10(1), 12–18.
  • Cribbie, R. A., & Arpin-Cribbie, C. A. (2009). Evaluating clinical significance through equivalence testing: Extending the normative comparisons approach. Psychotherapy Research, 19(6), 677–686.
  • Hays, D. G. (2010). Introduction to counseling outcome research and evaluation. Counseling Outcome Research and Evaluation, 1(1), 1–7.
  • Lenz, A. S. (2018). Reconsidering the value assigned to counseling research paradigms and outcomes. Counseling Outcome Research and Evaluation, 9(1), 1–4.
  • Valentine, J. C., Aloe, A. M., & Lau, T. S. (2015). Life after NHST: How to describe your data without “p-ing” everywhere. Basic and Applied Social Psychology, 37(5), 260–273.
  • Watson, J. C., Lenz, A. S., Schmit, M., & Schmit, E. (2016). Estimating and reporting practical significance in counseling research. Counseling Outcome Research and Evaluation, 7, 111–123. doi:10.1177/2150137816660584

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.