1,156
Views
4
CrossRef citations to date
0
Altmetric
Corrigendum

Correction to: An Investigation of Interrater Reliability for the Rorschach Performance Assessment System (R–PAS) in a Nonpatient U.S. Sample

This article refers to:
An Investigation of Interrater Reliability for the Rorschach Performance Assessment System (R–PAS) in a Nonpatient U.S. Sample

Volume 98, Number 4, 2016, pp. 382–390. doi:10.1080/00223891.2015.1118380

In our original article, we computed and reported interrater reliability using intraclass correlations (ICCs). We subsequently discovered that instead of the average measure ICCs that we reported, we should have been reporting single measure ICCs. Because average measure ICCs are always higher than single measure ICCs, the interrater reliability values we reported are inflated. In addition, technically, dichotomous decisions like those in our study are typically calculated using the kappa coefficient. Although kappa will return results virtually identical to single measure ICCs, we decided to report a statistic that is more familiar to readers for dichotomous decisions in interrater reliability. We thank Joni L. Mihura for her very valuable assistance in correcting these results.

The data presented here correct the interrater agreement for response-level data through the analysis of the kappa statistic (see ). Cicchetti (Citation1994) established useful guidelines for interpreting concordance rates among intercoders using the kappa statistic. His recommendations have become widely accepted and include the following categories: (a) poor agreement is < .40, (b) fair agreement is .41 to .59, (c) good agreement is .60 to .74, and (d) excellent agreement is .75 to 1.00 (Cicchetti, Citation1994). presents a summary of the kappa values reported by Cicchetti's (Citation1994) recommended categories. Overall, the majority of calculated kappas fell within the excellent range (n = 26), and of these there were a large representation from Location, Content Codes, and Determinants. However, there were a number of kappas within the fair range (n = 17), of which there were a few kappa values within the Cognitive Codes (n = 4), some within the Determinants (n = 5), and a greater portion within the Thematic Codes (n = 5). Among the good kappa values (n = 14) a majority were observed for Determinants (n = 4), some within Cognitive Codes (n = 3), and both White Space category codes (n = 2). The remaining good kappa values were distributed across other coding categories. All of the poor kappa values (n = 5) fell within one of two coding categories: Cognitive Codes (n = 3) and Determinants (n = 2).

Table 1. Rorschach Performance Assessment System interrater agreement at the response level.

Table 2. Summary of kappa interrater reliability results for 50 Rorschach Performance Assessment System protocol variables.

Overall, more than half of the kappa values calculated fell within either the good or excellent ranges (n = 40, 66.1%). The proportion of fair kappa values was 27.4% and the proportion of poor kappa values was 8.1%. This statistical analysis presenting kappa values is appropriate for those interested in interrater reliability at the response level to represent interrater reliability when coding for two examiner groups, assigning dichotomous values, and comparing the degree of reliability between them.

References

  • Cicchetti, D. V. (1994). Guidelines, criteria, and rules of thumb for evaluating normed and standardized assessment instruments in psychology. Psychological Assessment, 4, 284–290.
  • Kivisalu, T. M., Lewey, J. H., Shaffer, T. W., & Canfield, M. L. (2016). An investigation of interrater reliability for the Rorschach Performance Assessment System (R–PAS) in a nonpatient U.S. sample. Journal of Personality Assessment, 98, 382–390. doi:10.1080/00223891.2015.1118380

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.