Abstract
When determining interrater reliability for scoring the Rorschach Comprehensive System (Exner, 1993), researchers often report coding agreement for response segments (i.e., Location, Developmental Quality, Determinants, etc.). Currently, however, it is difficult to calculate kappa coefficients for these segments because it is tedious to generate the chance agreement rates required for kappa computations. This study facilitated kappa calculations for response segments by developing and validating formulas to estimate chance agreement. Formulas were developed for 11 segments using 400 samples, cross-validated on 100 samples, and applied to the data from 5 reliability studies. On cross-validation, the validity of the prediction formulas ranged from .93 to 1.0 (M = .98). In the 5 reliability studies, the average difference between estimated and actual chance agreement rates was .00048 and the average difference between estimated and actual kappa values was .00011 (maximum = .0052). Thus, the regression formulas quite accurately predicted chance agreement rates and kappa coefficients for response segments.