504
Views
13
CrossRef citations to date
0
Altmetric
Original Articles

Trifactor Models for Multiple-Ratings Data

, &
 

Abstract

In this study we extend and assess the trifactor model for multiple-ratings data in which two different raters give independent scores for the same responses (e.g., in the GRE essay or to subset of PISA constructed-responses). The trifactor model was extended to incorporate a cross-classified data structure (e.g., items and raters) instead of a strictly hierarchical structure. we present a set of simulations to reflect the incompleteness and imbalance in real-world assessments. The effects of the rate of missingness in the data and of ignoring differences among raters are investigated using two sets of simulations. The use of the trifactor model is also illustrated with empirical data analysis using a well-known international large-scale assessment.

Article information

Conflict of interest disclosures: Each author signed a form for disclosure of potential conflicts of interest. No authors reported any financial or other conflicts of interest in relation to the work described.

Ethical principles: The authors affirm having followed professional ethical guidelines in preparing this work. These guidelines include obtaining informed consent from human participants, maintaining ethical treatment and respect for the rights of human or animal participants, and ensuring the privacy of participants and their data, such as ensuring that individual participants cannot be identified in reported results or from publicly available original or archival data.

Funding: This work was not supported.

Role of the funders/sponsors: None of the funders or sponsors of this research had any role in the design and conduct of the study; collection, management, analysis, and interpretation of data; preparation, review, or approval of the manuscript; or decision to submit the manuscript for publication.

Acknowledgments: The majority of the work was done when the first author, Hyo Jeong Shin, was at the University of California at Berkeley. The authors would like to thank Kentaro Yamamoto, James Carlson, Jodi Casabianca, and Ikkyu Choi for their comments on prior versions of this manuscript and Emily Lubaway and Larry Hanover for their editing help. The ideas and opinions expressed herein are those of the authors alone, and endorsement by the authors’ institutions is not intended and should not be inferred.

Notes

1 Generalizability theory (G-theory) can be viewed as a similar approach (to that of IRT) by providing a way of partitioning the total variance into separate and uncorrelated parts (Shavelson, Baxter, & Gao, Citation1993) and by considering the estimated variance components as latent factors. G-theory separates the total variability in the ratings into variance components from different sources, such as (a) systematic variability between individual test takers, (b) variability between raters (interrater inconsistencies), (c) variability within raters across rating occasions (intrarater inconsistencies), and (d) variability between the writing tasks (e.g., Sudweeks, Reeve, & Bradshaw, Citation2004). In this paper, we focus on the measurement models based on the IRT framework and connect the results to the factor analysis approach or generalizability theory approach.

2 To our knowledge, nothing has been published yet regarding application of the HRM to dichotomous data.

3 The Rasch-based model was chosen to estimate factor variances for all raters and items.

4 Results of fitting the correct model, HEHE, are the same as the “full linkage design” columns in the Table 2.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.