Abstract
After a test is developed, most content validation analyses shift from ascertaining domain definition to studying domain representation and relevance because the domain is assumed to be set once a test exists. We present an approach that allows for the examination of alternative domain structures based on extant test items. In our example based on the 59 items from the 2004 Arizona state high school mathematics test, we asked 34 subject matter experts (SMEs) to judge the similarity of items based on any characteristics deemed important to them. After sorting items into categories, SME similarity data were scaled using multidimensional analysis. Scale values were then cluster analyzed to examine the domain structure derived from SME group responses. The SME similarity data yielded a cluster configuration that shared some commonality with the state model as stipulated in test specifications, but differed in critical ways. Confirmatory factor analysis of student item responses indicated that although both models fit the data the SME domain configuration produced better fit indices, suggesting that analyzing domain definition after construction can enhance the understanding of construct meaning.
Notes
1. The Arizona Department of Education began to conduct more formal, external alignment reviews of AIMS using the Webb (Citation2007) procedure in 2005, so the 2004 mathematics test analyzed in this study was not subjected to such review.