Abstract
Two ways of measuring change are presented and compared: A conventional “change score”, defined as the difference between scores before and after an interim period, and a process-oriented approach focusing on detailed analysis of conceptually defined response patterns. The validity of the two approaches was investigated. Vocabulary knowledge was assessed by means of equivalent multiple-choice tests administered before and after an intervention, and four characteristic responses were observed: Words consistently not understood; words inconsistently understood; learned words; and words consistently understood. The results showed that inclusion of the category “words consistently not understood” offered a “truer” gain score than did the conventional change score. It captured more variance from age and cognitive constraints and appeared educationally more reliable from an assessment-for-teaching-perspective.