Abstract
This paper presents a rubric for the evaluation of interactive multimedia by non-education specialists, from which an evaluation instrument was developed. This instrument was used to investigate the impact that instructing students in general design principles, evaluation techniques, the principles of Human Information Processing and Learning Theory may have on their subjective evaluations of interactive multimedia (IMM). The rubric suggests that design and content issues can be assessed by the support offered to user performance speed, memory, effort and comfort. Results indicate that instruction does have an impact on the subjective evaluations given by students and that the instrument was able to capture this.