Abstract
This article presents a case study describing practical and scientific strategies that were applied to assess human-computer interaction, with the foci on software content development (i.e., content validation and interface comparisons), evaluation (i.e., different types of criteria, effect sizes, effect ratios, targeted goals met), and user acceptability. A parenting intervention program entitled Parenting Adolescents Wisely (PAW) was delivered to 42 parents in community settings via two formats: noninteractive videotape and interactive multimedia. Based on a content validation model developed in this study, both formats consisted of critical skills identified from past empirical studies. Results of applying both formats to at-risk families showed improvements on three types of evaluative criteria: reaction, learning, and behavior. Improvements in children's problem behaviors were clinically significant for 33% to 48% of the children whose parents used the program. Finally, the PAW program showed a substantial cost benefit based on effect ratios, compared with other parenting interventions.