Abstract
School inspections are expected to have an impact on data use and improvement of schools. Schools are expected to generate data (e.g., self-evaluation reports and student achievement results) as part of the inspection process. This process, in turn, also generates data (e.g., inspection reports) for school improvement. The high-stakes context in which both types of data are generated however has been known to lead to strategic responses of schools. In this study, we analyzed if schools cheat on tests and reshape their test pool in responses to the Dutch (risk-based) school inspections. We found that 5.5% of the schools do not to comply with the guidelines for administering the test; one third of the schools exclude one or more students from the test. These responses, however, do not appear to be related to specific measures in the Dutch school inspections or prior performance of schools on these measures.
Notes
1. Schools are allowed to choose a test to evaluate student achievement. The Inspectorate has developed guidelines on how to use results of each of the available tests to evaluate cognitive outcomes of schools. Most schools (85%), however, use the Cito test, administered by the testing company Cito. In this section, we will therefore only report on potential strategic behaviour of schools using the Cito test. The Inspectorate also uses Dutch language and arithmetic tests in Grades 3, 4, and 6 that are part of the Cito pupil monitoring system to evaluate progress of students during their school career (as a subindicator of the first quality area on cognitive outcomes). As the Cito test in Grade 8 is the primary indicator to evaluate a school as failing or proficient in their cognitive outcomes, we focus here on this test.