ABSTRACT
The media suggest that accountability pressure increases teacher stress and drives teachers away from teaching, resulting in teachers leaving disadvantaged schools that serve larger proportions of poor and minority students. However, no prior work has systematically examined the changes in the national trends of teacher turnover in response to No Child Left Behind (NCLB) school accountability. Drawing on nationally representative samples of schools and teachers from the Schools and Staffing Surveys and Teacher Follow-Up Surveys from 1993–2009, this study applied difference-in-differences approaches to examine the effects of NCLB on teacher turnover. We found a weak increase in the average rate of teachers transferring involuntarily to other schools following school-initiated separations, particularly in disadvantaged schools that served larger proportions of poor and minority students. We also observed that NCLB reduced the involuntary attrition rates of teachers. Importantly, the policy effect is indistinguishable from zero on voluntary transfer between schools or voluntary leaving the teaching profession.
Acknowledgement
Min Sun’s contribution to this paper is based upon work supported by the National Science Foundation under Grant No. DRL-1506494. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.
Disclosure statement
No potential conflict of interest was reported by the author(s).
Supplemental Material
Supplemental data for this article can be accessed here.
Notes
1. Florida had previously graded every school in the state on a scale from “A” to “F” since the summer of 1999, based on proficiency rates in reading, writing, and mathematics. In 2002, the state dramatically changed its grading system to both recalibrate the acceptable student proficiency levels for the purposes of school accountability and to introduce student-level changes as an important determinant of school grades. Using student-level microdata to calculate the school grades that would have occurred absent this change, Feng et al. (Citation2010) demonstrated that over half of all schools in the state experienced an accountability “shock” due to this grading change, with some schools receiving a higher grade than they would otherwise have received (positive shock) and other schools receiving a lower grade than they would have otherwise received (negative shock).
2. Texas rated schools on four categories: exemplary, recognized, academically acceptable, and academically unacceptable. The last category is roughly equivalent to not meeting AYP.
3. SASS includes multiple measures of working conditions (e.g., teacher classroom autonomy, teachers’ earning, professional development, hours per week spent on school-related activities, teacher support from school administration and other colleagues). Although they are identified to be related to teachers’ turnover decisions in the literature, we decided not to include them as covariates, because they themselves can be influenced by the NCLB policy. Including them as covariates may lead to misidentifying the policy treatment effect. Similarly, we excluded other teacher professional characteristics that could be influenced by the policy treatment, such as if they were state certified, etc.
4. We also ran our model by using a two-level multinomial logistic regression with teachers nested within states. It is re-assuming that the estimates from these two-level models, as shown in Online Appendix Table OA-1, are very consistent with those from the DID estimates. We decided to primarily report the DID estimates in the main texts for at least two reasons: (a) the intra-class correlations of the two-level random effects model are very low: 0.018 for voluntary mobility, 0.03 for voluntary attrition, and close to zero for involuntary mobility and attrition; and (b) the random effects model has strong assumptions that can be violated in this study. For example, it assumes that there is no correlation between error terms and the treatment effect. Because we have relatively limited understanding of why some states started NCLB-like policy prior to 2002, it is highly possible that important covariates are omitted, which leads to possible correlations between error terms and the estimates of NCLB effects. Using state-fixed effects can alleviate some of these concerns by controlling for state-level unobservables that affected states’ policy choices prior to NCLB. As suggested by Clarke, Crawford, Steele, and Vignoles (Citation2015), it is preferable to use fixed-effect models when the primary interest is in policy-relevant inference of the effects of individual outcomes and when the data are limited to adjust for selection bias. The fixed-effects approach generates more robust estimates than random effects. Moreover, the random effects model assumes random cluster effects follow an asymptotically normal distribution. Considering we only have 50 states, this would be another strong assumption and may not be held in our data.
5. To further confirm the above findings, we separated schools into three groups using flexible thresholds to reflect the increase in the average percentages of FRPL and minority students over years: the advantaged schools (top 33 percentile points of the distributions in a year), the most disadvantaged schools (the bottom 33 percentile points), and the middle. The inferences are generally consistent with those in (results are available upon request from the authors).
6. Results are available upon request from the authors.
Additional information
Funding
Notes on contributors
Min Sun
Min Sun is an Assistant Professor in Education Policy in the College of Education at the University of Washington. Her research focuses on educator quality, school accountability, and school improvement. She can be reached via email at: [email protected] or via mail at: 2012 Skagit Lane, M205 Miller Hall (Box 353600), Seattle, WA 98195.
Andrew Saultz
Andrew Saultz is an Assistant Professor of Educational Policy in the College of Education, Health & Society at Miami University. His research focuses on educational federalism, school accountability, and parent engagement. He can be reached via email at: [email protected] or via mail at: 306 McGuffey Hall, Oxford, OH 45056.
Yincheng Ye
Yincheng Ye is currently a Research Associate in Quantitative Methods in the School of Education at Michigan State University. Her research focuses on math effective teaching, school working conditions, and engineering students’ involvement in co-curricular activities. She can be reached via email at: [email protected] or via mail at: Room 240D Erickson Hall, 620 Farm Lane, East Lansing, MI 48824.