ABSTRACT
This article provides evidence and generates insights about the power of financial rewards to motivate school administrators and the design features that influence their motivational potency. The multi-year mixed-methods study is grounded in expectancy and goal setting theories that suggest (a) awards must be salient and sizable enough to appeal to educators; (b) the award structure must evidence credible connections between work, performance, and reward; and (c) goals, measures, and awards must be perceived as fair. Since this framework helps explain administrators’ mixed reactions to their awards, it could help guide future efforts to develop and evaluate administrator incentive programs.
Acknowledgements
The authors wish to acknowledge the members of the research team for their assistance with data collection for this study. These individuals include Lauren K. B. Matlach, Laurie Blaisdell, Amanda Bowsher, Karmin Cortes, Brad Coverdale, Allison De La Torre, Marisa Goldstein, Laura Hyde, Claire Jacobson, Apitchaya Pimpawathin, and Jessica Sutter.
Funding
This research was supported through a partnership between the Prince George’s County Public Schools and the Department of Education Policy Study at the University of Maryland, College Park (UMCP). The authors are grateful for support from both organizations.
Notes
1. Research has documented features of administrator incentive programs (see, for example, Goff et al., Citation2014) and has examined how administrators think teachers will respond to financial incentives (see, for example, Max et al., Citation2014). Scholars have also wrestled with measurement issues embedded in efforts to link financial rewards to administrator performance (see, for example, Branch, Hanushek, & Rivkin, Citation2009). Systematic studies of administrator reactions to financial incentive programs, however, are rare (see, for example, Hamilton et al., Citation2012; Goff et al., Citation2014).
2. This section draws heavily on our discussion of the underlying theories and empirical evidence regarding teacher incentive programs found in Rice et al. (Citation2015).
3. For a more detailed discussion of the interrelated nature of these concepts, see Rice et al., Citation2015.
4. During the pilot year, only building principals and assistant principals were eligible to participate in FIRST; during the second year of implementation, additional building administrators (e.g., special education program coordinators and academic deans) were allowed to enroll. During the pilot year, only one eligible administrator opted out of the program in order to remain a member of the program’s Administrator Advisory Committee; during the second year of implementation, all eligible administrators participated; during the third year of implementation, one administrator who had been enrolled in FIRST opted out of the program.
5. The school-wide growth over time model was divided into making adequate yearly progress (AYP) (30%) and meeting district-developed growth targets (70%). Schools that made AYP received $750 (30% of the $2,500 maximum). The payout for meeting growth targets depended on the percent of targets met (0–24% = $0; 25–49% = $900; 50–74% = $1,300; 75–100% = $1,750). The district set targets for each school. Because changes in proficiency rates were based on cross-sectional data and did not track the performance of the same students over time, year-to-year changes in proficiency rates may or may not reflect “growth.” In some cases, dramatic changes in school population made the use of changes in proficiency rates as an accountability measure even more problematic. In one of our case-study sites, approximately 80% of the student body changed during the course of the study.
6. Only teachers who taught in tested subjects and grades and had pre- and post-test data for a classroom of students were eligible to receive a payout based on performance of students in the class; only teachers who were certified in and taught a hard-to-staff subject were eligible to receive a payout for this component.
7. During the first year of participation, teachers attended a series of professional development sessions based on the Danielson Framework for Teaching. During their second year of participation, teachers could develop their own professional development plans tailored to their interests and priorities.
8. Averages for assistant principals (not displayed) follow a very similar pattern.
9. Awards were taxed at a rate of approximately 40%. Most FIRST participants did not understand that their payouts would be classified as emoluments, a designation that made the bonus recipients solely responsible for all federal, state, and local taxes on this portion of their income.
10. One district official explained that since evaluating teachers was part of their formal job description, the district could not pay them additional stipends for completing the responsibilities that were already part of their job descriptions.