1,462
Views
7
CrossRef citations to date
0
Altmetric
Guest Editorial

Competencies and outcomes in public affairs education

ABSTRACT

To herald JPAE’s 25th volume, it is appropriate to look back at a bit of history and look forward to future research. Just as governments have faced demands to demonstrate results, so also did graduate programs in public policy and administration begin to face pressure to pay less attention to inputs and do more to demonstrate the results those inputs were producing. Beginning in the 1990s, NASPAA revised its accreditation standards to emphasize a mission-based approach with considerable attention to the competencies students develop while pursuing their degrees. That development stimulated an outpouring of research on competency assessment, but has fostered quite limited attention to the results of that focus for our graduates and the programs they analyze, the services they deliver, or the agencies they manage. The logical next step, research into the effects on student careers and public services, could help identify what works and how well it works in public affairs education.

On the occasion of JPAE’s 15th volume in 2010, I wrote a brief essay identifying a couple of topics that I believed to be important for scholars of PA education to address: one is the impact and place of PA education; the other is the transformation of public affairs degree programs through an expansion of doctoral education and the proliferation of specialized degree offerings. Asked to prepare a brief piece to herald JPAE’s 25th volume, I decided to ask, where do we stand now? The proliferation of degrees and expansion of doctoral education have received some limited attention, but much more research has been done on the impact and place of public affairs education, so I will focus my effort on that topic.

Raadschelders, Whetsell, Dimand, and Kieninger (Citation2018) have charted the history and development of JPAE, explaining how it emerged in response to the growth of PA education. They provide some interesting data about the content of manuscripts published in the journal over its first 24 years, finding that 60% of articles were about curriculum, 24% about pedagogy, 9% about educational philosophy, and 6% fell in a miscellaneous category. Their analysis provides no indication of how often the journal’s authors have turned their attention to analyzing the impact of public affairs education – the effects it has on graduates and the public service. That would include such questions as whether our programs add value to our graduates or what the nature and magnitude of that added value is and what impact our program have, through our graduates, on the quality of public service. While they do not indicate whether any articles address the expansion of doctoral education, they do find that over the years there have been 12 articles on the PhD curriculum.

To be sure, there is lots of attention to things we believe to be critical to the quality of education our students receive – the structure and content of the curriculum; consideration of important topics like management, budget and finance, diversity and inclusion; and teaching approaches and materials for use in class, such as case studies. There is also a subset of articles, not immediately obvious in the categories reported by Raadschelders and his coauthors, which provides some insight into the impact of public affairs education by reporting on efforts to assess competencies of program graduates.

Attention to outcomes in public affairs education dates back to before I published an article (Jennings, Citation1989) in the Public Administration Review (PAR) that explored accountability, program quality, and outcome assessment in graduate education in public affairs and administration. That article was a by-product of my work on the NASPAA Curriculum Committee, a group that did the ground work for a subsequent NASPAA committee that recommended that NASPAA move to mission-based accreditation. That change in the early 1990s was a precursor to the shift to a focus on competencies in the Standards adopted by NASPAA in 2009. In the PAR article, I examined the context of governance and its implications for accountability, program quality, and outcome assessment in public affairs education. Our programs were operating in a political environment in which demands were being made on public agencies to measure performance and demonstrate outcomes. That shift of attention from the inputs of public programs to their results was also being pursued in higher education, where state governments and regional and specialized accrediting bodies were demanding that universities and programs demonstrate their value by measuring relevant outcomes of education, including the competencies acquired by students.

The results of public affairs education can be thought of in a variety of ways. This would include student learning outcomes, including knowledge, skills, and values. Beyond this, of course, the results should be improved careers for students and enhanced performance of public programs. Newcomer and Allen (Citation2010) provide a nice model of what this might encompass and the pathways by which public affairs education might produce the results. As they present it, the desired consequences of our programs would include short-term, intermediate, and long-term results. The output is students leaving with acquired knowledge and skills. Short-term outcomes include placement in relevant jobs and students using their knowledge, skills, and competencies in employment. Intermediate outcomes include improved leadership in government and non-governmental organizations and alumni producing successful changes in work processes. In the longer term, this produces more ethical and transparent government and more effective and efficient government with improved production by NGOs and increased prestige of public service education.

Attention to outcomes and competencies began to build with NASPAA’s move toward mission-based accreditation (e.g., National Association of Schools of Public Affairs and Administration, Citation1992; Williams, Citation2002), but the adoption of the universal competencies in 2009 stimulated a flood of research in JPAE reporting on competency assessment. From 2011 through 2018, 21 articles examined various aspects of the results of public affairs education. Fourteen of those articles reported on the development and use of competency assessments by accredited MPA programs, offering guidance on different approaches to assessment and insights into lessons learned in the development and implementation of competency assessment models. Indeed, in a special issue of JPAE in 2014, eight articles explored different aspects of competency assessment (Diaz, Citation2014; Levesque-Bristol & Richards, Citation2014; Mayhew, Schwartrz, & Taylor, Citation2014; Meek & Godwin, Citation2014; Piskulich & Peat, Citation2014; Powell, Saint Germain, & Sundstrum, Citation2014; Rissi & Gelmon, Citation2014; Rubaii & Calarusse, Citation2014).

Other research on outcomes complemented the attention to competency assessment. For example, Sprague and Cameron Percy (Citation2014) analyzed the impact of practicum experiences on students. van der Wal (Citation2017) analyzed the pre- and post-entry values, motivations, sector perceptions, and career preferences of MPA students in Asia. Carpenter (Citation2011) asked how we could measure the community impact of nonprofit graduate students’ service-learning projects.

This latter article addresses the so what question – what effects do our programs have on public service? There has not been very much of that type of research in the field of public affairs education, but there is some. For example, Yeager, Hildreth, Miller, and Rabin (Citation2007) produced a noteworthy early effort to assess the impact of an MPA degree on career outcomes. They used results from a survey of Government Finance Officer Association members to compare differences between MPA and MBA graduates and to compare graduates of accredited MPA programs with graduates of unaccredited programs, and reported a variety of significant differences.

While we do not have many studies of the impact of public affairs education, scholars in other fields have produced sophisticated analyses of the impact of professional education on the profession’s accomplishments. There are a variety of studies of this type in the field of public education. For example, Berger and Toma (Citation1994) studied the effect of state education laws on educational outcomes and found that a master’s degree certification requirement resulted in lower educational outcomes. As another example, Harris and Sass (Citation2011) analyze the effect of experience, undergraduate degrees in education, and graduate education of teachers on student test scores. They find that undergraduate degrees in education have no effect, whereas advanced degrees produce mixed effects, leading to higher scores in middle-school math, lower scores in middle school reading, and no effect on either reading or math at the elementary school level. As a final example, scholars have been analyzing the differences in outcomes produced by teacher preparation programs. This is very similar to asking if differences in MPA or MPP programs lead to different outcomes for public services. An article by von Hippel, Bellos, Osborne, Lincolve, and Mills (Citation2016) illustrates this. The authors, using a very large data set for Texas students, teachers, and schools, find the differences in value added outcomes between teacher preparation programs to be largely non-existent. As they put it,

…, it is rarely possible to tell which TPPs, if any, are better or worse than average. The potential benefits of TPP accountability may be too small to balance the risk that a proliferation of noisy TPP estimates will encourage arbitrary and ineffective policy actions. (von Hippel et al., Citation2016, p. 31)

Studying differences in outcomes for students and public services as a result of the effects of graduate programs in public affairs is necessarily a challenging, complex, and messy affair. Measuring both programs and outcomes is a difficult undertaking. Measures are unlikely to capture all the dimensions we would like to capture. Just think of elementary and secondary education, where some outcomes are relatively easy to measure with test scores, such as reading and math, but that do not capture important outcomes like critical thinking skills, appreciation for the arts, or the development of citizenship values and skills. Beyond this, there are many factors that affect the knowledge and careers of our graduates and the outcomes of public services that need to be taken into account in detecting the consequences of our programs. Nonetheless, careful analyses would allow professional education in public policy and administration to demonstrate its value to students, graduates, and policymakers, while helping to identify what does not work and ways to improve the results of our efforts. This is nothing more than what policymakers, scholars, and analysts have demanded of public services more generally.

Additional information

Notes on contributors

Edward T. Jennings

Edward T. Jennings, Jr. is the Provost’s Distinguished Service Professor of Public Policy and Administration at the University of Kentucky. He received his BA and MA from the University of New Orleans and his PhD from Washington University in St. Louis. He served as the editor-in-chief of the Journal of Public Affairs Education from 2001 to 2005.

References

  • Berger, M. C., & Toma, E. F. (1994). Variation in state education policies and effects on student performance. Journal of Policy Analysis and Management, 13(3), 477–491. doi:10.2307/3325387
  • Carpenter, H. (2011). How we could measure community impact of nonprofit graduate students’ service-learning projects: Lessons from the literature. Journal of Public Affairs Education, 17(1), 115–132. doi:10.1080/15236803.2011.12001630
  • Diaz, R. (2014). Assessing professional competencies: The “Painstaking” implementation phase. Journal of Public Affairs Education, 20(3), 353–368. doi:10.1080/15236803.2014.12001793
  • Harris, D. N., & Sass, T. R. (2011). Teacher training, teacher quality, and student achievement. Journal of Public Economics, 95(7–8), 798–812. doi:10.1016/j.jpubeco.2010.11.009
  • Jennings, Jr., E. T. (1989). Accountability, program quality, outcome assessment, and graduate education for public affairs and administration. Public Administration Review, 49(5), 438–446. doi:10.2307/976388
  • Levesque-Bristol, C., & Richards, K. A. R. (2014). Evaluating civic learning in service-learning programs: Creation and validation of the public affairs scale–short survey (PAS-SS). Journal of Public Affairs Education, 20(3), 413–428. doi:10.1080/15236803.2014.12001796
  • Mayhew, F., Swartz, N., & Taylor, J. A. (2014). Implementing a multi-method competency model: Experiences of the MPA program at James Madison University. Journal of Public Affairs Education, 20(3), 321–334. doi:10.1080/15236803.2014.12001791
  • Meek, J. W., & Godwin, M. L. (2014). Iterative learning: Programmatic lessons from a course embedded approach to program mission assessment. Journal of Public Affairs Education, 20(3), 305–320. doi:10.1080/15236803.2014.12001790
  • National Association of Schools of Public Affairs and Administration. (1992). Symposium on outcomes assessment. Washington, DC: NASPAA.
  • Newcomer, K., & Allen, H. (2010). Public service education: Adding value in the public interest. Journal of Public Affairs Education, 16(2), 207–229. doi:10.1080/15236803.2010.12001594
  • Piskulich, C. M., & Peat, B. (2014). Assessment of universal competencies under the 2009 standards. Journal of Public Affairs Education, 20(3), 281–284. doi:10.1080/15236803.2014.12001788
  • Powell, D. C., Saint-Germain, M., & Sundstrom, L.-M. (2014). Using a capstone case study to assess student learning on NASPAA competencies. Journal of Public Affairs Education, 20(2), 151–162. doi:10.1080/15236803.2014.12001779
  • Raadschelders, J., Whetsell, T., Dimand, A.-M., & Kieninger, K. (2018). Journal of public affairs education at 25: Topics, trends, and authors. Journal of Public Affairs Education, 1–22. published on-line November 16, 2018. doi:10.1080/15236803.2018.1546506
  • Rissi, J. J., & Gelmon, S. B. (2014). Development, implementation, and assessment of a competency model for a graduate public affairs program in health administration. Journal of Public Affairs Education, 20(3), 335–352. doi:10.1080/15236803.2014.12001792
  • Rubaii, N., & Calarusse, C. (2014). Preparing public service professionals for a diverse and changing workforce and citizenry: Evaluating the progress of NASPAA programs in competency assessment. Journal of Public Affairs Education, 20(3), 285–304. doi:10.1080/15236803.2014.12001789
  • Sprague, M., & Cameron Percy, R. (2014). The immediate and long-term impact of practicum experiences on students. Journal of Public Affairs Education, 20(1), 91–111. doi:10.1080/15236803.2014.12001773
  • van der Wal, Z. (2017). Ethos reinforced, government endorsed? Comparing pre-entry and post-entry values, motivations, sector perceptions, and career preferences of MPA students in Asia. Journal of Public Affairs Education, 23(4), 935–958. doi:10.1080/15236803.2017.12002298
  • von Hippel, P., Bellos, L., Osborne, C., Lincolve, J. A., & Mills, N. (2016). Teacher quality differences between teacher preparation programs: How big? How reliable? Which programs are different? Economics of Education Review, 53, 31–45. doi:10.1016/j.econedurev.2016.05.002
  • Williams, D. (2002, January). Seeking the holy grail: Assessing outcomes of MPA programs. Journal of Public Affairs Education, 8(1), 45–56. doi:10.1080/15236803.2002.12023532
  • Yeager, S. J., Bartley Hildreth, W., Miller, G. J., & Rabin, J. (2007). What difference does having an MPA make? Journal of Public Affairs Education, 13(2), 147–167. doi:10.1080/15236803.2007.12001474

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.