190
Views
1
CrossRef citations to date
0
Altmetric
Political Science Instruction

Failure to Launch: False Starts in Designing the Political Science Capstone as a True Ending to the Major

Pages 79-100 | Received 20 Jun 2018, Accepted 23 Feb 2019, Published online: 29 Apr 2019
 

Abstract

This paper explores the design and initial implementation of a political science capstone. The capstone was organized around the theme of critical thinking. The course made the case to students that critical thinking was important and that political science provides uniquely valuable training for good critical thinking by teaching students methods and mindsets that formalize critical thinking. The capstone invited students to complete authentic performance tasks—a Literature Review, an Experiment Replication, and a Book Review—that would permit them to demonstrate and reflect on the knowledge and skills they had gained throughout the curriculum. Moreover, the Literature Review and Experiment Replication asked students to investigate the phenomenon of politically motivated reasoning—a salient example of a failure of critical thinking. The author intended to use the student work as evidence to assess the success of the overall curriculum in achieving the department's learning objectives. The initial run of this critical thinking capstone failed to produce such results. But this failure was largely the result of failures in the design and implementation of the capstone course itself—failures of expectations, failures of design, and failures of execution. This paper explores and explains the reasons for the failure of the initial run of the critical thinking capstone and suggests ways that future iterations of the course can be altered to more likely produce success in terms of students successfully achieving the course's learning objectives and the course generating evidence that can be used for effective departmental assessment.

Notes on contributors

Aaron M. Houck is an assistant professor in the Political Science Department at Queens University of Charlotte. He has a BA from Davidson College, a JD from Harvard Law School, and a PhD from Duke University. He teaches his department’s capstone course for political science and international studies majors. His other teaching focuses on U.S. politics, the U.S. legal system, constitutional law, Southern politics, and urban politics. His research examines judicial politics, political behavior, and the scholarship of teaching and learning.

Notes

1 See Sum and Light (Citation2010) for a discussion of a one-credit capstone course design.

2 For an interesting contrast, see the account of Truman State University’s experience designing and implementing a structured, sequential political science curriculum described in Breuning, Parker, and Ishiyama (Citation2001). Moreover, Ishiyama (Citation2005b) finds that graduates of political science departments with more structured curricula score higher on standardized political science subject matter tests.

3 And we regularly waive the requirement that a student complete even those three courses in the face of AP examination results, dual-enrollment credits, or even scheduling conflicts.

4 Further complicating things, of course, is the likelihood of disagreement among political scientists about the best way to achieve even a particular set of educational objectives.

5 The educational focus of the experiment replication I intended was the scientific method and research design, not the nature of the evidence collected or the mode of analysis. So, its lessons, I hoped, would inform all research methods—quantitative and qualitative.

6 Please note that this observation does not amount to an endorsement of the use of student course evaluations in decisions concerning the evaluation of instructors. See, e.g., Hessler et al. (Citation2019); Mengel, Sauermann, and Zölitz (Citation2019); Boring (Citation2017); and Boring, Ottoboni, and Stark (Citation2016), for details on the ubiquity of irrelevant (and invidious) factors’ influence on student evaluations of their instructors. (Thanks to Fabrizio Gilardi for sharing this list of sources—via a 19 December 2018 Twitter thread (@fgilardi), of course!)

7 Of eight total responses, just two students “strongly agreed,” one “agreed,” one “neither [agreed nor disagreed],” three “disagreed,” and one “strongly disagreed.”

8 Not a single group (of five) was able to correctly conduct a difference-of-means test to compare treatment groups with control groups. Some groups were unable to even differentiate between their treatment groups and control groups.

9 In defense of this first run of the capstone, I did receive e-mails and comments from students months after the course had ended in which they expressed that, after the fact, they “got it” and “appreciated” the course.

10 For instance, the students did seem interested in reflecting on their own failures of critical thinking and instances of motivated reasoning. My favorite example came from one of the student’s final course reflection. In it, she explained how motivated reasoning leads her to ignore and doubt evidence that eating raw cookie dough is potentially hazardous.

Additional information

Funding

This article was developed under a grant from the U.S. Department of Education. However, the contents do not necessarily represent the policy of the U.S. Department of Education, and you should not assume endorsement by the Federal Government.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 53.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 365.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.