716
Views
2
CrossRef citations to date
0
Altmetric
SCHOLARSHIP OF TEACHING AND LEARNING

Reflections on Critical Thinking: Lessons from a Quasi-Experimental Study

Pages 151-166 | Published online: 09 Jan 2018
 

ABSTRACT

In a pre/post quasi-experimental study assessing the impact of a specific curriculum on critical thinking, the authors employed a critical thinking curriculum in two sections of a U.S. foreign policy class. The authors found that the interactive and scaffolded critical thinking curriculum yielded statistically significant critical thinking increases for students scoring below average on the pretest. Within a discussion of the overall need for strengthening critical thinking in higher education, the authors demonstrate that the study’s findings support the developmental process of acquiring critical thinking, and illustrate that early jumps in critical thinking can be achieved within one semester. Additionally, the results point to the need for more long-term approaches to assess larger increases for those scoring above average.

Acknowledgments

This research was made possible in part through an institutional Digital Champions Fellowship grant (2015). The authors would like to thank GSU’s Center for Excellence in Teaching and Learning for their financial and technical support. Specific thanks go to Heidi Beezley, Zoe Salloom, and Dr. George Pullman. Many thanks also go to political science graduate research assistants, Larry B. Stewart, Jr. and Alexandra C. Pauley, for their coding work and to Dr. Peter Lindsay (Department of Political Science) for his helpful comments. We would also like to thank the anonymous reviewers from the Journal of Political Science Education for their extensive comments on earlier drafts.

Notes

Past iterations of this course had found that students were often stumped to find an author’s argument because the amount of new information provided in any given reading was too overwhelming, making it difficult for them to distinguish between illustrative examples and the actual premises leading to an argument. Often even in basic summaries, students concentrated too much on simply listing various points brought forward by an author rather than distilling an argument. This then also translated to the students’ difficulties in constructing effective arguments in their own writing. Most could formulate an opinion but found it difficult to connect evidence to a logical argument.

While students should also learn to be critical of how the discipline is taught, we first want them to understand what authors in the discipline say. For this, students need the skills for reading, and, more importantly, understanding arguments. Students need to learn how arguments are constructed, backed up, and how different arguments relate to each other.

Indeed, while the assignments in this course center on domain knowledge specific to foreign policy the breakdown of an article (regardless of topic/discipline) into its various components mirrors what many authors say about critical thinking: the ability to take on different perspectives and evaluate information so as to better inform decision-making.

IRB procedures are institutional ethics protections making sure that research involving human subjects follows specific guidelines to protect the confidentiality and safety of the participants in this study. Consent procedures asked the students’ permission to have their assignments become part of the study. Georgia State University IRB Study title: Assessing the Efficacy of Digital Pedagogy in Fostering Critical Thinking; IRB Number H16008; Reference Number 334577.

For the efficacy of rubrics to ascertain levels of critical thinking, see Cargas, Williams, and Rosenberg (Citation2017).

This practice encourages students to engage in the process of revision, something that is considered to be an important aspect of developing critical thinking (Barta-Smith and Di Marco Citation2009; Bean Citation2011).

Appendix A displays the demographics for each class. Comparing the two groups shows that there is not a statistically significant difference between them.

Appendix B shows the results of testing differences in critical thinking scores by the entire group and by class. These results demonstrate that neither class was driving the results of the study.

Age also reaches statistical significance and has a coefficient of −1.128. Older students demonstrate a decrease in the difference between pretest and posttest scores. None of the other independent variables reach statistical significance.

Ennis (Citation1993) discusses eight traps in total. First, research findings are too quickly seen as validation of instructional effort without taking into consideration that other influences might have had a bearing on the outcomes. Second, research that does not include a control group might fall into a similar trap in linking any positive effect on instructional effort. Third, problems with pre-post testing (discussed above). Fourth, too often critical thinking tests aren’t comprehensive and thus don’t tap effectively into the different aspects of critical thinking. Fifth, possible effects of background differences between test-maker and test-taker. Sixth, expecting results too soon (discussed above). Seventh, teaching to a test because too much depends on the results and eighth, scarce resources leading to compromises that undermine the validity of the testing (p. 181).

The six stages are: (1) unreflective thinker; (2) challenged thinker; (3) beginning thinker; (4) practicing thinker; (5) advanced thinker; and (6) master thinker.

This question is less left-field than it might at first appear. This particular research followed an earlier study that utilized the same critical thinking assessment. And while in both studies below average scorers increased their critical thinking, the earlier study also found increases for those scoring above average. Both studies utilized the same critical thinking curriculum with one large exception, the current study built in a group project whereas the previous study did not. It might be interesting to investigate whether there is something about group projects that particularly affects those scoring above average and inhibits them from further developing their skills.

Graduate research assistants not otherwise connected to the study were trained to code the pretests and posttests for critical thinking items according to this rubric. For each item coders could award up to 10 points (0–3 = low proficiency; 4–6 = medium proficiency; 7–10 = high proficiency) resulting in a 0- to 60-point critical thinking score. The higher the score, the higher the critical thinking abilities.

Coding: Age is the numerical value of age. GPA is the numerical value of their GPA. White is coded as White (1) and all other races (0). Female is coded as 1, and all others 0. Work is coded as 1 if the student works and 0 if the student does not work. The survey also asked whether students learned better with concrete facts or abstract concepts. Facts is coded as 1 when students prefer to learn with concrete facts and 0 when they said they learned better with abstract concepts.

Additional information

Notes on contributors

Jeannie Grussendorf

Jeannie Grussendorf is a senior lecturer in the Political Science department at Georgia State University where she teaches a variety of international relations courses (Global Issues, U.S. Foreign Policy, Introductory International Relations, and Politics of Peace). Her research is focused on the scholarship of teaching and learning and in this she examines the effect of different pedagogical approaches on critical thinking.

Natalie C. Rogol

Natalie C. Rogol is a PhD candidate at Georgia State University. She studies the executive-judicial relationship, particularly executive strategy and judicial decision making.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 53.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 365.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.