756
Views
62
CrossRef citations to date
0
Altmetric
RESEARCH

Evaluating Geoscience Students' Spatial Thinking Skills in a Multi-Institutional Classroom Study

, , , , , & show all
Pages 146-154 | Received 05 Apr 2013, Accepted 26 Nov 2013, Published online: 09 Jul 2018
 

ABSTRACT

Spatial thinking skills are critical to success in many subdisciplines of the geosciences. We tested students' spatial skills in geoscience courses at three institutions (a public research university, a comprehensive university, and a liberal arts college, all in the midwest) over a two-year period. We administered standard psychometric tests of spatial skills to students in introductory geology, mineralogy, sedimentology and stratigraphy, hydrogeology, structural geology, and tectonics courses. In addition, in some courses we administered a related spatial skills test with geoscience content. In both introductory and upper level undergraduate geology courses, students' skills vary enormously as measured by several spatial thinking instruments. Additionally, students' spatial skills generally improve only slightly during one academic term, in both introductory and advanced geoscience classes. More unexpectedly, while there was a tendency for high-performing students to be adept at multiple spatial skills, many individual students showed strong performance on tests of one spatial skill (e.g., rotation) but not on others (e.g., penetrative thinking). This result supports the contention that spatial problem solving requires a suite of spatial skills, and no single test is a good predictor of “spatial thinking.”

Acknowledgments

We are grateful to Bryn Benford, Clint Cowan, Cam Davidson, Laurel Goodwin, Tom Hickson, Jim Ludois, Stephen Meyers, Paul Riley, Josh Roberts, Mary Savina, and Sarah Titus for allowing us to test their students; all of the students who agreed to participate in this study; two anonymous reviewers for thoughtful and helpful comments; and the National Science Foundation for funding this work (SBE #0541957).

FIGURE 1: Examples of the types of questions from the spatial skills tests used in this study. (a) The Purdue Visualization of Rotations Test. Subjects are asked to identify what the object at the top right would look like if rotated in the same fashion as the first object. (b) Question in the style of the ETS Hidden Figures test. (The actual test questions are copyrighted; this example was drafted by the first author.) Participants are asked to identify which of the five shapes, A through E, can be found in the figure on the left. (c) The Planes of Reference test. Subjects are asked to identify the correct shape of the intersection of the plane and the object. (d) Our Geologic Block Cross-sectioning Test. Subjects are asked to identify the correct cross-section.

FIGURE 1: Examples of the types of questions from the spatial skills tests used in this study. (a) The Purdue Visualization of Rotations Test. Subjects are asked to identify what the object at the top right would look like if rotated in the same fashion as the first object. (b) Question in the style of the ETS Hidden Figures test. (The actual test questions are copyrighted; this example was drafted by the first author.) Participants are asked to identify which of the five shapes, A through E, can be found in the figure on the left. (c) The Planes of Reference test. Subjects are asked to identify the correct shape of the intersection of the plane and the object. (d) Our Geologic Block Cross-sectioning Test. Subjects are asked to identify the correct cross-section.

FIGURE 2: Examples of the distributions of student scores on the spatial skills tests used in this study. All data are from classes at the research university. The x-axis shows the number of questions answered correctly, and the y-axis shows the numbers of students in each class getting that score. The left to right shift in distributions of scores from pretest to post-test indicates the improvement in that particular spatial thinking skill, for that set of students. The extremely wide range of spatial skill levels in each class creates a large overlap of pre- and post-test scores. While these distributions are from classes at the research university, they are typical for introductory and upper-level classes in our study. (a) Purdue Visualization of Rotations Test (PVRT), introductory geology class. (b) Educational Testing Service (ETS) Hidden Figures test, introductory geology class. (c) PVRT, structural geology class. (d) ETS Hidden Figures test, structural geology class.

FIGURE 2: Examples of the distributions of student scores on the spatial skills tests used in this study. All data are from classes at the research university. The x-axis shows the number of questions answered correctly, and the y-axis shows the numbers of students in each class getting that score. The left to right shift in distributions of scores from pretest to post-test indicates the improvement in that particular spatial thinking skill, for that set of students. The extremely wide range of spatial skill levels in each class creates a large overlap of pre- and post-test scores. While these distributions are from classes at the research university, they are typical for introductory and upper-level classes in our study. (a) Purdue Visualization of Rotations Test (PVRT), introductory geology class. (b) Educational Testing Service (ETS) Hidden Figures test, introductory geology class. (c) PVRT, structural geology class. (d) ETS Hidden Figures test, structural geology class.

FIGURE 3: Post-test vs. pretest scores on the PVRT for all upper-level geology students participating in our study. The size of the point on the graph indicates the number of students with that pair of pre- and post-test scores. Smallest points represent individual students, slightly larger points represent two students, larger points represent three or four students, and the largest points represent five or six students. The vast majority of students score higher on the post-test than on the pretest (n = 59), a few students score the same on the post-test as on the pretest (n = 18), and fewer still score lower on the post-test than on the pretest (n = 15). The y = x line on the graph separates students who show improvement on the post-test from those who do not.

FIGURE 3: Post-test vs. pretest scores on the PVRT for all upper-level geology students participating in our study. The size of the point on the graph indicates the number of students with that pair of pre- and post-test scores. Smallest points represent individual students, slightly larger points represent two students, larger points represent three or four students, and the largest points represent five or six students. The vast majority of students score higher on the post-test than on the pretest (n = 59), a few students score the same on the post-test as on the pretest (n = 18), and fewer still score lower on the post-test than on the pretest (n = 15). The y = x line on the graph separates students who show improvement on the post-test from those who do not.

FIGURE 4: (a) Graph of post-test scores on the Purdue Visualization of Rotations Test vs. Planes of Reference test for all students in our study who took both tests (n = 89). Although R = 0.56, indicating a statistically significant correlation of these two skills, note that some students who excel at one of these skills are very weak in the other. (b) Graph of post-test scores on the Geologic Block Cross-sectioning Test vs. the Planes of Reference test for all students in our study who took both tests (n = 32). With R = 0.55, these skills are also moderately strongly correlated, with similar scatter. Point size conventions are the same as in ; the smallest points represent individual students, while each of the largest points represent five or six students.

FIGURE 4: (a) Graph of post-test scores on the Purdue Visualization of Rotations Test vs. Planes of Reference test for all students in our study who took both tests (n = 89). Although R = 0.56, indicating a statistically significant correlation of these two skills, note that some students who excel at one of these skills are very weak in the other. (b) Graph of post-test scores on the Geologic Block Cross-sectioning Test vs. the Planes of Reference test for all students in our study who took both tests (n = 32). With R = 0.55, these skills are also moderately strongly correlated, with similar scatter. Point size conventions are the same as in Figure 3; the smallest points represent individual students, while each of the largest points represent five or six students.

TABLE I: Number of study participants from each course.

TABLE II: Gender demographics.

TABLE III: Spatial thinking measures administered in each course.

TABLE IV: Normalized spatial skills average test scores and gains.

TABLE V: Normalized laboratory test-retest average scores and gains.

TABLE VI: Correlations (Pearson's r) between spatial skills post-test scores and measures of student academic success.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.