170
Views
0
CrossRef citations to date
0
Altmetric
Pedagogical and Curricular Innovations

Misuse of Data as a Teaching Tool

Pages 47-68 | Received 05 Oct 2021, Accepted 03 Apr 2023, Published online: 25 Jul 2023
 

Abstract

This work reports on the implementation of a self-contained data-literacy exercise designed for use in undergraduate classes to help students practice data literacy skills such as interpreting and evaluating evidence and assessing arguments based on data. The exercises use already developed data-visualizations to test and develop students’ ability to evaluate arguments based on data presented visually. Moreover, the exercises are designed to teach students positive lessons from evaluating flawed examples of data usage. We show that repeated use of these exercises has the potential to help students develop tools they can use across multiple contexts when evaluating data. Student feedback and panel survey data show that students like to learn using these tools and they report increased comfort levels in working with data. The paper argues that instructional tools like these data visualization exercises are a quick and effective tool for teaching data literacy skills and that they have significant utility in non-research-methods courses.

Notes

1 Consider, for example, how part of the electorate views the role of universities and how those views have turned sharply negative since 2015. http://www.people-press.org/2017/07/10/sharp-partisan-divisions-in-views-of-national-institutions/

2 The data on which the study reports was collected with the permission from the University of Southern California IRB #UP 15-00303.

3 Some research methods courses may incorporate data visualization techniques to train students on how to communicate their findings clearly and succinctly to a broader audience but these skills typically have a lower level of priority, or are taught in later segments of a multi-course methods series (Rom Citation2015).

4 Out of 35 respondents the answers were: not at all important (0%), slightly important (14.71%), moderately important (20.59%), very important (44.12%), extremely important (20.59%).

5 Given that this was the first time the instructor taught students outside of the major, it was difficult to gauge students’ prior research methods experience and data analysis skills. As a result, there was reasonable concern that the exercises would disproportionately and negatively impact students without prior background if included in required learning assessments like the final exam.

6 The example for the first “bad” visualization was sourced from Quartz which has collected several examples of misleading charts available online. https://qz.com/580859/the-most-misleading-charts-of-2015-fixed/.

7 The following is an example of a group response that was scored “full competence” across all categories. This group’s response was cut and pasted from an email sent to the instructor and is thus unedited. “Here are our data visualization answers: (1) national gun ownership has an inverse relationship with national homicide rate over the last 20 years (2) chart that shows percent changes since 1993: data presented is from a source called Carpe Diem AEI (unknown/not necessarily credible) (3) not very confident—we also don’t agree with what’s being said; source (Carpe Diem AEI) is unknown/not necessarily ; causation vs. correlation: there could be other factors that we aren't considering because they're not on the graph (stricter police enforcement); privately-owned firearms are not the only source of ammunition that people have (people could have more than 1 gun); this guy is a relatively known conservative journalist.”

8 Students may be driven by ideological bias to reject the data regardless of its accuracy.

9 Marfleet and Dillie (Citation2005) were able to administer an information literacy assessment instrument modeled after one developed by the Mesa Community College, but we were unable to find this instrument, nor an alternative designed specifically for data literacy skills. While there is great interest in incorporating data literacy skills into information literacy curriculum, research in library sciences has not coalesced on what specific standards and competencies this would include (Dai Citation2020).

10 It is possible that while groups were supposed to stay the same for the whole semester, that some students were absent from class on the days when we did data visualization exercises. While we have no way to cross-reference attendance and group membership, we also have little reason to expect that there is a correlation in data skills and absence during the class in which exercise 1 was discussed. Such correlation, if present, could potentially explain why students perform better on exercise 2 (when the “experienced” students return to class) but such coincidence seems very unlikely.

11 In a class of 48 students we collected 35 pre-and post- response pairs so our N for the individual responses is 35.

12 This probably explains why all groups demonstrated strong competence in terms of reading the information offered in data visualizations. Given this background, it is understandable why there was little room for exercises to improve performance in this particular area of data literacy skills (columns 1 and 2 under “Reading and Understanding” in ).

13 A value of −2 means that a student’s comfort level declined 2 points on a 5-point scale in the post-survey. A value of -1 indicates a decline of 1 point. A value of zero indicates no change in reported comfort levels. Values of 1 and 2 indicate a 1- or 2-point increase, respectively, in the reported level of comfort when working with data.

14 The panel data also includes information about the changes in self-reported skills levels for finding good sources of data, organizing large amounts of data effectively, using data as a tool in support of arguments, translating data and results into words and translating data into visual representation and images. The results across all these categories are not clear. They show that anywhere between 20% and 30% of students report improvements but 5–15% report worsening in their assessment of these skills. The data set does not reveal why some students report worse assessments as a result of completing this exercise.

15 More comprehensive feedback on whether and how students implement the tools that they learn in the course of repeating these exercises across different programs would be beneficial.

Additional information

Notes on contributors

Iva Božović

Iva Bozovic is an Associate Professor (Teaching) in the Department of Political Science and International Relations at the University of Southern California. Her research explores the interaction of formal and informal institutions with implications for private sector development, governance, and corruption. Her published works have examined the role of social capital and social networks in the growth of small and medium size businesses in post-communist economies. Her pedagogy work explores the ways to improve data literacy among undergraduate students. Prof. Bozovic also consults with international development organizations, and their partners working on issues related to private sector development. She teaches international political economy, development, economic institutions, corruption and trade. She received her Ph.D. in Political Economy and MA in Economics at USC.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.