4,107
Views
4
CrossRef citations to date
0
Altmetric
Research Article

Critical views of 8th grade students toward statistical data in newspaper articles: Analysis in light of statistical literacy

, & | (Reviewing Editor)
Article: 1268773 | Received 30 Jun 2016, Accepted 01 Dec 2016, Published online: 27 Dec 2016

Abstract

Understanding and interpreting biased data, decision-making in accordance with the data, and critically evaluating situations involving data are among the fundamental skills necessary in the modern world. To develop these required skills, emphasis on statistical literacy in school mathematics has been gradually increased in recent years. The critical views of 8th graders with respect to statistical data presented in several newspaper articles were investigated from various aspects through this descriptive study. The study was conducted in a middle school with 9 students at the 8th grade level. The statistical data included in news articles selected from different national newspapers published in Turkey were used to collect the data. Through clinical interviews, the evaluations of the students were determined from the aspects of consistency, sample, data collection, data analysis, data presentation, and data support; the reasoning behind the students’ evaluations was also discussed. As a result, it was determined that the students did not perform well during their critical evaluations of newspaper articles that included research data.

Public Interest Statement

During the recent years, the importance of educating statistically literate people has become more crucial for societies. Being statistically literate is required to read, interpret and criticize statistical messages in various contexts such as radio, TV, magazine, newspaper, or www. On the other hand, information-laden earth in which we live forced policy-makers to develop a contemporary curriculum. Thus, the concept of “statistical literacy” is defined as a key concept of statistics education for K-8 curriculum. Current study aims to investigate the critical views of middle school students about news items included in the media. In order to reach this aim, we firstly selected articles in national newspapers. Secondly, the clinical interviews were conducted with 8 graders. In this step, each student was required to examine all the articles in terms of sampling, data collection, data presentation, and data support.

1. Introduction

Rapid development and transformation in the fields of science and technology have affected human life in a radical manner and lead to an increase in our knowledge. Therefore, instead of memorizing new information developed in different fields, one should develop fundamental literacy in specific fields to cope with the emerging data. To address this issue, the concept of mathematical literacy has arisen as a means for improving skills in interpreting the growing amount of data over the last century.

Although mathematics literacy has been described in different ways by various researchers and institutions, the implied meaning remains the same. In this respect, the Programme for International Student Assessment has identified mathematics literacy as the skills used in struggling with, understanding and describing mathematics, along with using those skills in order to find solutions for problems encountered in social life (OECD, Citation2006). According to Mkakure and Mokoena (Citation2011), mathematical literacy, as derived from the practical applications of mathematics, is a concept that provides individuals with the confidence to solve problems through interpretation and critical evaluation of situations encountered in daily life. On the other hand, mathematical literacy has been interpreted by McCrone and Dossey (Citation2007) as an understanding of the function of mathematics in the world that allows for forming sound decisions and constructing responses to individual needs. As can be inferred from all of these descriptions, mathematical literacy is a broad term including numerous skills, such as problem solving, analytical ability, judgment, and effective communication in the various types of situations that individuals may encounter.

Statistical literacy may be considered as a separate discipline due to fundamental aspects that are unique to statistics. According to Moore (Citation1998), an understanding of statistics concepts such as data, change and chance is necessary for coping with situations that arise in everyday life; this importance has led to an increased emphasis on statistical literacy. According to Gal (Citation2004a), statistical literacy refers to the skills that people use for interpreting data, evaluating them critically, explaining ideas stemming from statistical information, arguing the data, and being aware of related phenomena. Watson (Citation1997) notes that these skills include understanding of probabilistic and statistical terminology, understanding of concepts in social discourse and statistical language, and inquiring into the attitudes formed for counter-situations. Likewise, according to Wallman (Citation1993), statistical literacy entails understanding statistical results related to everyday life and the ability to make evaluations through a critical approach; proficiency in statistical thinking enables individuals to make professional and individual decisions, both in society and internally.

An examination of the related literature reveals frequent references to four dimensions of statistical literacy. These may be summarized as (1) understanding of statistics; (2) interpreting statistical data; (3) decision-making; and (4) developing critical positioning skills. In this respect, an understanding of statistics denotes having knowledge of statistical concepts in given situations, articulating these concepts, and analyzing and giving responses to problems related to these concepts (Chance, Citation1997). Furthermore, interpreting statistical data refers to the process undertaken by individuals both to understand the meaning of data that they encounter and to interpret data they have themselves collected when required for a particular situation (Rumsey, Citation2002). Since making inferences concerning the possibilities that may arise based on interpreted data entails making predictions for the future, people tend to interpret the statistical concepts that they understood through using statistical relationships, and then reach to decisions through making predictions in consideration of these relationships (Mosenthal & Kirsch, Citation1998).

However, purposefully or not, some data that may be encountered may include misleading, unilateral, and biased opinions. Therefore, in order to develop a more objective view, it has been emphasized that passively interpreting data is not satisfactory, and that it is important to take an interrogative and critical stance toward potentially misleading data (Frankenstein, Citation1989; Wallman, Citation1993). In this respect, the skill of critical positioning requires knowing which questions should be asked in a particular situation (Snell, Citation2002).

In the results of the International Adult Literacy Survey, it was reported that societal change has led to an increasing demand for developing statistical literacy skills. For instance, as Gal (Citation1994) claims, workers may feel a need for statistical literacy in order to increase the quality of a process in order to respond to rapidly increasing demand. Likewise, this requirement applies to people who are data consumers, who may have a need to interpret discount rates in television and newspaper advertisements; newspaper circulars; statistical information related to political decisions; and so on. With this type of reporting, all of the information includes a message and aims to form a picture in the minds of consumers; these messages may be intended to make the consumer believe in an opinion or to convince them to accept a unilateral view by hiding pieces of the whole picture and presenting manipulated information (Clemen & Gregory, Citation2000). Therefore, in order to evaluate the validity and conclusiveness of the information, critical thinking skills are necessary (Beyer, Citation1987).

1.1. Critical thinking

Dewey (1909, as cited in Fisher, Citation2001) identified critical thinking as the processes of thinking about a situation, as well as the information and the beliefs that support it, in an active and careful manner. In this sense, Norris (Citation1985) explained critical thinking as the evaluation of self-thinking skills and the behavioral changes that occur as a result of this evaluation. Furthermore, Ennis (Citation1985, p. 45) described critical thinking as a logical and reflective process used in making decisions about “what to do” and “what to believe,” which Norris and Ennis (Citation1989) outline a three-stage process wherein (1) critical thinking begins with a problem statement and an interaction with the environment to solve the problem; (2) an inference is made through reasoning and establishing a relationship using the previous information; and (3) a decision is made about what to believe as result of the process.

According to Aizikovitsh-Udi, Kuntze, and Clarke (Citation2016), people who are able to think critically and statistically literate are prerequisites for democratic countries. Although there is a direct relation between two important areas (Royalty, Citation1995), researchers in the literature mainly focus on either critical thinking or competencies of statistical literacy. However, being critical in statistical context is not about an attitude, but also related to interpret and evaluate the given statistical data using some “abilities” (Aizikovitsh-Udi et al., Citation2016, p. 119). Considering the average Turkish watches more than four hours of television per day, as well as the 5 million newspapers sold per week and the 32 million Internet customers, (TurkStat, Citation2013), the average person is likely to be bombarded with information. This information, as Wallman (Citation1993) contends, frequently contains statistical data that is biased, in terms of both special and public foundations. Among the reasons for these biases, use of unfamiliar technical terms and misleading presentation of the news have been cited (Gal, Citation1999; Laborde, Citation1990). Therefore, evaluation of the statistical data that people confront matters.

1.2. Literature review

In a study conducted by Gelman, Nolan, Men, Warmerdam, and Bautista (Citation1998) in the context of an introductory statistics course, the researchers collected articles over a one-month period from two different newspapers (the New York Times & the San Francisco Examiner); these were published in the United States and included scientific studies and statistical data (e.g. graphics, table, etc.). The articles were given to students, who were asked to analyze them. The students’ critiques of the given articles were examined according to previously assembled and prepared questions. The students were then required to find a newspaper article and write a critique of the content. The students who participated in the study stated that the data placed in the newspaper articles included incomplete information used to sway their opinions.

In a similar study, Cerrito (Citation1999) distributed various news articles and asked students to analyze them in the scope of a statistics lesson. It was observed that the students interpreted the data provided in the articles according to their personal beliefs. In this case, the researcher found that the students neglected the persuasiveness of the articles; furthermore, they failed to infer the purpose of the articles at the beginning of the lesson. As a result, Cerrito suggested that these types of articles may be useful in developing students’ critical thinking skills.

An investigation carried about by Watson and Moritz (Citation2000) focused on student understanding and evaluation of newspaper articles around the concept of “sample” within the dimensions of the SOLO taxonomy: i.e. pre-structured, single-structured, multi-structured, and establishment of relationship. The consistency rate of the student responses for questions asked at the 3rd, 6th, and 9th grade levels were higher at the upper grade levels; it was concluded that the vast majority of the students had a lower level understanding of the concept of sample. Looking at the issue from another angle, Pfannkuch (Citation2005) aimed to identify the types of problems that students encountered in the course of statistical evaluations and to determine the ability levels of the students with respect to the SOLO taxonomy. In this case, data found in newspaper articles were converted into tables and graphs and given to students, along with some critical questions. According to the results of the study, most of the students were ranked at the first and second level, or the pre-structured and single-structured level, respectively, and a limited number of students were able to reach the relationship and abstract structure level with respect to the SOLO taxonomy.

In another study, Schield (Citation2006) presented results from an international project, conducted by W. M. Keck Statistical Literacy Project. The focus of the survey was reading statistics or rates of the percentages given in tables and graphs. The results showed that, participants of the study who were college teachers, college students, and data analysts had difficulties on reading statistical data. For instance, in reading an X-Y plot, about 4/5 of the college teachers misread a—times more than—comparison. Similarly, Özmen (Citation2015) observed instructors and students teaching or attending statistics courses in 9 programs of 7 faculties during a year. In order to evaluate the outcomes of the course, she developed a statistical literacy test and conducted Rash analysis. Consequently, she found that many of the participants showed low performance in the test. In addition to determination of the current issue, some researchers focused on how to improve statistical literacy. Koparan (Citation2012) investigated the effect of project-based learning approach on 8th grade students’ statistical literacy. He proved the effectiveness of project-based learning on students’ literacy levels toward probability concept in his experimental research. Merriman (Citation2006) also designed a statistics course for 14 year old students using media reports and investigated the effectiveness of the intervention on student achievement. Results of that study also revealed a significant improvement in students’ statistical literacy at the end of the implementation process. In addition to summarized articles above, some studies introduced classroom activities or theoretical backgrounds (e.g. English & Watson, Citation2015; Watson, Citation2013).

A limited number of studies have been carried out in different countries in relation to students’ understanding, interpreting, and critical evaluation of data presented in the media (e.g. Gelman et al., Citation1998; Merriman, Citation2006; Pfannkuch, Citation2005). These studies reveal a need for further research to be carried out with different populations. Accordingly, the present study aimed to investigate the critical perspectives of 8th students toward newspaper articles including statistical data.

1.3. Theoretical framework

As a result of his research related to knowledge and disposition elements, Gal (Citation2004a) classified statistical literacy according to two aspects, as shown below in Figure . The interests and attitudes of individuals toward given situations comprised the statistical data related to the dispositional element, while the knowledge element compromised literacy skills, statistical knowledge, mathematical knowledge, content knowledge, and critical perspective.

Figure 1. A model for statistical literacy (Gal, Citation2004a, p. 67).

Figure 1. A model for statistical literacy (Gal, Citation2004a, p. 67).

According to Gal (Citation2004a), knowing and interpreting statistical information is not enough in itself. It also requires accompanying reading-writing skills, as well as mathematical and content knowledge; and even these are not enough to perform evaluations. Rather, in addition to these, individuals need to ask “critical questions;” Gal identified the critical questions needed to evaluate statistical information in newspaper articles as follows (Table ):

Table 1. Critical questions

Following Gal’s (Citation2004a) suggestion, some of the critical questions were simplified in this case in consideration of the purpose of the study, the existing knowledge, and the life context of the students who formed the sample according to the experts’ (2 university mathematics educators) opinions. In this study, the students’ interpretations of statistical data were investigated in consideration of how often they used the simplified critical questions classified in the knowledge element of Gal’s model.

2. Methods

This study, which aims to investigate the critical views of students about news items included in the media, is descriptive in nature; in this sense, the most important specification of the descriptive model is the ability of the researcher to identify an event or situation as it exists (McMillan & Schumacher, Citation2001). The method of case study was adopted in this study which is a form of qualitative descriptive research.

2.1. Research group

The research sample for the study included nine 8th graders, students of approximately 15 years of age, who were enrolled in a public school located in Trabzon, Turkey, in the 2011–2012 academic year. This school was chosen among other middle schools with respect to the results of the Level Determination Exam for high school. The views of the students’ mathematics teacher were considered during the selection of the research sample. In this respect, the teacher categorized the students in a class in consideration of their mathematics achievement according to 3 levels: low, intermediate and high. Three students were randomly chosen from each group by the researchers as a means to increase the variation of the sample. From this perspective, purposive sampling was performed. The learning objectives related to statistics in the Turkish middle school mathematics curriculum were investigated in order to determine the pre-knowledge and academic proficiency of the students. The previously achieved objectives for the middle school level students (Ministry of National Education [MoNE], Citation2005) comprising the research sample are presented in Table .

Table 2. Distribution of objectives for statistics according to grade level in the middle school teaching program

2.2. Data collection tools and data collection

Articles published in Turkish national newspapers were used to collect the data. The newspaper articles were selected over a one-week period by experts; articles that included text, tables, and charts containing incorrect or missing information were chosen for the purposes of the study. The articles were given to the students, who were allowed one class period (approximately 45 min.) to read and think about them. Each student was required to examine all of the newspaper articles provided in terms of sampling, data collection, data presentation, and data support. So as not to interfere with their critical thinking, the students were guided through open-ended questions, such as: “Do you believe this information? Do you have any doubts about the accuracy of this information?” The data were collected through interviews, which were conducted individually and audiotaped. In regard to ethical considerations, permission to conduct the study was granted by school principal and participants were requested for consent to collect data.

2.2.1. Selected articles

Within the scope of the study, four articles were chosen that included statistical data with the potential to create misunderstandings. The article titles are listed below, in Table .

Table 3. Selected newspaper articles

The first article, entitled “One in three smart ticket holders has an automobile” (see Appendix 1), contains the results of a questionnaire and reports that an increasing number of people prefer public transport, and that 32% of rail-users have an automobile. In the title of the article, the statement is made that one in three smart ticket holders is an automobile owner. However, a smart ticket can be used for all public transport systems, including buses as well as rail, especially in metropolitan areas. Furthermore, the article states that the rate of high school graduates who were smart ticket holders was 46.2%, yet the rate of literate smart ticket holders was 0.7% when the percentages of smart ticket holders are considered with respect to their level of education in the related table given in the article. Another variable mentioned referred to occupational categories. The sum of the percentages related to this variable is 107.6%. While it is possible that some of the participants surveyed could have more than one occupation, there was no explanation offered for the overage of 7.6% percentage. Furthermore, there was no information concerning the research sample to which the questionnaire was applied, nor was the population described.

In the second article, entitled “The most comfortable shopping city of Europe is Istanbul” (see Appendix 2), it was written that The Economist, a respected UK journal, evaluated 33 large cities with respect to 38 criteria in 5 different categories, selecting Istanbul as the winner in the category of “comfortable shopping.” However, while the categories used to evaluate the cities were mentioned in the article, there was little information about the related criteria. Furthermore, there was no information on the sample size, sample selection, adequacy of the sample to represent the population, or how the measurements were performed during data collection.

In the third article, entitled “The number of ATM has increased twofold in 6 years” (see Appendix 3), a table displays the distribution of ATMs according to given years. When the values provided in the table are considered, it can be seen that the number of ATMs in 2011 was almost double with respect to the number of ATMs in 2006. However, the twofold increase for a given value means that the value actually increased three times. Furthermore, there was no information about from whom and how these data were collected. The remainder of the article explains that there was a decrease in the use of common ATMs, but there was no information on how sharp the decrease was, either in terms of percentages or in general.

The fourth article, entitled “20 thousand new mobile customers per month” (see Appendix 4), relates that due to the increase in smart phone usage, banks have begun to address new smart phone applications for their customers; information about the increase in the number of customers is provided. However, there is no information about whether the stated data referred to an entire population; nor was there any description of the population. In the content of the article, it was stated that the number of active users increased by almost 100 thousand in six months; but it can be seen in the related table that this number was closer to 90 thousand. In addition, it was stated in the table that 20 thousand new costumers had used this service each month. On the other hand, although this was presented as the average value, it was found to be closer to 15 thousand when the average was calculated.

2.3. Data analysis

In the data analysis phase, Gal’s (Citation2004a) 10 critical questions that should be asked in order to interpret and understand statistical information were categorized in 6 dimensions according to expert opinion (Table ). Different researchers then listened to voice records of the interviews in terms of which and how much of the identified dimensions were used with respect to the critical questions used by the students in analyzing the statistical data in the articles. To increase the reliability of the research, the data was coded by two researchers and inter-coder reliability is found to be greater than .80 which shows the reliability of the analysis (Lombard, Snyder-Duch, & Bracken, Citation2002). Finally, a discussion was held between coders to provide a consensus between disagreements.

Table 4. Distribution of the critical questions with respect to dimensions

Because the first and eighth question asked in the interviews were related to whether the study in the given newspaper article was logical, these items were evaluated in the “consistency” dimension. The second question was related to the participants of the studies presented in the newspaper articles as a whole; therefore, it was evaluated in the “sampling” dimension. The third question was evaluated in the “data collection” dimension, as it related to the use of data collection tools as a whole, while the fourth and fifth questions were evaluated as “data analysis,” since they inquired about the data distribution and statistical calculations presented in the articles. Questions six and seven were evaluated in the “data presentation” dimension, since they related to the presentation of the data obtained in the studies, and questions 9 and 10 were evaluated in the “data support” dimension, because they were related to whether the existing or supplementary explanations were sufficient for the presentation of the findings or data. The findings of the analysis are presented in frequency tables in the following section.

3. Findings

After completion of the interviews, the students’ recorded responses were analyzed to determine how often and/or whether the participants used the critical questions relating to the dimensions of consistency, sampling, data collection, data analysis, data presentation, and data support. Table represents a matrix pertain to students’ critics about four different newspaper articles with respect to dimensions of Gal’s critical questions (as given in Table ).

Table 5. Critical questions used by students pertain to newspaper articles

Table shows a general picture about how students evaluated newspaper articles regarding to critical questions. Per tick in the table signifies that the questions were met for the related article by students. As it can be seen, students mostly focused on first item of consistency dimension and 9th question related to data support. Data presentation dimension is the dimension which is less considered. In an attempt to carry out in-depth investigation, each of the dimensions was handled as subtitles. The calculated frequencies are presented in the tables below.

3.1. Findings of the consistency dimension

The frequency table below illustrates how often the consistency dimension was used among the critical questions as the students interpreted the newspaper articles.

As illustrated in Table , approximately half of the students considered the critical question “what kind of study is it?” in relation to articles 1, 3, and 4; and all of the students considered the same question in relation to article 2. In addition to this, the question “is this kind of study reasonable in this context?” was considered by almost half of the students in relation to articles 1, 2, and 4, while none of the students considered this question in relation to article 3. Moreover, the question “are the claims made here meaningful and supported by the data?” was considered under the consistency dimension; nearly half of the students asked the question in relation to articles 2, 3, and 4, while 2 students asked the question in relation to article 2. Some examples from the student responses within the consistency dimension are presented below:

S1: Maximum number of people taking the bus, people who are at the high school level and not literate … (Question 1a—Article 1)

S4: … here, it was written that the shopping centers in Istanbul took 13th place in the general ranking among a lot of cities, but took first place in comfort … (Question 1a—Article 2)

S3: … how much did the number of ATMs increase from the year 2006 to the year 2011 … (Question 1a—Article 3)

S2: … they determined three things here, and all are true. Why are they true? First, the level of education was considered, then smart ticket usage was taken into account (owner of a car or not), and finally, occupation was regarded (whether they are bosses or workers). (Question 1b—Article 1)

S5: The study was performed in England, so ranking London was really normal for this. (Question 1b—Article 2)

S3: … for example, whether these were one-time customers or they were continuously using mobile banking … these data were not given … (Question 8—Article 4)

Table 6. Frequency of critical questions usage related to the consistency dimension

On examining the students’ responses in Table , it can be seen that the half of the students paid attention to the question related to the consistency dimension. In addition, it can be seen that for some of the articles, all of the students considered a critical question (Article 2-Q.1a), while for the other articles, none of the students applied a critical question (Article 3-Q.1b).

3.2. Findings of the sampling dimension

The frequencies and percentage values given below, in Table , illustrate how often the sampling dimension was used among the critical questions in students’ interpretations of the newspaper articles.

Table 7. Frequency of critical questions usage related to the sampling dimension

As the frequencies in Table indicate, half of the students took note of how the sample was identified in articles 1, 2, and 4; and more than half of the students questioned the sampling technique used in article 3. On the other hand, only a limited number of students focused on the number of participants in the studies described in articles 2, 3, and 4, while 7 students considered this situation in article 1. The question under the sampling dimension entitled “is the sample large enough?” was considered by 3 of the students for articles 1 and 2 for article 3, and 4, while none of the students took this question into consideration in article 2. In addition, none of the students mentioned the question “could this sample reasonably lead to valid inferences about the target population?” in relation to articles 1, 2, and 3, while only 2 students remarked on this issue in article 4. Some examples from the student responses within the sampling dimension are presented below:

S5: I could not understand where the study took place, or who were the people included in the sample and how they were selected … (Question 2a—Article 1)

S9:33 large cities were selected from Europe, but, it is not clear how they identified the size of the cities … (Question 2a—Article 2)

S4: … was the number presented for smart phone users only in Istanbul, or in general, so in all of Turkey? (Question 2a—Article 4)

S3: … the number of people who participated in the study was not clear … (Question 2b—Article 2)

S7: … it is going to be closer to reality, as many more people were asked … (Question 2d—Article 1)

S9: … for example, we cannot find so many ATMs in Hakkari, but hundreds of them in Istanbul … (Question 2d—Article 3)

An examination of Table and the related student responses show that the students paid attention to how the research samples were selected and how many participants were included in the samples. However, it can be seen that the students did not give attention to whether the selected sample was large enough and representative of the population.

3.3. Findings of the data collection dimension

Another dimension of the critical questions investigated in terms of the students’ reviews of the articles related to data collection. The frequencies and percentages belonging to the responses in this dimension are presented in Table .

Table 8. Frequency of critical questions usage related to the data collection dimension

Table indicates that 3 of the students considered the critical question of “how reliable were the instruments or measures (tests, questionnaires, and interviews) used to generate the reported data?” in articles 1, 3, and 4, while 4 students considered this question in article 2. In other words, it can be said that less than half of the students considered how the data was collected. Some examples from the student responses are presented below:

S4: … if I met with the person who wrote this article, I would ask, “Did you calculate these values by going and asking people one by one, or did you talk through your head to deceive the nation?” (Question 3—Article 1)

S8: … I wish he could say how he collected the data … (Question 3—Article 4)

According to Table and the related responses, the students did not take into account how the data was collected and what kinds of data collection tools were used when they evaluated the data.

3.4. Findings of the data analysis dimension

The frequency and percentage values given below in Table represent how often the data analysis dimension was used among the critical questions in students’ interpretations of the newspaper articles.

Table 9. Frequency of critical questions usage related to the data analysis dimension

The data presented in Table demonstrates that 2 students considered the critical question “how was the data distribution accomplished?” for articles 1, 2, and 4, and only one student considered this issue for article 3. The question “are the reported statistics appropriate for this kind of data?” was considered by a limited number of students. Almost all of the students neglected the question “could outliers cause a summary statistic to misrepresent the true picture?” considered under the data analysis dimension. Some examples from the student responses are presented below:

S3: … I can’t be sure whether [the statistics] are true or not when I look at this, because only some data were given. Nothing like standard deviation was given; it would be better if the median, range were given … (Question 4—Article 1)

S8: … it was given only for 2011; the other years were not given. That is to say, there were no mobile applications; if there were, it was not stated. This is a deficiency. If I could, I would give the data from 2010 and compare them … (Question 5a—Article 4)

S2: … maybe there are world renowned department stores in London; maybe they prefer it because of this. For instance, 2 or 3 department stores in London… maybe there are more department stores in Istanbul, but not well-known. Presumably, there are more admirable brands there … (Question 5b—Article 2)

When Table and the related student responses are considered, it can be seen that the students did not focus on or investigate the data analysis during their evaluations of the newspaper articles. In other words, the students considered the results of the analysis, but they did not evaluate how the analysis was carried out.

3.5. Findings of the data presentation dimension

The frequency and percentage values given below in Table represent how often the data presentation dimension was used among the critical questions in students’ interpretations of the newspaper articles.

Table 10. Frequency of critical questions usage related to the data presentation dimension

Table illustrates whether the students gave attention to whether the charts presented in the articles were drawn accurately or whether there was any distortion of the drawings in the newspaper articles. None of the students focused on this situation in articles 1 and 3; while on one hand, 2 students took note of this issue in article 2; and on the other hand, 3 students mentioned this situation in relation to article 3. Furthermore, the question “how was this probabilistic statement derived? Are there enough credible data to justify the estimate of likelihood given?” was neglected by all of the students who participated in the study. Some of the students’ responses are presented below as an example:

S1: I look at this, and it does not look like twofold the other. So, it is close to being double, but indeed, it is not … (Question 6—Article 4)

S9: The graphics look OK to me … (Question 2—Article 2)

When the data presented in Table and the related student responses are considered, it can be seen that the students did not take note of the data presentation during their evaluations of the newspaper articles. In other words, the students interpreted the graphics as they saw them and investigated whether there was anything that could have an effect on study.

3.6. Findings of the data support dimension

Another dimension of the critical questions examined during the students’ evaluations of the newspaper articles related to support provided for the data. The frequency and percentage values related to this dimension are presented in Table .

Table 11. Frequency of critical questions usage related to the data support dimension

Table demonstrates that nearly all of the students gave no consideration to the critical question “should additional information or procedures be made available to enable me to evaluate the sensibility of these arguments?” in relation to articles 2, 3, and 4; and 3 of the students considered this question in relation to article 1. On the other hand, nearly half of the students considered the critical question “are there alternative interpretations for the meaning of the findings or different explanations for what caused them?” in relation to article 2, 3, and 4, while 2 students took this into account in article 1. Some examples from the student responses are presented below:

S1: … they could have made a chart for this article … (Question 9—Article 1)

S2: … in the cities where more students are using [public transport], for instance, there could be a district where fewer students are living, as a much more private sector. I would ask about the number of recorded plates; I would ask about the number of schools in the district … (Question 10—Article 1)

S3: … we were 13th in the general ranking, but how … how did they rank this, that data (criterion) is not clear … (Question 9—Article 2)

S9: …I guess that too many burglaries happen in London, because I assume that they referred to all kinds of comfort when they said [that], but the criterion is missing. There is shopping in London, but no chance to roam. The department stores in Istanbul are more comfortable; presumably they are not crowded … (Question 10—Article 2)

S7: … I would add the district in which they were conducted and the number of banks … (Question 9—Article 3)

S5: … I could do a comparison, such as writing 2009–2010–2011, and show the increase in the number of smart phone usage up to this … (Question 10—Article 4)

In consideration of Table and the related student responses, it can be seen that the students did not think that the given data were enough to explain the results. Furthermore, the additional information was also evaluated; and the students inquired about alternatives for the explanations.

4. Discussion and conclusion

In this study, the opinions of 8th graders about newspaper articles including statistical information were questioned according to the dimensions of consistency, sampling, data collection, data analysis, data presentation, and data support. In the consistency dimension, it was seen that most of students considered the context of the article, while it was found that less than half of the students inquired into whether the study was consistent in itself. With respect to the consistency of the study, whether the situations presented in the articles were supported by the data was neglected by the most of the students. It was seen that the rate of students interested in the consistency of the articles varied from one article to another. For this reason, it may be that the form of presentation of the articles had an effect on the consideration of critical situations by the students. The situations presented in the articles were seen as logical and accepted without any inquiry; the students did not demonstrate a critical stance on this point. In this respect, it was believed that the students made decisions based on their personal experiences while investigating the statistical data, as Cerrito (Citation1999) likewise concluded.

When the findings concerning the sampling dimension were investigated, it was seen that the students mainly questioned how the sample was formed and how many people participated; in other words, they investigated the sample size. However, whether or not the sample size was large enough and generalizable was not questioned by most of the students. Although the students were aware of the concept of sample, they did not effectively question the sample size or the relationship between the sample and the target population. Watson and Moritz (Citation2000) revealed similar results in their research. One reason for this situation may relate to the emerging picture derived from the results-based exam systems in Turkey, the construction of nationwide multiple-choice exams, and students’ lack of familiarity with implementations such as critical evaluation of statistical data in newspapers. As a result, it was believed that the students were unable to construct a relationship between the sample and the population, although they were able to handle these two concepts in different dimensions.

Under the dimension of data collection, it was concluded that the most of the students did not question how the data were collected or what kinds of data collection tools were used. Although the objective “Create research questions for a problem, identify an appropriate sample, and collect data” is included in the curriculum for 6th graders, it was seen that the students neglected the data collection dimension during their evaluations of the articles. The reason for this may be that students were not exposed to real-life experiences; in other words, they did not engage in activities such as collecting data and solving prepared questions. As a result, the students were not experienced in statistical processes. Because of the well-known effects of student attitudes toward statistics on their achievement, such experiences should be provided at the elementary education level (Mills, Citation2004).

When the results related to the data analysis dimension were investigated, it was seen that only a limited number of students questioned the articles with respect to the distribution of data. The issue of whether the statistics presented in the articles were appropriate for the data and the possibility of the news being falsified were neglected by most of the students. Similar results were also seen under the data presentation dimension. In the articles including data representation such as graphics, it was seen that a limited number of students considered whether these graphics were drawn properly. In this respect, the results obtained in this research were parallel to the study by Pfannkuch (Citation2005). As the reason for these results, it may be construed that placing misleading graphics in teaching programs, but giving little attention to these, especially with pictorial and figure graphs, may lead to misinterpretations. Moreover, it was noted that none of the students investigated the statistical situations. Teaching that the possibility for misleading information exists and is part of statistics, without relaying this both in teaching programs and the related teaching activities conducted in schools (Gal, Citation2004b) could be a reason for this issue.

When the data support dimension was investigated, with the exception of article 1, it was seen that most of the students questioned whether there was a need for additional information or calculations to establish the reasonability of the evidence. In the first article, it was noted that the students did not examine this dimension closely, due to some of the included demographic features of research sample. When the student responses were investigated from the aspect of the data support dimension, it was seen that the responses included the calculations presented for increasing the credibility of the articles in general. Most of the students failed to note whether there was a need for additional information and explanations in order to understand the data. In this respect, it was observed that the students did not question the given data closely; they only recommended some additional calculations that could be made to increase the credibility of the evidence.

In general, the students reported that they had never experienced an implementation like this before, where they were asked to examine newspaper articles including statistical data. In reviewing the middle school teaching program, it was seen that there were 15 objectives related to statistics that were mainly coincident with the critical questions delineated in Gal’s (Citation2004a) model. Although many of the objectives do not directly refer to statistical literacy (see Table ), they are open to interpretation and teachers are expected to make connections between the statistical concepts to provide both, statistical literacy and critically thinking. However, it is believed that the content and the applications given in schools may be restricted by teacher beliefs concerning the greater time requirement of these kinds of implementations (Duru & Korkmaz, Citation2010), as well as on the need to present questions related to comprehension and application (see Krathwohl, Citation2002) related to the level determination examination for high school entrance (İskenderoğlu, Erkan, & Serbest, Citation2013). Similar results were obtained by Yolcu (Citation2012). During observation of classroom activities, she has noted that teachers spent more time for application of statistical ideas while a small amount of time was dedicated to develop conceptual understanding and critical evaluation. This situation may prevent students from improving their statistical thinking (Wild & Pfannkuch, Citation1999). Instead, a student-centered learning environment should be created which gives opportunities students to develop their statistical conceptual understanding and critical thinking as they carry out statistical studies themselves which connect to real-life examples and situations.

4.1. Limitations

There were limitations associated with the methodology. First of all, although case studies are not designed for large groups and researchers tried to provide variation of the students with respect to their achievement, the number of participants in this study prevents generalizing results. Second limitation of the study concerns the data collection tools. As instrument, we selected newspaper articles to determine their statistical literacy. However, it was restricted in the context of these articles. Finally, statistical literacy of the students was investigated considering only Gal’s (Citation2004a) critical questions.

4.2. Educational implications

Today, in the age of information, many researchers agree on the necessity of statistical literacy (e.g. Gal, Citation2004a, Citation2004b; Wallman, Citation1993). As a consequence of this study, it was concluded that the students did not perform well during their critical analysis of the newspaper articles including research data due to their lack of related experience. For this reason, in addition to exposing students to theoretical knowledge and concepts such as central tendency and distribution, an instructional environment should be constructed for statistics education using original data from real-life situations. In this way, instruction may enable a change of students’ views toward statistics and may allow them to develop a critical standpoint toward statistical data. In order to reach this aim, policy-makers and curriculum designers should identify the aim of statistics teaching clearly. In-service teacher training programs should also support teacher development in relation to statistical-pedagogical content knowledge.

Additional information

Funding

The authors received no direct funding for this research.

Notes on contributors

Mustafa Guler

Mustafa Guler and Kadir Gursoy are research assistants at Karadeniz Technical University, Institute of Educational Sciences. Their research areas are: teacher training, mathematics learning and teaching for elementary school students. The two authors participated a national project aiming to explore the teaching knowledge, beliefs about mathematics and learning opportunities of Turkish pre-service elementary mathematics teachers and compare them to TEDS-M countries.

Bulent Guven

Bulent Guven is a professor at Education Faculty of the same university. In addition to elementary level, his research area includes secondary school mathematics teaching and learning as well as curriculum development. He participated in many national projects on mathematics curriculum development and textbook commissions.

References

  • Aizikovitsh-Udi, E., Kuntze, S., & Clarke, D. (2016). Connections between statistical thinking and critical thinking: A case study. In The teaching and learning of statistics: International perspectives (pp. 83–94). New York, NY: Springer.
  • Beyer, B. K. (1987). Practical strategies for the teaching of thinking. Boston, MA: Allyn & Bacon.
  • Cerrito, P. B. (1999). Teaching statistical literacy. College Teaching, 47, 9–13.
  • Chance, B. (1997). Experiences with alternative assessment techniques in introductory undergraduate statistics courses. Journal of Statistics Education [Online], 5(3). Retrieved June 5, 2012, from www.amstat.org/publications/jse/v5n3/chance.html
  • Clemen, R., & Gregory, R. (2000). Preparing adult students to be better decision makers. In I. Gal (Ed.), Adult numeracy development: Theory, research, practice (pp. 73–86). Cresskill, NJ: Hampton Press.
  • Duru, A., & Korkmaz, H. (2010). Teachers’ views about a new mathematics curriculum change difficulties encountering curriculum change. Hacettepe University Journal of Education, 38, 67–81.
  • English, L. D., & Watson, J. M. (2015). Statistical literacy in the elementary school: Opportunities for problem posing. In Mathematical problem posing (pp. 241–256). New York, NY: Springer.
  • Ennis, R. H. (1985). A logical basis for measuring critical thinking skills. Educational Leadership, 43, 44–48.
  • Fisher, A. (2001). Critical thinking: An ıntroduction. Cambridge: Cambridge University.
  • Frankenstein, M. (1989). Relearning mathematics: A different “R”—Radical mathematics. London: Free Association Books.
  • Gal, I. (1994, September). Assessment of interpretive skills. Summary of working group, Conference on Assessment Issues in Statistics Education, Philadelphia, PA.
  • Gal, I. (1999). Links between literacy and numeracy. In D. A. Wagner, R. L. Venezky, & B. Street (Eds.), Literacy: An international handbook (pp. 227–231). Boulder, CO: Westview Press.
  • Gal, I. (2004a). Statistical literacy: Meanings, components, responsibilities. In D. Ben-Zvi & J. Garfield (Eds.), The challenge of developing statistical literacy, reasoning and thinking (pp. 47–78). Dordrecht: Kluwer Academic.
  • Gal, I. (2004b). Towards “Probability Literacy” for all citizens: Building blocks and instructional dilemmas. In G. A. Jones (Ed.), Exploring probability in school: Challenges for teaching and learning (pp. 43–70. New York, NY: Springer.
  • Gelman, A., Nolan, D., Men, A., Warmerdam, S., & Bautista, M. (1998). Student projects on statistical literacy and the media. The American Statistician, 52, 160–166.
  • İskenderoğlu, T. A., Erkan, İ., & Serbest, A. (2013). Classification of SBS mathematics questions between 2008-2013 years with respect to PISA competency levels. Turkish Journal of Computer and Mathematics Education, 4, 147–168.
  • Koparan, T. (2012). The effect of project based learning approach on the statistical literacy levels and attitude towards statistics of student (Unpublished doctoral dissertation). Karadeniz Technical University, Institute of Educational Sciences, Trabzon.
  • Krathwohl, D. R. (2002). A revision of bloom's taxonomy: An overview. Theory Into Practice, 41, 212–218.
  • Laborde, C. (1990). Language and mathematics. In P. Nesher & J. Kilpatrick (Eds.), Mathematics and cognition (pp. 53–69). New York, NY: Cambridge University Press.
  • Lombard, M., Snyder-Duch, J., & Bracken, C. C. (2002). Content analysis in mass communication: Assessment and reporting of ıntercoder reliability. Human Communication Research, 28, 587–604.
  • McCrone, S. S., & Dossey, J. A. (2007). Mathematical literacy—It’s become fundamental. Principal Leadership, 7, 32–37.
  • McMillan, J. H., & Schumacher, S. (2001). Research in education: A conceptual ıntroduction (5th ed.). New York, NY: Longman.
  • Merriman, L. (2006). Using media reports to develop statistical literacy in Year 10 students. Proceedings of the 7th International Conference on Teaching Statistics. Retrieved September 5, 2016, from https://www.stat.auckland.ac.nz/~iase/publications/17/8A3_MERR.pdf
  • Mills, J. D. (2004). Students’ attitudes toward statistics: Implications for the future. College Student Journal, 38, 349–361.
  • Ministry of National Education [MoNE] (2005). İlköğretim Okulu Ders Programları: Matematik Programı 6-7-8 [Elementary curricula programs: Mathematics curricula program for middle grades]. Ankara: Author.
  • Mkakure, D., & Mokoena, M. A. (2011). A comparative study of the FET phase mathematical literacy and mathematics curriculum. US-China Education Review., 3, 309–323.
  • Moore, D. S. (1998). Statistics among the liberal arts. Journal of the American Statistical Association, 93, 1253–1259.
  • Mosenthal, P. B., & Kirsch, I. S. (1998). A new measure for assessing document complexity: The PMOSE/IKIRSCH document readability formula. Journal of Adolescent and Adult Literacy, 41, 638–657.
  • Norris, S. P. (1985). Synthesis of research on critical thinking. Educational Leadership, 8, 40–45.
  • Norris, S. P., & Ennis, R. H. (1989). Evaluating critical thinking. Pacific Grove, CA: Midwest Publications.
  • OECD (2006). Assessing scientific reading and mathematical literacy: A framework for PISA 2006. Paris: Author.
  • Özmen, Z. M. (2015). Farklı lisans programlarında okuyan öğrencilerin istatistik okuryazarlığının incelenmesi [Examination of the statistical literacy levels of students from different undergraduate programs] (Unpublished doctoral dissertation). Karadeniz Technical University, Trabzon.
  • Pfannkuch, M. (2005). Characterizing year 11 students’ evaluation of a statistical process. Statistics Education Research Journal, 4, 5–26.
  • Royalty, J. (1995). The generalizability of critical thinking: Paranormal beliefs versus statistical reasoning. The Journal of Genetic Psychology, 156, 477–488.
  • Rumsey, D. J. (2002). Statistical literacy as a goal for introductory statistics courses. Journal of Statistics Education [Online], 10(3). Retrieved June 5, 2012, from http://www.amstat.org/publications/jse/
  • Schield, M. (2006). Statistical literacy survey analysis: Reading graphs and tables of rates and percentages. Proceedings of the Sixth International Conference on Teaching Statistics. Retrieved November 25, 2016 from http://www.statlit.org/pdf/2006SchieldICOTS.pdf
  • Snell, J. L. (2002). Discussion: But how do you teach it? International Statistical Review, 70, 45–46.
  • TurkStat. (2013). Turkish statistical institute. Retrieved December, 2013 from http://www.tuik.gov.tr/UstMenu.do?metod=bilgiTalebi
  • Wallman, K. K. (1993). Enhancing statistical literacy: Enriching our society. Journal of the American Statistical Association, 88, 1–8.
  • Watson, J. (1997). Assessing statistical thinking using the media. In I. Gal & J. Garfield (Eds.), The assessment challenge in statistics education (pp. 107–121). Amsterdam: IOS Press.
  • Watson, J. (2013). Statistical literacy, a statistics curriculum for school students, the pedagogical content needs of teachers and the Australian Curriculum. Curriculum Perspectives, 33, 58–69.
  • Watson, J. M., & Moritz, J. B. (2000). Development of understanding of sampling for statistical literacy. The Journal of Mathematical Behavior, 19, 109–136.
  • Wild, C. J., & Pfannkuch, M. (1999). Statistical thinking in empirical enquiry. International Statistical Review, 67, 223–248.
  • Yolcu, A. (2012). An investigation of eighth grade students’ statistical literacy, attitudes towards statistics and their relationship (Unpublished Doctoral dissertation). Middle East Technical University, Ankara.

Appendix 1

Selected article 1

Appendix 2

Selected article 2

Appendix 3

Selected article 3

Appendix 4

Selected article 4