10,769
Views
25
CrossRef citations to date
0
Altmetric
Research Article

Diving into data: Developing the capacity for data literacy in teacher education

& | (Reviewing Editor)
Article: 1132526 | Received 23 Oct 2015, Accepted 11 Dec 2015, Published online: 08 Jan 2016

Abstract

Educators by definition are now required to utilize a variety of student data to shape the decisions they make and design the lessons they teach. As accountability standards become more stringent and as teachers face increasingly diverse student populations within their classrooms, they often struggle to adequately meet the needs of all learners. Using student data, rationales for instructional decisions become grounded in best practices. Unfortunately, some administrators and teachers lack the confidence and/or training needed to successfully engage with and interpret data results. This may be especially true for early career educators and those just entering the field. Indeed, for novice teachers to be successful in the current accountability culture, they must possess, understand, and effectively utilize data literacy skills, something quite difficult to accomplish without adequate training. The research in this article explored how pre-service educators determined what worked in a data literacy intervention and the potential impact this had on their instructional decision-making process. Implications for instructor professional development are offered for consideration.

Public Interest Statement

In this article, the authors discuss an action research study with teachers who participated in a Data Chat in an assessment class. The purpose of this research was to explore a data literacy intervention embedded in a pre-service teacher education instruction and assessment course from the perspectives of the participants. We studied our own instructional intervention to better understand how the participants were experiencing data literacy instruction in a pre-service teacher education course and how their experiences could structure and inform future instructional choices. The results from the qualitative study suggested that pre-service teacher used a data literacy intervention to assist them comprehend and analyze data and to use it for instructional purposes.

1. Introduction

One pedagogical challenge of the twenty-first century is the continually changing data literacy landscape that currently exists in today’s society. Data literacy, defined by Mandinach and Gummer (Citation2013), is “the ability to understand and use data effectively to inform decisions … composed of a specific skill set and knowledge base that enables educators to transform data into information and ultimately into actionable knowledge” (p. 30). It should be noted that novice teachers, from the first day on the job, are held accountable by both campus/district administration and local-/state-mandated requirements for the utilization of multiple data sources in their formulation of decisions about student learning/progress for their own professional evaluations (Coburn & Turner, Citation2012; Marsh, Citation2012; Piro, Wiemers, & Shutt, Citation2011). Despite such data literacy mandates, pre-service teachers often report a lack of confidence/foundational knowledge in data-driven knowledge/skills (DeLuca & Bellara, Citation2013; Piro & Hutchinson, Citation2014). Novice educators are expected to use a variety of data sources to determine whether or not certain students need remediation, additional support, and/or enrichment in their quests for specific skill mastery (Dunn, Airola, & Lo, Citation2013). Therefore, it is incumbent upon teacher preparation programs to provide opportunities for novice teachers to improve data literacy skills before they begin their careers.

The purposes of this research were twofold: (1) to explore how pre-service educators determined what worked in a data literacy intervention and (2) to investigate the implications for our own practice.

2. Pathways to data literacy

Though a comprehensive understanding of interventions which specifically target pre-service teacher data literacy is fairly restricted in the literature, components have been identified which, if utilized, positively impacted either student achievement or teacher disposition toward data usage (Reeves & Honig, Citation2015). Those components included: (1) cooperative relationships with colleagues (Gambell & Hunter, Citation2004); (2) the use of a knowledgeable consultant as a resource (Wayman & Jimerson, Citation2013); (3) the use of step-by-step protocol (Gearhart & Osmundson, Citation2009); and (4) the close linkage of the intervention with classroom instruction (Marsh, Citation2012).

2.1. Cooperative relationships

According to Means, Chen, DeBarger, and Padilla (Citation2011), educators expressed a higher confidence level when analyzing data if they were allowed to do so with their colleagues—in effect, if they were permitted to socially construct knowledge. Such practices resulted in the formation of formal and informal learning networks which primarily focused on critical dialogs surrounding effective instruction (Putnam & Borko, Citation2000; Warren Little, Citation2002).

In response to the collaborative educational data landscape, teacher preparation programs must turn away from the stereotypical practice of including data instruction in stand-alone coursework assignments and embrace the notion of collaborative data inquiry (Mandinach, Friedman, & Gummer, Citation2015). Participation in purposeful collaborative inquiry gives pre-service educators a venue by which, through collegial discussions, a better understanding of casual connections between instructional practices and student outcomes may occur (Gallimore, Ermeling, Saunders, & Goldenberg, Citation2009).

2.2. Knowledgeable resources

An essential criterion for successful data-driven choices is that teachers have adequate skills to correctly interpret the data’s meaning. However, data have stereotypically been a worrisome topic for many pre-service teachers and teacher preparation professionals (Pierce, Chick, & Wander, Citation2014). Many have low data self-efficacy and express negative feelings about using data, even though abundance resides at their fingertips (German, Citation2014). A concern is how to increase a pre-service teacher’s confidence level when both collecting and interpreting the message and meaning of data.

To help aid pre-service teachers and professors who have data-angst, research suggests that instructors with professional development in the area, colleagues with specific data expertise, or consultants could provide necessary legitimacy to the process (Marsh & Farrell, Citation2015). Students need to practice under the guidance of an instructor who is confident in his/her ability to use/interpret the data involved and has developed a trust and rapport with students to the extent that the students trust the information disseminated will serve an authentic role in future aspirations (Gulamhussein, Citation2013).

2.3. Step-by-step protocol

Data literate educators generate data-based questions, disaggregate data for answers, and analyze interpretations (Means et al., Citation2011. Pre-service teachers need to understand that data-driven decisions and subsequent implications for improved instruction do not simply occur by “checking off” the steps in a process. Instead, these new educators need to experience and practice each step until they become routine, creating a smooth transition of skills into professional practice (Costa & Kallick, Citation2008). Additionally, professors should teach each routine explicitly prior to engaging students in activities which require holistic application/analysis (Bocala & Boudett, Citation2015). If teachers deal with data on an inconsistent basis, the “provision of timely and efficient access to reminders of basic concepts” (Pierce et al., Citation2014, p. 295) is lost, thereby hindering the development of true data/assessment literacy.

2.4. Linkages to classroom instruction

Linking data results to instruction is critical to effective teaching and learning. Educators must do better than simply guess a student’s academic need. To understand their students, teachers must rely on data collected in order to utilize appropriate instructional strategies to support/enhance each child’s unique skill and knowledge levels (Tomlinson, Citation2007; Tuttle, Citation2009). For example, authentic performance assessments provide educators data which serve as (1) avenues by which to examine current skills and knowledge prior to instructional decision-making and (2) precise connections to authentic instruction.

The research in this article explored how pre-service educators determined what worked in a data literacy intervention and the potential impact this had on their instructional decision-making process. Implications for instructor professional development are offered for consideration.

3. Method

3.1. Purpose statement and research question

The purpose of this research was to explore a data literacy intervention embedded in a pre-service teacher education instruction and assessment course from the perspectives of the participants. The research questions were: How do participants view what works in a data literacy intervention and in what ways? What are the implications for our own work?

3.2. Design

This research was conducted using teacher-researcher methodology (Cochran-Smith & Lytle, Citation1993, Citation2009), specifically within practical action research design (Leitch & Day, Citation2000), in that the research will guide our future instruction, but we also focus upon the process of our reflective practice. We studied our own instructional intervention (Piro, Dunlap, & Shutt, Citation2014; Piro & Hutchinson, Citation2014) to better understand how the participants were experiencing data literacy instruction in a pre-service teacher education course and how their experiences could structure and inform future instructional choices. This pedagogical inquiry (Cochran-Smith & Lytle, Citation1993, Citation2007; Dewey, Citation1910/1933; Gore & Zeichner, Citation1991; Schön, Citation1983; Zeichner & Noffke, Citation2001) served as a reflective practice into curriculum development and reform at the program level in teacher education. IRB protocols for working with human subjects were followed. All participant names have been given a pseudonym.

3.3. Description of the participants and sampling procedures

Purposeful sampling (Patton, Citation2015) of students within a pre-service teacher candidate instruction and assessment course was used. Participants for the study were students in the course and were recruited through an online invitation to participate at a public university in the southwest of the USA. There were a total of 54 participants from two sections in two semesters of an academic year of the hybrid-format class which contained primarily undergraduate seniors in the semester prior to student teaching. Each of the 54 participants completed the data collection. Thirty hours of classroom observation were required as part of the course containing content in instructional methods and assessment procedures. Participant certification areas included: Early Childhood-Grade 6, some with additional English with a Second Language (ESL) or Special Education (SPED) certification; Grades 4–8, some with content area specialization and/or ESL or SPED; Grades 8–12, with content area specialization and/or ESL or SPED; or other specialized content areas such as Dance, Art, Music, Health, Family Consumer Sciences, Deaf Education, and Theatre.

3.4. Instructional intervention

The instructional intervention utilized to increase students’ abilities to translate data findings into more effective instruction for all students was entitled the Data Chat. This intervention grounded in the premise that learning is fundamentally a social phenomenon where individuals construct new knowledge is based on (1) prior experiences and (2) social interactions in authentic contexts (Vygotsky, Citation1978; Wenger, Citation1998), and (3) reflective practice (Buysse, Sparkman, & Wesley, Citation2003). Additionally, Data Chats incorporated adult learning theory, which purports adults are more apt to effect sustainable change if they garner the support of other adults, rather than trying to “go it alone” (Lomos, Hofman, & Bosker, Citation2011; McLaughlin & Talbert, Citation2001; Showers, Citation1996). Teacher candidates collaborated in content-specific groups as they worked to comprehend, desegregate, and interpret actual classroom sets of standardized test score data. The Data Chat intervention was informed by three primary principles from the Understanding by Design framework (Wiggins & McTighe, Citation2005): (1) use of accredited standards to establish educational focus, (2) determination of assessment to be used in the monitoring of student progress and goal/objective attainment, and (3) creation of instruction that addresses and enhances student needs.

3.4.1. Instruction

Initial instruction within the Data Chat intervention took place in one, three-hour session of an undergraduate-level instruction/assessment class. In this setting, students engaged in tasks focused on understanding the definitions of statistical terms and procedures needed for future numerical analysis and interpretation. In the second session of the intervention, the professor focused on the reading and comprehension of sample data-sets. Within this session of the intervention, groups were assigned based on similar content or grade levels and were provided with state-level standardized testing data reports. The third session of the intervention included self-directed inquiry learning that utilized group blogs, wikis, and discussion boards as they began inquiry into the type of data-set and initial analysis of the numbers, looking for strengths and weaknesses of the outcomes based on the numerical data. In the last session of the intervention, the groups of participants finalized their analyses and instructional interventions. Subsequently, each collaborative content-focused group presented the results, including graphical representations of their numerical analysis. It should be noted that since the data used were obtained with permission from a local school district (student names and all identifying information were removed from the data-sets prior to release by the district), participants created pseudo-names and ethnicities in their final reports to mirror the demographics reported in the classroom-level data.

3.4.2. Data Chat steps

The Data Chat instructional intervention comprised eight steps [The following steps were taken from Piro et al. (Citation2014, pp. 5–6) and Piro and Hutchinson (Citation2014)]:

(1)

Enlist support from the local school districts. Procure data-sets from local school districts. Ensure that copies of the state-standardized test classroom-level data have all been de-identified.

(2)

Purposefully target statistical literacy and interpretation through explicit instruction. Once basic statistical terminology has been introduced, provide multiple opportunities for students to practice reading sample data-sets.

(3)

Create grade-level or content-oriented teams. Teams of four–five teacher education candidates simulate grade- or content-level teams. In this configuration, students collaboratively analyze the data-set most closely aligned with their grade level/content area.

(4)

Analyze data-set to discern both strengths and weaknesses. Using numeric data (by percentage) to support their analyses, the participants analyze the data-sets for strengths and weaknesses. Specifically, participants analyze content-reported categorical information (RCAT). For each test given, specific skill groups are targeted by the state. For example, on the Mathematics Grades 3–8 standardized tests, RCATs include: Number Operation & Quantitative Reasoning, Patterns & Algebraic Reasoning, Geometry, Measurement, and Probability & Statistics (Texas Education Agency, Citation2012). Each of these large categories comprises a targeted skill subset that is dependent upon the academic level of the student being assessed. Students with a Level III (Advanced Academic Performance) have demonstrated the ability to think critically and transfer knowledge across a variety of contexts. Students receiving a score of Level II (Satisfactory Academic Performance) have a reasonable likelihood of success in the next course of grade, yet may need targeted academic intervention at some point(s) along the way (Young, Citation2012). Finally, students receiving a score of Level I (Unsatisfactory Academic Performance) are viewed as unlikely to be successful in the next course or grade without significant academic interventions (Young, Citation2012).

(5)

Incorporate state standards and local curriculum guides. Standards and content drive the selection of assessments and the development of instructional strategies following the previous two steps (Wiggins & McTighe, Citation2005). The participants in this study investigated the state standards and local curriculum guides that applied to their data-set generally and the sub-standards for weakness areas.

(6)

Create both formative and summative assessments. In this step of the Data Chat, participants consider the assessment procedures they would incorporate as interventions based on the strengths and weaknesses of the assessment data. After student weaknesses are identified in the class data-set, participants create assessments to address those areas where students were found to be struggling.

(7)

Create specific instructional strategies as interventions to address weaknesses. The participants decide how they would address weaknesses found within the classroom data through instructional interventions. The research supporting the instructional strategy choice was cited if used (Marzano, Citation2009), thus promoting the use of research-based instructional strategies. In addition, participants are required to detail differentiated instruction for each identified weakness. Instructional strategies are correlated to state standards.

(8)

Write a final presentation. The participants create a presentation of their data analyses and plan following the Data Chat intervention. The Data Chat intervention report for this research included: names of literacy group members; the type of data-set; the specific test; when the test was given; strengths and weaknesses of student performance; numeric, graphical, and narrative descriptions of the weakness areas; formative and summative assessments to be given prior to the next testing period; and instructional strategies for interventions. Final presentations are reflective of a format/language that could be utilized in a parent meeting, at a grade-level or faculty meeting, etc. to explain results in verbiage that could be easily understood by the targeted educational audience.

3.4.3. Assessments used in the intervention

Authentic State of Texas Assessment of Academic Readiness (STAAR)-standardized test data-sets at the classroom level were used for the participants to analyze in the Data Chat intervention. The data-sets included item analysis of specific content questions from a released version of grade-specific STAAR tests. Actual student names on the sample data-sets were erased prior to participant use in the Data Chat.

Participants analyzed these state-wide mandated achievement tests at the 3–8 grade levels and end of course assessments in grades 9–12 in varying contents given for accountability purposes. Data-sets were focused at the classroom level. At the 3–8th grade levels, content areas included Reading, Math, Science, and Social Studies. At the 9–12th grade levels, content areas included Algebra I and II, English I and II, Biology I, and US History. Data-sets in every teacher candidate content area were not available at the secondary level. When data in a content area were not available in a content area at the secondary level, participants chose a content group in which to work. Participants ended up working in a grade-level or content area team.

3.5. Instrumentation

The post-intervention survey consisted of open-ended responses to questions regarding data understanding, analysis, and use. Five questions comprised this survey and were collected through standardized, open-ended questions within Psych Data, an online, secure survey instrument platform, after the instructional intervention. Table demonstrates the open-ended questions from the survey.

Table 1. Post-intervention survey question content

3.6. Data collection and analysis procedures

All data were collected through a survey following the intervention of a five-week Data Chat in a pre-service undergraduate assessment and instruction class. This survey was available electronically to participants in Psych Data. Coding was conducted by hand using an inductive content analysis method (Bogdan & Biklen, Citation2003; Gall, Gall, & Borg, Citation2007; Hsieh & Shannon, Citation2005; Patton, Citation2015) with an inductive analysis approach initially guided by the survey questions (Hatch, Citation2002). Initial themes and patterns (Gall et al., Citation2007) were developed as the data were reduced, leading to the final codes. There were no outlying data. A constant comparative method (Glaser & Strauss, Citation2009) was used throughout the analysis of the data. In order to retain verisimilitude, the students’ own words were used (Tracy, Citation2010). Bracketed words were added sparingly to enhance comprehension. Inter-rater agreement between the two researchers added trustworthiness to the interpretation of the qualitative data (Creswell & Miller, Citation2000).

4. Findings

Findings of the post-intervention survey indicate several themes in participant responses. This section demonstrates the final themes: pre-intervention beliefs, understanding and analyzing data, classroom data and instructional practices, and contextual uses of data.

4.1. Pre-intervention beliefs

Participants unanimously stated their discomfort with understanding data prior to the Data Chat. For example, Karrie stated, “I knew nothing about data or what it was.” Shawna mirrored, “I didn’t know what constituted data. I also didn’t know you could read data.” Frank commented, “I had no ideas about what the numbers meant or really that I needed to be concerned [with the data].”

All participants reported they perceived they had limited ideas regarding data for instruction prior to the Data Chat intervention. For example, Ana stated, “I did not now that data could ‘predict’” things like students passing in later years.” Lynn stated, “I did not know that classroom data was given to the teacher. I didn’t know they had to interpret scores for parents, sometimes.” Two participants understood that data were used at the school level, but not by classroom teachers. Gina said, “I thought standardized tests were a tool the administrators used to get an overall perspective of the performance of the school. I didn’t think the classroom used it.” Sara stated, “I honestly had no idea that teachers looked at the data to plan instruction.” Dan said, “I never would have guessed that one could look at the grades of a class, find the low questions, high questions and change instruction based on those.”

4.2. Understanding and analyzing data

Participants’ perceptions regarding understanding the data and analyzing them were evident after the intervention. General skills were noted. For example, Fiona stated, “Data may easily be skewed or read incorrectly; so I learned to be conscious of common mistakes in order to read data correctly and use it effectively.” Specific skills were also gained through the Data Chat intervention. For example, Janice learned, “How to correctly read and interpret scale scores, RCATs and percentages.” Marcia stated she learned that “Data exists on many levels: district, school, grade level and classroom.” Jerry commented that he learned “how to read the results of standardized tests and make sense of them in the class.” Jeff said he felt it was important to “have a wide range of assessments and use the data collectively to give a detailed picture of classroom potential.” Judith commented, “My biggest take away was how to read the results of a standardized test and make sense of them and their impact on student instruction.” Dan stated, “I learned you can disaggregated the data in many ways: SES, ethnicity, gender, etc.” Sara concurred. “I never realized how much could be gathered from taking apart data.” Kathleen reported, “I learned how to extract information and to use it.” Michael said, “I need to be able to interpret or decode data to be able to use it to my advantage and to shape instruction.”

Other participants mentioned that data were “a valuable resource;” were “used to remediate instruction;” that “data is intense;” that “data contains an incredible amount of information;” and that the general public “often misinterprets it.”

4.3. Classroom data and instructional practices

Specific outcomes regarding instruction from data analysis were expressed by all participants. Specific skills gained by learning to analyze data for instruction were noted by some. Analysis of the data led Cathy to state that “the data help me to see the gaps that existed between instruction and assessment.” John noticed that test item analysis was a useful way to understand next instructional practices by helping the teacher “identify areas of instruction the teacher needs to work on.” Jolie stated that data analysis helped her “identify objectives [that] students master and those that are challenging to students.”

Other participants recognized how data analysis would inform and adapt their future instruction. The importance of data in relation to adapting instruction for specific learners was noted. Jeff reflected that, “Showing the teachers what students do not understand, thereby giving the teacher the opportunity to differentiate lessons and facilitate cognitive development” was an outcome of analyzing the data. Shawna agreed. She stated that data analysis led her to “giv[e] the teacher information for reflection on effectiveness of her practice” and to “promote student and teacher self-monitoring.” Frank also concurred. He stated, “Giving the teacher opportunities to reteach a concept from a different perspective through the identification of intervention strategies” was an outcome of the data analysis process. Gina stated that data analysis helped her “illuminate the need for accommodations and/or modifications in future lessons.” Cathy said analyzing data, “Giv [es] the teacher information for reflection on effectiveness of her practice.” Katie commented, “I will differentiate instruction and teach concepts using multiple mediums to help my students grasp the concepts.” Judith agreed. “I will apply different strategies to my teaching to help equip my students with the learning they need to be successful.” Tim stated, “Data analysis will help me find holes in my instructional practice as well as highlight those areas I am teaching well.”

Several participants focused upon how their beliefs had changed regarding understanding the connection between instruction and assessment. The Data Chat intervention led Fiona to state: “Working with data taught me the importance of assessment and the importance of analyzing data to improve instruction.” Simi said, “[Data chatting] taught me how to find areas of weakness and strength and how content, assessments and teaching interventions are related and aligned.” Laura provided an emotional response to using data for instruction when she stated she did not grasp the importance of understanding how data could inform her instruction prior to the Data Chat. “I think it would be horrible if a student got swept under the rug and the teacher moved on and left a gap in the student’s learning”.

4.4. Contextual uses of data

Several participants reflected upon the contextual uses of data, such as when certain data have more value, or when certain forms of assessments informed them for future instruction. For example, Jolie reflected upon how in-class student behaviors could be interpreted as data: “What I love is that any targeted behavior a student does can be used as data. Using data insures your teaching is authentic and student-centered.” Hannah demonstrated her understanding that multiple forms of assessments should guide instructional decisions when she stated:

Standardized test scores is one thing … but formative and summative assessments play a role as well. Using the appropriate intervention to address a skill that a student struggles with helps to make sure the objective of the lesson, the assessment, the teaching, and the strategies are all aligned and scaffolded.

Frank’s final belief comments are worth quoting in whole and demonstrate an overall perception regarding the contextual uses of data.

In order for data to be especially useful, teachers must identify the student’s problem, decide a data collection method, collect data to calculate the baseline, determine the timespan, set a goal, decide how the student’s progress will be assessed and summarize the outcome. This is the way I would correlate and construct lesson plans based on empirical evidence.

Contextual uses of data indicate an advanced cognitive awareness of data literacy for teaching. It indicates that the uses of data depend upon the context and a cognizance of when certain data are more significant than others. Jolie’s comment that other forms of data, such as classroom attendance data, have value; Hannah’s belief that multiple forms of assessments can provide valuable information; and Frank’s appraisal that empirical evidence are necessary to guide successful, uses of data demonstrate a form of conditional knowledge (Jacobs & Paris, Citation1987; Schraw & Moshman, Citation1995) that rises above declarative (the what of using data) or even procedural knowledge (applying data to instruction) to an awareness of when to apply data in specific instances.

5. Discussion of findings

The data suggested four themes: pre-intervention beliefs, understanding and analyzing data, classroom data and instructional practices, and contextual uses of data. Initial perceptions regarding using data for instruction suggested that many of the participants had limited knowledge regarding data use at the classroom level in schools. Further, they expressed a sense of discomfort with using data before the intervention. Previous research has found that a teacher’s sense of confidence with data analysis and interpretation rested largely on their sense of self-efficacy with those skills and that teachers with higher confidence levels were more likely to use data for instructional interventions (Bettesworth, Alonzo, & Duesbery, Citation2008; Tschannen-Moran & Hoy, Citation2007; U.S. Department of Education, Office of Planning, Evaluation & Policy Development, Citation2008). In teacher education data literacy intervention research, Reeves and Honig (Citation2015) found moderately higher self-efficacy results after an in-course data literacy intervention. The implications from these and previous research suggest that providing opportunities for pre-service teacher candidates to work with varying data literacy behaviors during their teacher preparation program may result in more self-efficacy with data use in teacher’s professional contexts.

The results of an exploratory study that investigated teachers’ thought processes as they worked through data scenarios found several common data literacy behaviors (Means et al., Citation2011) to the present study. Specifically, they found that teachers use data for data location, data comprehension, data interpretation, data for instructional decision-making, and data for posing questions about present and future inquiries. Similar to previous inquiries into the Data Chat (Piro & Hutchinson, Citation2014; Piro et al., Citation2014), the present study found that qualitative responses to post-intervention data literacy instruction demonstrated some comparable responses from the participants. Results from the current study suggest that the participants used a data literacy intervention to assist them comprehend and analyze data and use it for instructional purposes. Previously (Piro et al., Citation2014), we suggested that conditional uses of data—the when of using data in context—is an area of inquiry needed regarding data interventions in pre-service teacher education.

Beginning teachers have expressed that they feel unprepared as they initiate their teaching practice (Levine, Citation2006; National Council for the Accreditation of Teacher Education, Citation2010a,Citationb). It may be inferred that as new teachers are expected to use data for instructional purposes in an environment that expects data-driven decision-making, a better sense of preparedness in using data in their first year of teaching would additionally be perceived. Training novice teachers to develop their ability to investigate inquiries into classroom data uses (Athanases, Bennett, & Wahleithner, Citation2013) is necessary so that new teachers may improve the effectiveness of their practice (Means et al., Citation2011). Consequently, teacher preparation programs have been charged to develop data literacy practices in their programs (Mandinach & Gummer, Citation2013), thus reducing the reality shock (Huberman, Citation1989) of beginning teachers as they encounter the complexities of using data for accountability and instructional purposes in their professional practice. Reeves and Honig (Citation2015) found few pre-service teacher interventions for data literacy and suggested that this paucity of research suggests further research to identify effective interventions with this population. This current study adds to the scant literature on instructional interventions on data literacy in pre-service teacher education. More research is needed on instructional interventions in teacher education to best understand how pre-service teachers use data for later professional practice. For example, do content- or grade-level teams work for collaborative learning teams? For learning the ways to analyze data in teams, do common summative assessment data function best or does specific, contextual data from real students in clinical placements provide more situated learning (Reeves & Honig, Citation2015)? Does collaboration scaffold the learning process? Each question provides a focus for follow-up studies.

5.1. Implications for practice

Based upon our action research methodology, we reflect upon how the research will impact our instruction, conceptions, and future curricular choices. It is clear that connecting data to instruction is an important component of data literacy. Interventions in teacher education should connect data interpretation with instructional decisions. To this end, we provide several suggestions. First, it is important to teach statistical literacy prior to the Data Chat. Reviewing statistical terminology including test literacy, measurement terms, location of data, reporting categories, and demographic data will likely expedite the actual analysis, interpretation, and application of the data in the data teams. Second, the findings suggest that understanding how data, standards, assessment, and instruction all inter-relate was perceived as important to participants. Classroom data are important for instructional decisions. Students need to understand that data can have implication for instruction and not just accountability or just for whole school analysis. Using a backward mapping system or framework for teaching data literacy such as Understanding by Design (Wiggins & McTighe, Citation2005; Yelland, Cope, & Kalantzis, Citation2008) enhances the interrelated relationships of data, standards assessment, and instructional interventions for pre-service teachers. Additionally, self-efficacy in data usage is crucial for actually using data in future educational contexts. Continue providing opportunities for students to become familiar with the data literacy behaviors they will use as practicing teachers. Last, the contextual uses of data are capacities to develop with students. However, these advanced cognitive skills must occur after students learn the declarative and procedural uses of data.

6. Conclusion

Today’s educational climate places immense pressure on teachers at all levels to collect and analyze student data. Most commonly, this burden takes the form of examining standardized test scores with an eye focused on skills, giving students the most difficulty. What is gleaned from such a practice only becomes meaningful if combined with purposeful actions that appropriately address targeted student learning outcomes (Means et al., Citation2011). To a novice educator, this process is often very intimidating.

To help alleviate such perceptions, this research explored a data literacy intervention called a Data Chat embedded in a pre-service teacher education instruction and assessment course from the perspectives of the participants. The findings suggested that pre-intervention beliefs, understanding and analyzing data, classroom data and instructional practices, and contextual uses of data were each significant outcomes for the participants after participating in the Data Chat. Beginning teachers have expressed that they felt unprepared to use data as they initiate their teaching practice. Therefore, explicitly teaching data literacy knowledge and skills to teacher candidates may sufficiently develop the data literacy self-efficacy and content to achieve the desired improvements in student learning and to be successful in their chosen profession.

Additional information

Funding

Funding. The authors received no direct funding for this research.

Notes on contributors

Karen Dunlap

The authors’ current research agendas have focused on accountability measures for teacher educators and administrators. This current article reflects an ongoing research project aimed at understanding how to implement data literacy in teacher and administrator preparation programs.

References

  • Athanases, S. Z., Bennett, L. H., & Wahleithner, J. M. (2013). Fostering data literacy through preservice teacher inquiry in English language arts. The Teacher Educator, 48, 8–28.10.1080/08878730.2012.740151
  • Bettesworth, L., Alonzo, J., & Duesbery, L. (2008). Swimming in the depths: Educators’ ongoing effective use of data to guide decision making. In T. J. Kowalski & T. J. Lasley (Eds.), Handbook on data-based decision making in education (pp. 286–303). New York, NY: Routledge.
  • Bocala, C., & Boudett, K. (2015). Teaching educators habits of mind for using data wisely. Teachers College Record, 117(4), 1–20.
  • Bogdan, R. C., & Biklen, S. K. (2003). Qualitative research for education: An introduction to theory and methods. Boston, MA: Pearson.
  • Buysse, V., Sparkman, K., & Wesley, P. (2003). Communities of practice: Connecting what we know with what we do. Exceptional Children, 69, 263–277.10.1177/001440290306900301
  • Coburn, C., & Turner, E. (2012). The practice of data use: An introduction. American Journal of Education, 118, 99–111.10.1086/663272
  • Cochran-Smith, M., & Lytle, S. L. (Eds.). (1993). Inside/outside: Teacher research and knowledge. New York, NY: Teacher’s College Press.
  • Cochran-Smith, M., & Lytle, S. L. (2007). Everything’s ethics. An ethical approach to practitioner research: Dealing with issues and dilemmas in action research. New York, NY: Routledge.
  • Cochran-Smith, M., & Lytle, S. L. (2009). Inquiry as stance: Practitioner research for the next generation. Practitioner’s inquiry. New York, NY: Teachers College Press.
  • Costa, A., & Kallick, B. (Eds.). (2008). Learning and leading with habits of the mind: 16 Essential characteristics for success. Alexandria, VA: Association for Supervision and Curriculum Development.
  • Creswell, J., & Miller, D. (2000). Determining validity in qualitative inquiry. Theory into Practice, 39, 124–130.10.1207/s15430421tip3903_2
  • DeLuca, C., & Bellara, A. (2013). The current state of assessment education: Aligning policy, standards, and teacher education curriculum. Journal of Teacher Education, 64, 356–372.10.1177/0022487113488144
  • Dewey, J. (1910/1933). How we think. Buffalo, NY: Prometheus Books.
  • Dunlap, K., & Piro, J. (2015). Data literacy intervention for preservice teachers post Fall 2015 final. Retrieved from: https://www.psychdata.com/s.asp?SID=160782
  • Dunn, K., Airola, D., & Lo, W. (2013). Becoming data driven: The influence of teachers’ sense of efficacy on concerns related to data-driven decision making. The Journal of Experimental Education, 81, 222–241.10.1080/00220973.2012.699899
  • Gall, M. D., Gall, J. P., & Borg, W. R. (2007). Educational research: An introduction (8th ed.). Boston, MA: Pearson/Allyn & Bacon.
  • Gallimore, R., Ermeling, B., Saunders, W., & Goldenberg, C. (2009). Moving the learning of teaching closer to practice: Teacher education implications of school‐based inquiry teams. The Elementary School Journal, 109, 537–553.10.1086/599340
  • Gambell, T., & Hunter, D. (2004). Teacher scoring of large‐scale assessment: Professional development or debilitation? Journal of Curriculum Studies, 36, 697–724.10.1080/0022027032000190696
  • Gearhart, M., & Osmundson, E. (2009). Assessment portfolios as opportunities for teacher learning. Educational Assessment, 14(1), 1–24.10.1080/10627190902816108
  • German, J. (2014). Teachers’ perceptions of self-efficacy: The impact of teacher value-added ( Doctoral dissertation). Ashland University. Retrieved from https://etd.ohiolink.edu/!etd.send_file?accession=ashland1398439686&disposition=inline
  • Glaser, B. G., & Strauss, A. L. (2009). The discovery of grounded theory: Strategies for qualitative research. Piscataway, NJ: Aldine Transaction.
  • Gore, J. M., & Zeichner, K. M. (1991). Action research and reflective teaching in preservice teacher education: A case study from the United States. Teaching and Teacher Education, 7, 119–136.10.1016/0742-051X(91)90022-H
  • Gulamhussein, A. (2013). Teaching the teachers: Effective professional development in an era of high stakes accountability. Alexandria, VA: Center for Public Education, National School Board Association.
  • Hatch, J. A. (2002). Doing qualitative research in education settings. Albany: State University of New York Press.
  • Hsieh, H., & Shannon, S. (2005). Three approaches to qualitative content analysis. Qualitative Health Research, 15, 1277–1288.10.1177/1049732305276687
  • Huberman, M. (1989). The professional life cycle of teachers. Teachers College Record, 90, 31–57.
  • Jacobs, J. E., & Paris, S. G. (1987). Children’s metacognition about reading: Issues in definition, measurement, and instruction. Educational Psychologist, 22, 255–278.10.1080/00461520.1987.9653052
  • Leitch, R., & Day, C. (2000). Action research and reflective practice: Towards a holistic view. Educational Action Research, 8, 179–193. doi:10.1080/09650790000200108
  • Levine, A. (2006). Educating school teachers (Education Schools Project). Retrieved from http://www.edschools.org/pdf/Educating_Teachers_Report.pdf
  • Lomos, C., Hofman, R., & Bosker, R. (2011). Professional communities and student achievement—A meta-analysis. School Effectiveness and School Improvement, 22, 121–148.10.1080/09243453.2010.550467
  • Mandinach, E., Friedman, J., & Gummer, E. (2015). How can schools of education help to build educators’ capacity to use data? A systemic view of the issue. Teachers College Record, 117(4), 1–20.
  • Mandinach, E., & Gummer, E. (2013). A systemic view of implementing data literacy in educator preparation. Educational Researcher, 42, 30–37.10.3102/0013189X12459803
  • Marsh, J. (2012). Interventions promoting educators’ use of data: Research insights and gaps. Teachers College Record, 114(11), 1–48.
  • Marsh, J., & Farrell, C. (2015). How leaders can support teachers with data-driven decision making: A framework for understanding capacity building. Educational Management Administration & Leadership, 43, 269–289.
  • Marzano, R. (2009). Setting the record straight on “high-yield” strategies. Phi Delta Kappan, 91, 30–37.10.1177/003172170909100105
  • McLaughlin, M., & Talbert, J. (2001). Professional communities and the work of high school teaching. Chicago, IL: University of Chicago Press.
  • Means, B., Chen, E., DeBarger, A., & Padilla, C. (2011). Teachers’ ability to use data to inform instruction: Challenges and supports (Office of Planning, Evaluation and Policy Development, U.S. Department of Education). Washington, DC: Author.
  • National Council for the Accreditation of Teacher Education. (2010a). Assessment as a critical element in clinical experiences for teacher preparation. Retrieved from http://www.ncate.org/LinkClick.aspx?fileticket=oo50CSYDEFM%3D&tabid=715
  • National Council for the Accreditation of Teacher Education. (2010b). Transforming teacher education through clinical practice: A national strategy to prepare effective teachers. Retrieved from http://www.ncate.org/LinkClick.aspx?fileticket=zzeiB1OoqPk%3D&tabid=715
  • Patton, M. Q. (2015). Qualitative evaluation and research methods. Thousand Oaks, CA: Sage.
  • Pierce, R., Chick, H., & Wander, R. (2014). Improving teachers’ professional statistical literacy. In H. MacGillivray, B. Philips, & M. Martin (Eds.), Topics from Australian Conferences on Teaching Statistics: OZCOTS 2008–2012 (pp. 295–309). New York, NY: Springer.
  • Piro, J., Dunlap, K., & Shutt, T. (2014). A collaborative Data Chat: Teaching summative assessment data use in pre-service teacher education. Cogent Education, 1(1), 968409. doi:10.1080/2331186X.2014.968409
  • Piro, J., & Hutchinson, C. (2014). Using a Data Chat to teach instructional interventions: Student perceptions of data literacy in an assessment course. The New Educator, 10, 95–111.10.1080/1547688X.2014.898479
  • Piro, J., Wiemers, R., & Shutt, T. (2011). Using student achievement data in teacher and principal evaluations: A policy study. International Journal of Educational Leadership Preparation, 6(4). Retrieved from http://cnx.org/contents/f289b4cb-e79c-4827-a249-59042ed9b5f1@2
  • Putnam, R., & Borko, H. (2000). What do new views of knowledge and thinking have to say about research on teacher learning? Educational Researcher, 29, 4–15.10.3102/0013189X029001004
  • Reeves, T. D., & Honig, S. L. (2015). A classroom data literacy intervention for pre-service teachers. Teaching and Teacher Education, 50, 90–101.10.1016/j.tate.2015.05.007
  • Schön, D. A. (1983). The reflective practitioner: How professionals think in action (Vol. 5126). New York, NY: Basic Books.
  • Schraw, G., & Moshman, D. (1995). Metacognitive theories. Educational Psychology Review, 7, 351–371.10.1007/BF02212307
  • Showers, B. (1996). The evolution of peer coaching. Educational Leadership, 53, 12–16.
  • Texas Education Agency. (2012). State of Texas assessments of academic readiness (STAAR): Questions and answers. Austin, TX: Texas Education Agency, Student Assessment Division. Retrieved from http://www.tea.state.tx.us/student.assessment/staar/faq.pdf
  • Tomlinson, C. (2007). Learning to love assessment. Educational Leadership, 65, 8–13.
  • Tracy, S. (2010). Qualitative quality: Eight “Big-Tent” criteria for excellent qualitative research. Qualitative Inquiry, 16, 837–851.10.1177/1077800410383121
  • Tschannen-Moran, M., & Hoy, A. W. (2007). The differential antecedents of self-efficacy beliefs of novice and experienced teachers. Teaching and Teacher Education, 23, 944–956.10.1016/j.tate.2006.05.003
  • Tuttle, H. (2009). Formative assessment responding to your students. Larchmont, NY: Eye on Education.
  • U.S. Department of Education, Office of Planning, Evaluation and Policy Development. (2008). Teachers’ use of student data systems to improve instruction: 2005 to 2007. Washington, DC: Author. Retrieved from http://www2.ed.gov/rschstat/eval/tech/teachers-data-use-2005-2007/teachers-data-use-2005-2007.pdf
  • Vygotsky, L. (1978). Mind in society: The development of higher psychological processes. Cambridge, MA: Harvard University Press.
  • Warren Little, J. (2002). Locating learning in teachers’ communities of practice: Opening up problems of analysis in records of everyday work. Teaching and Teacher Education, 18, 917–946.10.1016/S0742-051X(02)00052-5
  • Wayman, J., & Jimerson, J. (2013). Teacher needs for data-related professional learning. Studies in Educational Evaluation, 42, 25–34. doi:10.1016/j.stueduc.2013.11.001
  • Wenger, E. (1998). Communities of practice. Cambridge: Cambridge University Press.10.1017/CBO9780511803932
  • Wiggins, G., & McTighe, J. (2005). Understanding by design. Alexandria, VA: Association for Supervision and Curriculum Development.
  • Yelland, N., Cope, B., & Kalantzis, M. (2008). Learning by design: Creating pedagogical frameworks for knowledge building in the twenty‐first century. Asia-Pacific Journal of Teacher Education, 36, 197–213.10.1080/13598660802232597
  • Young, V. (2012). State of Texas assessments of academic readiness: An overview of the program (PowerPoint slides). Retrieved from http://tea.texas.gov/student.assessment/staar/
  • Zeichner, K. M., & Noffke, S. E. (2001). Practitioner research. In V. Richardson (Ed.), Handbook of research on teaching (4th ed., pp. 298–332). Washington, DC: American Educational Research Association.