Publication Cover
Teacher Development
An international journal of teachers' professional development
Volume 14, 2010 - Issue 4
2,569
Views
2
CrossRef citations to date
0
Altmetric
Articles

The use of dialogue and tools to develop students’ mathematical language and meta‐cognition

, &
Pages 447-466 | Received 22 Jun 2009, Accepted 22 Sep 2010, Published online: 14 Dec 2010

Abstract

This paper presents the analysis of two representative lessons drawn from a cross case study of K–12 classrooms in which students and their teachers engaged in a collaborative dialogic inquiry process (data‐driven dialogue) to co‐construct meaning from assessment results. The representative lessons illustrate a significant shift from teacher‐centered uses of assessment results – determining student mastery of content, contributing to a course grade, sorting students into remediation groups, etc. Rather, in these lessons, the first use of formal classroom assessment results is to engage students in explaining their results and in identifying what actions they will take. The purpose of this analysis is to better understand student participation in a dialogic process of making meaning of assessment results, to document interaction among the teacher and her students, and to investigate the use and role of mediating artifacts in this emergent practice. The study suggests shifts in the roles of the teacher and learner in production of learning; students demonstrating agency in their learning process; peer‐to‐peer interaction and collaboration focused on learning; student use of mathematical language in describing the meaning of their assessment results; and, student use of meta‐cognitive strategies including student self‐evaluation, self‐monitoring and goal setting.

This article reports on the analysis of two lessons from one classroom drawn from a larger cross case study of K–12 classrooms in which students and their teachers engage in a collaborative dialogic inquiry process (data‐driven dialogue) to co‐construct meaning from classroom assessment results. We used a socio‐cultural historical approach, which considers learning to be both situated and distributed, to investigate a classroom‐based emergent practice that can be characterized as a formative use of summative assessment results. This analysis contributes to the growing body of evidence related to practices that engage students in making formative use of assessment results.

The purpose of the larger study as well as this analysis is to better understand student participation in a classroom‐based dialogic process that mediates formative use of assessment data. In this analysis, we (1) document the interaction between and among a teacher and her students as they engage in the emergent practice of dialogic formative use of summative assessments; (2) characterize the associated activity system in this classroom; and (3) describe the role of various mediating artifacts. This study brings together empirical research on student‐engaged formative assessment and a socio‐cultural historical perspective on learning. Below we frame our analysis in the literature on student‐engaged formative assessment and provide a description of one particular student‐engaged formative assessment practice that uses data‐driven dialogue as it has emerged as part of classroom activity. We then locate this emergent practice within socio‐cultural historical theories of learning to develop the conceptual framework that guided the study.

Student‐engaged formative assessment

Assessment is formative when information from assessment (formal or informal) is used by educators to adjust instruction with the intent of better meeting the needs of the assessed students and to provide feedback to students that shapes their actions (Black and Wiliam Citation1998; Council of Chief State School Officers Citation2008; Popham Citation2006; Sadler Citation1989; Shepard Citation2009; Third International Conference on Assessment for Learning Citation2009). Bell and Cowie (Citation2001) describe the process of formative assessment as including three distinct actions: (1) gathering information about learning; (2) analyzing/interpreting the gathered information about learning; and (3) acting on/using this information with the intention of improving student learning. When students are the subjects engaging in these actions, the resulting formative assessment can be described as student engaged. This conceptualization of student‐engaged formative assessment is consistent with Sadler’s (Citation1989) description of the three components necessary for learners to benefit from assessment. According to Sadler, a learner must ‘come to hold a concept of quality roughly similar to that of the teacher, be able to compare his/her current level of performance with the standard, and be able to take action to close the gap’ (119).

In their 1998 meta‐analysis of formative assessment practices, Black and Wiliam identified five specific practices that support formative assessment, for which they found substantial evidence of improvements in student learning outcomes. These included: (1) teachers sharing the criteria for evaluating learning with their students; (2) teachers using descriptive (as opposed to evaluative) feedback; (3) students self‐assessing; (4) student‐to‐student peer assessing; and, (5) using questioning in classrooms to learn about learning. In an implementation study conducted across six schools in the United Kingdom, a sixth practice emerged: making formative use of summative tests (Black et al. Citation2003). Before taking a summative assessment that would primarily be used for grading, these educators had students predict the areas where they thought they would perform the strongest and the areas where they anticipated not doing as well. This became the students’ study guide to prepare for the summative assessment. After the summative test, the students were asked to work together to investigate their incorrect responses. Others have identified similar formative assessment practices (see for example, Nicol and Macfarlane‐Dick Citation2006; Rodriguez Citation2004). The six practices identified by Black and Wiliam and their co‐authors can be characterized as student engaged, because each engages the student in gathering, analyzing/interpreting and/or acting on/using information about his/her own learning.

There is a substantial and growing body of evidence that when teachers engage students in using assessment formatively, student learning improves (Andrade, Du, and Wang Citation2008; Black and Wiliam Citation1998; Crooks Citation1988; Rodriguez Citation2004; Sebba et al. Citation2008; Torrance and Pryor Citation2000). Black and Wiliam’s (Citation1998) meta‐analysis showed that innovations that included the practices described above produced significant, and often substantial, learning gains. The effect sizes of the studies reviewed by these authors ranged between 0.4 and 0.7, ‘among the largest ever reported for sustained educational interventions’ (Black et al. Citation2003, 9).

Data‐driven dialogue as a tool for student‐engaged formative assessment

Data‐driven dialogue is a term coined by Wellman and Lipton (Citation2004) to describe a process through which educators collaboratively inquire about the meaning of data with other educators. A statewide teacher professional development initiative, the Colorado Consortium for Data‐Driven Decisions (C2D3), expanded on the Wellman and Lipton definition to describe a four‐step process through which teams of educators co‐construct meaning from data. The four steps include: Predict, Explore, Explain, and Take Action. Greater detail regarding what is included in each step is provided in Table .While there is little research that supports a causal link between data‐driven discussion/dialogue and student outcomes (Shepard Citation2010), the US Department of Education’s Institute of Education Sciences recently published a peer‐reviewed practice guide strongly encouraging teachers, administrators, and districts to implement data‐driven discussions among educators and among teachers and their students in order to improve instruction and learning (Hamilton et al. Citation2009). Based on strict validity standards for studies suggesting causal relations, the authors made five recommendations for using data, including guiding students in the formative use of summative assessment data.

Table 1. Steps in the data‐driven dialogue process.

Through the C2D3 project, over 2000 K–12 educators in Colorado have been trained to use data‐driven dialogue to collaboratively analyze and interpret student educational data with other educators. A C2D3 project staff member began working with a high school teacher to use this process with students in making meaning of their assessment results. That teacher then presented the idea of using data‐driven dialogue with students to approximately 200 educators participating in an institute on student‐engaged formative assessment practices. Several of the educators attending this institute had also been using data‐driven dialogue with their peers as part of school‐level improvement planning, grade‐level, and content‐area teams. Following the institute, these educators began to adapt data‐driven dialogue as a tool for their students in the formative use of summative tests. The teacher in the classroom that was the focus of the present analysis came from that group of educators, four of whom participated in the larger cross case study from which this analysis is drawn.

Within the K–12 classroom context, data‐driven dialogue has been used by teachers and students to co‐construct meaning from classroom‐level formal assessment results or end‐of‐unit tests. Data‐driven dialogue has emerged as a useful tool for organizing discussion of predicted performance and collective inquiry into actual performance on summative assessments, the sixth student‐engaged formative assessment practice identified by Black et al. (Citation2003). When used with students, the four steps of data‐driven dialogue remain the same as with adults, but how these steps are executed changes slightly. Table provides a comparison of the four steps of data‐driven dialogue when used with educators and when used with students. The materials and symbolic tools that mediate the dialogic process of social learning are of particular importance with both groups.

Socio‐cultural historical theories of learning

Our larger study applied socio‐cultural historical theories of learning to investigate data‐driven dialogue within K–12 classrooms. This view of learning asserts that individuals’ learning cannot be considered in isolation from the social, cultural and historical context within which the learning takes place. From this perspective one must consider the joint mediated activities found within classrooms, broader cultural and historical practices, e.g. schooling, of which they are part (Lave and Wenger Citation1991; Rogoff Citation2003), and the tools with which teachers and learners engage. This conceptualization of learning evolved from the work of Lev Vygotsky (Citation1978). Three of Vygotsky’s most significant contributions substantively shape the theoretical and conceptual framework guiding this analysis: his concepts of mediation, the zone of proximal development, and the social nature of learning.

Mediation

Vygotsky (Citation1978) asserted that mediation is an essential feature of any higher mental process. The Vygotskian triangle in Figure illustrates that the relationship between individual learners and higher mental processes (learning) is always ‘mediated’ by culturally and historically developed artifacts/tools. As Wells (Citation2002) put it: ‘Whereas other species act directly upon the object of interest to them, humans on most occasions interpose an artifact between themselves and the object of interest, thereby enabling them to act more effectively’ (46).

Figure 1 Vygotsky’s mediated action.

Figure 1 Vygotsky’s mediated action.

As historical products of the culture in which they are created, mediating artifacts or tools shape learning. Thus, it is through the use of mediating artifacts that learning is cultural. Furthermore, according to Vygotsky (Citation1978), mediating artifacts can be physical tools or symbolic tools (such as sign systems in language, mathematical symbols, or steps in a procedure). Wells (Citation2002) builds on Vygotsky’s conception of symbolic mediation in learning to ‘advance the exploration of the role of dialogue in the activity of learning and teaching’ (43). Wells emphasizes the importance of mediation through symbols (specifically language) as the core of the learning. According to this framework, data‐driven dialogue has the potential to be a symbolic mediator of students’ and teachers’ learning.

Zone of proximal development and social learning

Vygotsky (Citation1978) defined the zone of proximal development (ZPD) as ‘the distance between the actual developmental level [of the learner] as determined by independent problem solving and the level of potential development as determined through problem solving under adult guidance or in collaboration with peers’ (86). Put simply, an individual’s zone of proximal development is the knowledge, skills and understanding that he/she cannot demonstrate on his/her own, but can with help. Vygotsky suggested that the most beneficial learning experiences are those that are just beyond a child’s current developmental level but still within reach with help, or those that are in the child’s ZPD. This suggests that learning is not something that an individual does on her own, but socially, through participation in activity with others. Data‐driven dialogue is a dialogic process that serves as a tool to structure student interaction with fellow students and with the teacher as they collectively and dialogically engage in teaching and learning activity. Socio‐cultural historical theorists argue that as humans engage in joint mediated activity they learn and develop new zones of proximal development (Engeström Citation1987; Wells Citation2002). At the same time, as people participate in social practices with more experienced peers and experts, their participation in the practices undergoes transformation with their growing expertise (Lave and Wenger Citation1991; Rogoff Citation2003).

Conceptual framework

The conceptual framework for this study, illustrated in Figure , employs a socio‐cultural historical lens on learning in order to develop understanding of data‐driven dialogue‐mediated, student‐engaged formative assessment as an emergent practice within the classroom context. In particular, it considers the dialogic inquiry (Wells Citation2003) that is facilitated through the use of data‐driven dialogue within the classroom that was the focus of this analysis. Data‐driven dialogue, as noted above, was originally conceived as a tool for teachers to use in negotiating meaning from students’ test scores and other performance measures. Teachers who were trained in data‐driven dialogue adapted the technique for use in their classrooms. They created tools to guide dialogic inquiry among students and themselves about assessment as a tool for learning that students can use with agency. Because the use of data‐driven dialogue ‘create[s] points of focus around which the negotiation of meaning becomes organized’ (Wenger Citation1998, 58), it is also a reification of several underlying meta‐cognitive strategies, specifically, self‐assessment, self‐reflection and goal setting.

Figure 2 Conceptual framework.

Figure 2 Conceptual framework.

Consistent with a socio‐cultural historical perspective on learning, this analysis focused on the mediational means used in two representative lessons from a classroom in which student‐engaged formative assessment was a leading activity. Our evidence suggests a process in which students took on new roles as they developed expertise with mediating artifacts and appropriated the teacher’s goals as their own. We paid particular attention to the transformation of students’ participation and the mediating role of data‐driven dialogue in that transformation.

Methodology

This methodology section provides a description of the classroom that was selected for this analysis, explains why it was selected, and describes the data that were collected in the classroom. The analytical framework applied to the data is described as well.

Lesson analysis classroom

Our analysis is drawn from the case study of a fifth‐grade classroom, one of four cases in our larger study. This classroom, observed across two academic years, is located in a high‐poverty (72% free‐ and reduced‐lunch), racially diverse (66% Hispanic, 25% White, 6% Asian, and 3% American Indian and African American combined) school district. The educator whose classroom was the focus of the case study had participated in data‐driven dialogue with other educators for a year before she began using the process with her students. She also participated in a three‐day summer institute during which she was introduced to several other student‐engaged formative assessment practices and had the opportunity to hear from several educators who had implemented these practices in their classrooms, one of whom was using data‐driven dialogue with her students. The fifth‐grade teacher’s classroom was chosen for this study because evaluators of the C2D3 project had observed her use of data‐driven dialogue with her students, and judged that she had implemented the practice with fidelity (Proctor Citation2008).

This educator used a number of different student‐engaged formative assessment practices. However, the two classroom lessons on which this analysis focused were ones during which the leading activity involved data‐driven dialogue, as part of an emergent practice that engaged the teacher and students in making meaning of classroom‐level student assessment results. This new practice uses data‐driven dialogue, previously used among teachers, for individual and collective meaning‐making by teacher and students mediated by semiotic and material artifacts in a classroom space. The adapted four‐step, data‐driven dialogue was the primary tool used, however a number of additional artifacts also served to mediate the learning activity. These will be described below.

Data collection

Classroom video and audio tapes as well as teacher and student interviews were the primary sources of evidence collected in this study. In addition, teacher‐created tools used to support data‐driven dialogue were collected. Video‐taping took place across two school years, beginning in the fall of year one and ending in the fall of year two. Video data were captured through convenience sampling in that the taping took place on days that were mutually convenient for the teacher and the videographer. The video data upon which this analysis is based included two math sessions during which students were engaged in the data‐driven dialogue process. Because the class sessions occurred during two different school years, while the teacher remained the same, the students were different. Teacher and student interviews were conducted both years, on the same day that the sessions were video‐taped. The interviews were loosely structured and were video‐taped.

Analytical framework

This study employed a socio‐cultural historical perspective on learning because it focuses on the individual learner situated in social and cultural context. Because we were interested in foregrounding, or drawing attention to, the classroom context in which the learning took place, we used Engeström’s (Citation1987) expanded triangle of activity (a Cultural Historical Activity Theory or, CHAT, framework) as an analytical tool. This framework expands upon Vygotsky’s mediational triangle to include the social context. It is depicted in Figure . Engeström’s framework provides theoretically driven categories (mediating artifacts, object, subject, community, rules, and division of labor) and relations between those categories. We employed Engeström’s categories as a priori codes in our analysis of the video tapes of the two lessons and related interviews and transcripts from those tapes. In our analysis, we originally defined the classroom activity from the teacher’s perspective, and her goal of negotiating meaning from assessment results and what for her (and the other teachers in the larger study) was its corresponding outcome – students’ learning. We then considered both the teacher and individual students as subjects and we focused on the various materials and symbolic artifacts that they used, including the data‐driven dialogue. Teacher‐created material artifacts, such as the Data‐Driven Dialogue Guide and the Student Progress Monitoring Sheet (Appendices 1 and 2), which students used to guide the data‐driven dialogue process, may or may not have been jointly used. Using Engeström’s framework, we were able to analyze the rules, or norms, evident for the different subjects in the classroom, and the division of labor, or roles, that the teacher and students played in the formative assessment activity.

Figure 3 Engeström’s analytical framework.

Figure 3 Engeström’s analytical framework.

Though not represented in Figure , this study drew also on the expansion of Engeström’s triangle proposed by Wells (Citation2002) to account for the mediating role of dialogue in the activity of teaching and learning. Wells proposed a variation of Engeström’s expanded triangle in which a dual (or plural) subject transforms and is transformed by a common object through both semiotic and material mediation. The semiotic mediation (or dialogue) is jointly enacted by the subjects (plural). In other words, learning is distributed across multiple subjects and the material and symbolic tools they use jointly in the activity.

Data analysis

In our process of data analysis, the three authors jointly viewed the videos of the two lessons and used Engeström’s triangle to analyze the lessons as activities. The authors then analyzed the transcripts of the videos and the corresponding interviews independently using the constant comparative method (Strauss and Corbin Citation1998). We then came back together to discuss our individual themes and to view the video tapes through the lens of those themes as well as the a priori categories suggested by Engeström’s CHAT triangle and Wells’ expansion of the subject in dialogic activity. The emergent categories suggested by individual transcript analysis that held most robustly during the collective review of the data were: mathematical language, language about one’s own learning, i.e., meta‐cognition, agency, and safety. In the section below, we have integrated the findings with discussion framed by our analytical framework.

Findings and discussion

We defined the classroom activity of which these two lessons were part, for purposes of this analysis, by the object of co‐constructing understanding of meaning from recent student assessment results with the intended outcome of improving the students’ (plural) learning. This classroom activity was considered from the perspective of the teacher and individual students as subjects, as well as what emerged as a plural subject (Wells Citation2002), that includes the teacher and the students, all members of the classroom community who came to share similar relations to the object. We begin with the artifacts that mediated this learning process.

Mediating artifacts

Several mediating artifacts were used by the teacher and the students during both classroom sessions as part of their data‐driven dialogue process. The first artifact was the assessment that the students had taken the day before and students’ scored assessments. In both classroom sessions the assessment was a unit test provided as part of the math curriculum materials used across the district in which this classroom is located. During both classroom sessions, each student used his/her own test with his/her responses and individual scores.

The second artifact used during both sessions was the four‐step data‐driven dialogue process. The data‐driven dialogue process was supported by a third artifact, a one‐page guide and note catcher that the teacher had created to facilitate use of the process. This Data‐Driven Dialogue Guide (DDD Guide), included as Appendix 1, lists the ‘unit skills’ that were assessed by the test in bullet form in the same order as they appear on the test (e.g. the first ‘skill’ was assessed by the first few items on the test). The DDD Guide also included prompts for each step in the data‐driven dialogue process, including: (1) I predict that…, (2) I see that…, (3) Why did I score this grade? Why did I miss certain problems? and (4) What is my goal and how will I reach it?

A fourth artifact was used in advance of the data‐driven dialogue process – a Student Progress Monitoring Sheet. Included as Appendix 2, this sheet provides a table that lists each of the ‘skills’ for the unit on a row with an indicator of the skill as ‘beginning,’ ‘developing,’ or ‘secure.’ In this classroom, a spiraling or looping curriculum was used in which goals were first introduced and labeled as ‘beginning.’ When the goal came up a second time it was labeled ‘developing,’ and when students were expected to have mastered the goal it was labeled as ‘secure.’ All types of goals were assessed, but students were expected to have mastered only the secure goals. Each row on the Progress Monitoring Sheet also had space for students to indicate their progress towards the skills with the following headers: ‘help,’ ‘I think I can,’ and ‘I got it.’ The teacher also used a white board as a fifth mediating artifact as she recorded, for all of the students to see, the overall class results on the assessment, leaving them there for the duration of the data‐driven dialogue.

Semiotic mediation

During the two classroom sessions that were the focus of this study, the teacher and the students in the classroom engaged in a structured dialogic inquiry – data‐driven dialogue – to make meaning of the results of a unit test. This practice included students writing on their copy of the DDD Guide and both the teacher and the students sharing information orally with one another during each of the four steps of the data‐driven dialogue practice. The semiotic mediation evident during both sessions had in common the four basic steps of data‐driven dialogue: Predict, Explore, Explain, and Take Action. In both sessions, the teacher reminded the students what step they were on, provided them with the syntax of statements they should make during that step, and modeled what their statements might include. However, the teacher made a significant change in how she facilitated the steps of the data‐driven dialogue process across the two sessions (two different school years). During both DDD sessions the teacher framed the process and initiated each step. The first year, the teacher remained in the front of the room, managing each step and interjecting between student statements. Students raised their hands to be identified to speak and the teacher controlled the flow of the dialogue by calling on different students. The second year, the teacher began the dialogue with a review of the rules for engaging in the dialogue, and she then sat down with a group of students. Though the teacher still launched each of the four steps of the dialogue, the students no longer raised their hands before speaking. Instead, students listened to and watched their peers, adding their statements when there was an appropriate break in the conversation.

We inquired about this change in the structure of the dialogue during the second teacher interview. She indicated that during the first year when she was having children raise hands to speak, ‘it was very choppy and didn’t flow and it wasn’t like we were having a conversation or a dialogue which is what we’re shooting for.’ So she decided to change her approach as she began her second year of using this process with her students. She described several benefits of this change in approach:

It actually focuses the students in on listening to their peers … it makes them really hear each other. I think that when you really hear one another and listen then you start to see the patterns unfold and you start to see kids acknowledging that their peers are having the same problems that they are having and that kind of lends itself to groups’ re‐teaching or knowing who to go to in the classroom for a certain skill area.

In effect, based on students’ use of the data‐driven dialogue process and the agency afforded her, in what would remain, in spite of dialogue, a hierarchically structured classroom, the teacher adapted the rules in the formative assessment activity to grant the students more agency and independence in the discussion.

Predict

In both DDD sessions, the teacher’s introduction to the first step, prediction, followed the same pattern. Her language was almost identical and included,

the first step of the data‐driven dialogue process is to make some predictions. So everyone take a moment and make a couple of predictions about how you think you did on this assessment. You want to be specific, not ‘I predict that I did good,’ but ‘I predict that I did well on which skill’ [sic].

In both sessions, prediction preceded the teacher handing back the students’ tests. After the teacher introduced how to predict, the students, individually and in relative silence, made notes on their DDD Guide about what they predicted to see once their tests were returned. After several minutes of note taking, the teacher indicated it was time to start sharing their predictions. Then students made public statements to their peers about what they expected to see in their assessment results, such as, ‘I predict I did good on addition and subtraction with like denominators,’ ‘I did well on finding the common denominator,’ and ‘I did bad in adding with unlike denominators.’ During the first data‐driven dialogue session, students indicated they wanted to share a prediction by raising their hands and were ‘called‐on’ by the teacher. During the second video‐taped data‐driven dialogue session, the teacher sat down after indicating it was time for students to share their predictions and students took turns sharing their predictions without being called on by the teacher. In this same tape, on the rare occasions that two students began talking at the same time, they would look at one another and either through a gesture or by saying ‘go ahead’ one student would decide to go first. This suggests that the teacher had guided students in the use of the new rule for turn taking and that their participation had subsequently transformed as they became agents of the discussion.

During the second tape, in which students generally led the discussion, the teacher interjected once to identify a pattern in the students’ predictions and extend their thinking,

I’m hearing a lot of conversation about division with two digit whole numbers. Since we haven’t heard all of your predictions, is there anyone that feels that you really know single digits but when it got to double digit you feel like there was really a difference? … Did anyone make predictions like that?

She joined the dialogue a second time to ask a question, ‘When we say I did well on variables, can I ask you what that means that you were able to figure out? What are you saying?’ After this question, the student clarified that she was able to figure out what the sentence was and what the variable stood for. The video of the lessons held multiple similar examples of the teacher guiding students in their zones of proximal development.

Explore

After a number of students had made their verbal predictions, the teacher initiated the second step of data‐driven dialogue – explore. The teacher again verbally described what was involved in the step (exploring the test and making statements about what they noticed). She also talked about what was not involved in that step – namely, explaining why they got certain problems right or wrong. She modeled several statements students might make about their test results, such as ‘I did well finding common denominators.’ She again reminded them to make ‘specific statements,’ then she handed the students their scored tests.

Once the test was returned to them, students looked through their tests, making notes in silence for several minutes using their DDD Guide. The teacher then shared some overall class results. During the first year this included, ‘everyone in the room got numbers 1, 2, 3, and 4 correct, they can tell me what maximum, minimum, median, and mode mean.’ Then the students were cued to share their ‘I see that statements.’ Students made public statements related to their performance on individual items as they related to the learning outcomes being assessed. This included statements like, ‘I see that I did well on the stem and leaf plots,’ ‘I see that I did bad [sic] on adding and subtracting unlike denominators,’ and ‘I see that I got all the questions of the bar graph correctly [sic].’ During the first DDD session, students raised their hands and the teacher selected them to share. During the second DDD session, students followed one another, sharing their observations without direction from the teacher. The teacher would join the dialogue asking a question, clarifying one of the students’ statements like, ‘What does some division mean? Which ones did you do well on?’ In one instance the teacher reminded a student not to start explaining his answers. The teacher also interjected noting patterns in students’ observations and asking for more detail:

I’m hearing a lot of things about number stories. I did well, or I need work on number stories, but number stories have four parts to them, the number sentence, the solution, explaining the remainder, and then what to do with it. So, as you are saying I did well on number stories, what part? All parts? Setting up the number sentence? Finding the solution? Look that over and let’s have some conversation about that. Look over your number stories and give us more information, not just I did well, but what parts of it did you do well on?

As the students continued to verbalize what they saw in more detail, they consistently used mathematical language in their statements.

Explain

The teacher verbally indicated that it was time to move to the third step in the data‐driven dialogue – explain. Her introduction of this step followed the same pattern in both DDD sessions. She prompted students with the statements from the sheet – Why did I score this grade? Why did I miss certain problems? The students again took several minutes to make notes on their DDD Guide. Then the teacher cued them to begin making public statements explaining why they thought they had received the overall score they received and why they missed certain items. In explaining why she received a certain score, one student noted that, ‘I scored this grade because I came in, and I practiced at home, and did my homework.’ During both sessions, several students noted that, ‘I scored this grade because I didn’t come in enough and get more help.’

Then students shared why they missed certain problems. One student shared, ‘I missed a certain problem because I did not see the subtraction sign.’ Another noted, ‘I guessed on unlike and common denominators because I didn’t have enough experience.’ A third explained, ‘I missed a certain problem because I did not explain the answer right.’ During the second DDD session, the teacher again asked some questions emphasizing patterns she was noticing during the student dialogue and asking for more detail from the students, ‘Is there anybody that on their Progress Monitoring Sheet colored all the way over to I got it, I understand that skill, but then you missed it on the test? You thought you did, but maybe now you realize that you don’t. So it is time to adjust your thinking?’ Several students noted that they had. In both sessions, the dialogue process was facilitated by the material artifacts, e.g., Progress Monitoring Sheet, which mediated thinking about and discussing students’ perceptions of their math learning.

Take action

Finally, the teacher indicated it was time to move to the fourth step – take action. During both sessions she modeled some responses and prompted students from the DDD Guide, ‘What is your goal, and what are you going to do about it?’ In both sessions, the teacher again reminded students to ‘be very specific.’ During the second DDD session, she explained it this way, ‘I don’t want to just hear, “I am going to get help.” I need specifics. What are you going to get help on? When are you going to get help? Be very specific.’

During this final step, students again took several minutes to make notes before they began speaking out loud about the specific steps they planned to take to improve their learning. The action steps shared by students included a variety of actions from taking more time to read items to making an appointment with their teacher to get help on a specific topic like adding fractions with unlike denominators. During the second session, students also mentioned working with other students who understood the problems with which they had difficulties.

The four steps of the data‐driven dialogue process provided organization for the teacher’s guidance as she scaffolded, with support from the material and symbolic artifacts, the students’ reflection on their math understanding and their planning for future math learning. In reviewing the tapes, we were particularly struck by what we interpreted as a sense of safety that characterized the data‐driven dialogue process. The students grew progressively less reticent to talk about their learning and were supportive of one another in the discussion. Below we return to the categories from Engeström’s framework.

Rules and norms

Several rules that shaped this dialogic formative assessment practice were explicit in one or both of the classroom sessions. The first set of rules explicit in both classroom sessions were those that helped the students interpret their assessment scores, including the definition of types of goals that were assessed – beginning, developing and secure; and definitions of what scores on the assessment meant from a ‘1’ to a ‘4.’ During the second DDD session, the teacher asked students to talk about what the different scores meant. According to one student, ‘A “4” is advanced, a “3” means you’ve got it, and a “2” means you are getting better at it.’

Several rules related to how to engage in data‐driven dialogue were also made explicit. The rules changed between the first and second session to match the change in how the dialogue was facilitated, e.g. students not raising their hands to speak. During the second classroom session, the teacher prompted students to vocalize general rules related to participating in data‐driven dialogue, and students indicated, ‘You don’t have to raise your hand … because it is a dialogue and a conversation,’ and ‘If someone is talking at the same time as you, give up and let them talk,’ and ‘Listen to a person when they are talking … they know you are listening to them when you are just quiet … you are looking at them.’ During both DDD sessions, the teacher frequently reminded students to ‘be specific’ in their statements during each step of the data‐driven dialogue and to look at their DDD Guide and follow the prompts for each step in the process.

Some rules were apparent but not stated. These included how students and the teacher were seated in the room. Students were seated at desks that were arranged to face one another in groups of four of five. During the first classroom session, the teacher remained at the front of the room, directing each step of the data‐driven dialogue and facilitating the process by selecting students to speak. During the second classroom session, the teacher stood at the front of the room when giving directions, but sat down at one of the desks during the dialogue process. Students took turns speaking without intervention from the teacher. One implicit rule that was never stated, but always evident, involved the hierarchical nature of the class. While agency was distributed, and the teacher acted very much as a guide, she remained clearly in charge, even as students demonstrated growing agency and transformation of their participation in the emergent practice.

Division of labor/roles

Traditionally teachers have the lead role in classrooms and hold the locus of responsibility for learning actions (Marshall and Weinstein Citation1984). This is particularly true when assessment is involved. Teachers develop, administer, score and use assessment results primarily to grade students. Even when instructional practices have shifted to more constructivist or socio‐cultural strategies, US classroom assessment practices tend to reflect behaviorist learning theories (Shepard Citation2000). In other words, in a typical classroom activity system in the USA, it is the teacher who is perceived as the subject whose perspective matters, assessments are mediating artifacts and the object is sorting and ranking students, rewarding them or withholding reward through grading which, in theory, leads to improved learning. The notion of using assessment not to rank and sort students, but to determine what students know, and to guide the learning experiences to be provided to students, is still strange to most US teachers, because they perceive assessment and instruction as mutually exclusive activities (Heritage and Bailey Citation2006). Even in classrooms where teachers have begun to use assessment results to inform their instructional decisions, students may not be involved in using the results themselves (Popham Citation2008). By contrast, in this fifth‐grade classroom, as students engaged in data‐driven dialogue about their assessments, they assumed the role of subject, using assessment results as mediating artifacts for evaluating their own understanding of math concepts and identifying the action steps they would take to improve their learning. Recognizing that the role of subject in Engeström’s framework is a researcher category, we suggest that the data from these two lessons clearly demonstrate a transformation of participation in the activity as students developed greater agency and their relationship to what had been the teacher’s goal changed. We argue that collectively, the teacher and the students reorganized and distributed responsibility for learning and the role of subject in joint activity among the students and the teacher.

The teacher’s roles in this process included facilitating and creating structures to support the data‐driven dialogue, although how she facilitated this process changed between the two sessions analyzed for this study (i.e. having students raise their hands to speak). The teacher initiated the data‐driven dialogue process and provided structure for each step. The data‐driven dialogue started with students engaging in an activity that was designed to gather precise information about their learning – taking a test – that was also structured and facilitated by the teacher. The teacher scored the tests and developed and provided additional artifacts to support students in the data‐driven dialogue practice. The teacher also created a structure by which students could ‘sign‐up’ for appointments with her to get additional help on topics of their choosing after the data‐driven dialogue was done. Thus, she provided at least one choice for students as they selected their action steps. During the second data‐driven dialogue session, the teacher asked questions requiring students to clarify their statements or interjected ideas about patterns she was noticing in different student statements.

The students’ roles included analyzing and interpreting their own assessment results and determining how to use them. In scaffolding the transformation in the students’ participation in these roles, the teacher and teacher‐produced materials supported the students’ actions until they were able to act independently in the new roles. The students learned to describe how they did on the assessment, not in terms of an overall score, but in terms of the teacher’s learning targets/goals that were measured by different items on the test. The students explained why they received the scores they received and why they did well or not so well on individual items on the test using mathematical language. Students also decided what actions they would take, within the structure provided by the teacher, in response to their interpretation of their assessment results.

Object

The teacher and student interviews provided evidence that the teacher and at least some of the students (only three were interviewed) came to share the same object. According to one student, ‘Data driven dialogue helps me after a test to set a goal and to try and accomplish that goal. So next time we have a test I can try to make that goal accomplished [sic].’ Another student described data‐driven dialogue this way: ‘It helps me after taking a test because I know what I need more help on. Like if I think that I know that I am good at it and I get it wrong on the test then it kind of helps me know that I didn’t really get it all that much.’ The teacher explained the purpose of data‐driven dialogue as wanting ‘students to take ownership for the process of taking a test and analyzing the results in a way that is meaningful to them so they can see what they still need to work on and what strengths they have.’ The students’ language in these quotes clearly suggests that they have appropriated the teacher’s purpose or motive regarding using the dialogic process to analyze their results in ways that will help them to meet her standards, as suggested by Sadler (Citation1989). In essence, the teacher deployed the data‐driven dialogue in an activity organized by the object or motive of students negotiating meaning from student assessment data. In the process of using data‐driven dialogue, the students came to share that object. To invoke Wells, based on their discussion and interviews, we are able to characterize them as sharing the subject position in this activity with the teacher, or as Sadler (Citation1989) might put it, they came to hold (and act on) a concept of quality roughly similar to that of the teacher.

Transformations of participation

Transformation of participation in formative use of summative assessment in this classroom was evident, even though the teacher still maintained a hierarchical structure in the classroom as she and the students engaged in the data‐driven dialogue process.

Locus of responsibility for learning

The division of labor within this classroom with regard to who was responsible for the learning was different from more traditional US classrooms in that students were responsible for interpreting their own assessment results, explaining what the results meant, and determining what actions to take to improve their learning. The students were the agents in their learning process. They described themselves as responsible and demonstrated through their actions that they had taken responsibility for their own learning. Contrast the student statements wherein they are describing clear plans for their own future action based on assessment results, such as ‘my goal is to understand unlike denominators and I will reach it by coming in and getting help,’ to a statement overheard in the home of one of the authors, ‘I don’t know why we were doing that [activity], ask my teacher, she’s in charge of my learning’ (O’Brian, personal communication).

Distribution of expertise

The video tapes of the case study classroom, interviews with the teacher and students, and tools used in the classrooms provided numerous examples of how expertise was distributed among the students and the teacher in the classroom. As they completed the data‐driven dialogue process, students were encouraged to ‘sign‐up’ for an appointment with the teacher based on the learning needs that had become evident to them through the dialogue. Soon, students began to recognize when they had common ‘needs,’ and they began to talk with one another first and then collectively bring questions that remained to the teacher, after they sought help from one another. During the second DDD session, the teacher encouraged students to ‘think about your options’ before describing the action steps they would take based on their assessment results. She suggested they consider ‘looking around you and seeing who is good at things … picking the right partner when we have math time.’ One student indicated that he would reach his goal by, ‘choosing a good partner.’ After the data‐driven dialogue another student explained, ‘Like if we are talking … it helps us know what I need help on, and if other people know that they need help on it too then they should both work together and work on it.’

Use of mathematical language and student meta‐cognition

In this analysis, students’ development and use of academic language to guide their learning was evident. During both classroom sessions, students used mathematical language to describe the challenges and successes they experienced in learning. For example, students would state that they had difficulties ‘adding and subtracting fractions with unlike denominators.’ According to the teacher, ‘The best part of this experience is that students are using math language that they have never used before. I have no longer heard “I don’t get it.” I hear “I don’t understand how to add fractions with unlike denominators.” So, they are actually able to articulate what they know and don’t know. That has been very helpful in teaching and very exciting to hear students say.’

Students also engaged in meta‐cognitive strategies; they were both self‐directed and overtly reflective about their learning experiences. In the data‐driven dialogue described above, students not only reflected on how they did on their classroom assessment, they explained why they got certain items correct and incorrect and then set specific goals about what they will do next to improve their learning.

One of the most remarkable characteristics of this classroom was the degree to which students talked about their learning. Students made public declarations about what they did well or did poorly on their unit test, and they provided specific information about why they got the results they got and what they needed to do to improve them. Their dialogue suggests that the learning environment they co‐constructed by engaging in the practice of data‐driven dialogue mediated formative assessment was one of safety in which to speak and act as agents of their individual and collective learning.

Conclusion

In recommending that teachers guide students in the formative use of summative assessments, Hamilton et al. (Citation2009) note that:

Simply giving students assessment data that are accessible and constructive does not guarantee that they will know what to do with the data. Students need the time and tools to analyze the feedback; otherwise, they may simply glance at the overall score without considering why they achieved that score and what they could do to improve. (22)

The students in the two lessons analyzed here were engaged in guided reflection and individual and collective analysis of feedback using data‐driven dialogue. Based on our analysis we suggest that data‐driven dialogue has potential as a very useful tool in student‐engaged formative assessment.

The analysis we have presented is based on only two lessons. While our findings are clearly not generalizable, they are suggestive of an approach ripe for further research. The two lessons analyzed involve two different groups of students video‐taped in different academic years. Several elements and the general quality of the class discussions from the two taped sessions were shared across time and participants. Our close analysis of the discourse and recorded behaviors provides insight as to what occurred during two different math sessions in which a teacher implemented the emergent practice of student‐engaged formative assessment using data‐driven dialogue. Several artifacts, or tools, played a critical mediating role in the co‐creation of meaning from assessment results among the teacher and students in this classroom. We found evidence that the dialogic quality of this classroom as well as the roles of students and teacher engaged in assessment were markedly different from traditional US classrooms in some critical ways; participation transformed as the locus of responsibility for learning and expertise was distributed and students looked to the tools, themselves, and one another as supports for their learning. Students used mathematical language, talked about their learning, and used several meta‐cognitive strategies.

Future research can build on this analysis in several ways. We need to understand more fully how the process of introducing student‐engaged formative assessment practices develops over time in a classroom. In other words, within a given school year, what progression is evident from the beginning of the year to the end of the year in terms of the use of mediating artifacts, rules and norms, division of labor and roles when teachers and students engage in formative assessment practices in general and formative assessment using data‐driven dialogue in particular? There is also a need for more research on the outcomes of data‐driven dialogue as a tool for both educators and students to improve learning. This analysis, though limited, suggests that a discursive tool, in this case, data‐driven dialogue, can transform participation in teaching and learning activity. We look forward to future related research that will investigate the relationship between changes in classroom discourse, transformation in classroom participation frameworks, and changes in student achievement.

Notes on contributors

Julie Oxenford O’Brian is Director of the Center for Transforming Learning and Teaching at the University of Colorado Denver.

Honorine Nocon is Associate Professor and Chair of the Urban Community Teacher Education Program at the School of Education and Human Development, University of Colorado Denver.

Deanna Iceman Sands is Professor and Associate Dean for Research at the School of Education and Human Development, University of Colorado Denver.

References

  • Andrade , H. , Du , Y. and Wang , X. 2008 . Putting rubrics to the test: The effect of a model, criteria generation, and rubric‐referenced self‐assessment on elementary school students’ writing . Educational Measurement: Issues and Practice , 27 ( 2 ) : 3 – 13 .
  • Bell , B. and Cowie , B. 2001 . Formative assessment and science education , Dordrecht, , Netherlands : Kluwer Academic Publishers .
  • Black , P. , Harrison , C. , Lee , C. , Marshall , B. and Wiliam , D. 2003 . Assessment for learning: Putting it into practice , New York : Open Univ. Press .
  • Black , P. and Wiliam , D. 1998 . Assessment and classroom learning . Assessment in Education , 5 ( 1 ) : 7 – 74 .
  • Council of Chief State School Officers . 2008 . Attributes of effective formative assessment , Washington, DC : CCSSO .
  • Crooks , J.T. 1988 . The impact of classroom evaluation practices on students . Review of Educational Research , 58 ( 4 ) : 438 – 81 .
  • Engeström , Y.E. 1987 . Learning by expanding , Helsinki : Orienta‐Konsultit Oy .
  • Hamilton , L. , Halverson , R. , Jackson , S. , Mandinach , E. , Supovitz , J. and Wayman , J. 2009 . Using student achievement data to support instructional decision making (NCEE 2009–4067) , Washington, DC : National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, US Department of Education . http://ies.ed.gov/ncee/wwc/publications/practiceguides/
  • Heritage , M. and Bailey , A.L. 2006 . Assessing to teach: An introduction . Educational Assessment , 11 ( 3/4 ) : 145 – 8 .
  • Lave , J. and Wenger , E. 1991 . Situated learning: Legitimate peripheral participation , Cambridge, , UK : Cambridge Univ. Press .
  • Marshall , H.H. and Weinstein , R.S. 1984 . Classroom factors affecting students’ self‐evaluations: An interactional model . Review of Educational Research , 54 ( 3 ) : 301 – 25 .
  • Nicol , D.J. and Macfarlane‐Dick , D. 2006 . Formative assessment and self‐regulated learning: A model and seven principles of good feedback practice . Studies in Higher Education , 31 ( 2 ) : 199 – 218 .
  • Popham , W. Defining and enhancing formative assessment . Document prepared for Assessment for Learning. Formative Assessment Where it Counts . Los Angeles : Univ. of California, Los Angeles .
  • Popham , W. 2008 . Transformative assessment , Alexandra, VA : Association for Supervision and Curriculum Development .
  • Proctor , D. 2008 . “ A summary of evaluation results for 2007–2008 ” . Evaluation report for the Colorado Consortium for Data‐Driven Decisions, Version 2
  • Rodriguez , M.C. 2004 . The role of classroom assessment in student performance on TIMSS . Applied Measurement in Education , 17 : 1024
  • Rogoff , B. 2003 . The cultural nature of human development , Oxford, , UK : Oxford Univ. Press .
  • Sadler , D.R. 1989 . Formative assessment and the design of instructional systems . Instructional Science , 18 : 119 – 44 .
  • Sebba , J. , Crick , R.D. , Yu , G. , Lawson , H. , Harlen , W. and Durant , K. 2008 . “ Systematic review of research evidence of the impact on students in secondary schools of self and peer assessment. Technical report ” . In Research evidence in education library , London : EPPI‐Centre, Social Science Research Unit, Institute of Education, Univ. of London .
  • Shepard , L.A. 2000 . The role of assessment in a learning culture . Educational Researcher , 29 ( 7 ) : 4 – 14 .
  • Shepard , L.A. 2009 . Commentary: Evaluating the validity of formative and interim assessment . Educational Measurement: Issues and Practice , 23 ( 3 ) : 32 – 7 .
  • Shepard , L.A. 2010 . What the marketplace has brought us: Item‐by‐item teaching with little instructional insight . Peabody Journal of Education , 85 : 246 – 57 .
  • Strauss , A. and Corbin , J. 1998 . Basics of qualitative research: Techniques and procedures for developing grounded theory , London : Sage .
  • Third International Conference on Assessment for Learning . Position paper on assessment for learning . Paper presented at Third International Conference on Assessment for Learning . March , Dunedin, New Zealand.
  • Torrance , H. and Pryor , J. Developing formative assessment in the classroom . Paper presented at the Annual Meeting of the American Educational Research Association . April , New Orleans, LA.
  • Vygotsky , L.S. 1978 . Mind in society: The development of higher psychological processes , Edited by: Cole , M. , John‐Steiner , V. , Scribner , S. and Souberman , E. Cambridge, MA : Harvard Univ. Press .
  • Wellman , B. and Lipton , L. 2004 . Data‐driven dialogue: A facilitator’s guide to collaborative inquiry , Sherman, CT : Mira Via .
  • Wells , G. 2002 . Dialogue in activity theory . Mind, Culture and Activity , 9 ( 1 ) : 43 – 66 .
  • Wells , G. 2003 . Dialogic inquiry: Towards a sociocultural practice and theory of education , New York : Cambridge Univ. Press .
  • Wenger , E. 1998 . Communities of practice: Learning, meaning and identity , New York : Cambridge Univ. Press .

Appendix 1. Data‐Driven Dialogue Guide

Name________________________

Unit 7 skills

Understand and apply exponential notation

Understand and apply powers of ten

Understand and apply scientific notation

Decide if number sentences are true or false

Understand placement of parentheses

Understand order of operations

Compare, add, and subtract integers

Before you look at your test…….

I predict that…………….

I predict that…………..

Now explore your test

I see that………….

I see that……………

Now explain what you saw

Why did I score this grade?

Why did I miss certain problems?

Take Action

What is my goal and how will I reach it?

Appendix 2. Student Progress Monitoring Sheet

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.