6,012
Views
3
CrossRef citations to date
0
Altmetric
Articles

Making sense of student data in teacher professional development

, &
Pages 256-273 | Received 08 Nov 2018, Accepted 09 Nov 2018, Published online: 28 Nov 2018

ABSTRACT

In recent years, the use of student data has become increasingly concerned with management of teacher performance. However, when teachers become aware of specific student data directly related to their approach of teaching, it could inform them about possible strengths, weaknesses or challenges. Unfortunately, teachers generally have little time and encounter significant problems in the interpretation and use of data for change. In this article, we put forward that such problems can be avoided by offering teachers practical frames that are aimed at the interpretation and productive use of student data. We report on an extensive study that was done in the setting of reform implementation where teachers were asked to change their teaching practices. Participating teachers performed multiple PDCA(Plan-Do-Check-Act) cycles in which they designed and taught lessons where student data were collected. To interpret and make use of such student data for change, we provided participants with practical frames. We examined to what extent and in what way participants used these frames and how this influenced professional development. Results showed that participants used frames to both interpret student data and make changes to their teaching practices towards that required by the reform in a stepwise, rather independent way.

Introduction

Data about student learning as indicators of teacher and teaching quality have become increasingly important in the last decades. In most cases, such student data serve as information for school management and accountability purposes (Lingard et al. Citation2015). Student data are also collected at a larger scale, where schools, districts or even state agencies collect and monitor data, which, in turn, are used to map teacher quality by comparison to others. As a result, student data have also become increasingly important in framing the working lives and experience of teachers (Ball Citation2015, Stevenson Citation2017). However, when school management uses student data to judge or compare teachers, there are risks involved such as datafication, being the use of data as a management tool in itself, and performativity, that is, a culture or system that is focused on maximization, targets and the worth of individuals for the greater good (Ball Citation2003, Citation2015, Lloyd and Davis Citation2018). This, in turn, can lead to teachers feeling low trust and stress which can result in the ethics of competition, individualism, performance and inauthenticity (‘I am doing this because it will make me look good’) instead of the ethics of professional expertise, authenticity and co-operation (Ball Citation2003).

Student data can, however, also be used to contribute to the development of teachers’ professional expertise. Linking outcomes in student learning to teacher learning is known to be an important guideline for effective professional development programs for teachers (Desimone Citation2009, Borko et al. Citation2010). As the primary objective of teaching is to promote student learning, it is important for teachers to have insight into specific effects of their actions on student learning (Fishman et al. Citation2003). On the basis of data about student learning, teachers could adjust and/or expand their knowledge, beliefs, attitudes and teaching repertoire (Desimone Citation2009). In practice, most teachers make nonetheless insufficient use of strategies in which they productively use student data to inform teaching practices. Student data and how to use them to improve teacher effectivity, therefore, are underutilized (Stecker et al. Citation2005, Schildkamp and Kuiper Citation2010) and there is a need for initiatives and support to assist teachers in how to use student data for change (Mandinach Citation2012, Marsh and Farrell Citation2014). Studies on teachers who did try to perform data-driven decision-making by using student data for improvement in their classrooms show significant problems. A first problem is that many teachers report practical issues, such as struggling with time constraints and encountering technical problems such as using management information systems (Young Citation2006). A second problem is that, generally, teachers report trouble understanding what the large numbers of student data mean and how they can use these data to propose change (Ingram et al. Citation2004, Schildkamp and Kuiper Citation2010). A third problem is that, when thinking about change, teachers often experience that they do not have enough knowledge or prior experience to propose change (Bransford et al. Citation2005). Teachers willing to change thus face serious problems in data use and are at risk of ending up in a vicious cycle of teaching lessons and trying to make use of data without significant improvement. The primary focus in this study should therefore be to understand ways in which teachers can both adequately interpret and use student data to propose change. To this end, we studied the development of teachers who followed a PD program with multiple cycles in which they (a) collected student data in their classes, (b) made sense of these student data by interpretation and (c) tried to use these student data for changes to their teaching practices. Support in both steps was based on ideas from Minsky (Citation1985) and Klein et al. (Citation2006) who state that new information is always examined through ‘frames’ that act as lenses or perspectives that give meaning to information. We designed three practical frames and we explored to what extent and in what way participants used these frames. We also monitored participants’ professional development in terms of influence of data use on changing teaching practices in line with the reform.

Theoretical framework

In principle, data based improvement procedures can be understood as an application of the well-known quality improvement cycle; the Plan-Do-Check-Act (PDCA) cycle (Shewhart Citation1931, Deming Citation2000). When this cycle is applied to improve the quality of teaching, teachers would design a lesson (Plan), which they teach in their own class (Do). Next, teachers collect data on those student outcomes of interest and compare them to standards or expectations set beforehand in order to determine whether a change was successful or not (Check). After this, teachers can propose a new change (Act) and incorporate this in their new lesson design (Plan). To understand how teachers can be assisted in successfully interpreting and making use of student data within a PDCA cycle, we first need to deeper understand how people give meaning to experiences and propose change. From the field of cognitive science, it becomes clear that whenever people encounter new experiences, they activate mental frame-structures that were acquired in the course of previous experience (Minsky Citation1985). Using such frames, a new experience can be connected to the subjective knowledge already present in that frame and hence give meaning to the new experience. Already more than 30 years ago, such frames were defined to be a sort of skeletons, somewhat like application forms with many blanks or slots to be filled or a fixed set of named slots whose values vary across applications, which resemble the concept of mental schema (Bartlett Citation1932, Minsky Citation1985). An example of a frame could be a tree-like structure used for the concept car, having attributes like driver, fuel and different engine parts. Within the frame, all attributes have relations to one another and can be filled differently to design many different cars. Also, when a car is malfunctioning, one can schematically examine the attributes to find the part that is broken. Barsalou (Citation1992), who built upon the work of Minsky, states that frames are the main representations of knowledge in human cognition. They provide both a strong conceptual tool to give meaning to new experiences and a powerful productive mechanism for generating specific combinations of parts within a certain field. However, frames are drawn from past experience and sometimes do not fit new situations perfectly. Therefore, frames have to be adapted to particular experiences in specific settings (Minsky Citation1985). As Klein et al. (Citation2006) wrote, ‘Making sense of data is the process of fitting data into a frame and fitting a frame around the data’. Translated to education, this provides valuable insight into the problems of teachers having trouble in understanding and using student data. For, if teachers want to overcome the previously mentioned problems of not being able to interpret and make use of student data, we can point towards the use of specific frames. Frames, however, are made up of a broad factual and theoretical knowledge in a certain setting, which in education would be pedagogical content or specific teaching approaches. Research by Bransford et al. (Citation2005) shows that precisely such broad factual and theoretical knowledge was often found to be lacking, resulting in a situation where teachers often have frames that can be considered inadequate.

Frame design

In order to design effective frames for teachers usable in multiple PDCA cycles to interpret student data and propose new change, designers need to consider the specific settings in which the data based improvement procedure takes place as well as the intended goals for teachers. Coming back to the earlier mentioned frame example for car, it is clear that this frame is only suitable to assist practitioners involving car design, maintenance, etc. When we, in this specific study, want to design frames that can assist teachers in changing their teaching practices in single lessons towards context based education, it is implied that frames should focus on the primary teaching-learning strategy within a single lesson. Effective frames in this study should therefore offer factual and theoretical knowledge about the primary goal of the change process at hand. Frames should, however, not be limited to concrete actions that cannot be replicated in other classes or offering fixed solutions, which requires a level of abstraction that is not too generic and not too specific or limited.

As teachers are known to have little time or options for using new tools, frames will only be used properly if they are perceived as being functional within the teacher´s own environment, time and settings (Doyle Citation2006). In their teaching practices, teachers have to meet several goals simultaneously (e.g. student learning, keeping up the momentum, covering the textbook) with limited time and resources (Fullan Citation2007, Janssen et al. Citation2013). As a result, teachers will only use something new if it is perceived as being practical (Doyle and Ponder Citation1977). Three criteria can be used to determine the level of practicality of any change proposal in educational settings: it should be instrumental for teachers to use, it needs to be congruent with what a teacher normally does, and it must be easy to implement (low cost). Thus, to further design and fill the subparts of the frames, we not only need to take the main goal of the change process (the primary teaching-learning strategy within a single lesson) into account, but also need to focus on practicality demands. With these practicality demands in mind, we adopted the ideas of Merrill et al. (Citation2008), who showed that teaching-learning strategies for single lessons can be decomposed into smaller segments like presentation, practice or demonstration. In our research, we built upon his work and designed frames based on the concept of smaller segments of single lessons that can be ordered and reordered particularly to represent single lessons as teachers give many every day (see ). The use of specific segments in understanding and attaining innovation has earlier been described by Holland (Citation2000), who defined innovations as, ‘the rearrangements of already existing building blocks’. Whenever trying to propose innovations, the first step is to come to know the predominant building blocks in a certain area and then re-arrange them to propose an innovation. The concept of lesson segments has also been used in previous research by the authors (Janssen et al. Citation2013, Citation2014).

Figure 1. Graphical display of frames based on the concept of lesson segments.

Figure 1. Graphical display of frames based on the concept of lesson segments.

We expect that the set of lesson segments will illustrate to teachers how a change proposal such as context-based education in this study would work out in a concrete lesson design and how this relates to their regular instructional approaches. Using lesson segments, we expect teachers to also stepwise approximate the proposed reform with their own teaching practice as starting point, which can help them to understand how a change proposal connects with their regular teaching practice (congruency). Finally, all of this must be possible within limited time and resources available to teachers. Such low cost is accounted for by lesson segments being in place already as part of regular practices and the expectation that they are easily understood, so that teachers do not have to spend much time and effort on designing something completely new.

Frame use

From the previous sections, it becomes clear that frames could provide both a strong conceptual tool to give meaning to new experiences and a powerful productive mechanism for generating specific combinations of parts within a certain field. The specific frames used in the present study are based on lesson segments that can be combined to make up a single lesson and serve as a tool to make sense of student data. So how can lesson segments assist in making sense of specific student data? Suppose that a teacher expects 60% of the students to answer a test question correctly and he/she finds that only 40% answered the question correctly. He/she might then use a frame based on lessons segments to make sense of this problem by pointing to, for example a specific non-functioning order of lesson segments in the lesson (sequence of lesson segments) or a badly-structured explanation phase (content of one lesson segment). Asked which change would increase the student learning outcomes in a PD setting, the teacher might again choose to use a frame based on lesson segments to predict that when he/she aims to organize, for example the explanation phase better, the outcomes will increase as well. In this way, frames could be used in the setting of data based improvement initiatives to interpret student data and to propose change in a specific or desired direction, in this study the arrangement of lesson segments that constitutes context-based education.

In the present study, participants were instructed to collect, interpret and make use of student data obtained in their own classes to inform their change processes. The PD program specifically offered assistance in the two crucial steps of the process: interpretation of student data and productive use of student data and the objective was to explore to what extent and in what way these frames were used, and to monitor participants’ professional development in how their teaching practices changed upon using data. This research therefore aims to answer the following research question: How do practical frames contribute to teachers’ interpretation and use of student data in the setting of a PD program, and what are influences of this PD program on changing teaching practices?

Methods

Participants

The PD program examined in this study was performed in the setting of a national biology education reform in the Netherlands; that is, the introduction of a context-based curriculum in secondary biology education (Grades 7–12 in which teachers were asked to change their teaching practices according to the reform). Because of the explorative nature of this study, we purposively invited biology teachers both known to the Institute’s network of schools and varying on teaching experience, age and gender. None of them had previous experience in data based decision-making or teaching according to the goals of context-based education. Participants (n = 5) worked at four different secondary schools in the west of the Netherlands (see ). Four participants taught grades 10–12 and one participant grades 7–9. Two participants taught in general secondary education and three in pre-university education (see ).

Table 1. Survey of participants.

Context-based reform

This research took place in the setting of implementing a reform-based curriculum in secondary biology education in the Netherlands. Context-based education is an approach to education in which subject matter is organized and taught by using contexts. The use of a context to teach subject matter is thought to bridge the gap between the often abstract and difficult scientific concepts and the world the students live in (Gilbert Citation2006). It was proposed as a solution to students’ seeing school science as disconnected from the real world, leaving them with little interest in science, little understanding of the role of science in society and little awareness of career possibilities in the field of science (Bennett et al. Citation2007, Boersma et al. Citation2007). At the classroom level, the teaching-learning process of context-based education typically focuses on a meaningful context that is presented at the start of a lesson. From this context, a problem or question naturally follows that develops a ‘need-to-know’ for scientific concepts. Following, students have to gain insight in the concepts that are needed to answer the question or solve the problem (Gilbert Citation2006, Bennett et al. Citation2007, Wieringa et al. Citation2011). Several phases can be divided according to three forms of regulation: student self-regulation, shared regulation or a teacher’s regulation (e.g. Vermunt Citation1998). If applied to the phase of finding answers, a first option could be to have students perform certain learning activities themselves to answer the question or solve the problem. In the second option, students and teacher work together to find answers, mostly in questioning-based classroom discourse and in the third option, the teacher regulates learning by presenting the information needed for answering. In more general terms, the regulation of the lesson segment ‘answering questions’ can be done by either the teacher, or the students, or shared. In the same way, all lesson segments can be regulated by either the teacher, shared or students. The answering phase is then followed by a final reflection on the content and the process of learning. The introduction of a context-based curriculum with new objectives and examination requirements will be introduced nationally, but the implications of the reform proposal in terms of instructional approaches are largely up to teachers themselves. It is precisely here that teachers are required to reconsider their instructional approaches and that is where we performed the present PD program.

Structure of the PD program

Participants in the present study followed a PD program that was aimed at the development of their instructional approaches towards proposed context-based education. Foundations of the PD program were data-driven steps of the PDCA quality improvement cycle (Shewhart Citation1931, Deming Citation2000). Participants were instructed to perform certain actions in each step of the PDCA cycle (see , left column). In the Plan phase, participants set learning goals for a single lesson, designed a lesson on the basis of their intention, and made up a small test questionnaire for students (SQ1) aimed to determine the extent to which students met the learning goals. Also, they set expectations for effects on learning. A second questionnaire (SQ2) was designed by the researcher and aimed at determining the perceived regulation of learning processes. SQ2 was constructed as follows: each time participants designed a lesson, they emailed their lesson plan to the researcher (first author). On the basis of the lesson plan, the researcher provided the participant with a short questionnaire to investigate student views on the sequence of lesson segments and the regulation for each of the lesson segments (see for an example). Next, in the Do phase, participants taught the designed lesson during which they collected students’ data by administering SQ1 and SQ2. In the Check phase, the participants summarized their student answers to SQ1 and SQ2 and compared these to their expectations set in the Plan phase. The guiding question in the PD program for this comparison was, ‘Did the student data match the expectation you set beforehand?’ and the important question for interpretation: ‘Do you think that your specific lesson design has had influence on the outcomes of SQ1, and if so, how?’ Participants then completed the PDCA cycle by proposing change (Act), which we named ‘intentions’ after Fishbein and Ajzen (Citation2010). The guiding question for eliciting these intentions was, ‘Which next change in your practice would increase the student outcomes?’ Participants then designed a new lesson (Plan phase) on the basis of this intention and moved on in their next PDCA cycle. In the last two steps of the PDCA cycle (Check and Act), we provided participants with frames that could assist them in understanding the outcomes of the questionnaires and proposing changes (see , middle column). Participants completed four PDCA cycles in total, in which they designed, taught and reflected on four lessons. Participants worked independently and wrote down all the above-mentioned steps in an online structured reflection format. Researchers (first and second author) and participants only met at the start and the end of the PD program. In the first meeting, one of the researchers and individual participants jointly compared the participants’ regular practice with the proposed reform, both represented in lesson segments. The researchers then asked: ‘What change would take your regular teaching practice one step towards the reform proposal?’ This change proposal was rephrased into an intention to change and served as a basis for a first lesson design (Plan phase). At the end of the PD program, participants attended a group meeting in which they evaluated the PD program and member-checked their individual developmental path.

Table 2. Survey of the PDCA cycle.

Figure 2. An example of a student questionnaire (SQ 2) (taken from George’s lesson).

Figure 2. An example of a student questionnaire (SQ 2) (taken from George’s lesson).

Frames used in this study

All frames in this study are based on the overarching concept of lesson segments. Content and attributes of lesson segments were designed so that teachers could represent their existing approaches to instruction, context based instruction and approximations of context-based instruction. For lesson segments that represent the most common approach to instruction within a single lesson, we followed Gage (Citation2009), who identified the following lesson structure to be most common: a lesson that starts with the presentation of specific information by the teacher, followed by a phase where the teacher assigns exercises to apply or recall that information. After this, students have to answer the assigned exercises, which is sometimes concluded by testing of reflection (Explanation → Questions to Recall and/or Apply → Answering questions). Lesson segments that constitute context-based education are twofold, both can be preceded by an orientation phase and/or followed by a reflection and/or test phase: Context with central question(s) → Answering question(s) → Explanation and Context with central question(s) → Explanation → Answering question(s). For a survey of the lesson segments used in this study with definitions, see .

Table 3. Survey of lesson segments.

In this study, three specific frames were designed to assist teachers. The first frame was defined as Lessons can be seen as specific sequences of lesson segments. When using this frame, participants can interpret certain student data by pointing at a specific chosen order of lesson segments in the lesson given. For example when participants are used to starting a lesson by presenting knowledge, changing the sequence into starting with an application question could affect student learning. The second frame was Regulation of learning processes can be done by either the teacher, or the students, or shared. This frame centres around the amount of regulation that is given to students in each lesson segment. Student self-regulation has become more and more important in constructivist views of teaching, such as the reform in this study (context-based education). The third frame was defined as There are different types of contexts and they can have different functions. This frame focuses on understanding the content and purpose of a context as this was the reform setting in which this research took place. Following Gilbert’s notion, contexts can vary from the application of concepts to being authentic and having students participate in a community of practice (Gilbert Citation2006). As all types of contexts have the potential to positively affect learning (Bennett et al. Citation2007), participants in our research were free to use any type of context they wanted.

Data collection and analysis

We collected research data in this study to determine (a) if and how teachers used frames to interpret and use student data, and b. the development of teaching practices when using student data (see , right column). To determine the use of frames in the interpretation of student data, we collected all explanations and phrases of causal effects participants wrote down in the online reflection format during the Check phase of the PDCA cycles. Next, we investigated whether they used frames to do so, and how they did this. To determine if and how participants utilized frames to use their student data, we collected all the participants’ intentions formulated in the Act phase of the PDCA cycles. Next, we investigated whether they used a frame to do so and how they did this. Analysis on the use of frames was done by two researchers (i.e. the first and second author). For both the interpretation and the use of data to propose change, we investigated if and how participants used the ideas or terminology of the frames that were offered in the PD program. For examples of how this analysis was done, see .

Table 4. Illustration of how the analysis on the use of frames was done.

To determine how student data influenced the participants’ professional development in terms of teaching practices, we analysed participants’ developments throughout the PD program. For this end, we first made a chronological overview for each participant on paper. In the final PD session, we presented each participant with this survey of personal data, lessons designed and given, choices made and answers given to the questions pertaining to each of the PDCA steps in the online structured reflection format. After this, we had the participants check these summaries for internal validity (Miles and Huberman Citation1994). All participants confirmed that the overview represented accurately how they developed throughout the program. We analysed these written out developments of participants by categorizing a. lesson designs being in the same or other direction as the previous lesson and (b) experienced problems and successes in terms of SQ1 or SQ2 showing lower or higher scores than expected or other problems noted in observations by the participants. Next, we related the development directions with problems and successes. Did the participants choose to repeat the change, propose a change in the same direction or choose a complete different direction for change? In the final PD session, we also asked participants about the strengths and weaknesses of the entire PD program in an open format questionnaire.

Results

In this section, we will first present a case study to describe the way in which one teachers’ teaching practice was influenced by her student data. We chose to present Paula for this case study, because her development pattern and use of data can be used to clearly represent the specific research approach taken in this study. In the second part of the section, we describe how often and in what ways all participants used the frames that we provided to interpret and make use of student data and their professional development in changing teaching practices.

Case study Paula

Paula is a 52-year-old biology teacher who teaches mainly in the lower general secondary education grade level (grades 7–9). Before becoming a teacher, she worked as a teaching assistant for several years. She is an enthusiastic person who wanted to participate in this research in order to expand her teaching repertoire and get to know the context-based reform. At the start of her PD program, she outlined the structure of her typical lessons as follows: (a) checking homework for approximately 10 min; (b) explanation of new topics or students making a summary of the new topics using the textbook, 30 min; (c) students working on exercises from their textbooks (mostly exercises to recall) for the final 10 min. When she compared her typical teaching practice to the proposed reform, she intended to start by using a context with a central question and have students work out the answer themselves. Her first lesson design (Plan) started with a biological context in which a granny wants to get rid of the aphids in her oak tree. Can she combat aphids using chemicals without negative consequences for other organisms in the food web? She denoted the sequence of lesson segments in this lesson design as follows: Context with central question (teacher) → Answering questions (students independently) → Test (Shared). She expected 80% of the students to answer the learning effect questions in SQ1 correctly and, indeed, 80% of the students did (see ). When reviewing the outcomes of SQ2, she was surprised that many students felt as if she helped them a lot in answering the questions, whereas she designed this to be fully student regulated. She answered positively to the question in the Check phase (‘Do you think that your specific lesson design has had influence on the outcomes of SQ1, and if so, how?’). She answered (quotes): ‘Starting the lesson with a context and let students find answers has had a positive influence on the learning outcomes’ and ‘Designing a lesson in which students answered the questions relatively independent had a positive influence on the learning outcomes’. In her first explanation, she attributed the good learning outcomes to the changed lesson sequence and the introduction of a context at the start. She used the frame ‘Lessons can be seen as specific sequences of lesson segments’ to explain the positive learning outcomes. In the second explanation, she also attributed the high learning outcomes to the students’ relatively independent search for answers, although she did encounter a problem because students scored SQ2 as different from her design. With this latter explanation, she used the frame ‘Regulation of the learning process can be done by either the teacher, or the students, or shared’. The next step in the PDCA cycle was to propose change by answering the following question in the structured reflection format: ‘Which change in your practice would positively influence student outcomes?’ (Act). She answered as follows (quote): ‘I want to let students answer the questions from the context completely by themselves, without my help’. In this intention, she again used the frame ‘Regulation of the learning process can be done by either the teacher, or the students, or shared’. Paula learned that her first lesson design did not support students in working independently at the level she intended it to be, but that starting the lesson with a context and letting students find information themselves indeed result in high student learning. On the basis of that, she decided to design a new lesson (Plan) that started with a context, where students had to answer the questions without her assistance. She then moved on in the PDCA cycle by teaching that lesson, collecting student data and so on (see ).

The use of frames

shows how other participants used frames in the Check and Act phases of their PDCA cycles. We chose to show the Check and Act phases because in these phases, participants could use frames to either interpret their student data (Check) or formulate intentions to change (Act). It is clear from that all participants used one or more frames to interpret student data. This is illustrated, for example, by Kimberley when she reflected on her successful first lesson by saying: ‘By using a context, I noticed that their thinking skills were addressed more than before. They started asking questions more deeply.’ In this way, Kimberley interpreted the expected positive learning outcomes by pointing to the important role of starting with a context and thus used the frame ‘Lessons can be seen as specific sequences of lesson segments’. The participants also used frames to formulate an intention to change in their subsequent lesson. To illustrate how teachers did this, we will illustrate George´s interpretation and intention after teaching his first lesson. The lesson started with a context, after which he explained the main concepts. In his interpretation of the student data (Check), he stated that using a context helped students to understand the concept and had a positive influence on students’ participation in the subsequent activities. When asked for a next change to optimize student learning, he formulated two intentions, for which he used two frames.

shows how often participants used frames in the Check and Act phases of their PDCA cycles. In the beginning of their professional development, participants mainly used frame 1: i.e. that a lesson can be seen as a series of lesson segments and frame 2: that is that regulation of the learning process can be done by either the teacher, or the students, or shared. They used the third frame less and not until later in their development (i.e. not until the second PDCA cycle).

Table 5. Survey of the numbers of frames used in this study to interpret (Check phase) or use (Act phase) student data.

Teacher professional development

The second part of the RQ pertained to how using student data influences participants’ teaching practices. First, participants show to keep the changes they made to their teaching practice when making new changes (see ). For example when Kimberley reflects on the first lesson in which she introduced a context, she moves on to change the regulation, but keeps the context in place in subsequent lesson (see ). Second, participants did not stop developing, but persisted in their development and were mostly successful in their attempts to make changes and build further upon evident successes. Even as participants experienced that their student data were not as expected or when they encountered difficulty in class (problem), they still tried out new directions to make their lessons successful. This can be illustrated by Bob’s choices after his third lesson in data interpretation, when he stated that (quote): ‘There was not much of a working atmosphere due to the absence of many students who were on a study week’. However, he did use frame 3 to design a relevant context to motivate students in the subsequent lesson design (see ). This persistence in professional development contrasts with findings in other research, were motivation to proceed in development was low once problems were encountered.

Table 6. Summary of the development directions after experiencing success or problems in the lessons.

Conclusion and implications

The use of student data can be roughly classified according to their intent. On the one hand, student data can be used for accountability purposes. On the other hand, student data can be used to support teacher learning. This study has gained insight into how this second aim can be addressed more effectively. The known problems for teachers wanting to improve their teaching via collecting, interpreting and using student data are that they encounter difficulties in the two crucial steps of data interpretation and a productive use of data. In this study, we therefore provided participants with three frames that they could use when reflecting on their student data, and studied if and in what way these helped them to interpret their student data and propose productive change. The frames we offered were all based on the concept of lesson segments (see Methods). The research question of this study was aimed to determine if and how frames contributed to teachers’ interpretation and productive use of student data and the influences of using student data on teachers’ professional development in terms of how they changed their teaching practices.

Starting off their development, all participants chose to formulate an intention for which they used the first frame. This is illustrated by Bob who wants to start the lesson by creating a context through elaborating on the application questions he would normally assign at the end of the lesson. Next, results on the ways in which participants interpreted their student data show that they indeed used the frames provided with in doing so. Specifically, they mostly used the first frame to explain the outcomes as a product of a specific combination of lesson segments. Some also used the second frame by attributing specific results to choices made about the regulation of specific lesson segments done by either the teacher, shared or students. Only one participant used the third frame for interpretation purposes (Bob in his fourth lesson), indicating that this frame might not be very instrumental for teachers starting up their professional development and might serve as input for more secondary developments. When participants tried to use their student data and propose change, they especially used frame 2 in making changes to the second and third lesson design. Also, frame 3 was used slightly more than in the interpretation phase (see ).

Results on the professional development in terms of how data use influenced teaching practices in line with the proposed reform show that, in moving through the PDCA cycles, participants designed lessons in which they mostly kept their prior adjustments to lesson designs. For example, when they first used frame 1 to shift a context to the start of the lesson and this was found to be successful, they kept this change in successive lessons. Next, participants tended to maintain their successful changes in their next lesson design and proposed new changes in the same direction (using the same frame) or in another direction (possibly using another frame). Interestingly, participants also persisted in their development when confronted with problems. After the second lesson, participants started to formulate intentions outside the provided frames, such as the intention to ensure a tight connection between the context with attending questions on one hand and the learning goals on the other. Such intentions that were formulated without using the provided frames indicate that the frames in this study might serve as catalyst that initiates professional development in a direction of choice (in this study being context-based education), without having to be used extensively in the subsequent process of development.

What becomes clear from this study is that participants in general used frames that we offered to both interpret their student data and propose change. This contrasts with earlier research, in which precisely these two steps were found to be problematic (Ingram et al. Citation2004, Mandinach Citation2012). Findings in this study also show that this approach can help to overcome teachers’ reticent attitude of bypassing or reducing the use of tools and materials that they are offered in a PD setting and keep on using their own experience and routines (Borko et al. Citation2010). We think reasons for the effective use of frames in the present study can be found in the design of the frames, as we provided the participants with frames that comply with the three criteria for practicality (Doyle and Ponder Citation1977). First, frames in this study indeed offered instrumental content to participants; concrete procedures on classroom level in the form of lesson segments. Knowledge too concrete would not be readily transferable, and knowledge too abstract not directly useable (Zeitz Citation1997). Frames in this study were designed at an intermediate level of abstraction, so that they were directly useable for teachers, but not so concrete that teachers could use them only in specific situations. The instrumental content (lesson segments) in this research also allows teachers to attribute certain outcomes to a specific part of a lesson design. And whereas teachers in other known cases attributed, for example specific learning outcomes, to external and uncontrollable factors (Janssen et al. Citation2009), participants in this study attributed outcomes to controllable and internal factors (e.g. ability, effort). Such attribution is known to promote the formulation of productive change proposals (Weiner Citation2010). Second, participants also showed to use frames rather adaptively in connecting the proposed reform (context-based education) to their existing teaching practices, which relates to the congruency criterion. Third, frames were also found to cost little time and effort, as they were tailored to the needs of teachers who have to teach many lessons every day and desire comprehensible, useable and effective tools.

In this research, we present data from a small number of cases and portrait Paula as a single case study to explore the role of practical frames in the interpretation and usage of student data in-depth and in authentic teacher settings. Limitations of case studies in general are the limited possibilities of scientific generalization to population as well as the risk of researchers letting equivocal evidence and biased views influence the direction of the findings (Yin Citation2014). This latter issue was covered for in this research by having multiple researchers analyse the data and providing member checks with participants. As to generalization issues, studies with an exploratory motive in smaller groups such as the present study focus on generalizability to theoretical propositions (e.g. mechanisms, rationale) and not so much populations. However, now that we found practical frames to be valuable for a small group of teachers, it will be interesting to set up a quasi-experimental research design with a larger group of teachers to study generalizability to populations.

The use of student data in this study sharply contrasts with the more conventional approach in which student data are collected for accountability purposes. In accountability settings, student data are collected by standardized modes of student assessment on a large scale and used to map teacher quality by comparison to others. The risks of such a use are what Ball (Citation2003, Citation2015) proposed to be datafication, performativity and the tyranny of numbers leading to a decreased sense of professional expertise and authenticity of teachers. In the approach taken in this study, we explicitly took another stance and aimed to promote teachers’ professional expertise. The approach takes the teachers’ existing situation and intentions for each sequential development step as starting points within the context of educational reform. Also, teachers collected data themselves tailored to the goals they set beforehand and were assisted with practical frames to interpret and use these data. In the process, teachers made incremental steps that both connect to their volition and capability. Furthermore, the entire PD program took place in the authentic teaching settings of participants, thus taking all kinds of situational characteristics such as the school context, the classroom, the teacher’s schedule, and the teacher’s resources into account. In this way, using student data does not limit professional expertise, but rather enhances it. In conclusion, this research can provide directions for further conceptualization of data use in improvement procedures within PD settings in terms of what data should be collected, how these should be collected, and, especially, how teachers should be supported in their interpretation and use of student data.

Disclosure statement

No potential conflict of interest was reported by the authors.

Additional information

Funding

Funding received from the Dutch ministry of Education, Culture and Science.

References

  • Ball, S.J., 2003. The teacher’s soul and the terrors of performativity. Journal of education policy, 18 (2), 215–228. doi:10.1080/0268093022000043065
  • Ball, S.J., 2015. Education, governance and the tyranny of numbers. Journal of education policy, 30 (3), 299–301. doi:10.1080/02680939.2015.1013271
  • Barsalou, L.W., 1992. Frames, concepts, and conceptual fields. In: A. Lehrer and E.F. Kittay, eds. Frames, fields, and contrasts: new essays in semantic and lexical organization. Hillsdale, NJ: Lawrence Erlbaum Associates, 21–74.
  • Bartlett, F.C., 1932. Remembering: a study in experimental and social psychology. Cambridge: Cambridge University Press.
  • Bennett, J., Lubben, F., and Hogarth, S., 2007. Bringing science to life: a synthesis of the research evidence on the effects of context-based and STS approaches to science teaching. Science education, 91 (3), 347–370. doi:10.1002/(ISSN)1098-237X
  • Boersma, K.T., et al., 2007. Leerlijn biologie van 4 tot 18 jaar. Uitwerking van de concept-contextbenadering tot doelstellingen voor het biologieonderwijs [Proposing a learning track from 4 to 18 years. Elaboration of the concept-context approach into curriculum objectives for biology education]. Utrecht: CVBO.
  • Borko, H., Jacobs, J., and Koellner, K., 2010. Contemporary approaches to teacher professional development. In: E. Bekaer, B. McGaw, and P. Peterson, eds. International encyclopedia of education. 3rd ed. Oxford: Elsevier Scientific Publishers, 548–555.
  • Bransford, J., et al., 2005. Theories of learning and their roles in teaching. In: L. Darling-Hammond and J. Bransford, eds. Preparing teachers for a changing world. San Francisco, CA: Jossey-Bass, 40–87.
  • Deming, W.E., 2000. The new economics, for industry, government, education. 2nd ed. Cambridge, MA: MIT Press.
  • Desimone, L.M., 2009. Improving impact studies of teachers’ professional development: toward better conceptualizations and measures. Educational researcher, 38 (3), 181–199. doi:10.3102/0013189X08331140
  • Doyle, W., 2006. Ecological approaches to classroom management. In: C. Evertson and C. Weinstein, eds. Handbook of classroom management: research, practice and contemporary issues. New York: Lawrence Erlbaum, 97–125.
  • Doyle, W. and Ponder, G., 1977. The ethic of practicality and teacher decision-making. Interchange, 8 (3), 1–12. doi:10.1007/BF01189290
  • Fishbein, M. and Ajzen, I., 2010. Predicting and changing behavior: the reasoned action approach. New York: Psychology Press (Taylor & Francis).
  • Fishman, B.J., et al., 2003. Linking teacher and student learning to improve professional development in systemic reform. Teaching and teacher education, 19 (6), 643–658. doi:10.1016/S0742-051X(03)00059-3
  • Fullan, M., 2007. The new meaning of educational change. 4th ed. New York: Teachers College Press.
  • Gage, N.L., 2009. A conception of teaching. Dordrecht: Springer.
  • Gilbert, J., 2006. On the nature of ‘context’ in chemical education. International journal of science education, 28 (9), 957–976. doi:10.1080/09500690600702470
  • Holland, J.H., 2000. Emergence: from chaos to order. Oxford: Oxford University Press.
  • Ingram, D., Louis, K.S., and Schroeder, R.G., 2004. Accountability policies and teacher decision making: barriers to the use of data to improve practice. Teachers college record, 106 (6), 1258–1287. doi:10.1111/tcre.2004.106.issue-6
  • Janssen, F.J.J.M., De Hullu, E., and Tigelaar, D., 2009. Using a domain-specific model to improve student teachers’ reflections on positive teaching experiences. Action in teacher education, 31 (2), 86–98. doi:10.1080/01626620.2009.10463520
  • Janssen, F.J.J.M., et al., 2013. How to make innovations practical. Teachers college record, 115 (7), 1–43.
  • Janssen, F.J.J.M., Westbroek, H.B., and van Driel, J.H., 2014. How to make guided discovery learning practical for student teachers. Instructional science, 42 (1), 67–90. doi:10.1007/s11251-013-9296-z
  • Klein, G., Moon, G., and Hofman, R.R., 2006. Making sense of sensemaking 2: a sacrocognitive model. Intelligent systems, IEEE, 21 (5), 88–92. doi:10.1109/MIS.2006.100
  • Lingard, B., et al., 2015. Globalizing educational accountabilities. New York: Routledge.
  • Lloyd, M. and Davis, J.P., 2018. Beyond performativity: a pragmatic model of teacher professional learning. Professional development in education, 44 (1), 92–106. doi:10.1080/19415257.2017.1398181
  • Mandinach, E.B., 2012. A perfect time for data use: using data-driven decision making to inform practice. Educational psychologist, 47 (2), 71–85. doi:10.1080/00461520.2012.667064
  • Marsh, J.A. and Farrell, C.C., 2014. How leaders can support teachers with data-driven decision making: a framework for understanding capacity building. Educational management administration & leadership, 43 (2), 269–289. doi:10.1177/1741143214537229
  • Merrill, M.D., Barclay, M., and van Schaak, A., 2008. Prescriptive principles for instructional design. In: J.M. Spector, et al., eds. Handbook of research on educational communications and technology. 3rd ed. New York: Lawrence Erlbaum, 173–184.
  • Miles, M. and Huberman, A., 1994. Qualitative data analysis: an expanded sourcebook. Thousand Oaks, CA: Sage.
  • Minsky, M., 1985. The society of mind. New York: Simon & Schuster.
  • Schildkamp, K. and Kuiper, W., 2010. Data-informed curriculum reform: which data, what purposes and promoting and hindering factors. Teaching and teacher education, 26 (3), 482–496. doi:10.1016/j.tate.2009.06.007
  • Shewhart, W.A., 1931. Economic control of quality of manufactured product. New York: D. Van Nostrand.
  • Stecker, P.M., Fuchs, L.S., and Fuchs, D., 2005. Using curriculum-based measurement to improve student achievement: review of research. Psychology in the schools, 42 (8), 795–819. doi:10.1002/(ISSN)1520-6807
  • Stevenson, H., 2017. The “Datafication” of teaching: can teachers speak back to the numbers? Peabody journal of education, 92 (4), 537–557. doi:10.1080/0161956X.2017.1349492
  • Vermunt, J.D., 1998. The regulation of constructive learning processes. British journal of educational psychology, 68 (2), 149–171. doi:10.1111/bjep.1998.68.issue-2
  • Weiner, B., 2010. The development of an attribution-based theory of motivation: a history of ideas. Educational psychologist, 45 (1), 28–36. doi:10.1080/00461520903433596
  • Wieringa, N., Janssen, F.J.J.M., and Van Driel, J.H., 2011. Biology teachers designing context-based lessons for their classroom practice—the importance of rules of thumb. International journal of science education, 33 (17), 2437–2462. doi:10.1080/09500693.2011.553969
  • Yin, R.K., 2014. Case study research design and methods. 5th ed. Thousand Oaks, CA: Sage.
  • Young, V.M., 2006. Teachers’ use of data: loose coupling, agenda setting and team norms. American journal of education, 112 (4), 521–548. doi:10.1086/505058
  • Zeitz, C.M., 1997. Some concrete advantages of abstraction. How experts’ representations facilitate reasoning. In: P.J. Feltkovich, K.M. Ford, and R.R. Hofman, eds. Expertise in context. Cambridge, MA: MIT press, 43–65.

Appendices

Table A1. Paula’s development.

Table A2. Survey of the frames used in the interpretation and use of students’ data in the Check and Act phases of the PDCA cycles.