980
Views
1
CrossRef citations to date
0
Altmetric
Research Article

With or without you? Exploring student perception survey assessment data as a teacher-student (un)shared process

Pages 227-240 | Received 16 Feb 2023, Accepted 08 Jul 2023, Published online: 30 Aug 2023

ABSTRACT

Student voice initiatives such as Student Perception Surveys (SPS) are increasingly promoted by education authorities to assess teaching for learning improvement. Nevertheless, considerable ambiguity surrounds teacher and student consultation and involvement during the implementation of and response to SPS, suggesting students’ scepticism in the power of their voice and teachers’ struggle to act upon this important assessment data. Utilising Fielding’s Students as Researchers framework —complemented with Dewey’s Problem-Solving method —this study explores Australian secondary teachers’ experiences of researching SPS with their students, as co-researchers, in their efforts to act on such surveys. Teachers’ interviews and their SPS-based Participatory Action Research (PAR) projects reveal two key findings regarding the teacher-student consultation and involvement in the survey unpacking process. Firstly, the teachers promoted students as SPS discussants and co-researchers but avoided them as researchers (initiators); advancing democratic participation for educational development yet employing an active conscious strategy to protect themselves and/or their students in the SPS research process. Secondly, findings portray assessment puzzles of practice, demonstrating how despite declaring ‘survey fatigue,’ the teachers administered additional surveys as a heuristic SPS problem solving approach. Implications include guidance for education authorities regarding improvement of teacher learning through the dissemination of student voice assessment data-based research.

Introduction

The importance of promoting student voice in educational processes is increasingly stated in the literature, noting how schools that students view as responsive have better grades and reduced rates of chronic absenteeism (e.g. Kahne et al., Citation2022). Scholars note how Student Perception Surveys (SPS)— as student voice initiatives designed to assess teaching— are reliable and valid sources to inform teacher practice (Finefter-Rosenbluh et al., Citation2021; Kuhfeld, Citation2017) and can be better predictors of student scores on criterion-referenced tests than teacher self-assessment or principal assessments (Wilkerson et al., Citation2000). The involvement of students in everyday education fundamentals highlights their value as active partners/participants who work alongside and can research with teachers to improve educational processes—becoming agents of change (Fielding & Bragg, Citation2003).

Such participation is addressed in Fielding’s four-fold model (Citation2001), outlining different levels of involvement in research for schooling improvement, associated with teacher assumptions and values. The model distinguishes between students as sources of data, discussants/active respondents, co-researchers, and researchers/initiators; noting how practices move in and out of the different modes. Such practices, we acknowledge, can take place as Participatory Action Research (PAR), namely, a form of practitioner action research in which teachers and students collaborate in research into their educational experiences for the purposes of personal or collective improvement (Smit et al., Citation2020).

Yet, whilst there is much interest in research with and by students to develop educational fundamentals, it is unclear how meaningful student consultation could be or how might it be achieved in these investigative processes (Fielding & Bragg, Citation2003), let alone how teachers research SPS with students to improve educational experiences. To address this gap, and to understand the involvement of members affected by SPS research, we utilised Fielding’s Students as Researchers framework (Citation2001)—supplemented with Dewey’s Problem-Solving method (Citation1910)—asking the following:

  1. How do teachers research SPS with students in their efforts to act upon such assessment data?

  2. What limits, or conversely enables teachers to support students in the SPS-based PAR process?

Students as researchers: Dewey’s problem solving in a participatory action research

The United Nations Convention on the Rights of the Child (UNCRC, Citation1989), ratified by the Australian government in 1990, has enshrined key rights for children, including their right to express their views on all matters affecting them (Article 12). In the state of Victoria, Australia, where this study was conducted, there has been strong policy prioritising student voice initiatives for quality teaching (VicDET, Citation2018). Such a policy endorsement highlights student input into educational processes; acknowledging how teachers increasingly face multifactorial problems that need to be (re)solved.

The latter have long been viewed by Dewey (Citation1910) as the relationship between thinking and action, where both information seeking and problem solving are considered a learning process. The Problem-Solving method advocates a heuristic approach, capturing one’s attention in a particular situation: the puzzling, curious or engaging instance encouraging them to look again at a situation, unpack and learn from it (Loughran et al., Citation2016). Addressing problems (or puzzles) of practice requires more than the simple application of a technical-rational approach of providing a solution to a problem. Rather, the problem-solving method stresses the need to become aware of difficulty, identify the problem, assemble and classify data as well as formulate hypotheses, accept or reject tentative hypotheses, and formulate and evaluate conclusions (Dewey, Citation1910).

Contemplating pathways that can enhance educational problem-solving capacities, Alderson (Citation2000) suggests several ways to involve students in related research: Firstly, through investigating classwork topics—utilising different methods such as interviews and questionnaires. Secondly, being involved in adult-led research: students can assist in gathering and analysing data, including by offering appropriate language use that facilitates reliable perspectives to enhance research processes for learning. Thirdly, encouraging research directed by children, with the assistance of experienced adults; emphasising shared structures and democratic participation where teachers and students collaborate for educational development (Dewey, Citation1916; Kirby, Citation1999). The concept of Students as Researchers, Fielding and Bragg (Citation2003) argue, is reflected in these three traditions; promoting ideas of students as partners who work alongside teachers to improve learning and become agents of change. By investigating key issues related to their schooling experiences, students develop with their teachers, Fielding and Bragg note, ‘a sense of shared responsibility for the quality and conditions of teaching and learning, both within particular classrooms and more generally within the school as a learning community’ (p. 4).

The idea of Students as Researchers then, positions students as ‘resources’ and ‘producers of knowledge’; assuming they can, and should, undertake such work as they have particular skills and unique knowledge. Involving students as researchers rests on the belief that they have valuable educational perspectives that can serve as sources of creativity rather than unproductive conflict (Thomson & Gunter, Citation2007). Highlighting students’ needs while supporting their capacity to explore can provide new knowledge about teaching and learning; however, for such processes to be productive, teachers should foreground a mutual democratic dialogue in which they listen to and learn from each other (Dewey, Citation1910, Citation1916, Citation1933).

Indeed, the concept of Students as Researchers ranges in terms of their roles/involvement in research, as Fielding (Citation2001) suggests in his model:

  1. data sources (recipients)- facilitating teacher learning about student attitudes towards learning. Teachers engage with students by acknowledging their input, and meaning is made by dissemination, e.g., through recognising data about student performance.

  2. active respondents (discussants)- prompting teacher engagement so they could improve educational processes. Teachers engage with students by hearing their input, and meaning is made by discussion, e.g., shaping teaching agendas based on SPS data.

  3. co-researchers- deepening teachers’ understanding of teaching and learning. Teachers engage with students by listening in order to learn, and meaning is made by teacher-led feedback or dialogue, e.g., students co-research pedagogical aspects.

  4. researchers (initiators)- engaging with teachers and peers to deepen their own understanding of educational processes. Teachers engage with students by listening in order to contribute, and meaning is made by student-led feedback or dialogue, e.g., students guide teachers through teaching to shape learning.

In the first three levels, teachers mainly take the initiative and activities are teacher-led, while in the last level, students are the initiators and activities are student-led. Transformations of students as researchers lie in the application of Participatory Action Research (PAR), which purposely aims at giving voice to the participants and improving social justice outcomes; enabling teachers to investigate problems from different perspectives. The PAR process offers democratic means by which students and teachers collaborate towards positive educational outcomes (Bland & Atweh, Citation2007).

In this respect, practitioner research focuses on addressing contextual issues; designed in ways that can inform one’s practice and contribute on a broader level to collective work. When conducted in a school, practitioner research involves shifts in teacher-student power structures, from a hierarchical position where the teacher decides about content, goals, methods, norms and standards, to a more equal position of students as partners in shared learning processes (Taylor & Robinson, Citation2009). Students as Researchers— as a teacher-student collaborative PAR— can prompt teacher enabling student participation, develop a participatory school practice through modelling ideals of democracy, strengthen teacher-student relationships, provide a safe space for professional development, enhance practice, and devise engaging learning environments (Smit et al., Citation2020). This study offers a glimpse into students’ consultation and involvement in teachers’ PAR projects designed to examine SPS assessment data to improve teaching and learning.

Student perception surveys: apparatuses of student voice-based assessment data

SPS are increasingly endorsed by policy agencies, with at least 17% of the largest US districts implementing them in their schools (Steinberg & Donaldson, Citation2016). Similarly, Australian states have been promoting the use of SPS (e.g., VicDET, Citation2020a, Citation2020b), acknowledging how such student voice initiatives can alleviate concerns associated with isolated teacher assessment measures made by principals and peers, which often differ substantially from teachers’ own view of teaching effectiveness or student standardised outcomes (e.g. Jacob & Lefgren, Citation2008). Intriguingly, principals’ observations are rarely associated with other measures of teacher effectiveness (Kraft & Gilmour, Citation2017), reporting preferences for teachers with strong communication skills, although the most important aspect (ahead of teaching skills and subject knowledge) is caring, or teacher-student relationships (Harris et al., Citation2014). Likewise, peer assessments can mitigate teacher autonomy and entangle various ethical ramifications (Finefter-Rosenbluh, Citation2016).

Whereas another common way to assess teaching is through students’ perspectives (Kuhfeld, Citation2017), previous studies suggested no change in teachers’ practice and resistance following student feedback (Finefter-Rosenbluh et al., Citation2021), noting how teachers view SPS as tools limited by power relations and accountability standards (e.g. Finefter-Rosenbluh, Citation2020, Citation2022; Gehlbach et al., Citation2018). While SPS can help deepen teachers’ insights into where their practice could be improved, they struggle to act on it, stressing the need for institutional support and training. Equally, students doubt their teachers’ ability and readiness to respond to SPS assessment data; questioning the power of their voice to change teachers’ practices and their capability and willingness to translate survey feedback into tangible actions (Finefter-Rosenbluh et al., Citation2021). In is unsurprising, therefore, that studies call for shared student voice understanding where community members unite around a set of practices for the field to become more cohesive (e.g., Conner, Citation2022).

Method

This study investigates how together with students, teachers explore SPS— surveys crafted to identify learner experiences in the classroom— in their efforts to act upon such assessment data. The study took place in a school with solid culture of student voice, aligning with the Victorian policies for teacher education, calling teachers to engage with student voice and constructive feedback initiatives in their professional development (VicDET, Citation2021). The study— approved by the first author’s institutional review board (Project 14271) and the Department of Education and Training, Victoria— draws upon 11 experienced Year 7–12 teachers’Footnote1 semi-structured interviews and PAR projects examining SPSFootnote2 with the intention to act on such assessment data. Of the 11 participants, eight were female, following the gender distribution characterising the teaching profession in Australia. All the participants identified as white, acknowledging English as the primary language spoken at home— like 61.2% of the students who took the SPS. We note, in this respect, that further research should examine the experiences of more diverse groups of teachers; unpacking how such equity-oriented assessment processes are shaped by racialised dynamics.

Furthermore, the teachers’ PAR projects were supported by five one-hour PD workshops held at the school from March to September 2021.Footnote3 The workshops included an introduction to SPS-based PAR work, unpacking teachers’ initial experiences with SPS, discussing PAR data collection and analysis, and sharing final outcomes. Each workshop was supported by the school (cultural facilitation) and the authors (external assistance), led by the teachers themselves and driven by SPS data-related questions and strategies/solutions they wished to investigate. The structure of the workshops, which may have influenced teacher expectations about the extent of student involvement, is another limitation that future studies could tackle (e.g., illuminating how teacher/student positionality shapes participation in such voice-based assessment processes).

To avoid power-related effects, school leaders were not involved in the workshops. Participants were assured that they could not be identified and were informed that their participation in the workshops was completely voluntary, and they could withdraw any time. The teachers, who agreed to participate in a follow up interview, received a $20 gift certificate, and were provided with pseudonyms. Their teaching subject and school’s name are also not mentioned. Interviews lasted 30–45 minutes and were conducted on Zoom at the end of 2021 after the teachers’ PAR projects were completed. Notes were taken during the interviews which were recorded and transcribed by the platform. Transcripts were double checked by an independent research assistant and compared with notes taken during the interviews to ensure clarity and accuracy. In their interviews, teachers were presented with open-ended questions about their SPS-based PAR experiences, including how such exploration shaped their approach to SPS and what was their students’ responses to and role in the PAR project.

Interviews and PAR projects were thematically analysed through an interactive interpretive process considering the key theoretical frameworks employed in this study (Elliott & Timulak, Citation2005), namely, Fielding’s Students as Researchers framework (Citation2001) and Dewey’s Problem-Solving method (Citation1910). Categorising processes included an initial coding tree referring to teachers’ overall SPS-based PAR work with students. To ensure reliability, each author reviewed interview transcriptions and PAR projects independently and recorded ideas for a coding scheme. Disagreements were resolved by discussion to reach consensus.

Findings and discussion

Teachers’ interviews and PAR projects capture SPS-driven research processes providing novel opportunities for students to make more choices in their learning. Students were positioned as discussants and co-researchers, but not as SPS researchers/initiators. In addition, the teachers employed assessment puzzles of practice: while lamenting ‘survey fatigue,’ they actively administered more student surveys, reflecting a heuristic SPS problem solving approach.

Promoting students as SPS discussants and co-researchers but avoiding them as researchers (initiators)

The teachers acknowledged the importance of tuning in to students, by involving and listening to them in research processes designed to respond to their SPS assessment data. Engaging with SPS-driven PAR, they moved from recognising their students’ input as a given exercise of voice, to positioning them as active respondents/discussants and/or co-researchers of SPS (compare, Finefter-Rosenbluh, Citation2022; Finefter-Rosenbluh et al., Citation2021). For example, concerned that her subject is not seen as valuable by students— as suggested in their SPS— Jill sought to further examine the ‘valuing of subject’ measure, unpacking valuable 21st century skills in her class. She designed a PAR project with her students, involving them as discussants/active respondents and later as co-researchers of the ‘valuing of subject’ aspect to dissect her teaching and students’ learning experiences (Fielding, Citation2001). In this PAR process, she became more aware of ‘particular issues’ concerning her class; not only by listening to her students’ discussions about the valuing of different school subjects in contemporary society, but by creating with them, as co-researchers, a document that outlines and illustrates valuable 21st century skills reflected in her subject. Following up her SPS/valuing of subject-assessment data, Jill developed another survey to extend her and her students’ understanding about how the subject corresponds with valuable 21st century skills. As she described:

I talked with the students about their surveys … saying how I was interested in the ‘valuing’ section and would like to drill into that further to learn what they see as valuable … from that conversation the second survey was created … students reflected on their experiences in class, thinking specifically about 21st century skills … it was a combination of them thinking, realising and discussing how they’re subconsciously building and developing these skills in class, through collaboration, communication, making creative choices through taking risks, taking on leadership positions like their conversations with me about changing pedagogical aspects and so on … so when we examined the results of that second survey they were more aware of their learning and developing these skills … this [process] inspired me to create the ten employability skills in [my subject].

Concluding her SPS-based PAR experiences as ‘beneficial,’ Jill witnessed how she and her students had ‘become more aware of the value of the subject.’ Such shared research processes prompted her to explore ways that strengthen awareness of the complexity and the importance of ‘valuing of subject’ among other teachers in Victorian schools; contacting the Victorian Department of Education and Training, and informing them of the issue to secure support in conducting further research with students and related activities on the matter. Put differently, utilising SPS assessment data in ways that motivate teacher-student-research-based activism, Jill was looking to fit ‘new pieces’ into students’ and teachers’ understanding about the intricacies of subject value. Such pieces fall into place through processes of acting and observing in the research site (e.g. the classroom), and then evaluating and making sense of the results towards a given goal (e.g. viewing schooling as a promoter of 21st century skills) (Dewey, Citation1910). Dissecting measurable components with students as active discussants and co-researchers (Fielding, Citation2001), Jill became more aware of the problem and how she talks about her subject (Dewey, Citation1910)— exemplifying how it informs 21st century skills. Treating SPS as an apparatus of social change grounded in principles of democracy (Greenwood & Levin, Citation2007), she modelled citizenship and democracy in practice (Dewey, Citation1916). Such findings extend studies suggesting how teacher-student collaborations can increase the richness of data and contribute to socio-educational participation (Gillett-Swan, Citation2018), extending possibilities for deeper change (Mitra, Citation2018).

Similarly, Richard assumed his PAR project to move from ‘working on autopilot’— or viewing students as ‘data sources’— to actively examining with them ways that address their class disengagement, as suggested in their SPS. Working with students as partners (Fielding & Bragg, Citation2003) or discussants and co-researchers (Fielding, Citation2001), Richard’s PAR project hypothesised and examined how different assessment tasks and criteria modifications, suggested by and collaboratively devised with students, could improve student engagement. He involved students with class assessment data interpretation, including self and peer assessment; seeking to provide further opportunities for decision making. Positioned as agents of change, Richard’s students were consulted as co-researchers of a key educational issue— assessment. He found that such collaborative processes— envisioned as an exercise of shared educational responsibility— improved students’ attendance, especially underperforming students. Such processes prompted students to put more effort into their participation and learning, corresponding with higher grades. Concluding his experience, he stated, ‘the PD showed people the need for being a bit more tuned in and collaborative in your classes.’ Becoming aware of student (dis)engagement, identifying the problem at hand (i.e., need to modify assessment tasks to improve engagement), assembling, classifying and utilising SPS data through formulating and accepting hypotheses and conclusions (i.e., how different assessment tasks correspond with student engagement) (Dewey, Citation1910), Richard and his students combined forces, through democratic processes, to promote positive educational outcomes (Bland & Atweh, Citation2007; Dewey, Citation1916).

Taking a similar approach, Tommy— who became aware of a problem reflected in his students’ low perceptions of ‘valuing of subject’— involved his students as discussants and co-researchers of such SPS assessment data, seeking to use it through formulating and accepting related hypotheses that could help ‘make the subject more engaging and valuable.’ As he explained:

The students focused on five [SPS] questions in the breakdown on the valuing of subject … we wrote them together on a big butcher’s paper, and as they walked around and discussed their meaning, I showed them the data breakdown of the results and we chatted about it … they wrote more ideas on those bits of paper, linking it to class engagement and how it can be improved, and I came up with a podcast idea where all these ideas were shared broadly [with other students, including in other schools].

Thus, increasingly consulting and involving their students in their PAR projects (e.g., in processes of utilising, planning, collecting, classifying and analysing data), the teachers promoted teacher-student collaborative research processes, highlighting democratic participation for educational development (Dewey, Citation1910, Citation1916; Fielding, Citation2001; Kirby, Citation1999).

However, while encouraging their students to ‘move into’ and ‘between’ the first three levels of research participation (Fielding, Citation2001), the teachers were mainly taking and leading SPS-oriented research initiatives, with no active push into the last level, where the student is the initiator, and SPS-related activities are, in fact, student-led. The teachers did not position students as full researchers (e.g., Leitch et al., Citation2007) or research initiators in their SPS-based PAR projects. Such avoidance or overlooking of students as SPS researchers (initiators) seemed like a conscious strategy intended to protect the students or the teachers themselves in their research process. This approach might be seen as ‘teacher ownership’ of an educational change strategy in a space increasingly shaped by student voices (Finefter-Rosenbluh, Citation2022; Kirk & MacDonald, Citation2001). While positioning teachers as experts, Joseph also lamented how ‘staff don’t feel like they have a lot of understanding of the survey themselves so that can impact how seriously they expect students to engage with it’; suggesting resistance and the need for institutional support in SPS assessment processes (see also, Finefter-Rosenbluh et al., Citation2021). Equally, Stella declared:

We may be reluctant to push the survey forward to certain classes, depending on what our relationship is with that class … they can take it the wrong way, like we don’t know what we’re doing … group dynamics can both facilitate and hinder the implementation of student surveys.

Furthermore, several teachers avoided positioning students as SPS research initiators to protect the latter from potential violence and abuse— seemingly respecting their right to not participate. As Kelly illustrated:

There are kids who don’t normally have a voice in class, so they might be scared … kids must feel comfortable enough to speak in front of others without fear of being ridiculed and all that sort of stuff.

For Debbie, it was a conscious decision to dissect the SPS results together with the students, as co-researchers, so that the research process did not come across ‘as an attack-y way,’ rather, it allowed everyone ‘to see whether we had different opinions around what [the results] meant, so we can come to a conclusion together.’ Such SPS-based research conversations are not just about making ‘the teacher more aware of things,’ but it is also about ensuring that ‘students feel comfortable and care for each other like we care about them.’

These findings align with studies highlighting rare opportunities for students to assume leadership and responsibility for educational change (Finefter-Rosenbluh et al., Citationin press; Mitra, Citation2007). In the same vein, they demonstrate teachers’ understanding of how children’s participation may impede other rights, such as their right not to participate, to be protected from violence and abuse, and not to be discriminated against (Perry-Hazan, Citation2021). The findings highlight the importance of empowering teachers in SPS-based research processes intertwined with children’s best interests. Additionally, for Dewey (Citation1916), any kind of dualism, such as between thinking and action and/or theory and practice is artificial and ineffective, that is, thoughts without action can be seen as barren and action without thinking as routine, while the ideal is thoughtful action. For the teachers in this study, avoiding the positioning of students as SPS researchers could be seen as this kind of action/thinking dualism. Teachers’ limited understanding of SPS or familiarity with/exposure to working alongside students as full researchers, and their duty of care (e.g. protecting students from potential harm) may have kept them from positioning students as SPS researchers/initiators.

Assessment puzzles of practice

While most interviewees noted their own and their students’ ‘survey fatigue’— prompted by policy moves to gain student voice-based measures of teaching effectiveness (VicDET, Citation2020a)— they designed and employed additional student surveys in their SPS-based PAR projects. Such practice captured their further exploration, intended to produce additional data that could help ‘solve’— by learning more about and from— a puzzling problem reflected in their SPS assessment data. Such assessment puzzles of practice capture a heuristic problem-solving approach, reflecting the relationship between teachers’ thinking (about their SPS assessment data) and action (producing more SPS assessment data to solve their practice puzzle)a learning process where information seeking and problem solving of a puzzling instance are entangled (Dewey, Citation1910, Citation1933). For instance, Gail designed another survey to examine the puzzle behind her SPS results suggesting students’ low sense of ‘valuing of subject.’ Seeking to ’better the students’ experiences and benefit them,’ she crafted a survey to assemble and clarify the skills that students think should be developed, and assist in formulating implications for her pedagogy (Dewey, Citation1910). Such participatory explorations, she noted, were ’not so much about catering to students but really about trying to make as many kids’ learning experiences as positive as humanly possible.’ In doing so, she promoted shared learning processes, grounded in strong communication skills between groups (teacher/students-students/students) which can be viewed as a way of deepening common interests and beliefs— a collaborative research process (Fielding, Citation2001) that has the potential to develop awareness of one’s actions and its impact on a group (Dewey, Citation1916). One may argue, however, that such initiation of more assessment practices corresponds with teachers’ narrowed attention to measurable approaches for students’ learning (Hardy, Citation2018).

Similarly, declaring her ‘survey fatigue which can be quite exhausting,’ Lucy admitted how teachers ‘often look at a survey as very dry and confusing [data],’ but the SPS-based PAR process prompts ‘to investigate it a bit.’ Puzzled by her students’ poor perceptions of ‘rigorous expectations’ in her class, she was left confused as to how to solve this problem— hoping to learn more about it to find ways that could help ‘improve everybody’s experiences.’ Seeking to address this missing puzzle of practice, she designed another student survey— to produce more assessment data— about ‘what would it look like to have rigorous expectations and how can we make sure that’s happening in the classroom?’ In their responses, her students asked to receive explicit feedback illustrating classroom expectations. Analysing these additional assessment data with her students, Lucy later implemented a feedback form, hypothesising and formulating conclusions (Dewey, Citation1910) over the use of such an assessment tool to improve her practice. As she explained:

It was sort of drawing out from students a bit more evidence about their expectations, which could be the key to keeping them accountable and get that affirmation and know how I can do better … I actually surveyed all my classes using a quick survey on a Google form … there were lots of interesting and helpful comments, especially about feedback which became part of my practice … they wanted to get more feedback on their work to show what high expectation mean … so I’ve created a streamlined way where I can make that feedback meaningful to everyone.

Dissecting with students more assessment data— uncovering entanglements of feedback and expectations— prompted Lucy to create another assessment tool for solving the puzzling low expectation problem. Such assessment puzzles of practice for educational development correspond with studies showing how collaborative PAR projects can function as tools for generating feedback to support teachers in making sense of and developing their assessment practices (Harrison, Citation2013). In Dewey’s terms (Citation1916), Lucy made ‘a backward and forward connection’ between what she does and what her students experience ‘in consequence,’ demonstrating how ‘under such conditions, doing becomes a trying; an experiment with the world to find out what it is like; the undergoing becomes instruction—discovery of the connection of things’ (ibid. p.107).

Likewise, Rikki sought to gather more SPS assessment data to further examine her pedagogical effectiveness and a ‘problem’ of students’ low sense of engagement, to help devise steps for improvement:

Students get survey fatigue particularly in this school which is very much about student voice … [so] it gets to a point where they’re just clicking buttons to kind of move through it … [but] I moved past it, I was thinking about argument analysis stuff [and] sent them out another survey specifically asking about what makes them engage … like what sort of topics they’re interested in and what we could look at so it has a personal tie in and they can see the direct result of it … it was really tangible the buy-in for them to give really detailed and helpful responses.

Rikki gathered additional data, engaging with voice-based professional learning and growth ensuing from genuine critique (Mockler & Groundwater-Smith, Citation2015). Such assessment puzzles of practice echoed the relationship between her thinking about the initial SPS assessment data and her action focused on producing more assessment data to help solve the engagement and pedagogical effectiveness practice puzzle. This Deweyan-inspired action research required her students’ involvement and open communication about key matters relating to their learning (Stark, Citation2014).

Conclusion

According to Fielding (Citation2001), the concept of ‘Students as Researchers valorises and extends a transformative notion of education at the heart of which lies the commitment to teaching and learning as a genuinely shared responsibility’ (p. 137). Supplementing literature on SPS, teaching assessment, student voice, and teacher education for enhancing teaching and learning, this study portrays the complexities that lie in the way teachers research SPS with students in their efforts to respond to such important assessment data. Mapping out SPS-driven (un)shared teacher-student research initiatives, the study illustrates how such participatory action research processes can extend possibilities for deeper eucational change. The study provides guidance for teachers, teacher educators, and policymakers seeking to respond to ‘voices’ and improve the experiences of both teachers and students. Future studies are warranted to deepen our understanding of the kinds of teacher-student partnerships that generate lasting educational changes and support democratic aims.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Funding

The work was supported by the Victorian government school fund [291810187].

Notes

1. Teachers with over five years of experience teaching different subjects, such as History, English, Maths, Drama, and Science.

2. Survey implementation/data was collected from N = 1393 students online; utilising the Panorama Student Survey (Citation2020), which underwent rigorous psychometric testing to provide convincing evidence of reliability and content, structural, substantive, convergent and discriminant validity (Panorama Education, Citation2015). In this study, students took the surveys in March 2021, almost two months after the beginning of the school year in Australia. The SPS were open for three weeks, with the teachers administering them in their colleagues’ classrooms. The study employed the Grades 6–12 version of six (out of ten) of the survey’s Classroom and Teaching scales. Selected scales were: Belonging, Engagement, Pedagogical Effectiveness, Rigorous Expectations, Teacher-Student Relationships, and Valuing of Subject. Such constructs assisted in reflecting areas covered by the Victorian Practice Principles for Excellence in Teaching (VicDET, Citation2019). Overall, the survey comprised 38 closed ended items, of which 33 were construct measure focused. Each closed ended item within these scales was measured using a 5-point response scale with labels that were unique to the specific item wording. Higher scores on each scale indicate more positive student perceptions, regardless of the response labels used. Additional survey items measured demographic and education characteristics, including features of grade level and subject area information for each student; data which were provided by the school and appended to SPS responses. Such features, example items, and response options for all scales are described elsewhere (Finefter-Rosenbluh et al., Citationin press).

3. Such professional development workshops align with the Victorian policies for teacher education (VicDET, Citation2021). Victorian teachers are required to participate in PD programs to receive advancement in rank and wages. They can choose a PD from the Education Department’s offerings or receive explicit recommendations from their educational leaders/colleagues who may choose to work with subject associations or non-state companies. The teachers who participated in this study chose this study’s workshops as their PD for that year.

References

  • Alderson, P. (2000). Children as researchers: The effects of participation rights on research methodology. In P. Christensen & A. James (Eds.), Research with children (pp. 241–257). Falmer Press.
  • Bland, D., & Atweh, B. (2007). Students as researchers: Engaging students’ voices in PAR. Educational Action Research, 15(3), 337–349. https://doi.org/10.1080/09650790701514259
  • Conner, J. O. (2022). Educators’ experiences with student voice: How teachers understand, solicit, and use student voice in their classrooms. Teachers & Teaching, 28(1), 12–25. https://doi.org/10.1080/13540602.2021.2016689
  • Dewey, J. (1910). How to think. D.C. Heath. https://doi.org/10.1037/10903-000
  • Dewey, J. (1916). Democracy and education: The collected works of John Dewey, middle works (Vol. 9). Southern Illinois University Press.
  • Dewey, J. (1933). How we think. DC Heath and Co.
  • Elliott, R., & Timulak, L. (2005). Descriptive and interpretive approaches to qualitative research. In J. Miles & P. Gilbert (Eds.), A handbook of research methods for clinical and health psychology (pp. 147–159). Oxford University Press.
  • Fielding, M. (2001). Students as radical agents of change. Journal of Educational Change, 2(2), 123–141. https://doi.org/10.1023/A:1017949213447
  • Fielding, M., & Bragg, S. (2003). Students as researchers: Making a difference. Pearson Publishing.
  • Finefter-Rosenbluh, I. (2016). Behind the scenes of reflective practice in professional development: a glance into the ethical predicaments of secondary school teachers. Teaching & Teacher Education, 60, 1–11.
  • Finefter-Rosenbluh, I. (2020). ‘Try walking in my shoes’: teachers’ interpretation of student perception surveys and the role of self-efficacy beliefs, perspective taking and inclusivity in teacher evaluation. Cambridge Journal of Education, 50(6), 747–769.
  • Finefter-Rosenbluh, I. (2022). Between student voice-based assessment and teacher-student relationships: teachers’ responses to ‘techniques of power’ in schools. British Journal of Sociology of Education, 43(6), 842–859.
  • Finefter-Rosenbluh, I., Berry, A., & Ryan, T. (in press). Acting upon student voice-based teaching assessment initiatives: An account of participatory action research for teacher professional learning. Journal of Teacher Education. https://doi.org/10.1177/00224871231200278
  • Finefter-Rosenbluh, I., Berry, A., & Ryan, T. (in press). Acting upon student voice-based teaching assessment initiatives: An account of participatory action research for teacher professional learning. Journal of Teacher Education. doi:10.1177/00224871231200278
  • Finefter-Rosenbluh, I., Ryan, T., & Barnes, M. (2021). The impact of student perception surveys on teachers’ practice: Teacher resistance and struggle in student voice-based assessment initiatives of effective teaching. Teaching & Teacher Education, 106, 103436.
  • Gehlbach, H., Robinson, C. D., Finefter-Rosenbluh, I., Benshoof, C., & Schneider, J. (2018). Questionnaires as interventions: can taking a survey increase teachers’ openness to student feedback surveys? Educational Psychology, 38(3), 350–367.
  • Gillett-Swan, J. K. (2018). Children’s analysis processes when analysing qualitative research data: A missing piece to the qualitative research puzzle. Qualitative Research, 18(3), 290–306. https://doi.org/10.1177/1468794117718607
  • Greenwood, D. J., & Levin, M. (2007). Introduction to action research, social research for social change (2nd ed.). Sage.
  • Hardy, I. (2018). Governing teacher learning: Understanding teachers’ compliance with and critique of standardization. Journal of Education Policy, 33(1), 1–22. https://doi.org/10.1080/02680939.2017.1325517
  • Harris, D. N., Ingle, W. K., & Rutledge, S. A. (2014). How teacher evaluation methods matter for accountability: A comparative analysis of teacher effectiveness ratings by principals and teacher value-added measures. American Educational Research Journal, 51(1), 73–112. https://doi.org/10.3102/0002831213517130
  • Harrison, C. (2013). Collaborative action research as a tool for generating formative feedback on teachers’ classroom assessment practice: The KREST project. Teachers & Teaching, 19(2), 202–213. https://doi.org/10.1080/13540602.2013.741839
  • Jacob, B. A., & Lefgren, L. (2008). Can principals identify effective teachers? Evidence on subjective performance evaluation in education. Journal of Labor Economics, 26(1), 101–136. https://doi.org/10.1086/522974
  • Kahne, J., Bowyer, B., Marshall, J., & Hodgin, E. (2022). Is responsiveness to student voice related to academic outcomes? Strengthening the rationale for student voice in school reform. American Journal of Education, 128(3), 389–415. https://doi.org/10.1086/719121
  • Kirby, P. (1999). Involving young researchers: How to enable young people to design and conduct research. Joseph Rowntree Foundation.
  • Kirk, D., & MacDonald, D. (2001). Teacher voice and ownership of curriculum change. Journal of Curriculum Studies, 33(5), 551–567. https://doi.org/10.1080/00220270010016874
  • Kraft, M. A., & Gilmour, A. F. (2017). Revisiting the widget effect: Teacher evaluation reforms and the distribution of teacher effectiveness. Educational Researcher, 45(5), 234–249. https://doi.org/10.3102/0013189X17718797
  • Kuhfeld, M. (2017). When students grade their teachers: A validity analysis of the tripod student survey. Educational Assessment, 22(4), 253–274. https://doi.org/10.1080/10627197.2017.1381555
  • Leitch, R., Gardner, J., Mitchell, S., Lundy, L., Odena, O., Galanouli, D., & Clough, P. (2007). Consulting pupils in assessment for learning classrooms: The twists and turns of working with students as co‐researchers. Educational Action Research, 15(3), 459–478. https://doi.org/10.1080/09650790701514887
  • Loughran, J., Keast, S., & Cooper, R. (2016). Pedagogical reasoning in teacher education. In J. Loughran & M. L. Hamilton (Eds.), International Handbook of Teacher Education: Vol. 1. (pp. 387–421). Springer.
  • Mitra, D. (2007). Student voice in school reform: From listening to leadership. In D. Thiessen & A. Cook-Sather (Eds.), International handbook of student experience in elementary and secondary school (pp. 727–744). Springer Netherlands. https://doi.org/10.1007/1-4020-3367-2_29
  • Mitra, D. (2018). Student voice in secondary schools: The possibility for deeper change. Journal of Educational Administration, 56(5), 473–487. https://doi.org/10.1108/JEA-01-2018-0007
  • Mockler, N., & Groundwater-Smith, S. (2015). Seeking for the unwelcome truths: Beyond celebration in inquiry-based teacher professional learning. Teachers & Teaching, 21(5), 603–614. https://doi.org/10.1080/13540602.2014.995480
  • Panorama Education. (2015). Validity brief: Panorama student survey. https://go.panoramaed.com/hubfs/Panorama_January2019%20/Docs/validity-brief.pdf.
  • Panorama Student Survey. (2020). User guide.https://panorama-www.s3.amazonaws.com/files/panorama-student-survey/User-Guide.pdf.
  • Perry-Hazan, L. (2021). Conceptualising conflicts between student participation and other rights and interests. Discourse: Studies in the Cultural Politics of Education, 42(2), 184–198. https://doi.org/10.1080/01596306.2019.1599324
  • Smit, B., Meirink, J., Berry, A., & Admiraal, W. (2020). Source, respondent, or partner? Involvement of secondary school students in participatory action research. International journal of educational research, 100. https://doi.org/10.1016/j.ijer.2020.101544
  • Stark, J. L. (2014). The potential of Deweyan-inspired action research. Education and Culture, 30(2), 87–101. https://doi.org/10.1353/eac.2014.0013
  • Steinberg, M. P., & Donaldson, M. L. (2016). The new educational accountability: Understanding the landscape of teacher evaluation in the post-NCLB era. Education Finance and Policy, 11(3), 340–359. https://doi.org/10.1162/EDFP_a_00186
  • Taylor, C., & Robinson, C. (2009). Student voice: Theorising power and participation. Pedagogy, Culture & Society, 17(2), 161–175. https://doi.org/10.1080/14681360902934392
  • Thomson, P., & Gunter, H. (2007). The methodology of students-as-researchers: Valuing and using experience and expertise to develop methods. Discourse: Studies in the Cultural Politics of Education, 28(3), 327–342. https://doi.org/10.1080/01596300701458863
  • UN Convention on the Rights of the Child. (1989). U.N. Doc. A/RES/44/25.
  • VicDET. (2018). Practice principles for excellence in teaching and learning. https://fusecontent.education.vic.gov.au/45f340de-f24a-4df3-97f1-f928402fcae9/practiceprinreflection.pdf
  • VicDET. (2019). Victorian practice principles for excellence in teaching. https://www.education.vic.gov.au/Documents/school/teachers/support/practiceprinciples.pdf
  • VicDET. (2020a). Data collection and surveys. https://www2.education.vic.gov.au/pal/data-collection-surveys/guidance/attitudes-school-survey.
  • VicDET. (2020b). Student voice. https://www.education.vic.gov.au/school/teachers/teachingresources/discipline/humanities/civics/Pages/studentvoice.aspx.
  • VicDET. (2021). Performance and development for teacher class employees. https://www2.education.vic.gov.au/pal/performance-and-development-teacher-class-employees/overview
  • Wilkerson, D. J., Manatt, R. P., Rogers, M. A., & Maughanm, R. (2000). Validation of student, principal, and self-ratings in 360° feedback (registered) for teacher evaluation. Journal of Personnel Evaluation in Education, 14(2), 179–192. https://doi.org/10.1023/A:1008158904681