2,111
Views
1
CrossRef citations to date
0
Altmetric
Articles

Eliciting student teacher’s views on educational research to support practice in the modern diverse classroom: a workshop approach

ORCID Icon, , &
Pages 342-372 | Received 04 Nov 2016, Accepted 06 Jul 2018, Published online: 13 Sep 2018

ABSTRACT

Teachers’ professionalism includes using educational research to support their work in the modern diverse classroom. Student teachers’ views as they enter the profession are therefore important. Within a Higher Education Academy social science priority research strand, ‘Supporting research-informed teacher education in a changing policy environment’, this study developed workshops to ascertain student teachers’ views on educational research, preparing materials suitable for primary and secondary sectors. These could be updated, and used by other higher education courses. Face-to-face or email workshops asked participants about their current uses of educational research, and to read and comment upon one policy research extract and one ‘what works’ research review. Small-scale piloting suggested the workshops readily elicited views, and students identified some personal changes following participation. Participants were generally unfamiliar with the principles of ‘what works’ research. Thematic analysis suggested students considered educational research was often inaccessible, but wanted accessible research to inform their practice.

Introduction

School teaching takes place in a rapidly changing world of globally available information, where international requirements for high educational attainment impact on the knowledge and skills required of teachers, and therefore on teacher education. A summit on the teaching profession representing education systems deemed to be ‘high-performing and rapidly improving’ using outcomes from the Organisation for Economic Co-operation and Development (OECD)’s Programme for International Student Performance (PISA) (Schleicher, Citation2012, p. 11) identified teacher professionalism as a key factor in achieving such high-quality outcomes. Teacher professionalism includes personal characteristics leading to ethical and responsible action, but the review also identified the need to strengthen the ‘technical core’ of teachers’ professional practice by the creation, accumulation and diffusion of professional knowledge ‘which draws inter alia on research findings’ (Schleicher, Citation2012, p. 52). This has implications for the pre-service education of teachers to develop their understandings of educational research and its applications.

Within the UK, the British Educational Research Association (BERA) alongside the Royal Society for the Encouragement of the Arts, Manufacturing and Commerce (RSA) responded by commissioning an enquiry into research and teacher education (Citation2014b; BERA-RSA, Citation2014a). Updated papers from the enquiry appear also in a special edition of the Oxford Review of Education . One conclusion of the BERA/RSA enquiry was that teachers should be discerning consumers of, and engage with, research (BERA, 2014b, p. 5), being able to interpret research evidence and apply it to their working context. A detailed rationale for this is given by Winch, Oancea, & Orchard (Citation2015, p. 211):

… professional practice makes the following demands of teachers: practical understanding and know-how; a good conceptual understanding of education and teaching; and the ability to understand, interpret and form critical judgments on empirical research and its relevance to their particular situation. The professional teacher exercises discretion and judgment to evaluate educational research. S/he mediates their research-based knowledge drawing on awareness of the particular needs of the class(es) taught, as well as individual pupils. These observations suggest that good teachers need to engage actively with educational research; rather than replacing the irreducibly craft-based elements of their work, an iterative research–teaching relationship can support and expand them.

The authors further stress the need for teachers to understand and evaluate the relevance of research to their own situation, applying both personal experience and research findings to plan teaching activities.

Given the implications of such needs for pre-service teacher education, the UK Higher Education Academy (HEA), which champions learning and teaching in higher education, augmented the BERA/RSA review by commissioning research to explore the distinctive contribution of teachers being educated within higher education establishments to the development of teacher professionalism. This was also in response to alternative models of teacher training outside higher education in parts of the UK. The resulting report (Florian & Pantić, Citation2013) stressed the diversity of modern classroom settings, requiring teachers to respond effectively to developmental, social, cultural and/or linguistic factors that impact, often adversely, on child attainment and well-being. This demands knowledge of such adverse factors and how to counteract them, which is at least partly based on research findings. HEA followed up the report by developing a strategic social science priority research strand, ‘Supporting research-informed teacher education in a changing policy environment’, to develop understanding of the needs of initial teacher educators in preparing teachers for the complex classroom environment. The project reported here was commissioned as one study within this strand. It took place in a Scottish university with a large teacher-education programme. The project aimed to elicit the views of pre-service teacher students on engaging with and evaluating research on educational practice, to provide teacher educators with preliminary information about student thinking.

Strands in educational practice research

Two strands of educational research aim to influence classroom practice. One is critical policy research that undertakes systematic literature review to uncover the impact of schooling, and uses this to develop policy on practices likely to be effective. The other, related, strand reviews studies on the outcomes of educational pedagogies that conform to defined study quality standards and may use counterfactual (control) conditions. This approach is often characterised as ‘what works’ research. Leat, Reid, and Lofthouse (Citation2015, p. 272) in their BERA-commissioned review suggest that the importance of ‘what works’ evidence has recently increased in the UK:

‘… school improvement – has evolved in conjunction with a political desire for evidence-based practice with a focus on [pupil] outcomes. Given increased emphasis on accountability, it seems likely that [this] last mentioned purpose has increased in importance. As a result the most prominent face of educational research involving teachers is the school effectiveness paradigm – related to the aphorism ‘what works’. This is reflected in the popularity of meta-analyses of evidence relating to the impact of interventions on [pupil] outcomes.’

In this context, student teachers’ views on both policy and ‘what works’ education research are likely to be relevant as they embark on their professional lives.

Student and teacher views of classroom research

Accessing and applying research to practice is a relatively new expectation placed on UK teachers as they develop into ‘expert practitioners’ (Tatto & Furlong, Citation2015, pp. 149–150). During pre-service education, student teachers develop and reflect upon their beliefs, including those about social inclusion within classrooms, and undertake targeted pre-service learning experiences to expand and foster competence in teaching diverse pupils. A policy review by the European Agency for Development in Special Needs Education (Citation2011) discusses the importance of pre-service education in developing students’ attitudes, values, beliefs and understandings in this area, and provides comprehensive examples of learning activities intended to expand confidence as students become teachers. There are further recent examples internationally, for example, Sharma (Citation2012) in Australia, Tournaki and Samuels (Citation2016) in the US and Bi, Wu, Su, and Roberts (Citation2017) in China. However, no studies specifically reporting UK student teachers’ views of engaging with classroom research could be retrieved, although a recent study from South Africa suggested that student teachers who undertook a course in research methods attributed low value to its ability to help them realise the importance of research in the field of education (Lombard & Kloppers, Citation2015).

Leat et al. (Citation2015, p. 271) note a similar dearth of systematic surveys of qualified teachers’ engagement with research, with teachers’ voices emerging largely through quotes in papers related to other issues. Earlier studies of qualified teachers’ views suggest their views are not unproblematic, but that teachers (unsurprisingly) tend to find research that is relevant to their practice most useful. A systematic review by Hemsley-Brown and Sharp (Citation2004) identified facilitators and barriers to teachers’ use of research, reporting studies where teachers found the volume of research daunting, did not subscribe to academic journals in which research was published, and/or rarely had access to academic libraries. Some teachers considered research could have ambiguous results or untrustworthy findings; was often full of jargon and statistics they did not understand; was too theoretical, and so unhelpful or irrelevant to their teaching; and that their individual teaching situation prevented application of research findings. Some had limited interest in research, and tended to become defensive if they believed that the intention in sharing research evidence was to impose a particular style or model on their teaching. When this happened, they tended to question whether the findings were valid within their personal context, even when the person sharing evidence was a peer or colleague. Hemsley-Brown and Sharp (Citation2004) cited Zeuli (Citation1994), who asked teachers to comment on selected research papers and concluded that they had some difficulty understanding research issues, and responded best to concrete examples, considering them credible if they matched their own experience. Teachers valued research identifying strategies and techniques with a direct effect on teaching, and considered that research should concentrate on identifying procedures that work in classrooms. Hemsley-Brown & Sharp (Citation2004, p. 460) noted that at the time there appeared to be little incentive for teachers to access or use research, and concluded that:

Teachers perceive educational research to be quantitative in nature and frequently challenge the validity of the research, arguing that their unique situations invalidate the application of its findings. Practitioners are identified as seeking new solutions to operational matters whilst the researchers are characterised as seeking new knowledge. Findings from this review suggest that many teachers judge a study’s merits on the basis of whether the findings can be translated into procedures that work in classrooms.

Ratcliffe et al. (Citation2004) interviewed UK science teachers who considered that research had influenced science education, but required to be mediated through research-based teaching materials that translated the research findings into practical classroom strategies. Adoption of research evidence in the classroom was considered also to require a professional culture that encouraged change in practice, and professional networks, suggesting that lack of these would inhibit engagement. In focus group discussions, some teachers found grounds for rejecting or questioning research findings and considered that research should resonate with their prior beliefs and practice.

A survey of UK teachers by Williams and Coles (Citation2007a, Citation2007b)) found that only 13.5% of respondents regularly used the Internet to source research information, although the date of this study should be noted: Internet access has become generally more widespread since its publication. Only 11.2% of respondents in this study used research journals, preferring ‘pre-digested’ or informal sources, such as discussion with colleagues, professional magazines and in-service/education authority information. Respondents also indicated that they had concerns about their ability to evaluate research information.

More recently, Connolly (Citation2014) suggested a major barrier to teachers’ engagement with research may be an underlying philosophical resistance to, and a fundamental mistrust of, ‘what works’ research, perceiving it to be against practitioners; to undermine professional autonomy by using large-scale surveys, randomised controlled trials and quantitative analyses; and to be characterised as oppressive, dictatorial, descriptive and theoretically naïve, stifling reflective teacher practice.

It is possible that student teachers hold views similar to those of South African students or of earlier practising teachers that will serve as barriers to engaging with research. In particular, if students mistrust ‘what works’ research, asking them to engage with evidence-informed practice from a ‘what works’ perspective may be difficult. Alternatively, if they welcome its focus on the utility or otherwise of pedagogic practices and understand its scientific assumptions, they may be comfortable in critiquing and using its findings. Students are training at a time when, as argued above, engagement with educational research has become a recognised and necessary part of teacher competence, and may have different exposure to, and experiences of, research during their pre-service education than earlier respondents. The views of current pre-service students therefore require detailed and separate consideration as they move towards engaging with research evidence in the workplace, and the present study sought to pilot the identification of such views.

The pilot study

The importance of engaging with research in order to work in the diverse modern classroom and existing research reporting teachers’ views influenced the present study. We developed workshops inviting student teachers to read summaries of practice-focussed educational research reporting attempts to ameliorate factors inimical to learning; to give their views on whether reading these had developed their understandings and could influence their practice; and to comment on the workshop experience. Workshop materials were developed where alternative research summaries could be used in order to facilitate updating and so that pre-service teacher-educators in higher education could ascertain their own students’ views using relevant materials. The development of the workshop materials and the student views emerging from pilot workshops are described and discussed. As an exploratory study, a qualitative analytical approach was taken.

Aims

The overall aim of the research study was to prepare, pilot, and make a preliminary evaluation of workshops that elicited student teachers’ views on research evidence designed to influence classroom practice that can be used across higher education establishments.

Specific aims were

  1. to identify and select examples of educational research evidence designed to influence classroom practice in ameliorating developmental, social, cultural or linguistic factors that may impact adversely upon child educational attainment and well-being for workshop use;

  2. to prepare workshop activities and materials that engage student participants in appraising and commenting upon the selected research examples, and upon how they might use such research findings;

  3. to pilot the workshops, eliciting and recording participants’ reflections and views on the selected research and on barriers and facilitators to using research;

  4. to identify student views of participating in the workshops, and any resulting personal changes they reported;

  5. to identify and summarise preliminary themes concerning student awareness and views of education research.

Methods and procedures

Ethics, participants and recruitment

Ethical approval for the project was obtained from the University of [University of Strathclyde] School of [School of Psychological Sciences and Health] Ethics Committee, including approval of participant information and consent sheets that explained the project and participation, and approval to record group discussions.

Participants were recruited as volunteers from the University of [Inserted Strathclyde] as an opportunity sample for this preliminary investigation. They were recruited via adverts placed in online learning environments and disseminated via tutors and student forums from early and later years of a four-year undergraduate (BA/Bed) degree for primary teachers, and from a one-year post-graduate (PGDE) course for primary and secondary teachers. Only small numbers of participants could be accommodated within the scope of the pilot study.

Two group workshops were held towards the end of academic session 2014–15 with six BA/BEd participants from years one, three and four (5 female, 1 male). Six PGDE students (5 female, 1 male) completed email workshops towards the end of their course in 2015–16.

The study addressed five aims as follows:

Aim 1.Identifying and selecting examples of educational practice research

We were interested in student participants’ views of both critical policy research and ‘what works’ research, and sought an example of each to be discussed in workshops. Examples were to focus on classroom approaches attempting to counteract factors that may impact adversely on child attainment. They were sourced by the research team using targeted searches of published literature, including research reports, journals and websites. Further, and in line with the project’s interest in the distinctive contribution of HE to teacher education, the expertise of social science, health and education academics in the University of [Inserted Strathclyde] Faculty of Humanities and Social Sciences was accessed by inviting them by email to suggest research examples appropriate for workshop use. Sourcing evidence in this way gave a snapshot of the information that could be collated rapidly within a university scholarly community. Research examples were selected for pilot workshops in relation to their currency and relevance to Scotland and to students’ planned practice level, primary or secondary. It was intended that future workshops and other institutions would update the research examples to ensure the relevance of topics to later student participants.

Each workshop was to present brief and accessible research summaries on relevant topics, one reporting critical policy research and one a ‘what works’ summary of a pedagogic practice, as vehicles to elicit views and to allow some commonality of student experience. Faculty staff suggested examples of critical policy but not ‘what works’ research. Critical policy evidence suggested by staff or identified by targeted literature search was classified as applicable to developmental, social, cultural and/or linguistic factors affecting learning, cross-referred across categories if applicable. Relevant and accessible research summaries were sought that could be read easily during or before workshops. The executive summary of a recent publication from the Joseph Rowntree Foundation (JRF), Sosu & Ellis (Citation2014, pp. 1–4), report Closing the Attainment Gap in Scottish Education was chosen. This report discussed the relationship between poverty and educational attainment, a topic relevant to all participants and courses and a major educational policy focus in Scotland. It gave specific pointers for practice, presenting accessible, clearly summarised findings relevant to practitioners at classroom (and other) levels, recommended actions for schools and classrooms, and summarised effective approaches. The report was suggested by colleagues, had received television and other media coverage and had been introduced to some students in class.

‘What works’ studies were sourced by the research team from the main US and UK websites that provide evidence synopses designed for teacher use: the US Department of Education Institute of Education Sciences What Works Clearinghouse (WWC) Quick Reviews (http://ies.ed.gov/ncee/wwc) and the UK Education Endowment Foundation (EEF) Teaching and Learning Toolkit (https://educationendowmentfoundation.org.uk). Research summaries were selected on topics relevant to participants’ intended practice level that discussed interventions for students at risk of educational disadvantage:

This reviewed studies of the application of a widely used reading comprehension approach, Reciprocal Teaching, for pupils with learning difficulties. It was expected that some students would be familiar with this approach.

Aim 2. Developing workshop activities and materials

Workshops were to be completed either in small groups, lasting around 90 min including breaks, or by individuals via email. Workshop activities were devised by brainstorming amongst the research team, considering diverse classrooms and earlier studies of teachers’ views on research. The resulting activities asked questions about participants’ views of educational research and of the selected research summaries; on how research could be better mediated by higher education establishments for student use, and on any personal changes arising from the workshops.

Workshop materials used in the PGDE email workshops are appended – please see Appendix 1. Workshops comprised the following activities:

  • Pre-workshop Questionnaire: Participants completing group workshops were sent a Pre-workshop Questionnaire to be completed in writing before or at the beginning of the workshop. Participants undertaking email workshops were asked to complete this first. It asked about participants’ current uses of research evidence; where they sourced evidence; how useful they found it; barriers and facilitators to evidence use, and their overall views on educational research.

  • Views of Statements about Educational Research: Ten statements about educational research derived from literature on teachers’ views were judged by participants, who marked whether they were or were not close to the participant’s personal views. This was followed by brief discussion within group workshops and written comments from email participants.

  • Views of Research Summaries: Web links to the selected policy research summaries were sent in advance of group workshops with printed copies supplied at the meeting, and web links were sent in email workshop materials. The aims of the organisations publishing the summaries – the Joseph Rowntree Foundation (JRF) and either the WWC or the Education Endowment Foundation (EEF) – were presented as background information. Participants were asked six questions on each summary: if these sources of evidence were new to them; if they were surprised by the research findings; if they were aware of the issues discussed in the research; if they might use the research evidence to inform their practice; if there were ways the research could become more informative for teachers’ use; and if the research raised questions they would like to have answered by further research.Yes/no responses were ticked by individuals in group workshops, and then the questions were discussed. They were answered in writing by email respondents.

  • Post-workshop Questionnaire: This was completed in writing as the last workshop activity asking if research could be made more useful; about participants’ views on participating in the workshop; and about any personal changes resulting from workshop participation.

  • Follow-up activity: A list of six research websites was provided so that participants could follow-up their learning after the workshop.

  • Follow-up Questionnaire: This was sent about one month later to be completed in writing, asking if there had been any changes following the workshop in: participants’ intended use of research evidence in their practice; the kind of research evidence they would consider; other sources of information their practice was/would be based on; the usefulness of research evidence in informing practice; the sources they used to access research evidence; and their view of barriers to or problems in using research evidence to inform practice.

When the follow-up questionnaire was returned, participants were sent a certificate recognising their participation as equivalent to three hours of study towards their continuing professional development portfolios, recognising the time required for reflection and for completing the follow-up questionnaire.

Aim 3.Piloting workshops

The initial draft of the materials was piloted with a primary-school teacher colleague who had experience teaching children with additional support needs and who was also a university tutor studying for a masters degree. She undertook the workshop acting as a participant, and commented upon the experience and materials, suggesting minor changes to wording that were incorporated into the final version.

The face-to face pilot workshops were held in summer 2015 for undergraduate students. They proceeded to time, obtained responses from all participants, and produced clear audio and written records. One undergraduate did not return the follow-up questionnaire. Email workshops for post-graduate students in summer 2016 produced extensive comments, and responses were returned promptly by participants. One post-graduate did not return the follow-up questionnaire.

Aim 4. Student views of participating in the workshops, and personal changes reported.

Student views of the workshop experience and its impact were based on analysis of responses to questions 5–9 of the post-workshop questionnaire and the follow-up questionnaire (please see Appendix 1).

Aim 5. Student awareness and views of educational research: quantitative responses and themes

This pilot study was not scaled to provide comprehensive information on students’ views, but rather to assess the workshops as a means of accessing views. However, an inductive semantic thematic analysis was undertaken to identify themes arising from qualitative responses in these early workshops. There were no predetermined codes or themes, although participants were responding to the workshop questions and materials as detailed.

Group discussions were recorded with notes taken to clarify speaker turns and transcribed soon after the workshops, and written responses to questions were collated. The six phases of thematic analysis described by Braun and Clarke (Citation2006, ) were followed. Procedures are outlined in , where the revision and development of themes post-coding is presented. Ten potential themes were identified at Phase 3, reduced to eight themes in Phase 4 when workshops were completed and all data was available. In Phase 5 there was further discussion and re-checking of the complete data set, and two themes were combined to give seven final themes. These encompassed the quotations analysed and covered all participants. This final list of themes was considered to be a full and appropriate representation of the data. in the ‘findings’ section presents the final seven themes, with illustrative quotations.

Table 1. Development of themes.

Table 2. Themes and illustrative quotations. Participants identified by course and project number.

Findings

Aim 1.Identifying and selecting examples of educational practice research

Aim 1 was met in that accessible research examples were identified on topics relevant to the national policy context and to students’ practice sector. It was intended that future workshops would update the research summaries, using higher education resources where possible to swiftly source examples relevant to their courses. Experience in the pilot study suggests this would be possible. However, University staff in the pilot study forwarded examples of critical policy research but no pedagogic ‘what works’ evidence, which was sourced by the research team from the literature.

Aims 2 and 3. Preparing workshop activities and piloting workshops

Piloting showed that the questionnaire, discussion and research-statement workshop activities were appropriate for small group face-to-face or individual email workshops, and analysable responses were obtained. Piloting thus suggested that the workshop activities were feasible, with appropriate timings, and gave insights into student views. Workshop activities therefore appear suitable for use by other higher education establishments and Aims 2 and 3 were met.

Aim 4. Student views of participating in the workshops, and personal changes reported

Participant views of the impact of the workshops were summated across workshops, with responses collated from answers to Post-workshop Questionnaire questions 5–9 and all questions on the Follow-up Questionnaire. Post-workshop questionnaires were completed by all 12 participants. Some participants did not answer all yes/no questions, or gave equivocal or contradictory answers, at times accompanied by an explanation for the respondent’s uncertainty. The range of responses to individual questions is therefore 11–13. Quantitative results are summarised in with Illustrative quotations. Post-workshop questionnaire quotations are from PGDE students only, as their email versions gave full responses, whereas the group workshops for BA/BEd students provided limited notes.

Table 3. Summary of quantitative responses to post-workshop and follow-up questionnaires with illustrative quotations.

Immediately after the workshop, responses to post-workshop questionnaires showed all participants agreeing that workshop participation had helped them to reflect on the use of research evidence (Q5), with 11 helped to identify or clarify relevant issues in using research (Q6). Nine reported that workshops had made them feel better equipped to use research evidence (Q7), and seven felt more able to identify potential solutions to using or providing research evidence (Q8). Participants were more evenly split on identifying how universities might provide research evidence for pre-service teachers (Q9).

Around a month later, follow-up questionnaires were returned by 10 participants. They reported few changes or planned changes in participants’ practices. Overall, a lack of opportunity and intention to apply research was signalled. Aim 4, identifying student views on workshop participation and changes effected, was therefore met.

Aim 5. Student awareness and views of educational research: quantitative responses and semantic themes

Quantitative responses to yes/no questions on pre-workshop Q1, the six questions about Research Summary 1 and the six about Research Summary 2 were summated across workshops and are reported in .

Table 4. Summary of quantitative responses to Pre-Workshop Q1 and questions on research summaries.

As shows, most participants were unsurprised by the findings of Research Summary 1 (Sosu & Ellis, Citation2014), were aware of the issues it raised and reported that they could use the evidence in their practice, although most also considered that it could become more informative for practice and that it raised further research questions.

The ‘what works’ Research Summaries 2 were much less familiar to participants. As indicates, none of the students were familiar with WWC or EEF as sources of research evidence, and around half were unaware of the approaches discussed (Reciprocal Teaching or Setting and Streaming), although 10 reported that they could use the findings. Again, most considered that the summaries could become more informative for practice, and that they raised further research questions.

Responses to the 10 statements about Educational Research activity were collated, and are summated in .

Table 5. Participants’ views of statements about educational research.

One participant indicated both agreement and disagreement with Statement 9, so responses to individual Statements range from 12 to 13. The largest agreements were for Statement 1, whose negative implications for educational research were not close to participants’ views, and Statements 2, 9 and 10, whose positive statements were endorsed. Other statements showed mixed views.

Thematic analysis of written comments and transcribed group discussion was undertaken as described above. The seven final themes were:

  1. Role barriers to using research (including work pressure, time constraints, the belief that research is the job of academics; and that it is not the teacher’s role to adapt research to their individual work context).

  2. Research findings are evaluated in relation to prior personal experience.

  3. Expressing the need for the practical application/s of the research to be clear.

  4. Expressing the need to extend the applicability of the research beyond the restricted research population and characteristics sampled.

  5. In/accessibility of the research message and the need for clarity and readability.

  6. Lack of/opportunities to apply research.

  7. Students engaging with research outside their course requirements.

These themes are illustrated by quotations in , with grammar in quotations conventionalised.

Thematic analysis of the data suggested participants’ considered research would preferably be more informative for teachers to use in practice (Theme 5) and stressed the need for teachers to have time to learn, understand and ‘keep up’ with current research thinking (Theme 1). Mistrust of evidence and challenges to research methodologies were expressed in relation to (unnamed) studies that had been read by participants whose results were not clearly expressed (Theme 5), or were not useable or applicable to actual classroom populations (Theme 4) or practices (Theme 3). Some participants considered research papers expressed the authors’ views, rather than presenting evidence, and so were open to differences of opinion (Theme 5), and to studies contradicting each other. Students stressed teachers as users, rather than producers, of research (Theme 1) and a resulting need for research to be accessible to teachers and jargon-free (Theme 5), which was considered frequently to be far from the case. Research was considered in terms of its potential for application within the personal work context (Themes 2 and 3), and teachers required information on how it could be adapted to classrooms, and for materials and organisational issues to be explained. They noted (Theme 6) that schools rather than individual teachers were responsible for introducing research-based practices and that as new teachers they may have a limited role to play here. However, some students were also identifying relevant research evidence outside their course requirements (Theme 7).

The themes identified require further evidence and exploration, However, Aim 5, to make a preliminary analysis of student views on education research, was met.

Discussion of findings

Methodological issues

The number of participants attending pilot workshops was small, from one institution, and participants self-selected. However, they did represent entrants to primary and secondary sectors, and were from all stages of under- and post-graduate courses. The study achieved its aims by producing workshop activities which elicited student views which could be analysed, using research examples from critical policy and ‘what works’ approaches that could be updated and used by other higher education establishments. However, some further methodological issues should be noted.

The research summaries used were judged by the research team and their staff colleagues to be accessible and suitable for student use. They may not be representative of the full range of educational research encountered by students.

The use of yes/no questions was intended to force a binary choice from participants, but some felt unable to choose either option. Reluctance to commit was at times matched by a comment on uncertainty, accurately represented by the non-categorical response. More finely graded response scales should be considered.

Respondents in the email workshop returned full individual written answers, and in particular fuller responses to the post-workshop questionnaire than were obtained by group workshops. Written responses saved researcher transcription time, allowing rapid collation of responses. This might be useful when planning course content or offering prompt feedback to student cohorts. Timetabling difficulties were avoided, and email workshops have the potential to reach more participants. Email respondents also have greater anonymity than afforded by face-to-face group discussion if their names are concealed by their institutional email addresses, and this may encourage the expression of individual views unaffected by group interaction. Emails thus offer an efficient way to seek students’ views, especially for larger cohorts, albeit at the expense of group discussion. It is not possible to determine from this pilot if there were any systematic difference in responses from group or email formats: numbers are small and only post-graduate students towards the end of their courses were asked to use the email version. However, at present email appears to be a viable option for such experienced students, with group workshops perhaps preferable if supportive discussion or clarification from a workshop leader is required for less experienced students.

Impact of the workshops

In both workshop formats the focus and reflective tasks with which participants were asked to engage reportedly led to some feeling better equipped to understand and use research. In the case of email respondents this occurred without feedback or discussion, and so any developments must have been self -generated through personal reflection.

Responses to Post-workshop Questionnaires summarised in showed most participants agreeing that workshop participation had helped them reflect on the use of research evidence; to identify or clarify relevant issues in the use of evidence; to feel better equipped to use evidence, and better able to identify potential solutions regarding using or providing research evidence. But Follow-up Questionnaires a month later recorded few changes or planned changes in their practice. This was partly due to limited practice opportunities, e.g. over the vacation period, and what uses they were already making of research evidence, but some fundamental problems were also signalled, related to mistrust and inaccessibility of research.

Such mistrust may partly be due to student knowledge of what constitutes a valid empirical research study. Although written for a professional audience, ‘what works’ research summaries in particular require some understandings of the quality of studies reviewed, and how systematic reviews are constructed. As indicated, none of the students were familiar with WWC or EEF as sources of research evidence, and the fact that these bodies review studies of high scientific quality may not have been appreciated. Similarly, the policy research summary was welcomed for its clear conclusions, but its underlying evaluative principles were not mentioned. An understanding that study and review quality standards will affect the application and limits of research findings might be expected to influence views, but it was not clear that participants possessed such understandings, although they were suspicious of research that appeared to reflect only the authors’ views. Throughout the workshops, only one comment on methodological adequacy was recorded, where a BEd 4 student had been introduced to the concept of study quality in the final year of their degree programme. Students in early years of their courses may be introduced to research evaluation later, but PGDE participants at the end of their studies did not raise issues of research quality or empirical standards. When sourcing evidence, faculty staff did not suggest any ‘what works’ examples, although it is possible that further explanation to colleagues would have elicited them. But the alternative possibility is that ‘what works’ evidence did not feature greatly in teacher education courses at the time. This issue could be explored in future research studies. However, workshop participants did not express the negative views about ‘what works’ research reported by Connolly (Citation2014): as shows, 10 participants thought the ‘what works’ summary they read could be used to inform their practice.

The study is too small to report conclusive findings on student views of research, but some pointers and areas for further study emerge. New themes would no doubt emerge in further workshops, and from students on different teacher education programmes. But as stated, thematic analysis showed some discontent with educational research reported by these pilot students, and their comments perhaps provide messages for teacher educators preparing students to transition into a work environment where evidence-based teaching practice is strongly encouraged by policy, but where individual teachers may lack agency in implementing research findings. Students are likely to succeed better in this context if they possess sound tools with which to evaluate research quality, and the workshops suggested that some current students may lack these skills. Higher education establishments are in an excellent position to provide relevant information and de-mystify what constitutes ‘good’ research studies. Further research on students’ knowledge and understanding of the constructs of applied research and of implementation science would be useful in seeking to influence the uptake, adoption, and implementation of evidence-based policy and interventions.

The themes identified in the project enhance understandings of the positioning of teachers as research users, and form a sound basis for discussions with student teachers. They show resonances with the views of earlier teachers reported in studies reviewed above, and offer messages for educational researchers. Participants’ identification of the teacher as research user is close to BERA-RSA (Citation2014a) view of teachers as ‘discerning consumers’ of research. A main message for academics is therefore to write accessible research summaries for professional audiences, and the need for further clarity and explanation to ensure even better understandings. Student comments from , Themes 1, 3 and 5, on how universities and academics could make research findings more accessible are thoughtful and apposite, and the need to consider teachers as one potential audience for research evidence, and report accordingly, is clear. However, teachers do not independently decide on the use of research evidence, and schools rather than individual teachers may be responsible for introducing research-based practices (Theme 6). Considering how evidence can be better mediated for and introduced to schools is necessary. Current accessible research syntheses of evidence on pedagogic and organisational factors as disseminated by the EEF Learning & Teaching Toolkit and WWC Quick reviews are intended to support such teacher understanding, and further research on how far such approaches in fact meet student and teacher needs, and what else is required, would be welcome.

Conclusion

The study workshops identified facilitators and barriers to students’ understandings of educational research, and provided a feasible means of identifying views across student cohorts that could also be used by other higher education establishments. The study produced materials that allowed the student voice to be heard, and participants expressed a clear, appropriate and achievable need to access useable educational research that is easy to interpret and apply, in order to support their work with diverse pupils. The role of higher education in delivering this requirement is also clear, appropriate and achievable.

Acknowledgments

The authors would like to acknowledge the support of the Higher Education Academy in funding this project, and the contributions to education of the student teacher pilot participants.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Funding

This work was supported by the Higher Education Academy (HEA) as a Social Science Priority Research Strand.

References

  • BERA-RSA. (2014a). The role of research in teacher education: Reviewing the evidence. London: British Educational Research Association. https://www.thersa.org/globalassets/pdfs/reports/bera-rsa-interim-report.pdf
  • BERA-RSA. (2014b). Building the capacity for a self-improving education system: Final report of the BERA-RSA Inquiry into the role of research in teacher education. London: British Educational Research Association. https://www.bera.ac.uk/wp-content/uploads/2013/12/BERA-RSA-Research-Teaching-Profession-FULL-REPORT-for-web.pdf
  • Bi, H.Y., Wu, H.P., Su, X.Y., & Roberts, S.K. (2017). An examination of Chinese preservice and inservice early childhood teachers’ perspectives on the importance and feasibility of the implementation of key characteristics of quality inclusion. International Journal of Inclusive Education, 21, 187–204.
  • Braun, V, & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3, 77 –101.
  • Connolly, P. (2014) The advance of evidence-based approaches: Key lessons from Ireland, Keynote address, ‘Better Evidence for a Better World’, Campbell Collaboration Colloquium 2014, Queen’s University Belfast, Belfast, 16–19 June 2014, Retrieved 15 June 2017 from http://paulconnolly.net/talks/index.html.
  • European Agency for Development in Special Needs Education. (2011). Teacher education for inclusion across Europe - Challenges and opportunities. Odense, Denmark: Author.
  • Florian, L., & Pantić, N. (Eds). (2013). Learning to teach. Part 2: Exploring the Distinctive Contribution of Higher Education to Teacher Education. York: The Higher Education Academy. https://www.heacademy.ac.uk/learning-teach-exploring-distinctive-contribution-higher-education-teacher-education
  • Hemsley-Brown, J.V., & Sharp, C. (2004). The use of research to improve professional practice: A systematic review of the literature. Oxford Review of Education, 29, 449–470.
  • Leat, D., Reid, A., & Lofthouse, R. (2015). Teachers’ experiences of engagement with and in educational research: What can be learned from teachers’ views? Oxford Review of Education, 41, 270–286.
  • Lombard, B.J.J., (Kopus), & Kloppers, M. (2015). Undergraduate student teachers’ views and experiences of a compulsory course in research methods. South African Journal of Education, 35, 1–14.
  • Ratcliffe, M., Bartholomew, H., Hames, V., Hind, A., Leach, J., Millar, R., & Osborne, J. 2004. Science education practitioners’ views of research and its influence on their practice. Evidence-based Practice in Science Education (EPSE) Research Network Project 4. York, UK: Department of Educational Studies, University of York. Available online at http://eprints.soton.ac.uk/58240/1/epse_sci_ed_res.pdf
  • Schleicher, A. (Ed.). (2012). Preparing teachers and developing school leaders for the 21st Century: Lessons from around the world. Paris: OECD Publishing. http://www.oecd.org/site/eduistp2012/49850576.pdf
  • Sharma, U. (2012). Changing pre-service teachers’ beliefs to teach in inclusive classrooms in Victoria, Australia. Australian Journal of Teacher Education, 37, 53–66.
  • Sosu & Ellis. (2014). Closing the Attainment Gap in Scottish Education. York: Joseph Rowntree Foundation. https://www.jrf.org.uk/report/closing-attainment-gap-scottish-education
  • Tatto, M.T., & Furlong, J. (2015). Research and teacher education: Papers from the BERA-RSA Inquiry. Oxford Review of Education, 41, 145–153.
  • Tournaki, N., & Samuels, W.E. (2016). Do graduate teacher education programs change teachers’ attitudes toward inclusion and efficacy beliefs? Action in Teacher Education, 38, 384–398.
  • Williams, D., & Coles, L. (2007a). Teachers’ approaches to finding and using research evidence: An information literacy perspective. Educational Research, 49, 185–206.
  • Williams, D., & Coles, L. (2007b). Evidence-based practice in teaching: An information perspective. Journal of Documentation, 63, 812–835.
  • Winch, C., Oancea, A., & Orchard, J. (2015). The contribution of educational research to teachers’ professional learning: Philosophical understandings. Oxford Review of Education, 41, 202–216.
  • Zeuli, J. (1994). How do teachers understand research when they read it? Teaching and Teacher Education, 10, 39–55.

Appendix 1: Text of PGDE Email Workshop [logos and identifying information removed]

Participant Information Sheet

Name and address of department: [added]

Title of the study: The use of workshop materials summarising evidence-based classroom approaches to support student teachers in responding effectively to issues of diversity and inclusion.

Introduction

My name is [name] and I am [job title]. My colleague [name, job title] and I have been funded by the Higher Education Academy to explore the role that research-informed teaching plays in enabling student teachers to respond effectively to issues of diversity and inclusion in their practice and to closing the achievement gap.

What is the purpose of this investigation?

This investigation aims to investigate the use of workshop materials summarising evidence-based classroom approaches to support student teachers in responding effectively to issues of diversity and inclusion.

What would you do in the project?

This study would involve participants completing a workshop and returning a survey by email. In this workshop participants will be asked to consider research evidence summaries and to fill in two questionnaires. Completion of the workshop will take about 90 min. A follow-up questionnaire will be sent out a month after the workshop: this will take around 25 min to complete. The questionnaires will ask about participants’ current use and views on the role of research evidence in practice and their reflections on barriers and facilitators in relation to the use of research evidence in practice. Those participating via email will receive materials, questionnaires and research summaries individually via email, and will also return their responses by email. Email participants will consent to the study by returning their responses.

When each participant completes and returns their follow-up questionnaire, we will forward a certificate recognising completion for their individual CPD files.

Why have you been invited to take part?

We wish to include participants from [university name] teacher educator courses relating to different educational stages. All students in years 1 – 4 of the BA/BEd, and all students on PGDE courses are eligible to take part in the study. Completion of the workshop can be included in your list of CPD activities.

Do you have to take part?

Participation in this study is entirely voluntary and it is your decision whether or not you would be willing to take part. If you decide to take part, you can still withdraw from the study if you wish at any point without having to give a reason. If you decide not to participate in this research, or to withdraw from the research, please be assured that this will not affect your experience with your university course in any way.

What are the potential risks to you in taking part?

There are no potential risks identified in taking part in this research.

What happens to the information in the project?

The results of this study will be presented in a report to the Higher Education Academy, and it is hoped that the findings will be published in academic journals and presented at conferences. Anonymity of the participants is assured. The data will be stored securely and securely destroyed after the completion of the study.

The University of Strathclyde is registered with the Information Commissioner’s Office who implements the Data Protection Act 1998. All personal data on participants will be processed in accordance with the provisions of the Data Protection Act 1998.

Thank you for reading this information – please contact me to ask any questions if you are unsure about what is written here.

What happens next?

If you are happy to be involved in this study, please read the attached consent form to confirm this, read the workshop programme, and complete the workshop activities. Please send the ‘Please Return’ Section to [name] at the email address below.

If you do not want to be involved in the study, thank you for your attention.

Researcher contact details: [added].

This investigation was granted ethical approval by the [name of] Ethics Committee.

If you have any questions/concerns, during or after the investigation, or wish to contact an independent person to whom any questions may be directed or further information may be sought from, please contact:

Convener of the [name] Ethics Committee

[Contact details added]

Participant Consent Form

Name of department: [added]

Title of the study: The use of workshop materials summarising evidence-based classroom approaches to support student teachers in responding effectively to issues of diversity and inclusion.

  • I confirm that I have read and understood the information sheet for the above project and the researcher has answered any queries to my satisfaction.

  • I understand that my participation is voluntary and that I am free to withdraw from the project at any time, without having to give a reason and without any consequences.

  • I understand that any information recorded in the investigation will remain confidential and no information that identifies me will be made publicly available.

  • I consent to being a participant in the project, by returning my responses.

Please Keep

Email Workshop Programme

Title of the study: The use of workshop materials summarising evidence-based classroom approaches to support student teachers in responding effectively to issues of diversity and inclusion.

Instructions

This version of the workshop is designed to be completed individually with responses returned by email. You are welcome to keep this Programme, but please return the whole section marked ‘Please Return’.

You can either add responses using Word and email, or print out the ‘Please Return’ section, write your responses and scan to pdf, and return the scanned version attached to the email. Returning the ‘Please return’ section constitutes consent, as detailed on the attached participant information sheet and consent form.

Research Summaries 1 and 2 can also be downloaded from the addresses given.

1   Pre-workshop Questionnaire – 10 min.

Please complete the pre-workshop questionnaire, taking about 10 min, adding responses under each question.

2   Views on Research10 min.

A list of statements about educational research is below. The same list is in your ‘Please Return’ section. Please tick one box on your ‘Please Return section next to each statement to indicate if it is “Close to” your views’ or ‘Not close’ to your views’, and add comments.

List of statements

  • 1   Education research isn’t helping people live with daily reality.

  • 2   In order to influence teachers’ practice, research-based teaching materials that translate findings into practical strategies are required.

  • 3   Teachers have concerns about their ability to evaluate research information.

  • 4   Teachers are less interested in research if they believe that the intention in sharing the research evidence is to impose a particular style or model on their teaching.

  • 5   Having research evidence for practice prevents inappropriate or time-wasting activities in class.

  • 6   Without strong research evidence for good practice, teachers can be pushed into doing whatever politicians dictate.

  • 7   Educational research is often not applicable to individual classroom situations.

  • 8   Research is often full of jargon and statistics that are hard to understand.

  • 9   Having research evidence for practice allows teachers to justify their professional decisions.

  • 10   Theory without practice is empty; practice without theory is blind

3   Research Summary 1 – 25 min.

The next activity involves reading a piece of policy research published by the Joseph Rowntree Foundation and writing answers to questions.

Notes from the Joseph Rowntree Foundation website:

  • The Joseph Rowntree Foundation is an endowed foundation funding a UK-wide research and development programme.

  • We are independent, but we are not neutral: we are on the side of people and places in poverty.

  • We search out the underlying causes of poverty and inequality, and identify solutions – through research and learning from experience.

  • We demonstrate solutions – by developing and running services, stewardship of our land and buildings, innovating and supporting others to innovate.

  • We influence positive and lasting change – by publishing and promoting evidence, and bringing people together to share ideas.

Please read the Executive Summary of Sosu & Ellis (Citation2014) Closing the Attainment Gap in Scottish Education. York: Joseph Rowntree Foundation (available at http://www.jrf.org.uk/files/jrf/education-attainment-scotland-summary.pdf) and write answers to these six questions on the Research Summary 1 sheet in your ‘Please Return’ section:

  • 1   Is this source of research evidence [the Joseph Rowntree Foundation] new to you?

  • 2   Are you surprised by the research findings?

  • 3   Were you aware of the issues discussed in the research?

  • 4   Do you think you could use this research evidence to inform your practice?

  • 5   Are there ways in which this research evidence could become more informative for teachers to use in practice?

  • 6   Does this raise questions you would like to have answered by further research?

This should take around 25 min.

5   Research Summary 2 – 25 min.

The next activity involves reading a piece of policy research published by the Education Endowment Foundation and writing answers to questions.

Notes from the Education Endowment Foundation website:

  • The Education Endowment Foundation (EEF) is an independent grant-making charity dedicated to breaking the link between family income and educational achievement, ensuring that children from all backgrounds can fulfil their potential and make the most of their talents.

  • We aim to raise the attainment of children facing disadvantage by:

  • ● identifying and funding promising educational innovations that address the needs of disadvantaged children in primary and secondary schools in England;

  • ● evaluating these innovations to extend and secure the evidence on what works and can be made to work at scale;

  • ● encouraging schools, government, charities and others to apply evidence and adopt innovations found to be effective.

  • We share evidence by providing independent and accessible information through the Sutton Trust-EEF Teaching and Learning Toolkit, summarising educational research from the UK and around the world. This Toolkit provides guidance for teachers and schools on how best to use their resources to improve the attainment of pupils.

The paper is the EEF Teaching and Learning Toolkit (2015) summary of Setting or Streaming (also available at https://educationendowmentfoundation.org.uk/toolkit/toolkit-a-z/ability-grouping). Please write answers to these six questions on the Research Summary 2 response form at the end of this file:

  • 1   Is this source of research evidence (the Education Endowment Foundation) new to you?

  • 2   Are you surprised by the research findings?

  • 3   Were you aware of the approach (Setting/streaming) discussed in the research?

  • 4   Do you think you could use this research evidence to inform your practice?

  • 5   Are there ways in which this research evidence could become more informative for teachers to use in practice?

  • 6   Does this raise questions you would like to have answered by further research?

5   Follow-up information

Here is a list of key websites relevant to educational research that you might find useful.

The Campbell Collaboration Library of Systematic Reviews

http://www.campbellcollaboration.org/lib/

The Education Endowment Foundation

http://educationendowmentfoundation.org.uk/toolkit/

The Joseph Rowntree Foundation publications http://www.jrf.org.uk/publications

The National Foundation for Educational Research publications http://www.nfer.ac.uk/publications/

The What Works Clearinghouse http://ies.ed.gov/ncee/wwc/findwhatworks.aspx

6  Post-workshop Questionnaire – 10 min.

Please complete the post-workshop questionnaire, taking about 10 min, adding responses under each question.

Your ‘Please Return’ section follows. Please complete and return it with responses. We will send a follow-up questionnaire by email around a month after receiving your completed ‘Please Return’ section. When you return your completed follow-up questionnaire we will send you a certificate confirming your participation in the workshop which might be useful for your CPD activity file.

Please send the ‘Please Return’ section to: [email added]

Please Return

Email Workshop Responses:

Title of the study: The use of workshop materials summarising evidence-based classroom approaches to support student teachers in responding effectively to issues of diversity and inclusion.

Please respond by typing answers, or by printing out this ‘Please Return’ section, writing answers and scanning to pdf. Please return by email to [added].

Pre-workshop Questionnaire

1.  How far would you say that your current practice is based on research evidence?

Please tick   Very little   Quite a lot    A lot

Can you tell us more?

2. What kinds of research evidence do you use?

Responses

3. What other sources of information, if any, is your practice based on?

Responses

4. How useful do you find research evidence in informing your practice?

Responses

5. What sources do you use to access research evidence?

Responses

6. Are there any barriers to or problems in using research evidence to inform practice?

Responses

7. Are there any factors that help/facilitate using research evidence to inform practice?

Responses

8. Could you explain why you do, and/or why you don’t, use research evidence?

Responses

9. Could you briefly sum up your views on educational research?

Responses

Views on Statements about Educational Research

Here is a list of 10 statements about educational research. Please complete the chart, ticking whether the statements are ‘close to’ or ‘not close to’ your views, and adding comments.

Research Summary 1

Please answer these six questions re. Research Summary 1, Sosu & Ellis (Citation2014).

1. Is this source of research evidence (the Joseph Rowntree Foundation) new to you?

Please tick    Yes    No

Can you tell us more?

2. Are you surprised by the research findings?

Responses

3. Were you aware of the issues discussed in the research?

Responses

4. Do you think you could use this research evidence to inform your practice?

Responses

5. Are there ways in which this research evidence could become more informative for teachers/practitioners to use in practice?

Responses

6. Does this raise questions you would like to have answered by further research?

Responses

Research Summary 2

Please answer these six questions re. Research Summary 2, EEF Teaching and Learning Toolkit summary of ‘Setting or Streaming’.

1. Is this source of research evidence (the Education Endowment Foundation) new to you?

Please tick    Yes    No

Can you tell us more?

2. Are you surprised by the research findings?

Responses

3. Were you aware of the approach (Setting or Streaming) discussed in the research?

Responses

4. Do you think you could use this research evidence to inform your practice?

Responses

5. Are there ways in which this research evidence could become more informative for teachers/practitioners to use in practice?

Responses

6. Does this raise questions you would like to have answered by further research?

Responses

Post-workshop Questionnaire

1. Could you suggest steps that you could take to help you use research based evidence/approaches in practice?

Responses

2. Could you suggest steps that you could recommend for other student teachers to help them use research based evidence in practice?

Responses

3. Could you suggest steps researchers could take to help their research to be used in practice by teachers?

Responses

4. Could you suggest steps universities could take to help student teachers use research-based evidence in practice?

Responses

5. Has participating in this workshop helped you to reflect upon your use of research evidence?

Please tick    Yes      No.

Can you tell us more?

6. Has participating in this workshop helped you to identify or clarify potential issues in using research evidence?

Please tick    Yes      No.

Can you tell us more?

7. Has participating in this workshop helped you to feel more equipped to use research evidence?

Please tick    Yes      No.

Can you tell us more?

8. Has participating in this workshop helped you to identify potential solutions for yourself/other students in using and providing research evidence?

Please tick     Yes      No.

Can you tell us more?

9. Has participating in this workshop helped you to identify potential solutions for universities in providing research evidence?

Please tick    Yes      No.

Can you tell us more?

This is the end, and thank you for completing the email workshop. We will send a follow-up questionnaire around a month after receiving your completed ‘Please Return’ section. When you return your completed follow-up questionnaire we will send you a certificate confirming your participation in the workshop which might be useful for your CPD activity file.

Please return to: [added]

Follow-up Questionnaire

Title of the study: The use of workshop materials summarising evidence-based classroom approaches to support student teachers in responding effectively to issues of diversity and inclusion.

Please return by email to: [added]

As part of the email-workshop, you completed questionnaires about your views on and uses of research. The following questions ask whether there have been any changes following the email-workshop.

  1. Has your intended practice based on research evidence changed at all, and if so how?

  2. Has the kind of research evidence you do/will use changed at all, and if so how?

  3. Have other sources of information that your practice is/will be based on changed at all, and if so how?

  4. Has the usefulness of research evidence in informing your practice changed at all, and if so how?

  5. Have the sources you use to access research evidence changed at all, and if so how?

  6. Has your view of barriers to or problems in using research evidence to inform practice changed at all, and if so how?