5,005
Views
11
CrossRef citations to date
0
Altmetric
Articles

‘It’s hard to grow when you’re stuck on your own’: enhancing teaching through a peer observation and review of teaching program

, ORCID Icon, , , &
Pages 54-68 | Received 07 Nov 2019, Accepted 06 Apr 2020, Published online: 13 Sep 2020

ABSTRACT

Rapid technological and social changes have prompted a strong focus on teaching practices in higher education. Among the assortment of programs and approaches aimed at developing teaching practices, peer review and observation of teaching remain widespread for their efficiency and potential to be transformative. Though such programs are well described in the literature, whether or how they affect practices remains under researched. This study reports on the accounts of past participants – with respect to changes in their teaching practices – in a peer review and observation of teaching program run in the Faculty of Science at a large, research-intensive university. Results from a questionnaire and interviews from several years’ cohorts indicate that participants overwhelmingly believed that the program enhanced their teaching practice and that they continued to practice specific teaching strategies developed during their participation in the program. Particular features of the program were associated with its success, such as the role and experience of the ‘reviewer’ and the review cycle. The program was thought to have changed the perspective of participants in the way they think about teaching.

Background

Providing support for university teachers to enhance their teaching practice has the potential to improve student learning and graduate outcomes. Peer review of teaching is a supportive process designed for this purpose (Gosling, Citation2014; Harvey & Solomonides, Citation2014). Peer observation of teaching, a process in which an observer watches a colleague in their practice with the main aim of learning about teaching strategies through that observational experience, has also led to self-reported improvements in teaching (Hendry et al., Citation2014). Evidence for the effectiveness of peer review for professional development includes improved teaching practice, increased confidence, and self-reflection, and engagement in the scholarship of teaching and learning (Barnard et al., Citation2011; M. Bell & Cooper, Citation2013; Chalmers & Hunt, Citation2016; Fletcher, Citation2018; Hammersley‐Fletcher & Orsmond, Citation2004; McMahon et al., Citation2007). Effective peer review programs lead to benefits to the institution by improving the collegial culture, bringing staff closer together, and disseminating effective teaching methods (M. Bell & Cooper, Citation2013; A. Bell & Mladenovic, Citation2008).

Despite the reported benefits of peer review, Sachs and Parsell (Citation2014) claim that peer review is ‘neither systematically supported nor generally perceived to be a high-quality developmental activity’ (p. 2). Problematic issues with the implementation of peer observation and review programs include: being seen as a threat to academic freedom and autonomy; reviewers being considered subjective; reviews being inaccurate or misrepresentative; and there is simply little incentive for staff to participate (Hammersley‐Fletcher & Orsmond, Citation2004; Shortland, Citation2004). In the higher education context, collaborating across different disciplinary cultures also poses a particular challenge (Neumann, Citation2001; Taylor et al., Citation2000).

Researchers have characterized several different models of peer observation or review, offering flexibility for institutions to tailor an approach to suit specific needs. For example, a model might have either a ‘review’ or ‘observation’ focus, include different reviewers (e.g., a peer or an expert), be used for formal and informal processes (such as accreditation versus development), or be compulsory or optional (Gosling, Citation2002).

However, although there is considerable detail available in the literature regarding the various characteristics of programs, there is less research on whether these programs have actually achieved their aims (Hammersley‐Fletcher & Orsmond, Citation2004; Yiend et al., Citation2014). For example, according to Gosling (Citation2009) participants were ‘ill equipped … to evaluate and provide feedback on the effectiveness of others’ teaching’ (p. 7), while according to Hammersley‐Fletcher and Orsmond (Citation2004): ‘The way in which schemes have been implemented has been well documented, but how the PoT [Peer Observation of Teaching] scheme develops and grows has not been so well researched’ (p. 501). Yiend et al. (Citation2014) also question whether engagement in peer review and observation programs actually develops participants’ critical reflection (Hatzipanagos & Lygo-Baker, Citation2006).

One key question appearing in many studies is related to ‘who’ is doing the reviewing. Shortland (Citation2010), for example, supports the ‘peer review’ model as an ‘equal’ is more likely to create an accommodating environment, thus limiting the possibility of intimidation. However, Hatzipanagos and Lygo-Baker (Citation2006) found that peer reviewers (as opposed to expert reviewers) are often motivated to offer only positive reviews, thus limiting any benefits to be gained by receiving and understanding the content and purpose of constructive criticism. In a similar juxtaposition Hatzipanagos and Lygo-Baker (Citation2006) found that a reviewer external to the discipline, an academic developer or central teaching and learning academic, is better suited to the role, while Georgiou et al. (Citation2018) found that both educational expertise and familiarity with the broad disciplinary culture are key factors in establishing a successful relationship between peer and participant.

Since its inception in 2014, the program that is the focus of this study underwent evaluation and development over a period of three years. As part of this evaluation, it was established that the program was extremely well-received by participants (Georgiou et al., Citation2018). Factors that were found to be critical for the success of the program, where success related to the aim of the program to support the professional development of the participant, included participant appreciation of feedback, specialized support from an experienced discipline teacher as ‘peer’, and the collegiality experienced. The main aims of the present study build and extend on this previous research, to explore teachers’ experiences of developing their mainly face-to-face teaching practices through participation in the program and their views on whether these developments endured afterwards.

The peer observation and review of teaching (PORT) program

The PORT program (originally named Peer Review of Teaching, or PRT) has been running in the Faculty of Science at a research-intensive university in Australia since 2014 (Georgiou et al., Citation2018; Sharma & Georgiou, Citation2017). While initially the program was purely voluntary, it is now also an essential requirement for all new staff joining the Faculty of Science, who must complete a compulsory three-day program run by the University’s academic development unit.

The PORT program discussed here is unique in Australian universities in that it combines a process of collegial peer review and observation. The main aim of the program is to enhance teaching practice to enable quality student learning. It involves: () initial goal setting, (2) reviewer constructive feedback, and (3) participant observation of peers.

Figure 1. Outline of the peer observation and review of teaching (PORT) program

Figure 1. Outline of the peer observation and review of teaching (PORT) program

The program begins with a goal-setting workshop in which participants, in discussion with each other and workshop facilitators, identify specific objectives for enhancing their teaching practice. Participants are also asked to identify ‘indicators’ or ‘measures’ of how they will know their objectives have been met. These objectives and indicators are recorded and serve to inform subsequent peer review discussions. The workshop is designed to give participants ownership of the professional development process and provide focus to their subsequent peer review discussions.

Following the workshop, each participant is observed by a reviewer – experienced in pedagogical practices and with some disciplinary expertise or awareness – on at least two occasions. A peer review and observation form are used during observations to record activities relevant to the participant’s objectives. On the first occasion the participant is encouraged to teach ‘as normal’. In the post-observation meeting, the participant is then asked to reflect on ‘how they went’ in light of their stated objectives. In the ensuing discussion, various new strategies and approaches are discussed. The participant then chooses one or more of these strategies to trial in their second observation/review. In the post-observation meeting that follows this round, the participant is again asked to reflect on whether their objectives were met in relation to the agreed indicators. These post-observation discussions are designed to resonate with the ‘evidence-based’ expectations of STEM lecturers. For example, a participant’s objective might be to increase student engagement. Reviewers can report on whether this objective has been met by undertaking an intermittent visual scan and recording student behaviours.

Participants are also required to perform at least two observations of other teaching sessions during the program. These observations are made available by way of a timetable that is compiled by the PORT team and includes different types of sessions with a range of teaching strategies and techniques. Wherever possible in terms of time commitments, participants are encouraged to choose sessions that include strategies they wish to try. They are also given a specially designed form, the ‘Lecture Activity, and Student Engagement’ (LASE) form to complete during their observations in an effort to encourage a critically reflective eye.

The Colloquium is a forum held at the end of the year, designed for participants to reflect on their experiences throughout the program (see Georgiou et al., Citation2018 for more details).

Aims

This study aims to evaluate the impact of a hybrid PORT program on university lecturers’ practice. Specifically, we are interested in addressing the following three research questions:

  1. What, if any, are the changes participants’ made to their teaching practices as a result of engagement in the PORT program?

  2. Were these changes sustained after engagement in the PORT program?

  3. Which elements of the program were important in encouraging these changes?

Methods

The sample and two main survey methods, a questionnaire and individual interviews, are detailed below. Broadly, the questionnaire provided answers to research questions 1 and 2, which were related to identifying the changes in teaching practices and determining whether they were sustained, whilst the interviews provided an in-depth understanding of how the program impacted teaching practices, thus providing answers for research question 3. However, as will be evident, both tools were utilized to answer all three research questions overall.

Recruitment and sample

Forty-five academic staff completed the PORT program between 2014 and 2016. All but two of these participants were invited, by email, to complete an online questionnaire, with a further option of taking part in a post-questionnaire interview. Two could not be contacted. Of the remaining 43 participants, 16 completed an online questionnaire. This equates to a 37% response rate, which is the minimum required for questionnaires (Nulty, Citation2008, p. 310). Three of the participants were from the 2014 cohort, nine from 2015, and four from 2016. In terms of level of appointment, there were two level A (associate lecturer), eight level B (lecturer), three level C (senior lecturer), two level D (associate professor), and one unknown (participant did not provide their level of appointment). Most (ten) had up to two years of teaching experience. Five staff participated in an individual interview about their experience of the program in addition to completing the online questionnaire. We utilised convenience sampling and thus did not collect demographic information from interview participants.

Questionnaire

A 35-item online questionnaire was designed by the research team and a link was emailed to those who responded to the initial invitation. The questionnaire consisted of a mixture of 30 Likert-style closed questions and five open-ended response questions. The questions primarily focused on distinct aspects of the PORT program and included the following areas:

  • Demographics and context (Q1-Q5)

  • Initial workshop experience (Q6-Q8)

  • Peer observation (Q9-Q18: including two open-ended questions)

  • The review process/cycle (Q19-Q28: including two open-ended questions)

  • Experience of the final colloquium (Q29-Q30)

  • Whether any changes in teaching practice were sustained (Q31-Q33)

  • Whether participants would recommend the program and had any suggestions for improvement (Q34 & Q35: including one open-ended question)

Likert-style responses are reported as numbers of responses (or percentages) for each given question. For open-ended responses, quotations are provided, as appropriate, for illustrative purposes.

Interviews

Semi-structured, 60 minutes, face-to-face interviews were conducted with each of the five participants by the lead author. The recordings of these interviews were transcribed verbatim. Following this, three members of the research team analyzed the interview transcripts in the five phases of thematic analysis outlined by Braun and Clarke (Citation2006), namely: (1) becoming familiar with the data; (2) generating initial codes; (3) searching for themes; (4) reviewing themes; and (5) defining and naming themes. Each member of the team individually carried out the first three phases, by closely reading each transcript and distilling the main points. Together themes were reviewed and agreed upon, and quotes selected to illustrate each theme. Interview quotations are ascribed to individual interviewees with the labels I1, I2, and so forth.

Results

Did participants change their practice and were these changes sustained (research questions 1 and 2)?

Overall, all questionnaire respondents (100%) reported that they are still using new teaching strategies that they developed in the PORT program (Q31; 13 of 16 stating they ‘frequently’ do, with the other 3 indicating ‘always’), and all (100%) would recommend the program to other colleagues (Q34).

In terms of the development of teaching practices, respondents overwhelmingly noted that they felt confident in applying a new strategy with 15 of the 16 respondents to the questionnaire selecting ‘agree’ or ‘strongly agree’ on this question (Q23; 93.75%) and most agreeing that the trial of the new strategy was successful (Q25; 87.5%) ().

Figure 2. Selected responses related to teaching practices from questionnaire (n = 16). Note that there were no responses in the Disagree or Strongly Disagree categories

Figure 2. Selected responses related to teaching practices from questionnaire (n = 16). Note that there were no responses in the Disagree or Strongly Disagree categories

Supporting the findings in the questionnaire, all five interview participants were very satisfied with their experience in the PORT program, and found it enjoyable, useful or helpful, or valuable for their teaching.

For instance, one participant (I2) discussed implementing a practice they had observed in the following semester:

I’m teaching next semester [and I’m planning to] put a little more material online and get them to do some pre-readings and those sorts of things and then spend more time going over detailed stuff in the lecture rather than try to cram it all. Which was something I think that I got from the chemistry lectures I’ve sat in.

Which elements of the program helped encourage a change in teaching practice (research question 3)?

From the questionnaire we were able to glean an idea of which aspects of the program participants valued most and were most significant in terms of changing their teaching practice. All respondents to the questionnaire agreed or strongly agreed that the workshop, held at the beginning of the program, was useful for explaining the program (Q6; 100%) and for reflecting on their teaching (Q7; 100%), and 70.58% agreed or strongly agreed that discussion with colleagues was useful for setting teaching objectives (Q8). Fewer respondents agreed/strongly agreed that the colloquium, held at the end of the program, was useful for reflecting on teaching (Q29; 56.25%) or learning how to enhance teaching (Q30; 37.5%). Most participants responded ‘neutral’ for these questions (one respondent selected ‘disagree’ for Q30) and notably, several commented they were unable to attend the end-of-year colloquium.

When asked about the review cycle, all 16 respondents agreed or strongly agreed that being observed in their teaching (Q20; 100%) was useful for learning how to enhance their practice and that they were satisfied with the review process (Q26; 100%). With regard to feedback, all respondents felt reassured by feedback received from their reviewer about their teaching (Q22; 100%) () and most were motivated to try a new teaching strategy (Q23; 93.75%).

Figure 3. Selected responses related to review process from questionnaire (n = 16). Note that there were no responses in the Disagree or Strongly Disagree categories

Figure 3. Selected responses related to review process from questionnaire (n = 16). Note that there were no responses in the Disagree or Strongly Disagree categories

There was some additional detail in the open response section, with most respondents saying they valued the feedback that they received from their peer reviewer because it gave them information about their students’ levels of engagement or an ‘audience perspective’, or new strategies to try:

I really found it useful to know what students were doing while I was lecturing and at what points I had them most engaged and times when I lost them. It was very helpful to get this feedback.

While respondents valued the guidance that they received, they also felt a little anxious initially about being observed or ‘reviewed’ (Q28; open-ended question related to challenges experienced as part of program):

It is quite daunting have someone come to observe you teaching even if the atmosphere is that of support and guidance – you are still putting yourself out there for some kind of evaluation.

When considering participants’ roles as observers, 88.24% thought that observing colleagues in their teaching situations were useful for learning how to enhance their practice (Q10) and most agreed or strongly agreed that they were also generally satisfied with their experience as an observer (Q16; 93.75%).

Most respondents valued observing a colleague’s teaching situation because it enabled them to watch students’ reactions to their colleague’s practices and monitor students’ levels of engagement, and experience the situation from a student’s perspective. They used this experience to reflect further on their own practices (Q17; open-ended questions related to what was useful about the peer observation program).

[The most useful thing was] being able to focus on the teaching strategies and how students react to them.

Watching others helped me to benchmark my teaching – its effectiveness and the types of strategies I use.

In order to understand in greater depth the way the program impacted participants’ teaching practices, and to answer research question 3, four themes were identified in the interview data: (1) enhancement of teaching practice (through review); (2) enhancement of teaching practice (through observation); (3) development of reflective practice; and (4) creating confidence and collegiality. These are further described below.

Enhancement of teaching practice (through review)

The questionnaire results indicated that all surveyed participants made changes in their teaching practice by adopting and trying out new strategies to engage their students. Interviews demonstrated that participants valued receiving specific feedback about their teaching ‘style’ and practical, tailored ‘tips’ or advice about new teaching strategies, and then having the opportunity of connecting these suggested new teaching strategies to students’ learning experience. New strategies ranged from trying straightforward techniques to implementing more complex and involved approaches, which often involved some element of ‘active learning’, a popular movement in the science education research field (Freeman et al., Citation2014; Georgiou & Sharma, Citation2015). Examples of straightforward techniques included the teacher facing their students when they were talking, slowing the rate of their speech, using an appropriate font size on slides, or using ‘think-pair-share’ questions and activities with a clear and specific time limit.

More complex strategies included a fundamental change to the lecture structure, removing ‘content’ to include set breaks and periods of time for student activities, for example, using lecture response systems or worksheets to provide students with practice questions and problems (particularly in relation to difficult concepts).

Disciplinarity was part of the review process because many of the suggested strategies were discipline-specific pedagogies. As one participant put it:

That might have also been helpful because [the reviewer] understood a lot of the concepts and [they] understood when I was explaining a difficult concept and [they] could gauge how well the students were doing with that (I2)

Though disciplinary familiarity was important, participants also felt that it was important that the ‘peer’ was somewhat detached from their immediate network or department, partly because the focus of advice would then shift away from curriculum or discipline content. Participants who had experienced being previously reviewed by a colleague from within their discipline or department – in a process which was not part of the PORT program – found that their colleague’s feedback focused mainly on content (or ‘content coverage’) and, for example, included advice about pace or the rate of content delivery, rather than teaching strategies that engage students with content:

My supervisor did come to a couple of my lectures and … gave me feedback. A lot of the feedback was also content related [whereas] my [PORT reviewer] probably focused more on style and things like that.” (I2)

it was useful getting someone outside of chemistry because they concentrate more on my teaching style, which was good … Yeah. I guess it’s like a fine balance in that sense. (I2)

There was also one interviewee who noted that sometimes the feedback provided was not critical or meaningful due to the personal (friendly) relationship between colleagues in the same department:

I do but I don’t know how much – it’s hard to know how much meaningful feedback to take back from them. They’re all very positive but I don’t know if because we have a personal relationship, it’s a bit different. (I2)

Enhancement of teaching practice (through observation)

For most participants, during their peer observations they observed or saw colleagues using some or all of the strategies recommended by their reviewer, and some teachers also learned additional new strategies during this phase of their program:

One of the tricks I picked up [was] when it’s time to do the worksheets, [my colleague would] increase the light in the room and then [students] automatically know it’s time to work or they can see their worksheet for one. When [my colleague had] given them ample time and [wanted] to bring their attention back [they would] just dim the lights and that automatically brings their attention [back], like very naturally … I use that all the time and it works fantastically. (I2)

Importantly, participants also saw the degree to which students were engaged and motivated, and the levels to which students’ attention was being captured in response to their colleague’s techniques, and this became a focus of their observational experience:

I was sitting at the back of the lecture hall … and then I was observing all the students, like they [had] lecture notes, like [on their] laptop, iPad lecture notes were open and they were not involved in other activities. Everyone was having [a] laptop, but they were not browsing [the] Internet … they were actually paying attention. (I3)

Development of reflective practice

There was evidence of an improvement of reflective practices amongst interviewees, with several references indicating increased awareness of students’ level of engagement in the learning and teaching situation. Participants valued the perspective on student engagement provided by their reviewer. This perspective from a ‘second pair of eyes’ was facilitated by the reviewer’s use of the LASE form (Georgiou et al., Citation2018), to note down what students were doing throughout the teaching session. Participants thought that this perspective was useful because it was one that they were not normally able to experience:

I think it was very useful to get that accurate feedback into what are students doing because I can’t look at them, what they are doing. To certain extent I can see people attempting something on the worksheet but that does not mean that actually they are focused. (I5)

The participants all mentioned their appreciation of the ‘structured’ parts of the program, specifically, setting their objectives, receiving their reviewer’s notes and using the LASE form in their peer observations. One participant mentioned that they were shown the notes from their reviewer, which were comprehensive and referred to general ‘style things but also specific incidents’ (I2), making the process more empirical and dispassionate. Another two respondents highlighted that taking on the student perspective was not a ‘one-off’, but involved a change in their point of view. For example, Interviewee 1 recounts: ‘So yes, you start to take on board these kinds of insights – it never occurred to me that the people [students] at the back would have a different view or perspective on the slides’ (I1).

Creating confidence and collegiality

Some participants felt ‘alone’ or isolated in their teaching role, and consequently they felt that any discussion about teaching practice was beneficial: ‘You’re very siloed … you have really no idea in terms of feedback from other people as to how you’re doing, and what are the things you could change … you’re very much on your own’ (I4). Another participant reflected that taking part in the PORT program meant that, despite teaching at the university for 20 years, they were able to meaningfully discuss teaching and learning for the first time: ‘[What] I do remember vividly was the ability to interact with my colleagues from a lot of faculties to talk about teaching and focus for a while through this process’ (I1).

Participants also valued reassuring or encouraging comments made by their peer reviewer (particularly in the second meeting) indicating that they were ‘doing a good job’, or engaging their students and being a successful teacher. This reassurance or reinforcement enhanced participants’ confidence to teach.

My [reviewer] was really good in pointing out the things that I was doing well, which was really encouraging … Some lectures I noticed it was getting a bit noisy in quite a large lecture hall but [they were] encouraging in [that they] pointed out that a lot of the talk, most of the talk was actually on topic. That was reassuring that – I do like that [the students] talk to each other about the topic, it shows they’re engaged. (I2)

Discussion

This study reports on the accounts of past participants of a Faculty-based peer review and observation program to explore university teachers’ experiences of developing their mainly face-to-face teaching practices. Past participants were surveyed to determine the impact of the program on teaching practices and whether any reported changes were sustained. Our quantitative data demonstrate that engagement with the program was overwhelmingly positive and all respondents indicated that they felt confident with trying a new teaching strategy. Furthermore, all respondents to the questionnaire indicated that they still either ‘frequently’ or ‘always’ employ the changed teaching practice(s) on completion of the program.

Our thematic analysis of the interviews explored how the program impacted practices. First, the relationship between the reviewer and participant was found to be vitally important. It is important, for instance, that feedback received as part of a program like this is accepted as valid. Rather than receive uncritical advice (Hatzipanagos & Lygo-Baker, Citation2006) or advice that was considered too critical (Gosling, Citation2002), a compromise between these two is necessary. To achieve this compromise, there needs to be a strong and trusting reviewer-participant relationship. The reviewer needs to offer explicit and immediate feedback that is appropriate to the participant’s teaching context. This requires some awareness of the disciplinary content and associated teaching approaches. In the present study, participants noted that some disciplinary familiarity was valuable. As Clarke and Reid (Citation2013) suggest, staying connected to the discipline and engaging in discipline-specific pedagogies is important for developing teaching practices and engaging academics in teaching-related activities. Participants also noted that some detachment is necessary, to prevent too strong a focus on ‘content coverage’, or subject and course-specific details. This ‘fine balance’, as one participant put it, between content knowledge and detachment, resulted in a strong inclination to immediately try the techniques that were suggested.

Another important factor in the take-up of new teaching practices is exposure. In the review process, participants were introduced explicitly to new teaching practices that might work in their context; however, observation of others’ teaching sessions also helped participants become aware of different ways of doing things. This result is consistent with previous research on university teachers’ experience of peer observation, which showed that teachers could see whether colleagues’ strategies were effective, and felt reassured that they were already doing some things well by watching students in a teaching situation (Hendry et al., Citation2014).

Importantly, once implemented, new teaching practices would not be likely to be sustained without reflection on the part of the participants. Our results show that participants highly valued – in both the peer observation and peer review phases – the opportunity to focus on what students were doing in the learning and teaching situation (lecture, practical class, or tutorial). This opportunity to view ‘through students’ eyes’ – either first-hand during peer observation or second-hand via a reviewer’s feedback in peer review – how teaching strategies were experienced by students was a key factor in motivating participants to try new strategies. Specifically, participants appreciated seeing tangible evidence of students’ engagement through use of the LASE form. Two interviewees gave examples of how they ‘see’ things differently now, and one described how they were able to go through a process of evaluating the implementation of their teaching practices at their new institution.

Finally, these elements seemed to be tied up with the final theme of ‘collegiality and confidence’. As our results show, participants felt reassured by feedback from their reviewer that some aspects of their teaching were good, and particularly in the second peer review meeting that the new strategies they had tried were working well, thereby increasing their confidence. This reassurance or positive affirmation enhanced participants’ confidence in themselves as teachers, or their self-belief in their ability to teach successfully, known as their teacher self-efficacy. The development of faculty members’ confidence as teachers is particularly important: a recent review of research in education shows that teacher self-efficacy is positively related to student motivation and achievement (Zee & Koomen, Citation2016). As one participant in our study commented, ‘confidence is really important. That really impacted on me, when I was a lot more confident, I think the students responded well’ (I2). Participants also very much appreciated the opportunity to feel part of a community. Academic staff in universities, commonly, do not hold a teaching qualification and often teach alone, with little interaction with their colleagues about teaching practices – this was noted by a number of interviewees. As one participant in the PORT program commented, this ‘makes it hard to grow [as a teacher] when you’re stuck on your own’ (I4).

Many studies around teaching development concede that legitimate and lasting changes in practice are difficult to achieve. In our study, participants not only changed their practice, but also their way of thinking about teaching and we argue that this development represents a transformation. We posit that such change arises from being informed about and seeing similar strategies being implemented, which together with experiencing the student perspective and increased confidence, leads to enhanced practice and self-efficacy (). Acknowledging that the ultimate goal of developing teaching is to facilitate student learning and engagement, we note that in our particular context, several elements were identified, some of which agree with extant research on peer review and observation programs, such as the value of collegiality, while others seem to contradict accepted principles such as the importance of expert review. Though much is known about the different models of peer observation and review programs, less is known about how or if these programs achieved their aims and what role the discrete elements played in this achievement. This paper makes a contribution in this space by identifying the particular elements of one program and relating them to the program’s goal of transforming teaching practices.

Figure 4. A model of how factors that feature in the PORT program could achieve the aim of engaging students in disciplinary content

Figure 4. A model of how factors that feature in the PORT program could achieve the aim of engaging students in disciplinary content

Limitations and further research

This study has limitations in terms of potential biases related to sample selection and self-reporting. In terms of the survey, there is the possibility that only respondents who have had positive experiences will respond. Further, our selection of interview participants through convenience sampling might also be subject to bias. However, the triangulation between the two data sets does provide evidence of changes in teaching practice. Though self-reported, the detail in these accounts, particularly in the interviews, is significant. It is certainly not straightforward to encourage such changes in teaching practice and the link between these changes to specific elements of the program could be useful for designers and users of peer review and observation of teaching programs in higher education, and are worth exploring further. Additionally, it would be of interest to explore the benefits of being a reviewer with respect to how the role of reviewer might affect their own teaching. And finally, research aimed at testing our predictions about teachers’ enhanced self-efficacy and students’ engagement with discipline content (see our model, ) would be of considerable interest given the elusive nature of student engagement.

Conclusions

Though peer review and observation of teaching programs have been well described in the literature and exist in some form or another in most tertiary institutions, the durability and success of these programs are rarely the focus of research. In particular, genuine changes in practice are rare for one-off programs within universities, where teaching is still considered to reside in the shadow of research. In this study, we consider a program that demonstrates how structured and collegial collaboration between academics can result in improvements in teaching practice. This program has been sustained since 2014 and continues to be received well by all involved, including participants and reviewers alike. Participants’ positive experience of our PORT program and their positive outcomes provide reasons to be optimistic about the value and feasibility of supporting and maintaining professional development programs for improving teaching within the higher education sector.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Notes on contributors

Graham D. Hendry

Graham D. Hendry is a Senior Lecturer in higher education. His research interests include educational design and academic staff professional learning.

Helen Georgiou

Helen Georgiou is a Lecturer in Science education. Her research interests include conceptual understanding and development in school science and high quality teaching and learning practices in higher education.

Hilary Lloyd

Hilary Lloyd is a Senior Lecturer in the discipline area of Pharmacology. She has 25 years of teaching undergraduate and postgraduate students and holds a position as an education-focused academic. Her educational research interests include collaborative learning, pedagogical innovation and academic development.

Vicky Tzioumis

Vicky Tzioumis is a Lecturer (Education-focused) with an interest in student-centred learning and teaching, and extensive experience in the management and delivery of academic staff professional learning programs.

Sharon Herkes

Sharon Herkes is a senior lecturer in Physiology at the University of Sydney. Her research interests include gamification, academic assessment practices and peer review

Manjula D. Sharma

Manjula D. Sharma is a Professor in Physics Education with research interests in higher education teaching and learning with a focus on the discipline of physics

References