638
Views
0
CrossRef citations to date
0
Altmetric
Research Article

More than a checkpoint: the pedagogic potential of a dialogic approach to doctoral progression assessment

ORCID Icon

Abstract

In the UK, all doctoral programmes are expected to include some form of periodic progression assessment, with individual institutions having autonomy to design and implement their own structures. Yet, despite the potential significance of this assessment to individual doctoral journeys, the design of progression assessment processes has previously received very limited attention. This paper reports on a study which investigated doctoral students’ experiences of progression assessment at one UK university, where the process involved both written and oral components. Utilising the concept of assessment for learning to support the analysis of narrative interviews with six doctoral students studying in the social sciences and humanities, the paper considers the pedagogic potential of doctoral progression assessment. The findings of the study indicated that the students perceived the dialogic aspects of the assessment to have the most significant potential for supporting learning and understanding, particularly where invitational, reflective, coaching format questions were utilised. The potential of assessment dialogue to present opportunities for reframing and disrupting thinking is explored, as well as the significance of this assessment in supporting autonomy, permission and motivation. The paper advocates careful consideration of both assessment design and practice in relation to these aspects.

Introduction

The purpose of this paper is to illustrate, and provoke consideration of, the pedagogic potential of doctoral progression assessment, through consideration of the experiences and perceptions of doctoral students. The paper draws on narrative data from a small-scale study at one university in the United Kingdom (UK), where periodic progression assessment involves both written and oral components and includes assessors who are independent of the supervisory team.

In the UK, all doctoral programmes are expected to include ‘some form of regular progress review’ in order to assess doctoral candidates’ progress and identify ‘gaps in knowledge or skills’ (Quality Assurance Agency for Higher Education [QAA] 2020, 15). Typically taking place on an annual basis, the significance of this assessment may vary depending on the institution and stage, but the initial progression assessment often represents a more formal event determining transfer to, or confirmation of, doctoral status (QAA Citation2020, 12). Whilst a national doctoral characteristics statement exists (QAA Citation2020), there are no standardised progression assessment criteria and individual institutions have autonomy to design and implement their own processes. Yet, despite the potential significance of progression review structures to the doctoral journey (Smith McGloin Citation2021), the design of progression assessment has previously received very limited attention (Dowle Citation2023). Where progression assessment has been the subject of academic discussion, this has often centred on predominantly pragmatic, rather than pedagogic, considerations such as attention to its role in completion rates (Clarke Citation2013) and/or the framing of reviews as functioning for purposes such as institutional quality control (Sillence Citation2023). In this context, this paper focuses on a study which aimed to complement and develop this pragmatic focus by investigating progression assessment from a primarily pedagogic perspective. In particular, it sought to explore student perceptions of their experiences of a two-stage assessment model which included a viva voce component.

This study was framed by the following research questions:

RQ1: What are doctoral students’ perceptions of their progression assessment experiences?

RQ2: To what extent do doctoral students perceive their progression assessment as supporting their learning and development?

Utilising theory relating to the concept of assessment for learning (Wiliam Citation2011), the study sought to position the progression assessment as a potential learning event and to explore students’ perceptions of its impact on their thinking and understanding. In relation to RQ2, learning was differentiated from notions of performance by focusing on student perceptions of ‘changes in comprehension, understanding and skills’ (Soderstrom and Bjork Citation2015 176). Taking a qualitative approach, an interview format informed by the design of ‘event-focused interviews’ (Jackman et al. Citation2022) was used to elicit narrative accounts, supporting a detailed understanding of the student journey through the process. Interviews were conducted with six doctoral students, who were all studying within the social sciences or humanities. Whilst limitations in terms of scale and disciplinary coverage are acknowledged, the purpose of the work was to investigate, illustrate and provoke consideration of the potential value and role of the doctoral progression review process rather than to provide any form of representative or generalisable understanding.

In provoking consideration of the purpose and potential value of progression assessments, this paper is intended to have utility for three key audiences. Firstly, postgraduate research leaders who oversee design, organisation and training relating to progression processes at an institutional level. Secondly, academic staff involved in supervising doctoral students and/or acting as examiners for progression assessments. Finally, doctoral students who are preparing for progression assessments and are interested in how their engagement may support their understanding.

It is worth noting that whilst this study was conducted in the UK, forms of doctoral progression assessment are common in a range of countries internationally (Viđak et al. Citation2017, Bartlett and Eacersall Citation2019). Despite the specific context of this study, consideration of the learning potential of doctoral assessment is likely to have utility in many international contexts.

Doctoral progression assessment – purpose and structure

The existence and purpose of doctoral progression assessment processes is framed by a context of wider political attention on the role of doctoral education as part of the increasing importance of the ‘knowledge economy’ (McAlpine Citation2017; Smith McGloin Citation2021; Dowle Citation2023). Driven by the positioning of doctoral completion rates as a key performance metric for institutions (Arts and Humanities Research Council [AHRC Citation2020], Smith McGloin and Wynne Citation2022, Higher Education Statistics Agency [HESA] 2023), the primary, pragmatic, purpose of progression assessment appears to be grounded in an assumption that it plays a key role in improving continuation and completion rates. Despite this focus, empirical evidence relating to the impact and effectiveness of progression assessment processes is actually very limited (Dowle Citation2023). With many institutions positioning the initial progression assessment as conditional to the transfer to doctoral status (QAA Citation2020) and some metrics excluding students who withdraw in the first year (AHRC Citation2020), for early stage progression reviews in particular it may be argued that there is increased potential for pressure to privilege bureaucratic and deficit-based assessment models. From this perspective processes may be designed primarily to identify unsatisfactory progress, as opposed to functioning to support all doctoral students’ learning and development. Indeed, it has been argued that progression assessment is often connected to the management of performance indicators and considered as an aspect of procedures for ‘institutional quality control’ (Sillence Citation2023), rather than as a significant feature of doctoral pedagogy and curriculum.

Terminology used within national guidance and policy documents in the UK similarly reflects the potential for these competing perspectives. For example, the ‘Doctoral Degree Characteristics Statement’ frames the progression review as ‘a gateway’ (QAA Citation2020), whilst the related ‘Guidance for Research Degrees’ summarises its main purpose as relating to students’ ‘likelihood of completing the research programme’ (QAA Citation2018, 10). Both documents then proceed to outline its significance in supporting students and its role as ‘an important part of the learning process’ within doctoral degrees (QAA Citation2020, 15).

As a result of both institutional autonomy and the existence of potentially competing understandings of the function of doctoral progression reviews, there are significant variations in approaches and terminology nationally (Clarke Citation2013; Smith McGloin Citation2021). For example, various descriptors including ‘probation review’, ‘annual monitoring’ and ‘progression examination’ are used to define procedures in different contexts (Sillence Citation2023). A significant variation in the design of progression assessment processes is then the extent to which individual institutions prioritise a focus on assessment of written outputs over a focus on oral examination processes. Whilst in some institutions progression assessment points are limited to a desk-based review of a written report, in others they include a form of viva voce led by a panel including academic staff who are independent of the supervision team. This variation is reflected in literature aimed at students, where progression reviews are conceptualised in different ways ranging from the production of a ‘mini thesis’ (Cryer Citation2006) to engagement in a form of ‘mock viva’ (Trafford and Leshem Citation2008). Decision making in relation to this also sits within a wider context where additional research is required to further address the value of dialogic viva-based assessment, if we are to avoid cost and time being the dominant drivers in institutional decision-making regarding its use (Dobson Citation2008).

Existing understandings of doctoral progression assessment

The role, significance and particularly the pedagogical features of doctoral progression assessment have previously been subject to very limited research (Dowle Citation2023). Despite a QAA report highlighting concerns surrounding a lack of consistency in approaches (2007), only very recently have questions regarding the effectiveness (Dowle Citation2023), design (Sillence Citation2023) and role within the doctoral journey (Smith McGloin Citation2021) of progression assessment been subject to any sustained research interest in the UK. Whilst some additional studies have been undertaken in wider international contexts (Mewburn, Cuthbert, and Tokareva Citation2014; Bartlett and Eacersall Citation2019), and progression monitoring has been addressed in some studies concerned with broader questions relating to completion (Viđak et al. Citation2017), there is clearly scope for additional consideration of its pedagogical aspects.

It has been widely acknowledged that doctoral progression assessment has the potential to act as a key formative assessment opportunity for students (Bartlett and Eacersall Citation2019; Sillence Citation2023). Yet, despite its significant relational and pedagogical complexities, literature surrounding the progression assessment is often predominantly framed by pragmatic and technical perspectives (Crossouard and Pryor Citation2009). These perspectives risk underestimating the elusive, contextual and disciplinary nature of the concept of ‘doctorateness’ in assessment (Denicolo, Duke, and Reeves Citation2019) and its transformational potential in relation to student understanding, practice and identity (Crossouard and Pryor Citation2009). In this context, whilst emerging research regarding the role and effectiveness of doctoral progression assessment has significant utility, its focus in relation to pedagogy remains narrow. For example, the most relevant recent UK study which addresses ‘effectiveness’ of doctoral progression assessment (Dowle Citation2023), focuses largely on pragmatic, and arguably institutional and political, questions regarding relationships between assessment, progress and ‘timely completion’. Supported by 28 semi-structured interviews with doctoral researchers, supervisors and professional services, it does consider the potential for ‘intellectual development’ to be supported by dialogue with independent examiners in assessment, and the risks of its reduction to a ‘bureaucratic’ process; however, it offers limited insight into the pedagogical intricacies of this.

Recent empirical work in the UK is seemingly limited to just two additional small-scale studies which seek to address specific aspects of the disciplinary, developmental and potentially affective nature of doctoral progression assessment (Smith McGloin Citation2021; Sillence Citation2023). The insight offered into specific pedagogical aspects is limited; however, alongside the work of Dowle (Citation2023) what is most significant about this small body of work is that it frames an understanding of tensions between the nature of doctoral study, as complex and necessitating aspects of ‘intellectual autonomy’ (McCulloch and Bastalich Citation2023), and the neoliberal university’s growing focus on measuring, managing and codifying performance (Smith McGloin Citation2021). Despite differing in approach, both Sillence’s (Citation2023) work, which focuses on assessment design through four in-depth interviews with examiners, and Smith McGloin’s study (2021), which explores the reflective diaries of six doctoral researchers from the perspective of mobility, arrive at a conclusion that the learning potential of progression assessment may be contingent on both examiners’ and students’ perceptions of purpose. This understanding is also echoed internationally; for example, Bartlett and Eacersall (Citation2019) offer an autoethnographic account tellingly framed by the question ‘why do I have to do it?’ and Mewburn, Cuthbert, and Tokareva (Citation2014) analysis of interviews with 20 doctoral students concludes with concerns about the potential framing of progression assessment as a managerial, rather than pedagogic, tool and the resulting inconsistencies in expectations this creates.

Thus, a key understanding in relation to existing work focused on doctoral progression assessment is that, whilst doctoral progression assessment is acknowledged as complex and relational, clarity regarding its pedagogical and formative potential is limited. As a result, there is seemingly a growing risk that it may be conceptualised as a primarily administrative (Sillence Citation2023), bureaucratic (Dowle Citation2023), managerial (Mewburn, Cuthbert, and Tokareva Citation2014) and potentially even restrictive (Smith McGloin Citation2021) process. This understanding was key to the aims, approach and design of the present study.

Theoretical framework: assessment for learning

In order to offer new insights into doctoral students’ perceptions of the pedagogical role of progression assessment, the study used the concept of assessment for learning (Wiliam Citation2011) as a lens for analysing students’ experiences and perceptions. Focussing on the extent to which the progression examination was seen to influence thinking and understanding, this approach aimed to differentiate its potential in terms of teaching and learning from more pragmatic outcome or performance focused notions of progression.

Whilst originally associated with classroom learning in school environments (Black and Wiliam Citation1998), assessment for learning - understood as a shift from positioning assessment as ‘evaluating effectiveness’ at the end of a sequence of learning to having a function in ‘guiding learning’ (Wiliam Citation2011) - has become a common feature in literature focused on ‘learning oriented’ (Carless Citation2015) assessment in higher education (Yang and Carless Citation2013; Winstone et al. Citation2017). Within this context, it has been argued that the complex, nuanced, conceptual and often tacit nature of intended learning creates a need for assessment processes to go beyond narrow behaviourist models and to frame assessment as a communicative and contextual event (Sadler Citation2010; Winstone et al. Citation2017).

One key feature of this understanding is the advocation of the role of dialogic forms of assessment for learning (Sadler Citation2010; Yang and Carless Citation2013). Dialogic assessment is understood as utilising dialogue to support learning, but being about ‘more than conversation’ (Yang and Carless Citation2013) and necessarily involving aspects of shared reasoning, relationships and responsibilities (Yang and Carless Citation2013; Winstone et al. Citation2017) to go ‘beyond feedback’ (Sadler Citation2010). Its role in assessment for learning is to improve skills and performance and support students’ next steps (Wiliam Citation2011; Carless Citation2015) through maximising understanding, engagement and action and the potential for what Winstone et al. (Citation2017) frame as agentic ‘proactive recipience’.

A conceptualisation of assessment for learning was adopted which framed an aspiration for a process which could be understood as responsive, adaptive, constructive and affective (Sadler Citation2010; Wiliam Citation2011; Yang and Carless Citation2013). That is to say assessment for learning should be informed by what ‘the learner already knows’ (Wiliam Citation2011, 6), ‘respond to the needs of the learner’ (1) and ‘meet learning needs’ (10), ‘effect future performance’ and be supported by a ‘quality of interaction’ (6) which has the potential to enable the learner to take ownership and be ‘active in managing the process’ (5).

Research design

Methodology and positionality

The focus for this study emerged from an interest in the progression assessment process in my position as an early career researcher, acquiring experience of the role of both doctoral supervisor and progression examiner. In the context of limited empirical research in this area, and alongside anecdotal experience of differing ideas of process, purpose and utility, I sought to investigate and better understand student experiences of the progression assessment process and potential implications for practice.

The ‘conceptual architecture’ (Nichol, Hastings, and Elder-Vass Citation2023) underpinning the methodological approach adopted was informed by three key understandings arising from my own standpoint and engagement in the literature review. Firstly that, in response to existing understandings of doctoral progression assessment, the intended focus of this study was primarily pedagogic, rather than pragmatic. Secondly, that an in-depth, relational, approach offered potential to engage with the complex, contextual and individualised nature of investigating this pedagogic role of the assessment within doctoral journeys (Crossouard and Pryor Citation2009). Finally, that the purpose of this small-scale study was to provoke reflection in relation to the potential of doctoral progression assessment, not to provide a generalisable understanding of its processes.

Following careful consideration of the study’s purpose, theoretical framework and my own philosophical positioning (Chilisa and Kawulich Citation2012) a qualitative, narrative methodology was adopted for the study. This was seen as significant in addressing these understandings by approaching assessment experiences from a humanist, rather than purely technical, stance (James Citation2014), and in increasing opportunities to maximise contextual detail and understanding. This aligned with an interpretivist standpoint and an epistemological assumption that ‘knowledge’ is unavoidably socially constructed and subjective, and thus understanding is limited to interpretation and re-presentation.

Participants, context and process

The study explored the progression assessment experiences of six doctoral students, four of whom had recently undertaken their first progression assessment and two who had recently undertaken their second progression assessment. The inclusion of students at different stages provided a useful insight into their developing understanding of their role within, and the potential value of, the process. The students were enrolled at one post-1992 UK university and the initial recruitment was open to students across the social sciences and humanities disciplines. More responses were received from students studying in education, which is reflected in the sample. The study included students on both professional doctorates and PhD routes, with providing a basic profile of each students’ doctoral route and stage. Students were identified by circulating an email to doctoral supervisors within the relevant schools and research centres, and through sharing project information via the university’s graduate school.

Table 1. Doctoral student profiles.

The university’s progression assessment process requires students to undertake a progression examination at stage one, followed by periodic progression reviews at stage two and beyond. For full-time students a progression examination must be completed by month 12 post-registration and for part-time students by month 18. The progression examination has two components, a 3,000-6,000-word report evidencing progress and a viva voce examination conducted by two internal assessors, who are independent of the project. This examination typically consists of dialogue which is supported by questions prepared in response to the initial report. The university uses qualification descriptors as a benchmark for progress alongside a list of milestone objectives, including statements such as the researcher has ‘developed an appropriate knowledge of research methods, relevant to the area of research’. The possible outcomes at stage 1 are pass, fail or resubmission. Subsequent progression reviews are conducted on a 12-month cycle; evidence of progress here may include submission of draft chapters or a thesis outline. At this stage a progress meeting is then held, normally with one of the original independent examiners from the stage one progression examination. Again, milestone objectives are used for assessment and at these stages the reviewer must assess whether the researchers’ progress is satisfactory. In the event of an unsatisfactory outcome, students have an automatic right to resubmission. At this stage withdrawal of registration may occur in the event of continued unsatisfactory progress or non-submission. The university’s graduate school provides initial staff training for progression assessment as part of its mandatory doctoral supervision induction, and the stated purpose of the process includes the wording ‘to provide formative feedback’. Doctoral students can also attend optional progression assessment information sessions which are delivered periodically via the graduate school.

Ethical approval for the study was obtained through the university’s ethics committee and pseudonyms are used. Written consent was secured prior to each interview and students were provided with information clearly communicating that their decision to participate would not impact on any aspect of their assessment or progress. The study did not include students for whom I had been involved in either supervision or the progression assessment itself. Interviews were conducted with an awareness of the personal and potentially emotive nature of any individual stories (Eakin Citation2004) and participant information included details of the university’s student wellbeing provision.

Event focused interviews

In line with the methodology adopted, the study utilised a narrative event-focused interview model (Jackman et al. Citation2022) to explore the students’ experiences of their most recent progression assessment. This approach conceptualised the assessment as a potential learning event and focused on an in-depth, chronological account of their experience. The event-focused format is used to capture ‘rich data’ about human experience of ‘episodic phenomena’ and, whilst Jackman et al. (Citation2022) focus on its adoption in relation to sports psychology, it was determined that its narrative format, detailed focus on the ‘how’ and ‘why’, and conceptualisation of the assessment as an ‘event’, offered value for use in this context.

Taking a semi-structured format, the approach was informed by Jackman et al.’s model (Citation2022), seeking to elicit a chronological description of individual experiences in three stages (preparation and writing, the viva/review meeting, feedback and reflection). Pre-prepared questions inquired about experiences at each stage and follow up questions were then used to explore, clarify and extend aspects focusing on experience, context and process. A time limit of one hour was applied for each interview, the interviews were audio-recorded and transcribed, and notes were taken to support follow up questioning. Students were offered the option of online or in person interviews, predominantly to ensure the study was accessible to distance and part-time doctoral students. Three interviews took place in person and three online.

Analysis

The interviews were analysed using an iterative two-stage process to develop an understanding of the students’ individual and collective experiences, and the relationship between these and the pedagogical function of assessment. In line with Srivastava and Hopwood (Citation2009) conceptualisation, the top-level focus of the analysis process was to ‘visit and revisit’ the data through an iterative process to bring together the data, research questions and theoretical framework. This was achieved by first transcribing and re-listening to each interview to support a process of writing as an analytic tool (Richardson and St. Pierre Citation2005) by developing a short vignette for each student’s experience. The transcribed interviews were then approached again thematically, using Wiliam’s (Citation2011) conceptualisation of assessment for learning as a lens to identify aspects of pedagogical potential and practice. As this analysis developed, the vignettes provided a useful reference point to understand the ideas in relation to the context and sequence they had captured (Silverman Citation2013), thus avoiding reducing understandings purely to face-value readings of isolated individual sections of the text (Eakin and Gladstone Citation2020). This aspect of the approach was particularly important in prioritising methodological, as opposed to methodical, coherence through this part of the study. This supported a focus on trustworthiness in narrative research as being premised on criticality, contextualisation and an acknowledgement of the need to recognise the subjectivity of interpretations (Moss Citation2004).

Findings

The experiences of the doctoral students in this study positioned their progression assessment, particularly the oral component, as having significant potential to support learning and development. This is explored below in relation to three key themes: the potential of dialogic assessment; the potential for re-framing and disrupting understandings; and the significance of autonomy, permission and motivation. The student narratives included significant examples of conceptualisation of their progression assessment as having the potential to act as a form of active ‘dialogic assessment’, where invitational, reflective coaching format questions supported and extended thinking. Within this space the progression assessment process was then seen to have the potential to present necessary opportunities for reframing and disrupting existing understandings and thinking. This included examples where progression assessment was perceived to have the potential to remove barriers in thinking or provide new pathways or concepts to consider. This process appeared to be most effectively enabled where the pedagogical approach of the assessor paid careful attention to encouraging the autonomy of the researcher, supporting a sense of permission, leading to positive impacts on motivation.

It is worth noting that all six doctoral students reflected on successful outcomes of their progression assessment, with no experiences of a fail or unsatisfactory outcome. In all but one case, their narratives also captured an experience which could generally be understood as providing significant enhancement to their learning experience. The exception to this was the experience of Beth, who had passed her progression examination, but characterised her experience as causing her confidence and motivation to ‘come crashing down’. Beth’s narrative provided interesting insight into perceptions of conflicting ideas of purpose and pedagogy, and had some consistency with Smith McGloin (Citation2021) notion of the risk of such structures restricting mobility in the doctoral journey.

The potential of dialogic assessment

Whilst the process of developing the written report was typically framed as having pragmatic value in supporting organisation and progress, the oral component of the assessment was frequently referred to as having the most significant potential to impact on learning. The value of constructive dialogue here was a dominant theme throughout the narratives and could be characterised as a form of dialogic teaching and learning. This had consistency with the notion of utilising dialogue for assessment relating to nuanced and complex content (Winstone et al. Citation2017), and wider research regarding the value of this form of assessment in higher education (Hill and West Citation2020). In many cases, the students conceptualised and framed the dialogue arising in their oral assessments by utilising terms such as ‘mentoring’, ‘coaching’ and ‘discussions with experts’ to highlight its potential to support their learning:

Judith: I think having that almost like mentoring, which is like a mini mentoring or mini coaching session, in these examination spots is just really helpful for someone like me who thinks, right, okay that’s something else for me to think about.

It was noteworthy that for the two students at stage 2 (Judith and Ruth), their previous experiences of stage 1 had led them to purposefully dedicate additional time to plan how to fully utilise this opportunity at stage 2, for example by bringing their own questions to the discussions to address to their reviewer.

The parallels between oral assessment as having the potential to act in a similar way to forms of coaching also extended to reflections on the nature of discussion, and particularly the framing of the types of assessment questions being asked. In this respect, from the perspective of the students, their notions of what represented ‘effective’ questioning to support this dialogue typically consisted of a format which was characterised as open, invitational, immanent and reflective. This has some synergy with previous discussion regarding embracing ideas of coaching in doctoral supervision discussions (Wilson and James Citation2022), but also positioned the dialogue as responsive to the report, as well as to contributions with the discussion. For example, Andrew explicitly pointed this out as an expectation of a skilled examiner ‘inviting conversation’ rather than putting the student onto ‘the back foot’:

Andrew: [They responded] Did you consider capturing that? It’s different to saying why aren’t [they] being interviewed as part of your study and that’s the trick of the questions. Don’t get me wrong, the examiner who asked it is extremely experienced in doing the exam, so they understand the kind of questions you should be asking.

This was further reinforced by the contrasting experience of Beth, who outlined her frustration at being asked questions which she felt were focused primarily on recall and memory, and thus assessment of learning, rather than inviting this more valuable reflection and justification.

Beth: [They said] can you recall the theory? Tell me everything… And I was like. What? And so, I spent like the first half an hour not really being able to answer many questions feeling really stupid.

There appeared to be two key aspects of unlocking and embracing the potential for learning and reflection here. Firstly, the expertise and insight (both methodological and disciplinary) of an independent ‘expert’ assessor was seen as being significant by the students – indeed, in some cases it was something they explicitly recalled being highlighted to them by their supervisors prior to the assessment. And secondly, the provision of an extended opportunity to reflect, justify and explain their research to a new audience was seen to create an opportunity to think about it in a different way:

Judith: Of course, you’ve got the expertise of researchers. Academics who just know their stuff so well… you know, have you thought about doing this that and the other?

Ruth: I have to talk about things. Talking about stuff helps me understand stuff… it was a well-informed professional conversation… [I] articulated the choices I’ve made and the fact that was then validated by the process.

Where previous studies have positioned the inclusion of a researcher independent from the project (Dowle Citation2023) and a clarity of purpose (Sillence Citation2023; Smith McGloin Citation2021) as having the potential to enhance the process, these perspectives begin to illustrate the pedagogic aspects. The students’ perspectives indicate that a responsive and constructive dialogic approach, supported by invitational and reflective questioning and led by a researcher with disciplinary and methodological expertise, may maximise the potential of progression assessment as a learning opportunity.

The potential for re-framing and disrupting understandings

Where these conditions were in place, a key affordance of the oral component of the examination was then seen as its capacity to help re-frame and disrupt thinking, thus contributing to potential ‘breakthroughs’. This was particularly evident where the doctoral students’ narratives addressed their experience of later reflection on their assessments. Sometimes this was seen as supporting them to get ‘unstuck’ when they were aware of an existing challenge but previously unable to resolve it, but in other cases the process was credited with supporting the identification of a potential issue or conflict, which they had not previously been aware of.

David: But the thing is, I was stuck because I was a little bit directionless. You know, where could this go?… so, this is where this [the progression examination] now provides very clear, succinct direction.

Judith: I’m pootling along and it’s going okay and now this bombshell’s just dropped, but it was also helpful because, yeah, absolutely it completely shifted it.

This aspect was seen as demonstrating the potential of the dialogue to act in a manner consistent with the adaptive function of assessment for learning and thus impacting future performance (Wiliam Citation2011).

In reflecting on these realisations, whilst there were numerous references to the additional work and challenges these created, there was also a consistent sense that if introduced in a constructive, invitational (rather than instructional) and supportive way, these had significant value. For example, Jenny talked of her thinking being ‘disrupted’ and Judith characterised being sent ‘into a spin’, but both then proceeded to reflect on their later understanding of the magnitude and importance of that learning.

Judith: Without that… I’ve no doubt I’d be in a very different place.

The nature of this re-framing or disruption was typically conceptual or methodological, rather than pragmatic or substantive, relating to in-depth consideration of aspects of methodology, theoretical framework and approaches to analysis. David credited this with leading to a ‘framework shift’, whilst Ruth summarised it as ‘re-framing the problem’ and Jenny outlined it ‘conceptually changing her approach to the data’. Whilst in some cases this re-framing had arisen from a direct suggestion, such as the introduction of a particular theory, or a question highlighting a potential challenge, in others this change in thinking appeared to be much less tangible and specific. Given the potential complexity of these aspects, this is perhaps unsurprising, but implies that articulating the approach and having the opportunity to explore and justify this to an independent, expert audience may itself act as a provocation for considering it differently. For example, in articulating his framework shift David struggled to pin this down to a specific contribution, but instead saw it as a product of the opportunity as a whole in provoking ‘new thinking’.

The final aspect of this theme was that where existing approaches and thinking were disrupted, it appeared to be important that conceptual understanding was supported and explored in depth in the discussion. All of the researchers spoke of being open to challenge and the need to defend their work, but what contrasted the more positive experiences from Beth’s more challenging narrative was the sense that the students felt the process had supported them to understand why a re-think may be necessary and to consider alternatives. Whereas Beth, who indicated that the approach of one of her examiners left her feeling discomfort and uncertainty surrounding her progress, succinctly characterised her perception of their approach to challenging her current thinking as: If you don’t know the answer, move on.

Thus, the doctoral researchers indicated that dialogic assessment has significant adaptive potential (Wiliam Citation2011; Carless Citation2015) when framed as constructive, purposeful dialogue (Bartlett and Eacersall Citation2019) which seeks to move beyond administrative assessment of the student’s likelihood to complete (QAA Citation2018) to embed pedagogic purpose (Sillence Citation2023).

The significance of autonomy, permission and motivation

The final theme generated by the analysis was that dialogic progression assessment had a significant affective component, again consistent with theory relating to assessment for learning (Wiliam Citation2011; Yang and Carless Citation2013). In order for the doctoral students to develop their thinking, research skills and approach, instilling a sense of autonomy and ownership of their project (and the challenges within it) was of particular importance. Where this was present, the students spoke of a sense of permission to experiment and apply new ideas and of enhanced motivation. It was also noteworthy that this was situated within a context where many openly spoke of the anxiety induced by oral assessment, and the importance of supportive and respectful approaches in creating a sense of ‘psychological safety’ (Clark Citation2020). This supportive climate appeared to be integral to their capacity and willingness to engage in honest dialogue about challenges and barriers.

Closely linked to the previous two themes, an invitational rather than instructional approach to the oral assessment was credited with helping to instil this sense of autonomy and ownership over research decisions. This may be summarised as part of the process of learning that rather than there being one ‘correct’ approach to their work, social research is premised on expectations of justification, coherence and academic argument. Ruth illustrated this position and the potential value of the progression assessment in supporting and reinforcing this:

Ruth: No, it’s my study… as long as I can justify academically and back it up. Why I’ve chosen to do what I’ve done then then that’s okay.

In applying this invitational approach, where questioning and dialogue introduced opportunities to explain and reflect on research decisions, this also exposed, and created permission in relation to, previous assumptions about academic study which had the potential to act as constraints. For example, Jenny spoke of her progression assessment as giving permission to engage with ‘grey literature’ where previously she had understood that she ‘wouldn’t have expected’ to be able to use this, despite its potential relevance – perhaps reflecting ideas of exposing elements of the ‘hidden curriculum’ (Elliot et al. Citation2020). This permission was frequently referred to as being supported by the creation of a respectful, trusting and safe environment, where the students felt comfortable to be honest about existing understandings.

Ruth: I was put at ease because I knew no one was there to catch me out.

Andrew: I found that really refreshing, although I did feel at one point have I been too honest?

With this sense of ownership and permission, where the researchers reflected on positive experiences, the outcome was often an enhanced sense of motivation in relation to developing their work and their understanding. Jenny described this as a ‘renewed sense of purpose’, David repeatedly used the word ‘direction’ in his reflections, meanwhile Judith and Ruth both relayed experiences of being motivated to invest significant time after their oral assessments in developing their understanding of a particular aspect of their methodological approach. This motivation was also clearly about more than pragmatic ideas of progression and completion, framing a desire to develop the personal understanding needed to conduct high quality research:

Ruth: That consistency… linking it together. Not rushing… rather than just going off doing analysis, right, job done, tick right next chapter, right can have my hat now please! You know?

Judith: One thing I remember… He said [in the review meeting]: ‘What you do is the difference between a mediocre piece of work and an excellent piece of work’.

These understandings are consistent with assessment for learning as having an active, affective function (Wiliam Citation2011), and help to illustrate aspects which may be contributing to previous research which has highlighted the importance of a positive, constructive and supportive approach (Bartlett and Eacersall Citation2019; Sillence Citation2023)

Implications

In the context of limited empirical attention on doctoral progression assessment (Dowle Citation2023; Sillence Citation2023), this paper aims to act as a provocation for consideration of its learning potential. In doing so, it presents an argument that, given the complexity of doctoral study, a dialogic approach which promotes shared reasoning (Yang and Carless Citation2013) and acknowledges the relational nature of assessment (Winstone et al. Citation2017) may offer significant potential as assessment for, rather than of, learning here (Wiliam Citation2011). This position has potential implications at an institutional level, where consideration of aspects of pedagogic value and student experience should be factored into assessment design, and support reflection on the rationale for the necessary additional investment of resources required to adopt, or retain, a dialogic viva-based model (Dobson Citation2008).

The perspectives presented here also have utility in supporting reflection for academics involved in doctoral assessment and supervision. They support previous discussion regarding the necessity for assessment to be framed and practiced by supervisors and examiners in a supportive, positive and purposeful manner (Bartlett and Eacersall Citation2019; Smith McGloin Citation2021; Sillence Citation2023), but also extend this to further illustrate aspects of pedagogic practice. Effective practice was understood as involving dialogue which carefully responded to student’s written report and oral responses to elicit and extend understanding. This included the introduction of new ideas and perspectives, whilst retaining a focus on the importance of the student owning their project. In doing so, despite the potentially one-off nature of the progression viva as an encounter, the perceived expertise of examiners, alongside the application of a supportive, reflective and invitational approach, appeared to have significant potential to generate sufficient trust to enable honest and purposeful dialogue. Through focusing on supporting progress, rather than solely ‘testing’ it, this also aligns with the need for assessment design, particularly at doctoral level, to embrace the potential to support students as autonomous and proactive learners (Winstone et al. Citation2017) in becoming independent researchers.

Finally, for doctoral students it is apparent that there may be value in reflecting on progression assessment as presenting an opportunity for supporting thinking and decision making, particularly where a viva-based model is offered. This was perhaps best illustrated by the stage 2 students in the study who, with experience from an earlier progression assessment, now saw the opportunity to take a more active and autonomous role by considering what they hoped to achieve from this opportunity for dialogue with outside ‘experts’.

Conclusions

Utilising narrative data from a small-scale study at one university in the UK, this paper has presented a discussion regarding the potential pedagogic value of applying a dialogic approach to doctoral progression assessment. Through the lens of assessment for learning, the experiences of six doctoral students studying in the social sciences and humanities are presented to illustrate aspects of the learning potential of this process. The perceptions of these students demonstrate that a two-stage assessment process, which includes both a written and oral component, offers an opportunity for students to encounter invitational, immanent and reflective questioning, which can support and extend thinking and understanding. In the context of such an approach, students were seen to encounter necessary, and potentially significant, opportunities for reframing and disrupting their existing understandings and approaches. This process appeared to be most effectively enabled where the approach of the assessor paid careful attention to encouraging the autonomy of the student, supporting a sense of permission and aiming to have a positive impact on motivation. The study builds on a small volume of existing work, illustrating an understanding of potential pedagogic considerations required for avoiding bureaucratic (Dowle Citation2023) and managerial approaches (Mewburn, Cuthbert, and Tokareva Citation2014) in favour of purposeful and supportive processes (Smith McGloin Citation2021; Sillence Citation2023).

Disclosure statement

No potential conflict of interest was reported by the authors.

Additional information

Notes on contributors

Timothy Clark

Timothy Clark is Director of Research and Enterprise at the University of the West of England, Bristol, UK. His research focuses on aspects of doctoral pedagogy, doctoral student experience and researcher development, particularly in relation to academic writing and methodological decision making.

References

  • AHRC. 2020. Training Grant Funding Guide 2019–20. Arts Humanities Research Council. Accessed July 19, 2023. https://www.ukri.org/wp-content/uploads/2021/10/AHRC-08102021-TrainingGrantFundingGuide-2020.pdf
  • Bartlett, C. L., and D. Eacersall. 2019. “Confirmation of Candidature: An Autoethnographic Reflection from the Dual Identities of Student and Research Administrator.” In Traversing the Doctorate: Reflections and Strategies from Students, Supervisors and Administrators, edited by T. M. Machin, M. Clarà, and P. A. Danaher, 29–56. Cham, Switzerland: Palgrave Macmillan.
  • Black, P., and D. Wiliam. 1998. Inside the Black Box: Raising Standards through Classroom Assessment. London: Granada Learning.
  • Carless, D. 2015. Excellence in University Assessment: Learning from Award-Winning Practice. Abingdon: Routledge.
  • Chilisa, B., and B. Kawulich. 2012. “Selecting a Research Approach: Paradigm, Methodology and Methods.” Doing Social Research: A Global Context 5 (1): 51–61.
  • Clark, T. 2020. The 4 Stages of Psychological Safety: Defining the Path to Inclusion and Innovation. Oakland, CA: Berrett-Koehler.
  • Clarke, G. 2013. “Developments in Doctoral Assessment in the UK.” In Critical Issues in Higher Education, edited by M. Kompf and P. Denicolo, 23–36. Rotterdam, Netherlands: Sense Publishers
  • Crossouard, B., and J. Pryor. 2009. “Using Email for Formative Assessment with Professional Doctorate Students.” Assessment & Evaluation in Higher Education 34 (4): 377–388. doi:10.1080/02602930801956091.
  • Cryer, P. 2006. The Research Student’s Guide to Success. Maidenhead, UK: Open University Press
  • Denicolo, P., D. Duke, and J. Reeves. 2019. Delivering Inspiring Doctoral Assessment. London: Sage.
  • Dobson, S. 2008. “Theorising the Academic Viva in Higher Education: The Argument for a Qualitative Approach.” Assessment & Evaluation in Higher Education 33 (3): 277–288. doi:10.1080/02602930701293272.
  • Dowle, S. 2023. “Are Doctoral Progress Reviews Just a Bureaucratic Process? The Influence of UK Universities’ Progress Review Procedures on Doctoral Completions.” Perspectives: Policy and Practice in Higher Education 27 (2): 79–86. doi:10.1080/13603108.2022.2077855.
  • Eakin, J., and B. Gladstone. 2020. ““Value-Adding” Analysis: Doing More with Qualitative Data.” International Journal of Qualitative Methods 19: 160940692094933. doi:10.1177/1609406920949333.
  • Eakin, P. J., ed. 2004. The Ethics of Life Writing. London: Cornell University Press.
  • Elliot, D., S. Bengtsen, K. Guccione, and S. Kobayashi. 2020. The Hidden Curriculum in Doctoral Education. Cham: Springer Nature.
  • HESA. 2023. “Higher Education Student Darta.” Higher Education Statistics Agency. Accessed July 19, 2023. https://www.hesa.ac.uk/data-and-analysis/students
  • Hill, J., and H. West. 2020. “Improving the Student Learning Experience through Dialogic Feed-Forward Assessment.” Assessment & Evaluation in Higher Education 45 (1): 82–97. doi:10.1080/02602938.2019.1608908.
  • Jackman, P., M. Schweickle, S. Goddard, S. Vella, and C. Swann. 2022. “The Event-Focused Interview: What is It, Why is It Useful, and How is It Used?” Qualitative Research in Sport, Exercise and Health 14 (2): 167–180. doi:10.1080/2159676X.2021.1904442.
  • James, D. 2014. “Investigating the Curriculum through Assessment Practice in Higher Education: The Value of a ‘Learning Cultures’ Approach.” Higher Education 67 (2): 155–169. doi:10.1007/s10734-013-9652-6.
  • McAlpine, L. 2017. “Building on Success? Future Challenges for Doctoral Education Globally.” Studies in Graduate and Postdoctoral Education 8 (2): 66–77. doi:10.1108/SGPE-D-17-00035.
  • McCulloch, A., and W. Bastalich. 2023. “Commencing Research Students’ Expectations and the Design of Doctoral Induction: Introducing Inflections of Collaboration and Pleasure.” Journal of Further and Higher Education 47 (5): 687–698. doi:10.1080/0309877X.2023.2185772.
  • Mewburn, I., D. Cuthbert, and E. Tokareva. 2014. “Experiencing the Progress Report: An Analysis of Gender and Administration in Doctoral Candidature.” Journal of Higher Education Policy and Management 36 (2): 155–171. doi:10.1080/1360080X.2013.861054.
  • Moss, G. 2004. “Provisions of Trustworthiness in Critical Narrative Research: Bridging Intersubjectivity and Fidelity.” The Qualitative Report 9 (2): 359–374.
  • Nichol, A. J., C. Hastings, and D. Elder-Vass. 2023. “Putting Philosophy to Work: Developing the Conceptual Architecture of Research Projects.” Journal of Critical Realism 22 (3): 364–383. doi:10.1080/14767430.2023.2217054.
  • QAA. 2007. Report on the Review of Research Degree Programmes: England and Northern Ireland: Sharing Good Practice. Gloucester: Quality Assurance Agency. Paragraphs 84–93.
  • QAA. 2018. “UK Quality Code for Higher Education: Advice and Guidance Research Degrees.” Quality Assurance Agency for Higher Education. Accessed July 11, 2023. https://www.qaa.ac.uk/docs/qaa/quality-code/advice-and-guidance-research-degrees.pdf
  • QAA. 2020. “Characteristics Statement: Doctoral Degree.” Quality Assurance Agency for Higher Education. Accessed July 11, 2023. https://www.qaa.ac.uk/docs/qaa/quality-code/doctoral-degree-characteristics-statement-2020.pdf?sfvrsn=a3c5ca81_14.
  • Richardson, L., and E. St. Pierre. 2005. “Writing: A Method of Inquiry.” In The Sage Handbook of Qualitative Research, edited by N. Denzin and Y. Lincoln, 959–978. London: Sage.
  • Sadler, D. R. 2010. “Beyond Feedback: Developing Student Capability in Complex Appraisal.” Assessment and Evaluation in Higher Education, 35 (5): 535–550. doi:10.1080/02602930903541015.
  • Sillence, M. 2023. “Understanding Doctoral Progress Assessment in the Arts and Humanities.” Arts and Humanities in Higher Education 22 (1): 45–59. doi:10.1177/14740222221125621.
  • Silverman, D. 2013. “What Counts as Qualitative Research? Some Cautionary Comments.” Qualitative Sociology Review 9 (2): 48–55. doi:10.18778/1733-8077.09.2.05.
  • Smith McGloin, R. 2021. “A New Mobilities Approach to Re-Examining the Doctoral Journey: Mobility and Fixity in the Borderlands Space.” Teaching in Higher Education 26 (3): 370–386. doi:10.1080/13562517.2021.1898364.
  • Smith McGloin, R., and C. Wynne. 2022. Structures and Strategy in Doctoral Education in the UK and Ireland. Lichfield, UK: UK Council for Graduate Education.
  • Soderstrom, N. C., and R. A. Bjork. 2015. “Learning versus Performance: An Integrative Review.” Perspectives on Psychological Science: A Journal of the Association for Psychological Science 10 (2): 176–199. doi:10.1177/1745691615569000.
  • Srivastava, P., and N. Hopwood. 2009. “A Practical Iterative Framework for Qualitative Data Analysis.” International Journal of Qualitative Methods 8 (1): 76–84. doi:10.1177/160940690900800107.
  • Trafford, V., and S. Leshem. 2008. Stepping Stones to Achieving Your Doctorate: Focusing on Your Viva from the Start. Maidenhead, UK: Open University Press.
  • Viđak, Marin, Ružica Tokalić, Matko Marušić, Livia Puljak, and Damir Sapunar. 2017. “Improving Completion Rates of Students in Biomedical PhD Programs: An Interventional Study.” BMC Medical Education 17 (1): 144. doi:10.1186/s12909-017-0985-1.
  • Wiliam, D. 2011. “What is Assessment for Learning?” Studies in Educational Evaluation 37 (1): 3–14. doi:10.1016/j.stueduc.2011.03.001.
  • Wilson, J., and W. James. 2022. “Ph. D. partnership: Effective Doctoral Supervision Using a Coaching Stance.” Journal of Further and Higher Education 46 (3): 341–353. doi:10.1080/0309877X.2021.1945555.
  • Winstone, N. E., R. A. Nash, M. Parker, and J. Rowntree. 2017. “Supporting Learners’ Agentic Engagement with Feedback: A Systematic Review and a Taxonomy of Recipience Processes.” Educational Psychologist 52 (1): 17–37. doi:10.1080/00461520.2016.1207538.
  • Yang, M., and D. Carless. 2013. “The Feedback Triangle and the Enhancement of Dialogic Feedback Processes.” Teaching in Higher Education 18 (3): 285–297. doi:10.1080/13562517.2012.719154.