2,148
Views
7
CrossRef citations to date
0
Altmetric
Articles

Matching final assessment to employability: developing a digital viva as an end of programme assessment

& ORCID Icon
Pages 373-384 | Received 12 Mar 2018, Accepted 07 Aug 2018, Published online: 07 Oct 2018

ABSTRACT

While traditionally the viva voce examination had a central role in student assessment it fell out of favour as higher education expanded. This paper describes the development of a digital video viva examination to promote a more authentic and lower stakes method of assessment for students in their final under-graduate module. The paper presents a case study using a module from pre-registration nursing but the approach could be useful for other practice based and vocational disciplines in the health sciences, social work and business management and law. The paper describes the challenges of developing a truly authentic assessment when faced with academic requirements of the programme.  The problems of video assessment include broadband speeds and file sharing are discussed.  The authors were able to develop a lower stakes assessment with students on average recording and re-recording their viva submission 3.41 times and rehearsing it 3.67 times.

Introduction

Historically, the viva voce, or oral examination played a central role in the assessment of learning outcomes, with its multiplicity of purposes applied to such programmes as clinical dentistry, medical studies and social sciences to name but a few (Kehm, Citation2001; Mandus, Russell, & Higgins, Citation2009). The viva voce is from the Latin meaning ‘live voice’ or oral examination in which a student or candidate answers questions from an examiner (Watts, Citation2012). Viva voce as a term was first used in the sixteenth century but the practice of oral examination may pre-date this common usage (Merriam-Webster Dictionary, Citation2018). With the exception of the defence of doctoral degrees the viva largely fell out of favour as higher education expanded, often being recognised as resource intense for staff and inherently unreliable and anxiety inducing for students (Framp, Downer, & Layh, Citation2015; Knight, Dipper, & Cruice, Citation2013). Written assessments methods, considered the strongest measure of learning outcomes (Pearce  and Lee Citation2009), dominated higher education institutions despite the longevity of oral examination techniques (Joughin, Citation1998). The traditional viva examination is recognised as a relatively high-stakes form of assessment with the student’s performance on a single day being central to their grade or outcome. Meakim et al., (Citation2013, p. s7) define high-stakes as ‘an evaluation process associated with a [simulation] activity that has a major academic, educational, or employment consequence’. As the stakes are high in terms of the outcome of the process, uncertainty and anxiety of the viva examination are natural by-products, arising from the ambiguity and ritualistic approach (Carter & Whittaker, Citation2009), diverse agendas, and the difference in power between the candidate and assessor (Doloriert & Sambrook, Citation2011; Kehm, Citation2001).

However, in certain disciplines the viva examination has had a resurgence now being used in health professional education to test clinical-reasoning, decision-making and requisite professional life skills (Framp et al., Citation2015). It is recognised that this particular form of assessment is well suited for the evaluation of reflective and critical-thinking competencies alongside problem-solving and analytical abilities (Rahman, Citation2011). Chadwick (Citation2004) asserts that the viva can provide the opportunity to assess learning outcomes that would be otherwise difficult to assess by other methods, complimenting part of a total examination schedule (Kehm, Citation2001). Furthermore, oral exams have the potential to measure the students’ achievement’s in course outcomes not restricted to knowledge, but related to individual professionalism, ethics, interpersonal competence, qualities and the relevance of these to the workplace setting (Harden, Citation2002). There is a growing body of evidence suggesting that the rigor and validity of this form of examination has been well established, with the viva known to provide unique insights into the capacity of the novice to think critically (Hungerford, Walter, & Cleary, Citation2015; Orrock, Grace, Vaughn, & Coutts, Citation2014; Knight et al., Citation2013; Naqvi & Aheed, Citation2014 & Rahman, Citation2011). Rahman (Citation2011) purports that the oral exam format has the potential to assess the student on all five cognitive domains of Bloom’s taxonomy (Bloom, Engelhart, Furst, Hill, & Krathwohl, Citation1956). Rahman (Citation2011) adds that whilst many of the domains can be assessed through the written exam, the oral exam covers several cognitive domains and the student’s psychomotor skills of oral expression.

Rationale

Within the nursing programmes the final module focuses on the transition to the role of a registered nurse. Previously the module was assessed by a written assignment which required the student to react to a leadership scenario. The leadership scenarios were very broad, contrived and lacked authenticity (Rally, Citation2015) in terms of the type of day-to-day problems experience in the workplace. When considering the concept of authenticity related to assessment, there is a spectrum of opinion ranging from the extent to which assessment replicates the context of professional practice or ‘real life’ (Hart, Citation1994; Torrance, Citation1995) to Reeves and Okey (Citation1996) arguing that it is the degree of fidelity of the task, and the context in which performance would normally occur. For the purpose of this discussion Birenbaum (Citation1996) offers the most appropriate insight, suggesting that in his opinion authentic assessment needs to consider problem-solving and critical thinking competencies alongside meta-cognitive competencies such as reflection and social skills for example communication and collaboration. This view is supported by Rally (Citation2015) who argues that an authentic assessment is one in which the student is required to make active use of theoretical material, rather than just remembering it, by applying it to actual contemporary and practical contexts.

Undoubtedly, the end of programme assessment is high-stakes and is not only pivotal in securing employment and entering the workforce, but a pre-requisite to professional registration. Failure to meet the outcomes of the assessment has several implications for the student nearing the end of the nursing programme, some of the most significant being withdrawal of a job offer and the inability to register as a nurse with the professional body. Although student-learning objectives were constructively aligned (Biggs, 1996) to the learning outcomes, it is recognised that for many students the assessment drives learning (Osborne, Dunne, & Farrand, Citation2013; Wass, Van der Vleuten, Shatzer, & Jones, Citation2001). With the assessment being scenario based, many students had not encountered the problem that was to be solved and were unable to contextualise factors of the situation. Failure rates were deemed significantly high and were comparable between cohorts, presenting issues that needed to be considered, such as extension to training and bursaries, job offers being retracted and no guarantee of success following resubmission. It was not uncommon for this final assessment to be the only assessment that a student failed throughout the 3-year programme. Problem-based learning (PBL) as an active learning pedagogy, starts with the premise of a problem or query that needs to be solved and assists student learning by integrating theory and practice (Pearce & Lee, Citation2009). Students need to understand client requirements and contextual factors in situ before they are able to solve the problem, through the acquisition of knowledge and professional and academic skills (Pinto Pereira, Telang, Butler, & Joseph, Citation1993). Some students were unable to solve the leadership scenario as they had never encountered the situation in practice or acted in such as senior position within a team. As a result, students struggled to make links between theory and practice. It was evident, over time, that the leadership scenario did not provide the appropriate focus on patients in specific situations, avoiding opportunities to rehearse the care response for patients and their families. The majority of students were able to demonstrate appropriate higher order skills in terms of problem-solving and critical thinking whilst in practice and it became apparent that there was a need for the students to be able to articulate these skills in a summative assessment. Borin, Metcalfe, and Tietje (Citation2008) claim that a critical component of PBL is the ability of the student to demonstrate mastery of content, concepts, and skills by focusing on what has been learned. The leadership scenario did not provide the student with the opportunity to do this and a single authored manuscript, as a summative assessment provided little chance to assess competence or graduate attributes. Employers naturally have a vested interested in the graduate nurse, seeking someone that is fit for practice and purpose to join the workforce. Ponder, Beatty, and Foxx (Citation2004) claim that the ability to analyse, synthesise, and evaluate are higher order skills that are essential graduate attributes, who at the point of registration should be able to apply critical thinking to new experiences that have not been encountered before. Osborne et al. (Citation2013) outline how there is increasing focus in higher education on developing the knowledge and skills in graduates which promote employability. The written assessment provided little insight, if any into how profession-ready the student was in terms of entering the workforce as a graduate and a registrant, or if they matched employers’ needs.

With the advent of a new curriculum and a shift towards an all graduate programme, an innovative approach was required in relation to the end of programme assessment. This provided the opportunity for change in terms of aligning the assessment to employer-based, real-world requirements and asking students to demonstrate meaningful application of essential knowledge and skills. The authors decided to develop the end of programme assessment as a digital viva focusing on common interview questions. details the module learning outcomes, assessment criteria and viva questions. For the purpose of this case study a digital viva is defined as ‘the recording by the student using video their responses to pre-set questions’. It is recognised that a digital viva does not provide a ‘live voice’ and, therefore, the examiner is unable to ask follow up questions or to probe answers. To some extent a digital viva is a half-way house between a written submission and a full face-to-face viva. Why then did the authors decide to compromise by developing a digital viva rather than a face-to-face viva? The answer to this lies in the fact that students were on an extended management placement for 23 weeks and they would have either had to come into the University for their assessment or the examiners would have had to visited the student on placement. Busy NHS clinical areas do not lend themselves to controlled viva voce examinations and the size of the programme with 700 students meant that the workload implications of doing assessments in a clinical setting would prove exceptionally challenging.

Table 1. Relationship between module learning outcomes, assessment criteria and questions asked during digital viva.

The pre-set interview questions related to leadership, management and quality of care, asking students to identify their personal strengths and weaknesses. Students were provided with a series of six questions which were developed to align with the assessment criteria. It is well recognised that by employing this approach in clearly defining the assessment criteria and asking students the same questions that reflect rubrics and criteria for answers, internal consistency and reliability is much more likely (Birley, Citation2001; Davis & Karunathilake, Citation2005; Oakley & Hencken, Citation2005). Students then researched their answers, rehearsed and subsequently recorded and, if necessary, re-recorded their viva as a 20-min video (MP4 file). This rehearsal and recording/re-recording was considered an important element of authenticity as it mirrors the preparation which students are encouraged to do prior to a real-life job interview. Zimmerman (Citation1997) outlines top tips for job interviews which include practising the answers to the kind of questions the student believes they will be asked in a job interview. While the ‘stakes’ associated with the final assessment remained the same students were able to work to refine their submission by rehearsing, recording and re-recording their responses rather than the higher stakes performance at a single time associated with a face-to-face viva.

The authors quickly discovered that the Virtual Learning Environment (VLE) was problematic in terms of uploading MP4 files with constraints such as speed of upload and the inability to manage the volume of students uploading on one specific submission date. As a result, a cloud-based file sharing application called Filemail was initially used. This allowed students to upload their file and academic staff then to download the video to complete the assessment. Feedback and results were provided via Turnitin and rubric functions in the VLE. More recently the team have identified WeTransfer which has faster upload and download capabilities depending on the user’s broadband speeds.

The team selected a digital viva as an alternative to other programme viva’s which were face-to-face in an attempt reduce the stakes in the end of programme assessment. The face-to face vivas were inherent with issues, particularly when assessing large cohorts of students. Room booking, timetabling and staff resourcing issues were not uncommon, alongside quality concerns of assessors who were examining for 2 to 3 days consecutively (Framp et al., Citation2015; Knight et al., Citation2013). Many of these issues were resolved through a digital viva as room bookings and timetables were no longer required. Each academic was assigned 25 digital viva’s to mark over a 21 working day period thereby overcoming the potential quality issues of examining for 2–3 days consecutively.

As described earlier one of the main drivers for wanting to change the final module assessment was to improve its authenticity. Gulikers, Bastiaens, and Kirschner (Citation2004, p. 69) describe what an authentic assessment is ‘requiring the students to use the same competencies, or combination of knowledge, skills and attitudes that need to be applied in the criterion situation in professional life’ Authentic assessment can be difficult to design and it is sometimes easier to consider authenticity on a continuum from more traditional methods of assessment to truly authentic assessments which mirror real life situations (Meuller, Citation2006). More traditional methods of assessment such as essays and examinations have little or no application to real life and authentic assessment aims to assess student’s ability to apply knowledge and skills to solve real world and, often complex, problems (Rally, Citation2015). Gulikers et al. (Citation2004) and Polikela (Citation2004) describe some of the characteristics of an authentic assessment, these include:

  • perceptions amongst students that the assessment is authentic

  • similar to real work problems and tasks

  • based on performance

  • integration between workplace skills and university academic requirements

  • placing an emphasises on assessment for learning

  • motivational for the student

The above characteristics present some challenges in terms of assessment design not least the challenge of integrating workplace skills and university academic requirements into a single assessment. Whilst this can be challenging it is not impossible as problem solving and the integration of knowledge and skills can demonstrate the student’s ability to analyse, critically review and synthesise which are often the requirements of undergraduate assessments. In order to enhance assessment for learning the team designed a programme assessment strategy which involved iteratively revisiting similar assessment methods in each year of the programme. This enabled the students to specifically utilise the feedback when being assessed by a similar method. This resulted in a group viva in year one, an individual face-to-face viva in year two and a digital viva in the 3rd year. Given that the feedback from these viva’s related to application of theory to practice, level of analysis and synthesis the digital viva, although different from a face-to-face viva still enables students to utilise previous assessment feedback. Therefore, there was progression between years not only in academic level but also in how the students responded to a similar assessment task.

The other key motivation for changing the assessment was to promote employability. Dacre Pool and Sewell (Citation2007) describe how employability encompasses employability assets e.g. knowledge, skills and other attributes, deployment career management skills and presentation or job getting skills. While this assessment would contribute to knowledge and skills, and therefore employability assets it was mainly focused on presentation and job getting skills by preparing students for their forthcoming interviews. Watts (Citation2006) described the DOTS employability model with DOTS standing for Decision, Opportunity, Transition and Self-awareness. Whilst the decision to embark on a nursing career had been made the wide scope of nursing roles meant that student still had decisions to make once they were reaching the end of their programme. Nursing as a profession provides a considerable number of opportunities with various specialties and roles available. The module was designed to assist students with transition raising their awareness of employment possibilities and to support them in their preparations for applying for their first job. The digital viva also enables the student to develop their self-awareness in terms of how their current knowledge, skills and experience fits with the role they are applying for. This is achieved through a process of reflecting on what they have learnt during the course and how this fits with what an employer is looking for in a newly qualified registered nurse.

Evaluation

The aim of the evaluation of the new digital viva was to explore student’s perceptions of the new assessment as well as to examine their previous experience of using technology to record and upload videos. Once the first cohort had undertaken their digital viva’s they were invited to take part in an evaluation by completing an online survey.

The online survey examined previous experience of using online and social media, previous video recording and the use of cloud storage as well as perceptions of the difficulty in recording and uploading the video viva. To examine whether students had taken advantage of the assessment being a lower stakes assessment than a face-to-face viva student were asked how many times they had recorded and re-recorded the viva video. They were also asked how many times whey had rehearsed their viva responses before finally recording the video. Students were also asked about whether they felt the questions asked in the viva were similar to those which would be asked at a job interview. Each question allowed participants to offer qualitative feedback and the final question asked students to provide their feedback about this method of assessment.

Results

A total of 310 nursing students were invited to participate in the survey via an invitation email. A total of 68 students completed the survey representing a 21.9% response rate. The evaluation was in addition to the standard university module evaluation and the National Student Survey (NSS), which may have impacted on the response rate. Students had wide experience of using social media and applications. shows that the most common types of social media were related to sharing updates, photographs and videos. However, when asked if they had recorded and uploaded a video to a social media platform under half (46.27%) of the respondents had (n = 31) and the majority (53.73% had not (n = 36). Despite this the majority of respondents reported on a Likert scale that they found working with applications, recording video and uploading video easy or very easy. shows the specific results against the five-point Likert scale for these questions.

Table 2. Student reported use of social media and applications (used more than once per month).

Table 3. Experience of information technology, recording and uploading video.

The average number of times a video was recorded was 3.41 with a range between a single recording n = 16 (23.53%) and more than 10 recordings (n = 3) accounting for 4.41% of participants. The average number of rehearsals was 3.67 with a range of a single rehearsal n – 23 (34.33%) through to more than 10 rehearsals (n = 2) accounting for just 2.99% of the total participants.

When asked to rate on a scale of 0 to 100 how difficult they found the recording and uploading. Where 0 is very easy and, 100 is very difficult, the average score was 55. A number of students provided qualitative feedback in addition to their rating. Comments about how stressful the experience of recording and compressing the MP4 file were common.

the recording, uploading and compressing of the files was very stressful

Most of the comments expressed a preference for this type of lower stakes assessment over a face-to-face viva

I prefer this method to the face-to-face as there is more pressure to remember material. This is a fairer system

I would feel more pressurised and nervous if the viva was face-to-face

Some students welcomed the opportunity to rehearse and re-record their submission.

It gave me the time to rehearse and include exactly what I wanted to say but the uploading of the viva was extremely difficult

When asked to score a statement about whether the digital viva questions mirrored those which would or were asked at a job interview, using a scale of 0 strongly disagree to 100 strong agree, the average score was 48 suggesting that the questions did not mirror a real-life job interview. One respondent commented

Although the questions had some similarities to interview questions the job interview questions were more generic and did not require hugely detailed answers

General feedback could be broadly grouped into two categories, those who liked the lower stakes approach and those who found the whole experience of managing the technical and IT issues very stressful.

I found it easier to just speak and say what I needed to rather than someone asking questions and watching me

Some respondents commented directly on the ability to rehearse and re-record allowing them to revise their viva submission several times before submission

Having the ability to review your answers enables you to identify any errors or missing information and correct them in order to achieve a higher grade

You don’t feel as anxious presenting a digital viva, as if you make a mistake, you can re-record it.

For many respondents the advantages of having a lower stakes assessment were offset by the technical problems with uploading their video.

It enables us to record without the pressure of others watching, and feel that it would be easier to mark. The only problem that I had was that it was not highlighted by the lecturer that the viva could take several hours to download, originally mine was going to take 20 hours but as my partner is good on the computer he was able to compress it so it only took 3 hours, which is still a long time

Discussion & implications for practice

The results suggest that the new digital viva assessment has successfully moved the end of programme assessment from a high-stakes to a lower stakes assessment allowing students to rehearse, record and re-record their viva rather than having a single high stakes performance on a single day. The implementation of the digital viva was not without its problems not least those related to the handling of MP4 files and the uploading of the file for assessment. The authors were surprised that the VLE, at that time was not really suitable for the uploading of MP4 files other than very short recordings. Large files even when compressed either took a considerable time to upload, were timed out, or were rejected because the file size was too large. Even with a cloud file sharing arrangement some struggled to upload their files and despite the fact that many students had prior experience of uploading video to social media they often struggled to zip and compress their files before uploading. This raises important issues about the digital skills of student particularly given that digital skills are often a key graduate attribute. Clearly throughout the programme there is a need to ensure that students develop their digital skills in a more structured way rather than to rely on them developing them through a process of trial and error. Commonly encountered problems could have been avoided by building file compression guidance into the module for example. The problems with uploading from home were undoubtedly compounded by the UK broadband infrastructure in some geographical locations. Some students reported that they could only upload their files if they came into the University and used open access PC’s. Some students reported that they liked the assessment approach but would rather submit their assessment on a USB, than have the stress of uploading the file. With a significant shift to online submission of assessment and feedback the use of USB as a mode of submission does not address the key issues in terms of digitally enabling graduates for working life, uncertainty about broadband infrastructure and capacity of VLE’s.

One of the most significant issues with the digital viva as an authentic assessment was the fact that it was difficult to balance the academic requirements of assessing learning outcomes at level 6 (final year under-graduate) and to produce an authentic assessment with high fidelity. As one student commented in a job interview you often do not need to refer to published work, theories etc. and, therefore, interview questions by their very nature usually require shorter and more succinct answers. The authors believe that what they have in fact developed is a more authentic assessment than that previously used and this would fit with Mueller’s view of authenticity existing on a continuum (Mueller, Citation2006).

The authors consider that the digital viva addresses the scheduling, timetabling and quality issues associated with having to organise 700 face-to-face viva’s over a 1-week period. The system of assessment weeks on a programme with a large number of students is a logistical nightmare with a large number of rooms being needed for assessment and staff facing long days of assessment of 2–3 consecutive days. With a digital viva staff has 21 working days to mark the work, moderation and external examining is made easier online. Each academic would normally have to mark 25 vivas whether face-to-face or online.

Finally, managing a large-scale assessment without the benefit of the assignment function of a VLE was exceptionally challenging as ensuring that all the MP4 files were downloaded and allocated to the correct academic markers required considerable time. Staff often found moving between watching the MP4 file and marking using a rubric and Turnitin in the VLE difficult although several staff found innovative ways of doing this including watching the videos on smart TV’s or using dual screens. This method of assessment also raised issues with the digital skills of staff in terms of opening and managing compressed files, alongside the departure from paper-based script submissions and annotated feedback, which again required a different skill set. With the benefit of hindsight these issues could have been overcome with staff training in advance of the first assessments.

As highlighted earlier this type of assessment may be applicable to other disciplines and the authors would offer the following suggestions to anyone planning on implementing this type of assessment. Ensuring staff buy in from the outset and providing initial and ongoing training to staff is essential if this type of assessment is to be successful. In addition, identifying early on the method for recording and uploading video files needs to be identified and tested prior to large scale roll out. Importantly, file transfer speeds should not simply be checked within a University setting but from a variety of locations as students will often try to do this at home. Moving forward the team plan to make use of lecture capture video capability to assist students with recording and uploading/sharing of files. Linking video recording folders to the VLE can easily overcome many of the problems uncounted by the programme team with video recording and file transfer.

Disclosure statement

No potential conflict of interest was reported by the authors.

References

  • Birenbaum, M. (1996). Assessment 2000: Towards a pluralistic approach to assessment. In Birenbaum & F.J.R.C. Dochy (Eds.), Alternatives in assessment of achievements, learning processes and prior knowledge. Boston MA: Kluwer Academic Publishers.
  • Birley, H. (2001). The society of Apothecaries diploma examination in genitourinary medicine: Death of the viva voce? Sexually Transmitted Infections, 77(3), 223–224.
  • Bloom, B.S., Engelhart, M.D., Furst, E.J., Hill, W.H., & Krathwohl, D.R. (Eds.). (1956). Taxonomy of educational objectives. The classification of educational goals, handbook I: Cognitive domain. New York, NY: David McKay.
  • Borin, N., Metcalfe, L.E., & Tietje, B.C. (2008). Implementing assessment in an outcome-based marketing curriculum. Journal of Marketing Education, 30(2), 150–159.
  • Carter, B., & Whittaker, K. (2009). Examining the British PhD viva: Opening new doors or scarring for life? Contemporary Nurse, 32(1–2), 169–178.
  • Chadwick, S.M. (2004). Curriculum development in orthodontic specialist registrar training: Can orthodontics achieve constructive alignment? Journal of Orthodontics, 31(3), 267–274.
  • Dacre Pool, L., & Sewell, P. (2007). The key to employability: Developing a practical model of graduate employability. Education and Training, 49(4), 277–289.
  • Davis, M.H., & Karunathilake, I. (2005). The place of the oral examination in today’s assessment system. Medical Teacher, 27(4), 294–297.
  • Doloriert, C., & Sambrook, S. (2011). Accommodating an auto-ethnographic PhD: The tale of the thesis, the viva voce, and the traditional business school. Journal of Contemporary Ethnography, 40(5), 582–615.
  • Framp, A., Downer, T., & Layh, J. (2015). Using video assessments as an alternative to the Objective Structured Clinical Examination (OSCE). Australian Nursing and Midwifery Journal, 23(1), 42.
  • Gulikers, J.T.M., Bastiaens, T.J., & Kirschner, P.A. (2004). A five-dimensional framework for authentic assessment. Educational Technology Research and Development, 52(3), 67–86.
  • Harden, R.M. (2002). Learning outcomes and instructional objectives: Is there a difference? Medical Teacher, 24(2), 151–155.
  • Hart, D. (1994). Authentic assessment: A handbook for education. Menlo Park: Addison-Wesley Publishing Company.
  • Hungerford, C., Walter, G., & Cleary, M. (2015). Clinical case reports and the viva voce: A valuable assessment tool, but not without anxiety. Clinical Case Reports, 3(1), 1–2.
  • Joughin, G. (1998). Dimensions of oral assessment. Assessment & Evaluation in Higher Education, 23(4), 367–378.
  • Kehm, B.M. (2001). Oral examinations in German universities. Assessment in Education, 8(1), 25–31.
  • Knight, R., Dipper, L., & Cruice, M. (2013). The use of video in addressing anxiety prior to viva voce exams. British Journal of Educational Technology, 44(6), E217–E219.
  • Mandus, G., Russell, M., & Higgins, J. (2009). The Paradoxes of high stakes testing: How they affect students, their parents, teachers, principals, schools and society. Boston, MA: Information Age Publishing.
  • Meakim, C., Boese, T., Decker, S., Franklin, A.E., Gloe, D., Lioce, L., … Borum, J.C. (2013). Standards of best practice: Simulation standard I: Terminology. Clinical Simulation in Nursing, 9(6), S3–S11.
  • Merrian-Webster Dictionary. (2018, July 29). Time traveller word first usage 1654. Retrieved from https://www.merriam-webster.com/time-traveler/1654
  • Mueller, J. (2006, March 12). Authentic assessment toolbox. Retrieved from http://jfmueller.faculty.noctrl.edu/toolbox/whatisit.htm#looklike
  • Naqvi, A., & Aheed, B. (2014). Introducing an innovative viva format for assessment and integrated knowledge. Journal of the Pakistan Medical Association, 64(7), 824–825.
  • Oakley, B., & Hencken, C. (2005). Oral examination assessment practices: Effectiveness and change with a first-year undergraduate cohort. Journal of Hospitality, Leisure, Sport and Tourism, 4(1), 3–14.
  • Orrock, P., Grace, S., Vaughn, B., & Coutts, R. (2014). Developing a viva exam to assess clinical reasoning in pre-registration osteopathy students. BMC Medical Education, 14, 193.
  • Osborne, R., Dunne, E., & Farrand, P. (2013). Integrating technologies into ‘authentic’ assessment design: An affordances approach. Research in Learning Technology, 21, 1–18.
  • Pearce, G., & Lee, G. (2009). Viva voce (oral examination) as an assessment method: Insight from marketing students. Journal of Marketing Education, 31(2), 120–130.
  • Pinto Pereira, L.M., Telang, B.V., Butler, K.A., & Joseph, S.M. (1993). Preliminary evaluation of a new curriculum – Incorporation of problem-based learning (PLB) into the traditional format. Medical Teacher, 15(4), 352–365.
  • Poikela, E. (2004). Developing criteria for knowing and learning at work: Towards context-based assessment. The Journal of Workplace Learning, 16(5), 267–274.
  • Ponder, N., Beatty, S., & Foxx, W. (2004). Doctoral comprehensive exams in marketing: Current practices and emerging perspectives. Journal of Marketing Education, 26(3), 226–235.
  • Rahman, G. (2011). Appropriateness of using oral examination as an assessment method in medical or dental education. Journal of Education and Ethics in Dentistry, 1(2), 46–51.
  • Rally, B. (2015). Authentic assessment: Using assessment to help students learn. Relieve, 21(2). doi:10.7203/relieve.21.2.7674
  • Reeves, T.C., & Okey, J.R. (1996). Alternative assessment for constructivist learning environments. In B.G. Wilson (Ed.), Constructivist learning environments: Case studies in instructional design (pp. 191–202). Englewood Cliffs, NJ: Educational Technology Publications.
  • Torrance, H. (1995). Evaluating authentic assessment. Buckingham: Open University Press.
  • Wass, V., Van der Vleuten, C., Shatzer, J., & Jones, R. (2001). Assessment of clinical competence. Lancet, 357(9260), 945–949.
  • Watts, A.G. (2006). Career development, learning and employment. York: Higher Education Academy.
  • Watts, J.H. (2012). Preparing doctoral candidates for the viva: Issues for students and supervisors. Journal of Further and Higher Education, 36(3), 371–381.
  • Zimmerman, P. (1997). Job interviews: Tips and techniques. Accident and Emergency Nursing, 5, 189–192.