584
Views
0
CrossRef citations to date
0
Altmetric

Abstract

Historically, written and oral assessments such as essays and group presentations have been the predominant modes of assessment used to assess pre-service physical education students’ and sport coaching students’ teaching and coaching knowledge the pedagogical content knowledge of preservice physical education students and sport coaching students. Yet, at a time when the call for more practice-based teacher education is commanding more and more attention in the literature, key consideration should be given to the authenticity of assessment modes being used. Thus, one particular mode of assessment that could be considered is use of Practically Assessed Structured Scenarios (PASS). This article presents a four-way conversation conducted by university educators from four different countries who discuss the benefits and challenges of using PASS as a mode of assessment within physical education teacher education (PETE) and sport coaching education programs. An example of how a PASS assessment session might be organised is also included.

Historically, written assessments in the form of essays, unit outlines, question–response examinations and observation reports, and oral assessments in the form of group presentations, individual vivas and round-table discussions have been used to assess preservice physical education (PE) students’ and sport coaching students’ teaching and coaching knowledge (Capel et al., Citation2009; Semiz & Ince, Citation2012). Yet, at a time when the call for more practice-based teacher education is getting louder (Ward et al., Citation2020), we believe that further consideration should be given to the authenticity of assessment modes being used to develop and determine preservice PE students’ and sport coaching students’ understanding of the students they teach, the learning environments they are responsible for, and the pedagogical choices they are making.

One particular mode of assessment that could be considered is practically assessed structured scenarios (PASS). This predominantly spoken word mode of assessment is similar to objective structured clinical examinations used in medical and veterinary education (Kirton & Kravitz, Citation2011) in that it utilizes practical scenarios to test practical knowledge. This article presents comments from a four-way professional conversation conducted by university educators from four different countries who discussed the suitability and/or implementation of PASS as a mode of assessment within physical education teacher education (PETE) and sport coaching education programs. Thus, the purpose of this article is to introduce and develop readers’ understanding of PASS and provide access to university educators’ views on the benefits and challenges of PASS implementation. Furthermore, it is intended that discussion within the article will stimulate debate among teacher educators on the range of assessments currently offered within PETE and sport coaching programs around the world.

Assessment in Sport Coach Education and PETE Programs

Over the past decade there has been increasing commentary by academics on the need for assessment literacy development for preservice PE teachers and sport coaches (Dinan-Thompson, Citation2013; Dinan-Thompson & Penney, Citation2015; Hay & Penney, Citation2013; Hurley, Citation2018). Across PETE programs, assessment is often viewed as “a particularly troublesome process” given the practical nature of the subject (Lorente-Catalán & Kirk, Citation2016, p. 66) and drives the perceived need for teachers of PE to receive additional assessment support (MacPhail & Halbert, Citation2010). With assessment practices in PE remaining relatively underdeveloped and arguably “off-trend” (Starck, Citation2018; Ward & Cho, Citation2020), investigating whether and how tertiary PE students and sport coaching students learn about assessment continues to be of importance to PETE and sport coaching program faculty (Lorente-Catalán & Kirk, Citation2016). Moreover, there still appears to be a dearth of published research exploring alternative forms of assessment in PETE programs (Lorente-Catalán & Kirk, Citation2014).

In addition, commentary on modes of assessment being utilized within sport coach education and development programs reveals a similar trend. This includes inadequate attention being given to the quality of assessment practices in coach accreditation programs (Hay et al., Citation2012) and a questioning of the purpose of practical assessments that require participation to imitate the behaviors and practices of course leaders (Nash & Sproule, Citation2012). Furthermore, Hay et al. (Citation2012, p. 192) have suggested that the prevalent use of certain assessment modes “such as tests or quizzes provide little, if any, indication of the breadth and integration of necessary coaching competencies.”

Hay and Penney (Citation2009, p. 391) stated that for assessment to work optimally in PE and sport coach education contexts, there should be a prevalence of “authentic assessment concerned with the relationships between learning content and contexts.” Aligned with this view of the need for authentic assessment opportunities are many of the policies that influence PETE and sport coaching education assessment design. For example, assessment requirements prescribed in the United Kingdom’s Subject Benchmark Statements (Quality Assurance Agency, Citation2016, p. 14) support “assessment of live practice or simulations”; in Australia, the Australian Institute for Teaching and School Leadership (a national agency committed to the quality assurance of all accredited initial teacher education programs) has stated that their standards and procedures are not intended to “constrain the ability of initial teacher education providers to be innovative” (Australian Institute of Teaching & School Leadership, Citation2016, p. 3); and in the United States, Standard 5a of the SHAPE America Initial Physical Education Teacher Education Standards (SHAPE America – Society of Health and Physical Educators, Citation2017) refers to preservice candidates being able to select or create authentic, formal assessments that measure student attainment of short and long-term objective. Arguably, the development of such a skill dictates that preservice candidates be exposed to a range of “authentic, formal assessments” themselves to help develop their craft.

In addition to changes in educational policy relating to the structure and identity of PETE and sport coach education programs, there have been widespread changes in areas such as practice expectations, employment agendas and instructional discourse (Olson et al., Citation2017). For example, Kirk et al. (Citation1997) raised the issue of PETE programs having at their core a focus on studies in biophysical sciences (e.g., exercise physiology, biomechanics, human anatomy). Indeed, PETE programs were (and still are) seen as a “professional or vocational application of human movement studies” (Kirk et al., Citation1997, p. 295). In recent years, however, Cliff (Citation2012) and Olson et al. (Citation2017) have observed a decrease in the number of sport science units in undergraduate PETE degrees and an increased number of social science– and pedagogy-focused units. One of the legacies of this earlier focus on sports science, however, is the continued use of assessment modes typically associated with the study of biophysical sciences (e.g. written reports, multiple-choice exams). That is not to say that innovative modes of assessment in PETE and sport coaching education programs are not being used. For example, Meldrum (Citation2011) described the use of scenario presentations (e.g., role-plays, school meetings and interviews), Pedersen et al. (Citation2014) highlighted the use of responses to a case study vignette and annotated bibliographies, Araya et al. (Citation2015) articulated how they used authentic scenarios (such as inheriting a new team) and observation reports, and Calderón et al. (Citation2021) explored the use of blogs and debates as modes of assessment within blended learning environments. However, when coupled with the prevalence of essay writing as a default mode of assessment in higher education (Brown, Citation2011), the positioning of academic writing as being more valued than practical abilities (Johnson, Citation2013), and the traditional alignment of written assessment modes with specific PE subject areas (Larsson et al., Citation2018), the range of authentic assessment modes used in PETE and sport coaching programs appears limited. Lorente-Catalán and Kirk (Citation2013, p. 79) offered a reminder of why exposure to innovative, authentic modes of assessment should be more commonplace in PETE and sport coaching programs: “If PE teachers are to be competent users of alternative assessment in schools, then it is of crucial importance that they are able to experience such practices as part of their professional preparation.”

Practically Assessed Structured Scenarios

Adapted from objective structured clinical examinations used in medical and veterinary education (Kirton & Kravitz, Citation2011), PASS is predominantly a spoken word mode of assessment, authentic in its use of practical scenarios to test practical knowledge. When students engage in a PASS assessment, they move through a series of pre-prepared assessment scenarios, either individually or in a group. Scenarios are designed to examine understanding of learned elements throughout a program, from curriculum content and teaching and learning context to the application of health and safety protocols. A typical PASS assessment may include the following elements:

  1. Individual responses/group discussion offered in response to pedagogical or content focused questions

  2. Group responses to a range of verbal, written or visual cues designed to assess teaching and/or coaching proficiency

  3. Individual verbal response outlining key observations made of practical performance, still or video images relating to specific teaching and learning scenarios.

A Dynamic Assessment. PASS is largely based on principles of dynamic assessment—an alternative assessment that attempts to measure learning potential and is often used in conjunction with standardized tests (Caffrey et al., Citation2008). Alony and Kozulin (Citation2007) noted that dynamic assessment lends itself to assessment of “fluid” abilities (i.e., those in a state of change or varying in the way they are used and applied), rather than assessing “crystallized” abilities that represent an outcome of learning or acquisition. Such an approach is arguably well suited to assessing the skills, abilities and understanding that closely align with PETE and sport coach education programs.

A Development-Focused Assessment. The aim of PASS is to assess understanding and application as well as potential for further development, rather than a static level of achievement, and “it does this by prompting, cueing or mediating within the assessment, and evaluating the enhanced performance” (Barjesteh & Niknezhad, Citation2013, p. 527). Therefore, in alignment with Vygotsky’s (Citation1978) concept of the zone of proximal development, PASS can capture evidence of higher levels of development than that which would be determined by an entirely independent performance and, in so doing, provides both students and staff with a basis for optimizing growth as well as planning for further development.

Consequently, the assessment process itself, rather than being an “end point,” provides further learning opportunities for students, thus reflecting an assessment for learning approach. Here, the assessment process is designed to identify strengths as well as next steps as opposed to exposing deficiencies in learning. This dynamic and collaborative assessment process enables students to demonstrate higher levels of understanding, being guided by the assessor to make links and draw inferences that they might not otherwise do. Moreover, when PASS involves assessment in groups, the collaborative nature of the experience provides opportunities for students to “pool” their expertise and therefore achieve better marks than they might otherwise have done alone. They have opportunities to check their understanding with each other and build on or extend what their peers can contribute. For some students this is a welcome positive alongside other types of group assessment (e.g., oral presentations), in which they often perceive that the performance of others has negatively affected their own (group) mark.

An Adaptable Assessment. provides an example of how a PASS session might be organized. In the example provided, four 60-min assessment sessions are organized for four separate cohorts (1a, 1b, 2a, 2b). Each cohort contains three separate groups of up to eight students. Groups rotate around three separate assessment stations, with the fourth assessment station visited by all students simultaneously. This arrangement allows for two staff members to conduct an entire cohort’s assessment session (e.g., one staff member supervising the group demo on one-half of a basketball court, with the other staff member supervising both the group discussion and individual written task). The PASS session outlined in was designed for a first-year undergraduate games and activities module with content having both a theoretical component (e.g., classification of games, safeguarding of the learning environment) and a practical/applied component (e.g., modifying elements of a game, adapting instructional pedagogies to improve performance). By catering to both individual and group assessment needs, PASS can be adapted to assess a range of practice and knowledge-based learning outcomes.

Table 1. Example Organization of a PASS Assessment Session

A Collective Desire to Develop Our Craft

Considering the importance of both authentic assessment in teacher education programs and our professional development responsibilities as a global faculty of teacher educators, the authors of this article engaged in a 6-month asynchronous professional conversation exploring the utility of PASS usage in their respective PETE and/or sport coaching education programs. A professional conversations model was used as a guiding framework for conversation within which a series of predetermined prompt questions were set to steer discussion (Leonard, Citation2012). For a detailed overview of the participatory research method used to capture authors’ experiences and perceptions of PASS, see Jarrett et al. (Citation2021).

In essence, authors’ involvement in this professional conversation could be attributed to their collective desire to develop their craft as sport pedagogues and teacher/coach educators. All conversation participants had at least 12 years of university teaching experience, with all currently delivering a range of undergraduate and postgraduate PETE and sport coaching programs. Two authors had previous experience of implementing PASS and two had yet to trial PASS in any form. All participants had previously either worked at the same institution or been involved in research collaborations together. When the professional conversation initially began, each participant was affiliated to a university in four separate countries: Australia, the United States, the United Kingdom and Catalonia (Spain). This spread of participants across four different countries was by design, because seeking out international voices helps us to develop our glocal identities as educators, defined as the mixing of global and local values to enhance cultural efficacies, creativity and diversity of thought (Dvir et al., Citation2019; Jarrett, Citation2021). Engaging in conversation with educators based abroad also provides a deeper understanding of practice beyond local learning environments.

Benefits and Challenges of Using PASS

The 6-month professional conversation undertaken by participants included the discussion of a range of experienced and perceived benefits and challenges related to PASS implementation. At the end of the conversation period, a review of the conversation transcript was completed by all participants, with key comments relating to the benefits or challenges of PASS usage collectively identified. A key comment was determined when all four authors identified (e.g., highlighted) the comment during their review of the transcript. These key comments are outlined in and . In relation to each key comment identified, supplementary commentary from the conversation transcript was included to engage readers in a form of silent conversation, whereby the reader takes on a silent but active role in reflecting on and responding to commentary (e.g., in a verbal or written manner) to bring into focus their own responses to each articulated PASS benefit or challenge. This supplementary commentary was then used to inform the development of key questions designed to further scaffold practitioners’ reflections on PASS-related decision making, planning and implementation.

Table 2. Benefits of Using PASS

Table 3. Challenges of Using PASS

Discussion

Perhaps it is somewhat predictable to see that within the majority of statements categorized as a benefit came from participants with prior experience of PASS use. Accordingly, those without experience of PASS commented more on the organizational and resource barriers they envision if a change to PASS were made. These organizational and resource barriers should be viewed as contributing factors to the call for assessment capacity building that Dinan-Thompson and Penney (Citation2015), Hay and Penney (Citation2009, Citation2013) and Lorente-Catalán and Kirk (Citation2016) have previously described.

Of note within were participants’ views around reluctance of fellow staff to embrace new modes of assessment. Comments such as “[there] was the need to manage expectations of both colleagues and students new to PASS assessment” and “some of my colleagues do not like the variability and less tightly defined nature of the PASS criteria” allude to the need for relevant institutional leaders to support colleagues and their use of innovative assessment practices such as PASS. Offering such support will no doubt become easier with the commissioning and publication of relevant empirical research and colleagues’ involvement in PASS-related scholarship of teaching activity. It is also important to point out that encouraging and supporting colleagues to experiment with new modes of assessment falls well within PETE assessment and innovation standards previously detailed.

Anecdotally, the list of benefits presented in provides some level of justification for PASS to be explored as an alternative mode of assessment that promotes engagement in and assessment of what Ward et al. (Citation2020) described as the core practices of teaching PE and coaching sport. Through recognition of the reported value of authentic assessment practices such as PASS, the collegial practices that are such a large part of core “everyday” PE teaching and sport coaching can potentially be given their deserved attention within PETE and sport coaching programs. What is also important is that any value associated with use of PASS must be recognized by students as well. The comment made that “I do not have ANY group-based grades in ANY of my classes due to the historical strain of dealing with student complaints” indicates an assessment culture in contrast to the opportunities for learning that assessment modes such as PASS can offer.

Conclusion

The professional conversations detailed within this study highlight a number of benefits and challenges relating to PASS integrity and feasibility. With a collective desire to see more authentic modes of assessment used within preservice PE teacher and sport coaching education programs, the authors of this study considered the use of PASS as a viable and authentic method of assessment. Organizational and conceptual challenges to PASS implementation were raised, and it is in response to these challenges that further research into PASS development and implementation can and should be focused.

Disclosure Statement

We confirm that the research commented on in this article received ethical clearance and has not been submitted for review to any other journal/publication. No external funding was received to complete the research.

Additional information

Notes on contributors

Kendall Jarrett

Kendall Jarrett ([email protected]) was an associate professor in learning & teaching at the University of Greenwich in London, UK, during the authorship of this article. He is now a senior lecturer in educational leadership at the University of Chester, UK.

Belinda Cooke

Belinda Cooke is a head of subject in physical education and outdoor education at Leeds Beckett University in Leeds, UK.

Stephen Harvey

Stephen Harvey is professor in sport pedagogy at Ohio University in Athens, OH.

Victor Lopez-Ros

Víctor López-Ros is an associate professor and chair of sport and physical education within the Faculty of Education and Psychology at the University of Girona in Girona, Spain.

References

  • Alony, S., & Kozulin, A. (2007). Dynamic assessment of receptive language in children with Down syndrome. Advances in Speech Language Pathology, 9(4), 323–331. https://doi.org/10.1080/14417040701291415
  • Araya, J., Bennie, A., & O’Connor, D. (2015). Understanding performance coach development: perceptions about a postgraduate coach education program. International Sport Coaching Journal, 2(1), 3–14. https://doi.org/10.1123/iscj.2013-0036
  • Australian Institute of Teaching and School Leadership. (2016). Guidelines for the accreditation of initial teacher education programs in Australia. Melbourne: AITSL. https://www.aitsl.edu.au/docs/default-source/default-document-library/guidance-for-the-accreditation-of-initial-teacher-education-in-australia.pdf?sfvrsn=caf1ec3c_0
  • Barjesteh, H., & Niknezhad, F. (2013). A paradigm shift toward a new philosophy of assessment: Dynamic assessment from a critical perspective. Indian Journal of Fundamental and Applied Life Sciences, 3(3), 526–535.
  • Brown, S. (2011, July 6). Authentic assessment at Masters level [Paper presentation]. Keynote speech at the Assessment in Higher Education Conference, University of Cumbria, Carlisle, UK.
  • Caffrey, E., Fuchs, D., & Fuchs, L. (2008). The predictive validity of dynamic assessment: A review. The Journal of Special Education, 41(4), 254–270. https://doi.org/10.1177/0022466907310366
  • Calderón, A., Scanlon, D., MacPhail, A., & Moody, B. (2021). An integrated blended learning approach for physical education teacher education programmes: teacher educators’ and pre-service teachers’ experiences. Physical Education and Sport Pedagogy, 26(6), 562–577. https://doi.org/10.1080/17408989.2020.1823961
  • Capel, S., Hayes, S., Katene, W., & Velija, P. (2009). The development of knowledge for teaching physical education in secondary schools over the course of PGCE year. European Journal of Teacher Education, 32(1), 51–62. https://doi.org/10.1080/02619760802457216
  • Cliff, K. (2012). A sociocultural perspective as a curriculum change in health and physical education. Sport, Education and Society, 17(3), 293–311. https://doi.org/10.1080/13573322.2011.608935
  • Dinan-Thompson, M. (2013). Claiming ‘educative outcomes’ in HPE: The potential for ‘pedagogic action. Asia Pacific Journal of Health, Sport and Physical Education, 4(2), 127–142.
  • Dinan-Thompson, M., & Penney, D. (2015). Assessment literacy in primary physical education. European Physical Education Review, 21(4), 485–503. https://doi.org/10.1177/1356336X15584087
  • Dvir, Y., Maxwell, C., & Yemini, M. (2019). Glocalisation’ doctrine in the Israeli Public Education System: A contextual analysis of a policy-making process. Education Policy Analysis Archives, 27(, 124. https://doi.org/10.14507/epaa.27.4274
  • Hay, P., Dickens, S., Crudgington, B., & Engstrom, C. (2012). Exploring the potential of assessment efficacy in sports coaching. International Journal of Sports Science & Coaching, 7(2), 187–198. https://doi.org/10.1260/1747-9541.7.2.187
  • Hay, P., & Penney, D. (2009). Proposing conditions for assessment efficacy in physical education. European Physical Education Review, 15(3), 389–405. https://doi.org/10.1177/1356336X09364294
  • Hay, P., & Penney, D. (2013). Assessment in physical education: A sociocultural perspective. Routledge.
  • Hurley, K. (2018). Assessment competence through In Situ practice for preservice educators. Journal of Physical Education, Recreation & Dance, 89(3), 24–28. https://doi.org/10.1080/07303084.2017.1417927
  • Jarrett, K. (2021). Leadership for impact: the importance of the global and of the local. Professional Development Today, 22(2). https://www.teachingtimes.com/leadership-for-impact-the-importance-of-the-global-and-of-the-local/
  • Jarrett, K., Cooke, B., Harvey, S., & López-Ros, V. (2021). Using professional conversations as a participatory research method within the discipline of sport pedagogy-related teacher and coach education. Journal of Participatory Research Methods, 2(1) https://doi.org/10.35844/001c.18249
  • Johnson, T. (2013). The value of performance in physical education teacher education. Quest, 65(4), 485–497. https://doi.org/10.1080/00336297.2013.811093
  • Kirk, D., Macdonald, D., & Tinning, R. (1997). The social construction of pedagogic discourse in physical education teacher education in Australia. The Curriculum Journal, 8(2), 271–298. https://doi.org/10.1080/0958517970080206
  • Kirton, S. K., & Kravitz, L. (2011). Objective Structured Clinical Examinations (OSCEs) compared with traditional assessment methods. American Journal of Pharmaceutical Education, 75(6), 111. https://doi.org/10.5688/ajpe756111
  • Larsson, L., Linnér, S., & Schenker, K. (2018). The doxa of physical education teacher education – set in stone? European Physical Education Review, 24(1), 114–130. https://doi.org/10.1177/1356336X16668545
  • Leonard, S. (2012). Professional conversations: Mentor teachers’ theories-in-use using the Australian National Professional Standards for Teachers. Australian Journal of Teacher Education, 37(12), 46–62. https://doi.org/10.14221/ajte.2012v37n12.7
  • Lorente-Catalán, E., & Kirk, D. (2013). Alternative democratic assessment in PETE: an action-research study exploring risks, challenges and solutions. Sport, Education and Society, 18(1), 77–96. https://doi.org/10.1080/13573322.2012.713859
  • Lorente-Catalán, E., & Kirk, D. (2014). Making the case for democratic assessment practices within a critical pedagogy of physical education teacher education. European Physical Education Review, 20(1), 104–119. https://doi.org/10.1177/1356336X13496004
  • Lorente-Catalán, E., & Kirk, D. (2016). Student teachers’ understanding and application of assessment for learning during a physical education teacher education course. European Physical Education Review, 22(1), 65–81. https://doi.org/10.1177/1356336X15590352
  • MacPhail, A., & Halbert, J. (2010). We had to do intelligent thinking during recent PE’: students’ and teachers’ experiences of assessment for learning in post-primary physical education. Assessment in Education: Principles, Policy & Practice, 17(1), 23–39. https://doi.org/10.1080/09695940903565412
  • Meldrum, K. (2011). Preparing pre-service physical education teachers for uncertain future(s): a scenario-based learning case study from Australia. Physical Education & Sport Pedagogy, 16(2), 133–144. https://doi.org/10.1080/17408981003712828
  • Nash, C., & Sproule, J. (2012). Coaches perceptions of their coach education experiences. International Journal of Sport Psychology, 43, 33–52.
  • Olson, R., Laidlaw, P., & Steel, K. (2017). No one wants to be taught from a textbook!’: Pre-service health and physical education teachers’ reflections on skill acquisition and a new curriculum. European Physical Education Review, 23(4), 499–516. https://doi.org/10.1177/1356336X16658222
  • Pedersen, S. J., Cooley, P. D., & Hernandez, K. (2014). Are Australian pre-service physical education teachers prepared to teach inclusive physical education? Australian Journal of Teacher Education, 39(8), 53–62. https://doi.org/10.14221/ajte.2014v39n8.4
  • Quality Assurance Agency. (2016). Subject benchmark statements: Events, hospitality, leisure, sport and tourism. Retrieved from, https://dera.ioe.ac.uk/27845/1/SBS-Events-Hospitality-Leisure-Sport-Tourism-16.pdf
  • Semiz, K., & Ince, M. (2012). Pre-service physical education teachers’ technological pedagogical content knowledge, technology integration self-efficacy and instructional technology outcome expectations. Australasian Journal of Educational Technology, 28(7), 1248–1265. https://doi.org/10.14742/ajet.800
  • SHAPE America - Society of Health and Physical Educators (2017). National standards for initial physical education teacher education.
  • Starck, J. R. (2018). Assessment practices in physical education [PhD thesis, University of Alabama]. University of Alabama Institutional Repository. https://ir.ua.edu/bitstream/handle/123456789/5190/file_1.pdf?sequence=1&isAllowed=y
  • Vygotsky, L. (1978). Mind in society: The development of higher psychological processes. Harvard University Press.
  • Ward, P., & Cho, K. (2020). Five trends in physical education teacher education. Journal of Physical Education, Recreation & Dance, 91(6), 16–20. https://doi.org/10.1080/07303084.2020.1768182
  • Ward, P., Higginson, K., & Cho, K. (2020). Core practices for preservice teachers in physical education teacher education. Journal of Physical Education, Recreation & Dance, 91(5), 37–42. https://doi.org/10.1080/07303084.2020.17345