168
Views
0
CrossRef citations to date
0
Altmetric
Original Article

Feasibility and practicality of a simulated placement: an exploratory pilot of a novel training method for postgraduate psychology students in the wake of COVID-19

ORCID Icon, ORCID Icon, ORCID Icon, ORCID Icon & ORCID Icon
Received 27 Jun 2023, Accepted 08 Mar 2024, Published online: 27 Mar 2024

ABSTRACT

Objective

In 2020, COVID-19 caused a drastic interruption to face-to-face postgraduate psychology placements in Australia, prompting the development of a simulated placement. This pilot project represents a preliminary evaluation of the program by exploring the feasibility of a simulated placement as a novel training modality for competence development in postgraduate psychology training.

Method

Students enrolled in a simulated postgraduate psychology placement in 2021 completed an online survey at the end of the placement. Acceptability, perceived competency and experience of the placement were assessed. Chi-square goodness of fit tests were used to evaluate the difference between rates of completion and the length of time it took to complete placement requirements between students enrolled in face-to-face placements in 2019 and 2020, and the simulated placement in 2021.

Results

The results showed an increase in proportion of students who completed the simulated placement than previous face-to-face placements and that students were able to complete training and meet competence in shorter time frames. Students reported increased psychological competency and confidence to practice at the completion of the simulated placement.

Conclusions

A simulated psychology placement is a novel teaching modality that may be feasible in the development of perceived psychological competence and confidence in postgraduate psychology students.

Key Points

What is already known about this topic:

  1. Simulation-based learning has been increasingly utilised in health disciplines to support student development of clinical competencies.

  2. Simulation-based learning appears to offer a number of advantages to both students and training institutions, including exposure to low prevalence, high-risk presentations, increased training fidelity, and the ability to monitor and assess student progress.

  3. There is limited evidence for the use of simulated placements for competency development in postgraduate psychology students.

What this study adds:

  1. A sample of postgraduate psychology students enrolled in a fully simulated placement reported increased psychological competency and confidence to practice, and experienced both benefits and challenges associated with the placement, including increased flexibility, opportunity for “safe” skill development, and workload fatigue.

  2. The simulated placement was associated with higher rates of placement completion and significantly lower rates of extensions of time required to complete the placement compared to face-to-face placements.

  3. A simulated placement was found to be a feasible training modality in professional psychology through the demonstration of practicality, acceptability of the placement, and perceived confidence and competency development.

The onset of COVID-19 caused a drastic interruption to face-to-face postgraduate psychology placements in Australia, and many universities were faced with large cohorts of students unable to secure or complete supervised placements (Paparo et al., Citation2021). Training institutions were required to make significant decisions regarding student placements in a relatively short amount of time, attempting to support students to continue to progress through their postgraduate training while maintaining the safety of students and minimising risks (Rice, Murray, et al., Citation2022). Due to COVID-19, many places that previously accepted students for placement paused the acceptance of new students, and many workplaces moved to work from home and telehealth services in a bid to maintain safety of staff (Cosh et al., Citation2021). These changes significantly reduced the availability of placements on offer that were able to meet training requirements (Cosh et al., Citation2021). In order to support students’ continued progression through their postgraduate psychology training, one regional university in Australia developed a fully simulated professional placement that met placement requirements under the Australian Psychology Accreditation Council (APAC) Accreditation Standards (Citation2019), and offered students access to continued skill development and supervision in what was otherwise a tumultuous and interrupted year for students. APAC is responsible for the development and review of accreditation standards for psychology programs in Australia. The 2019 revision of the APAC Accreditation Standards was the first version to allow simulation-based learning at Level 3 (Professional Psychology programs) for a standalone placement. While becoming increasingly common in many health sciences including medicine, nursing, and occupational therapy (Herrera-Aliaga & Estrada, Citation2022; Imms et al., Citation2018; James & Chapman, Citation2010; Nehyba et al., Citation2017), there is a lack of evidence for the use of simulated learning activities and simulated placements in psychology (Rice, Murray, et al., Citation2022).

Simulation-based learning (SBL) is most commonly defined as “a technique that creates a situation or environment to allow persons to experience a representation of a real event for the purpose of practice, learning, evaluation, testing, or to gain understanding of systems or human actions” (Lioce et al., Citation2020, p. 44). Specific simulated learning activities (SLA) can vary; however, primary approaches include role plays, objective structured clinical examinations (OSCE), simulated clients, and actors (Paparo et al., Citation2021). SLAs have the potential to provide a vast array of benefits to both students and training programs, as well as addressing some common challenges associated with traditional learning methods (Paparo et al., Citation2021). Such benefits include standardised exposure to a wide range of common mental health issues, experience with high-risk, low prevalence presentations students may not encounter in traditional face-to-face placements, less interruptions to placement progress, increased accessibility for regional and remote students, and utility to observe and assess student competency more closely (Lateef, Citation2010; Phillips et al., Citation2017; Roberts et al., Citation2017).

The utility of SLAs to provide standardised exposure to a wide range of common mental health issues, as well as the opportunity to gain experience with high-risk, low prevalence presentations address a common challenge identified in traditional postgraduate psychology placements (Paparo et al., Citation2021). Not only does this ensure students are gaining the exposure that they require and a low-risk opportunity to apply their knowledge and skills to these challenging but likely inevitable scenarios, it also provides an important layer of protection to vulnerable clients (Lucas, Citation2014). This aspect of SLAs aligns with Standard 1: Public Safety of APAC’s accreditation requirements for psychology programs, which asserts that public safety must be assured (APAC, Citation2019). Further, SLAs provide the opportunity for repeated exposures to these client presentations or challenges, thus, enhancing experience, skill development, and application (Lateef, Citation2010). Cybulski et al. (Citation2010) suggested this also has the flow on effect of improved client outcomes and student confidence in dealing with challenging scenarios, such as patients presenting with suicide risk. Cybulski et al. (Citation2010) suggested that SLAs provide the scaffold between the safety of tertiary learning and applying theoretical skills in real-world settings, increasing both confidence and competence in students moving into the next stage of their careers. Many of these benefits have also been reported by postgraduate psychology students who have experienced some form of SLA (Oxlad et al., Citation2022). In a mixed methods analysis of student perspectives on SLAs, Oxlad et al. (Citation2022) found that students reported that SLAs provided a low-risk environment to develop their skills, prepared them for future clinical work, provided opportunities for skill development, and increased their confidence for future clinical practice. In terms of challenges, students reported concerns about the validity of skills assessment within SLAs and the authenticity of skills practice and development with simulated clients. In the field of nursing, Liaw et al. (Citation2010) found that nurses who engaged in a simulated problem-based learning activity performed significantly better at clinical crisis management across two different crisis management tasks than their counterparts who engaged in a standard problem-based learning activity. In another study, postgraduate psychology students who engaged in an OSCE task that utilised simulated clients reported increased confidence for future work, with mixed attitudes towards the authenticity of utilising simulated clients (Roberts et al., Citation2017). SLAs have also been widely used to provide students with exposure to common ethical dilemmas, again in a protected, low-risk setting (Lateef, Citation2010). SLAs have also demonstrated utility for assessment of student competency across a range of domains (Scalese et al., Citation2008).

The increased interest in SLAs in the wake of COVID-19 prompted the development of guidelines for simulation-based learning in postgraduate psychology programs (Paparo et al., Citation2021). The guidelines provided nine criteria for appropriate delivery of SLAs within a postgraduate psychology program. These guidelines included the need for competency targeted activities, supervision, reflection, and avenues for stakeholder feedback. The development of these guidelines indicates both the impetus and interest in researching innovative methods for psychology training programs, given the current challenges in this field. Paparo et al. (Citation2021) also identified the need for further investigation into the utility of SLAs as a teaching modality, given the potential benefits. Rice, Murray, et al. (Citation2022) extended these simulation guidelines by applying them with simulated placements in postgraduate psychology programs. A fully simulated placement may provide additional benefits over and above both partial SLA training and face-to-face training methods in being able to achieve standardisation of training for all students, as well as ensuring students receive exposure to a wide scope of practice, such as practice across the lifespan (Rice, Murray, et al., Citation2022). Differences in placement experience is of primary concern, as students’ access to a broad scope of practice cannot be guaranteed and may vary widely based on placement sites (Paparo et al., Citation2021), thus, the ability to control these factors highlights a significant strength of a fully simulated placement.

There are a number of barriers to both securing and successfully completing professional psychology placements. Shortages in appropriate, supervised, in-vivo placements are a common concern for university placement coordinators tasked with securing placements that meet a range of requirements designed to ensure sufficient psychological competency development (Shires et al., Citation2017). Timeframes to complete placements also appear to vary widely depending on the ability to accrue the sufficient client contact hours and meet all of the requirements of the program (Rodolfa et al., Citation2007). In addition, interruptions to placement, such as cancelled appointments or supervisor illness, can cause significant delays to overall course progression. Student experiences also reflect these concerns. In an evaluation of student experiences of postgraduate clinical psychology placements, students reported concerns regarding a lack of appropriate placement options, extended timeframes to source and arrange placements, and significant challenges in accruing sufficient client hours (Nedeljkovic et al., Citation2014). While extended placement completion timeframes is a recognised concern for Australian students, specific data on the rates of extended timeframes could not be sourced. This is an internationally recognised issue, however. In a review of student placement completion times in a US sample, Rodolfa et al. (Citation2007) found that students could take between 1.1 and 4.7 years to achieve the requirements for a 12-month placement. Helmes and Pachana (Citation2011) also found students reported concerns regarding lack of diversity in client exposure, and the heavy workload associated with their program of study. These issues were exacerbated by the onset of COVID-19, with significant challenges in securing and completing placements observed broadly across Australian universities (Paparo et al., Citation2021).

Simulated placements may improve training accessibility for students in regional and remote locations. Presently, there is a substantial lack of skilled practitioners in rural and remote areas, with psychologists five times less prevalent in rural areas than urban areas of Australia (Legislative Council, Citation2022). This shortage is despite increasing need for psychologists in regional, rural, and remote areas where rates of psychological distress and suicide are higher than in metropolitan areas (Legislative Council, Citation2022). Capacity of service providers to offer placement opportunities, and availability of supervisors in rural and remote locations are primary barriers to students from regional areas being able to secure regional placements (Eklom, Citation2019). This increases mental and financial strain on students who may be required to relocate in order to meet the training requirements of their degree, and potentially takes away skilled practitioners from these areas (Eklom, Citation2019). Rural background is a primary predictor of future rural practice (Kwan et al., Citation2017), thus, strategies that increase accessibility of study, such as simulated placements, may offer the opportunity for regional and remote students to remain in their communities while completing their required placements (Rice, Murray, et al., Citation2022).

Despite the potential for this medium demonstrated in related fields, there is currently a lack of empirical evidence regarding the efficacy of SLAs in Australian postgraduate psychology training programs (Paparo et al., Citation2021). Furthermore, while evidence for this medium is lacking in psychology programs, there is even less research regarding the effectiveness of entirely simulated placements (Rice, Murray, et al., Citation2022), and there is no clear evidence indicating the ideal proportion of SLAs within a postgraduate psychology program. One recent study of postgraduate psychology students’ experiences of any type of SLA found that those students who had engaged in extended simulated learning were generally supportive of a greater integration of SLAs into psychology training (Oxlad et al., Citation2022). Participants also identified a number of potential benefits of extended simulation-based learning, including greater preparation for future clinical work, and increased opportunity for standardised training experiences (Oxlad et al., Citation2022). The question then is whether simulated placements for psychology students can offer the benefits highlighted above while also providing quality of learning, reflected in student competency development. A randomised control trial comparing the effectiveness of a simulated placement versus a traditional placement for Australian occupational therapy students found equivalent outcomes for students across groups (Imms et al., Citation2018). Students in both groups demonstrated an improvement in competency as rated by supervisors and students in both types of placements also reported equivalent improvements in confidence to respond flexibly to challenges at the completion of their placement (Imms et al., Citation2018). While this study was conducted within another health discipline, the result shows promise for simulated placements to potentially address some issues with traditional placement methods, whilst achieving equivalent learning outcomes, however this needs investigation, particularly in psychology.

Given the novelty of this teaching practice, and the potential to address numerous challenges associated with traditional placement methods, it is necessary to explore the utility of simulated placements in psychology training programs. Therefore, the present study aimed to evaluate the feasibility of a novel simulated placement as a training modality for postgraduate psychology students. Consistent with Bowen et al.’s (Citation2009) recommendations, three components of feasibility were evaluated: practicality, acceptability of the placement, and perceived confidence and competency development.

Methods

Participants

Psychology students who completed a simulated placement as part of their enrolment in an accredited 5th year postgraduate psychology program (a one-year postgraduate training program that allows eligibility for a supervised internship, which leads to eligibility for full registration as a psychologist) at a regional university in Australia were invited to participate in this study. This was the first placement in the professional psychology program for all students. Given that this study was a feasibility study with the primary aim of determining whether a larger scale study is warranted, the sample size was based on students who volunteered to participate (N = 11). In a review of feasibility studies, sample sizes of N = 11 were deemed appropriate dependent on study context (Billingham et al., Citation2013), and other related studies have utilised similar sample sizes to determine feasibility (e.g., Clay et al., Citation2021).

Measures

Practicality

The practicality of the placement, specifically, whether it was achievable, was chosen as a measure of feasibility (Bowen et al., Citation2009). This was assessed based on the proportion of students who enrolled in the placement that went on to successfully complete their placement. In addition, it is common in face-to-face placements for students to struggle to accrue sufficient client hours to meet placement requirements within the set timeframe (Rodolfa et al., Citation2007), thus, the provision of extensions of time in order for students to meet these requirements is a standard practice. This can be due to many factors, including variations between placement settings impacting the ability to accrue client hours or build a full client caseload (Rodolfa et al., Citation2007). Consequently, the present study also analysed the proportion of students that required extensions of time in order to meet placement requirements, as a measure of practicality. This data was also compared to data from face-to-face placements that took place in 2019 and 2020 to examine whether students in the simulated placement were completing the unit at similar rates and in similar time frames compared to students enrolled in traditional face-to-face placements. Both 2019 and 2020 data were examined, given the consideration that COVID-19 may have affected the completion and extension rates for students in 2020.

Perceived confidence and competency development

Competencies of Professional Psychology Rating Scale – Self-Report Version (COPPR-S) (Rice, Schutte, et al., Citation2022)

The COPPR is a measure of APAC Standards Level 3 professional competencies across all domains of practice. This measure is intended to be used in professional psychology training to evaluate competence development, as well as supervisee and registered psychologist performance on core psychological competencies. The measure is clustered into 11 core domains of professional competence (Scientist-Practitioner, Cultural Responsiveness, Working Across the Lifespan, Counselling Micro-skills, Professional Communication and Liaison, Clinical Interviewing, Formulation and Diagnosis, Assessment, Intervention, Ethics, and Self-Reflective Practice). The measure provides both subscale scores, calculated using mean item scores, and a mean total score with higher scores indicating greater competence. The measure has both a self-report version and an observer version of performance across the domains (Rice, Schutte, et al., Citation2022). Initial validation of COPPR-S has been conducted and the measure has demonstrated content, convergent and divergent validity (Rice, Schutte, et al., Citation2022). The present study used the self-report version (COPPR-S) to explore students’ perceived competence across all domains of practice. displays the 7-point Likert scale with behavioural anchors by which participants scored themselves on a range of competencies.

Figure 1. 7-point Likert scale with behavioural anchors for COPPR-S (Rice, Schutte, et al., Citation2022).

Please use the following scale to rate level of competence for each item. Competence is defined as the level expected of a registered psychologist (Rating = 4).
Figure 1. 7-point Likert scale with behavioural anchors for COPPR-S (Rice, Schutte, et al., Citation2022).

The Psychology and Counselling Self-efficacy Scale (PCES) (Watt et al., Citation2019)

The PCES is a self-report questionnaire measuring an individual’s self-efficacy regarding their attainment of psychological competencies. The PCES utilises a five-point Likert scale across 31 items, that cover five factors of competency: research, ethics, legal matters, assessment and measurement, and intervention. Items list various competencies, and respondents rate their confidence in demonstrating these competencies from “not at all” to “extremely” confident. The PCES generates both a total score and five subscale scores, with higher scores indicating higher psychological self-efficacy. The PCES was evaluated through confirmatory factor analysis, demonstrating construct validity (Watt et al., Citation2019).

Acceptability of the simulated placement

Likert scale response and open-ended questions were utilised to collect ratings of acceptability, experience of the placement, and perceived skill development. At the end of the simulated placement, students were asked to respond on a five-point Likert scale, with 1 = strongly disagree to 5 = strongly agree to statements such as “A simulated placement provided a good learning experience”. Open ended questions were included with the aim of capturing students’ experiences of competency development in the placement, as well as benefits and challenges they may have experienced in completing a simulated placement. This included questions such as “What aspects of a simulated placement enhanced your learning and development of competence?” and “What aspects of a simulated placement were the most challenging or made it harder to learn or develop competence?” Both the Likert and open-ended questions were modelled on questions used in a similar evaluation of a previous postgraduate psychology placement (Cosh et al., Citation2021).

Procedure

The study was approved by the University of New England Human Research Ethics committee, and students were invited to participate via an advertisement on the online learning site. Students were eligible to participate if they had completed the simulated placement, and were enrolled in an accredited 5th year postgraduate psychology training program. Participation was voluntary and students were advised that participation in the study would have no bearing on their grades or participation in the placement. At the completion of placement, students chose to participate by selecting the link to the Qualtrics (Provo, UT) survey link, and provided informed consent prior to commencing.

The simulated placement

Consistent with APAC accreditation requirements, the simulated placement consisted of 300 hours of learning activities delivered fully online, completed over one trimester, including the examination period. Students were provided with individual and group supervision by Psychology Board of Australia approved supervisors. Supervision included direct observation, clinical skill demonstrations, and review of student documentation, including session notes, case conceptualisations, and risk assessments. Supervisors were provided with training to supervise in the simulated placement, alongside supervision materials and resources, and ongoing supervision throughout the placement. Supervisors were also able to access support from the academic and placement team if they required assistance at any time during the placement. Students engaged in a range of activities including observation of clinical sessions, video simulations, role play with peers and direct observation by supervisors. Video simulations were demonstrations of psychological sessions conducted by experts in the field. Peer observations involved students observing other students’ skills demonstrations for the purpose of learning and reflection. SLA’s created for student role plays were developed by academic staff in the postgraduate psychology program in consultation with industry experts, and mapped to the APAC accreditation standards. SLA’s were piloted and modified accordingly. Students were able to share recordings of practice sessions with supervisors, and discuss these in supervision, and students completed reflective exercises alongside case-related activities. Objective structured clinical examinations were conducted, and student competencies were assessed by supervisors using a range of ratings, at multiple points throughout the trimester. Students also completed case reports and participated in mid-and-end placement review processes.

In line with accreditation standards (APAC, Citation2019), the simulated placement was designed with learning and task requirements equivalent to a face-to-face placement. Students were required to accrue the same client, supervision, and overall hours as would be expected in a traditional placement setting. Relevant teaching content and resources were provided, for example, reflective practice, and applied throughout case-based scenarios. Scenarios covered common mental health disorders (e.g., major depressive disorder, generalised anxiety disorder, obsessive-compulsive disorder, etc.), and end-to-end simulation of actual practice. Various scenarios were included for learning and assessment tasks to enable students to have exposure to a range of presentations across the lifespan. Each case and related scenario was designed and scaffolded to build core foundational skills followed by more advanced competencies. Skills development was designed to simulate practice, including initial assessments and the development and implementation of treatment plans, along with associated tasks (e.g., documentation of clinical notes). As per the APAC requirement, cultural diversity and practitioner cultural responsiveness was imbedded within cases, reflective activities and assessment processes.

Core competencies of psychological assessment, diagnosis and case conceptualisation, interviewing, ethical practice, and professional communication skills were developed and assessed throughout the placement, with a range of client presentations across the lifespan.

Data analysis

Quantitative data were analysed with IBM SPSS Statistics for Windows, Version 28 (IBM Corp, Citation2021). Descriptive statistics summarised student ratings of satisfaction and acceptability of the training provided. Chi-square goodness of fit test was utilised to compare whether rates of placement completion and extensions of time to complete the placement in the simulated placement, 2021, were significantly different than face-to-face placements in 2019 or 2020.

Content analysis (White & Marsh, Citation2006) was utilised to explore responses to open-ended questions regarding student acceptability and experience of the simulated placement. An inductive approach to analysis was undertaken, whereby researchers sought to understand the experiences of students completing the placement, and this formed the starting point for meaning making within the data, as opposed to coding to a predetermined theoretical framework (Braun & Clarke, Citation2021). In the initial stages, the data were read and reread before being classified into codes. Responses were analysed for codes across the data as opposed to analysing responses to each question separately, to effectively capture any patterns present across the data (Braun et al., Citation2021). The first author completed initial coding of the data, which was then reviewed in conjunction with a second coder. The coded data was then reviewed and grouped into clusters of shared meaning, from which overarching themes were identified. Codes and themes were then collaboratively reviewed by the first and second coder and refined to ensure the identified codes fit with the data and were captured effectively by the overarching themes. The resultant coding and description of themes was reviewed by secondary authors to ensure reliability of the analysis, and that it accurately reflected participant experiences following the format of Franklin et al. (Citation2010).

Results

Eleven provisional psychologists who completed the simulated placement completed the survey. This was the first placement in the postgraduate degree for all participants. Due to the small sample size, and in order to protect participant confidentiality and avoid providing details that might make participants identifiable, no further demographics are reported. Not all participants completed the full survey, thus, the sample size for some measures vary. Sample sizes for all data are reported as part of the results. Practicality of the simulated placement was also assessed by comparing the proportion of students who enrolled in a placement between 2019 and 2021 who successfully completed their placement (N = 195), and the proportion of students who completed a placement between 2019 and 2021 who required extensions of time to meet placement requirements (N = 145).

Practicality of a simulated placement

Practicality was assessed based on comparing placement completion rates and the proportion of students that required extensions of time to accrue required placement hours in the simulated placement to data from face-to-face placement that took place in 2019 and 2020. displays the proportion of students who enrolled in a placement that successfully completed the placement and the proportion of students who completed a placement that required an extension of time in order meet the placement requirements.

Table 1. Placement completion rates and extensions of time 2019–2021.

A Pearson’s chi-square goodness of fit test was used to evaluate whether either placement completion rate or extensions of time to complete the placement were different across placement type. All 3 years were compared individually. The chi-square test examining completion rates was significant χ2 (2, N = 195) = 8.84, p = .012, with a small-to-medium effect size, based on Cohen’s w (Cohen, Citation2013), w = .21, indicating that there was a significant difference between completion rates based on placement, with higher completion rates of students enrolled in the simulated placement. The chi-square test evaluating whether extensions of time to complete differed by placement was also significant χ2(2, N = 145) = 133.22, p < .001, with a very large effect size found, based on Cohen’s w (Cohen, Citation2013) w = .95, with rates of extensions to complete the placement significantly lower for students enrolled in the simulated placement.

Development of competency and confidence and acceptability of the simulated placement

There was general agreement that the simulated placement provided a good learning experience with 90% of students either somewhat or strongly agreeing with this statement. A total of 100% of students either somewhat or strongly agreed that the simulated placement allowed them to develop their skills as a psychologist and 90% agreed that the simulated placement prepared them for future training and work in psychology. Students also reported strong agreement that the simulated placement allowed them to improve their competence as psychologists, with 100% either somewhat or strongly endorsing this statement. Mean scores and standard deviations for these statements are provided in . provides the number and percentage scores for each anchor of the simulated placement acceptability Likert scale.

Table 2. Student ratings of simulated placement acceptability (N = 10).

Table 3. N and percentage (%) for each anchor of the simulated placement acceptability Likert scale (N = 10).

Results from the COPPR-S indicate that at the end of the simulated placement participants perceived their overall competency to be at the standard of a registered psychologist across a range of competencies (Mean item score = 4.85, SD = .73). In the domain of cultural responsiveness, participants reported their competency to be consistent with an advanced beginner. In the domains of counselling micro-skills, clinical interviewing, ethics, and self-reflective practice, on average, participants perceived their skill level as proficient, demonstrating above standard competence in line with a registered psychologist (). Scores for all remaining competencies were reported to be competent to the standard of a registered psychologist. Means across the five PCES subscales all placed above the midpoint, indicating students reported moderate psychological self-efficacy across all competencies at the end of their placement (see ).

Table 4. Outcome for the competencies of professional psychology rating scale – self-report version (COPPR-S) total and subscale scores (N = 9).

Table 5. Outcome for the psychologist and counsellor self-efficacy scale (PCES) total and subscale scores (N = 9).

Qualitative exploration of students’ experience

Three primary themes, those being “Increased Perceived Psychological Competence and Confidence for Future Clinical Work”, “The Simulated Placement Provided a Conducive Environment for Competency Development” and “Elements of the Simulated Placement Diminished the Value of My Learning” were developed from the open-ended responses.

Increased perceived psychological competence and confidence for future clinical work

When asked to reflect on the experience of the simulated placement, students reported that both their psychological competence and their confidence to practice in the future had increased as a result of the simulated placement.

The simulated placement builds my confidence and skills implementing treatment plans and conducting assessments for a range of case presentations.

Students reported that their competence and their confidence to complete specific tasks expected of a psychologist, such as letter writing, case conceptualisation, and delivery of evidence-based interventions had improved, as illustrated below:

Conducting sessions enhanced my learning and development as well as writing notes, writing referrer letters, and learning from my supervisor.

This increased sense of competence and confidence was also reflected in student reports of feeling more prepared for future clinical placement and practice as a psychologist, as a result of the simulated placement.

I feel like it has prepared me well for my internship year next year.

The simulated placement provided a conducive environment for competency development

Students reported that their developing psychological competency was facilitated by an environment that was safe, supportive, and flexible, thus, conducive to the acquisition of new skills. There was a sense that a number of factors regarding the way in which the placement was structured and delivered provided an environment that facilitated students’ learning in ways that felt supportive. These factors included practicing clinical skills with peers, the provision of supervision, and having flexibility in completing required tasks in their chosen time, which are outlined, in turn, below.

Students appreciated the ability to apply burgeoning skills with peers, which again created a safe environment to practice new skills.

Very thankful for the experience of practicing with my peers before actually seeing real clients.

The ability to practice with peers and receive supervision and feedback in a low-risk setting and in a manner that supported ongoing development throughout the placement, was also highlighted as a strength of the placement that facilitated competency development.

My supervisor, [supervisor’s name] was fantastic. The time she put into reviewing my work was so appreciated and she always provided really clear instructions on how to improve and helped me to generate goals to improve every week.

This low-risk setting in which students were able to apply and develop burgeoning skills in a progressive manner across the placement acted as a “bridge” between the classroom-based theoretical knowledge and clinical practice.

I liked how you phased out examples of case notes … This way we had to work off our own knowledge.

Students also reported deriving value from the flexible format of the placement, allowing them to balance the placement more effectively with other life commitments and approach tasks in their own time.

In balancing current work commitments, a simulated placement is more convenient

Elements of the simulated placement diminished the value of my learning

When asked to reflect on challenges experienced in the simulated placement, some students reported that certain aspects of the placement interrupted or diminished the potential for learning, and the perceived value of different learning tasks. These aspects included concerns about the value of skills practice with “fake” clients, high workload and repetitive activities.

The first reported challenge from some students were concerns about the perceived disparate value in skills practice with “real” vs. not “real” clients.

I think it is tricky because it will always be different with real clients compared to role-playing with peers.

Students questioned the value of skills practice with simulated clients and there was a concern the competencies developed through simulated learning activities would not transfer well to future practice with “real” clients.

I am quite nervous about next year[internship] as have never seen a real client before.

Additionally, some students reported that the fidelity of their role plays, and reflective exercises was impacted by waning student effort. Some students reported that clinical skills practice was repetitive, resulting in diminished adherence to effortful practice, thus, limiting the value they were able to derive from the placement.

It was also tiring being the client and the therapist for the same session content, which at times decreased engagement from both myself and the other student, making it hard to do our best work and learn

Some students also reported that the workload involved in the placement diminished the potential for learning. There appeared to be a sense that the practice would have been more meaningful if tasks were more novel or with lower frequency.

I wish there was either less content or more time to complete it so that I could actually absorb all of the information.

Discussion

Results from the present study indicate that simulated placements may potentially represent a feasible training modality for the development of psychological competence in postgraduate psychology programs. Feasibility was evaluated by analysing practicality, acceptability, confidence, and competency development through the simulated placement, and was supported by the findings of improved completion rates, perceived confidence and competence post placement, and students’ positive experiences of the placement. Results of the study also highlight potential areas for improvement in future iterations of a simulated placement, as well as research into the use of simulated learning activities in postgraduate psychology programs.

In terms of completion rates, a higher proportion of students completed the simulated placement than was observed in face-to-face placements completed during COVID-19, or placements completed prior to the onset of COVID-19. In addition, students were able to progress through the simulated placement at a faster pace, as evidenced by the reduced number of extensions of time granted in order for students to meet placement requirements compared to previous face-to-face placement offerings. Almost all students in the simulated placement were able to achieve all placement requirements in the set timeframe, in contrast to previous face-to-face placements. This was in part due to the fact that students were able to continue their placement despite border closures and repeated lockdowns, both of which were repeated strategies in minimising the spread of COVID-19 in Australia in 2021. However, given that completion rates and required extensions of time of the simulated placement were also higher than even pre-COVID-19 rates, it suggests that not only does a simulated placement offer protections against the negative impacts of COVID-19 or other future such events and disruptions, but may also be more accessible and practical than standard face-to-face placements. This increased accessibility and practicality may be related to reduced pressure to find face-to-face placements for students, as well as minimising possible interruptions to placements, such as illness or changes in supervisors, or cancelled client appointments (Paparo et al., Citation2021).

These higher completion rates and shorter completion time-frames may also be due to increased flexibility of the placement. Students were largely able to complete placement tasks at any time they chose, with the freedom to complete extra work if there had been interruptions at another time. This freedom would not likely be available in a traditional placement setting where hours are set, thus, limiting the option for “catching up” if interruptions occur. Ability to accrue client hours in the simulated placement was also not hampered by placement settings or the necessary time it takes to build a full case load that can delay placement progress within traditional placement settings (Rodolfa et al., Citation2007). The value of increased flexibility also appears to be supported by student responses, indicating that participants perceived online delivery as flexible, potentially contributing to the practicality of the placement. Given the ongoing difficulty with securing sufficient placement options for postgraduate students (Eklom, Citation2019), even prior to COVID-19, the preliminary evidence for the practicality of a simulated placement is promising. While not specifically targeted in the present study, simulated placements may also increase accessibility for prospective students in regional and remote areas with the added benefit of keeping skilled practitioners in rural and remote areas, where shortages are a pressing issue, with five times fewer psychologists in these areas in comparison with metropolitan areas (Parliament of NSW, Citation2022; Rice, Murray, et al., Citation2022). While the results of this pilot suggest that simulated placements may be a practical and accessible training modality, this must not be conflated with the capacity for the placement to develop psychological competence.

Acceptability of the placement was evaluated by analysing participants’ attitudes and experiences of the placement, which highlighted that students identified both strengths and challenges associated with the placement. Some participants reported concerns that skill development with simulated patients may not translate to “real” clients. There was a perspective that skill development in this context may lack validity. While concern regarding the authenticity of skill development with simulated clients has been observed previously (Roberts et al., Citation2017), the effects of this do not appear to be supported in the literature. Students who engage in simulated learning activities have demonstrated increased competence as rated by supervisors (Imms et al., Citation2018), and increased confidence for future clinical fieldwork (Roberts et al., Citation2020). SLAs also provide increased opportunity for exposure to standardised clients across a broad scope and exposure to a range of presentations with the ability to control and target specific competencies (Lateef, Citation2010; McGaghie et al., Citation2010). Indeed, while students reported this concern regarding authenticity, they also endorsed that they had developed skills, developed their competence in a range of specific psychological tasks such as assessment, intervention, report writing, and communication with stakeholders, and felt increased confidence for future work. Further investigations into the effectiveness of simulated placements for competency development in psychology would benefit from evaluating any potential differences in competency development between simulated and face-to-face placements and evaluating the potential long-term outcomes of completing a simulated placement early in the training journey.

Another challenge identified by participants was the sense that the value of skills practice was diminished by the repetitive nature of the simulated activities. Participants reported the perception that repetitive skills practice increased practice fatigue leading to less effortful practice and decreased practice fidelity. Interestingly, it is this repetitive nature that is argued to be a benefit of simulated practice. It is through repetitive practice with high-risk, low prevalence clients that students develop important crisis intervention and risk management skills that they may not get exposure to in face-to-face placements (McGaghie et al., Citation2010). Students also have the opportunity to build and develop these skills safely through repeated attempts where the risk to themselves or clients is significantly lower (Lateef, Citation2010). Given this contrast between the benefits of repetitive practice and students’ perceptions of such, it may serve future iterations of the placement to provide a rationale for the value of repetitive practice prior to the commencement of the placement. Students also reported that the high workload associated with required tasks of the placement was a challenge. High workload has also been identified as a challenge in traditional placement settings (Helmes & Pachana, Citation2011), and it must be noted that the requirements of the simulated placement were exactly the same as a standard face-to-face placement, as set out by the accreditation standards.

A primary benefit of SLAs is the opportunity to scaffold learning from tertiary spaces to the application of skills to real-world settings, with a resulting benefit being increased student confidence and competence to practice (Cybulski et al., Citation2010; Oxlad et al., Citation2022). Results from the present study appear to support this argument and further support use of simulation as a placement method. At the completion of placement, students reported a sense of safety, and appreciation of testing their skills with peers prior to working with real clients. Furthermore, students reported a sense of growing confidence and readiness to apply their skills with real clients as the placement went on. The answers to open-ended and Likert-type questions indicated that students felt that that their skills and readiness to practice had increased. Students’ scores on the PCES indicated that students felt moderate confidence across a range of psychological competencies. Results on the COPPR also indicated students perceived their skills to be at the level of an advanced beginner to proficient compared to a registered psychologist, across a range of competencies, at the completion of the placement. These results suggest that at completion of the simulated placement students felt confidence in their psychological competency for future practice. It is pertinent to add that the present mean item score at the end of placement was similar to self-rated COPPR scores found in a sample of provisional psychologists (Rice, Schutte, et al., Citation2022). PCES scores for these students were also similar to scores collected from a group of postgraduate psychology students who were also at the end of their first placement, conducted via telehealth (Cosh et al., Citation2022), and to scores reported by Watt et al. (Citation2019) in the initial PCES development paper. These similarities may suggest that the placement resulted in comparable levels of competence and confidence as found in other postgraduate student samples. However, the endorsed scores do not represent actual competence level, as they are based on self-reported estimates, as objective measures were not available. It is worthy to note that students who have only completed one placement are unlikely to be equivalent in skills to a registered psychologist. Overestimation of capability is a recognised issue in student and early career professionals’ self-reporting of competence in healthcare fields. For example, student nurses were found to overestimate their competence in comparison to objective measures (Forsman et al., Citation2019) and Kuittinen et al. (Citation2014) found that newly registered psychologists are more likely to rate their competency higher than psychologists who have been registered for a longer time. These self-estimations of ability may also be considered appropriate when viewing students within a conscious competence framework, where early-stage learners are generally considered unable to correctly rate their own competence (Burch, 1970, as cited in Kjærgaard & Sorensen, Citation2014). Instead, these results suggest that the students who participated experienced a sense of confidence and preparedness for future work after completion of the simulated placement.

It is important to note that all participants were provisional psychologists who completed the simulated placement as their first placement, so this was their first experience with intensive skills practice. Public safety is at the forefront of the push for competency-based approaches to learning (Donovan & Ponce, Citation2009), and teaching methods that may increase the opportunity for students to apply their burgeoning skills in a context that minimises risk for both the student and client are needed. This approach also satisfies Standard 1: Public Safety, of the APAC (Citation2019) accreditation requirements for psychology programs. As such, simulated placements may be a valuable complement to face-to-face training methods, allowing students the first exposure to practical experience where they can explore and practice their developing skills, prior to seeing actual clients. Simulated placements provide an environment where there is repeated exposure to standardised and comprehensive presentations within which a broad range of skills can be practiced, with a level of control that cannot be guaranteed in a face-to-face placement (Maran & Glavin, Citation2003). A simulated placement allows for the application of theoretical knowledge to the early stages of practical skills acquisition in a safe and controlled environment, where competency development can be more closely assessed and supervised (Lateef, Citation2010).

Limitations and directions for future research

There are a number of limitations in the current study. Unfortunately, paired pre-and-post placement measures could not be collected. Collecting such information would be a valuable focus for future research as it would allow the tracking of competency development over the placement, and provide further evidence for the validity of the simulated placement as a teaching modality for psychological competency development. The present study also relied on self-reported competency, thus, objective ratings in the form of supervisor evaluation of competency development are needed in future research. The small sample size of the present study also limited the inferences that could be made from the data, as well as the types of analysis that could be utilised, and caution should be taken when reviewing these findings. Further, as this was a voluntary study, there is also the potential for selection bias to impact the data. Comparisons of competency development between the simulated placement and an equivalent face-to-face placement was also not possible within the present study. A comparison of face-to-face and simulated placements in future research is warranted, to enable evaluation of whether simulated placements have the potential to provide equivalent or complimentary outcomes comparable to traditional face-to-face placements. Of particular interest in this comparison should be the development of competency, exposure to a broad range of presentations and opportunity for diverse skill development, and accessibility of the placement. A goal of implementation of a simulated placement is unlikely to be the replacement of all face-to-face placements, so instead the question may be whether the strengths of a simulated placement can complement standard placements in supporting students to successfully move through their postgraduate training program, not only in ways that build competency, but also ensuring students have access to a wide range of standard presentations with improved accessibility. These outcomes may not be obvious at the completion of the placement, so future research may benefit from comparing differences in competency development between students who have completed a simulated placement and those who only completed face-to-face placements, at the end of their provisional psychology training. Future research regarding simulated placements may also benefit from evaluating accessibility for regional and remote students. Improving access to skilled practitioners is a primary challenge in regional to remote areas of Australia as well as other countries, and results from the present study indicate that a simulated placement may increase accessibility for students in non-metropolitan areas.

Conclusions and implications

The present study provides preliminary evidence that simulated placements may have the potential to offer a feasible, practical, and acceptable training modality for the development of competency in postgraduate psychology students. Simulated placements may provide benefits over traditional face-to-face placements, including fewer potential interruptions, ability to gain client contact hours more regularly and, thus, reducing the timeframes required to complete placements, improved completion rates, and student and client safety during early skill development. Simulated placements represent a potential novel innovation in training methods that may benefit postgraduate psychology programs in complementing face-to-face placements.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Data availability statement

The data that support the findings of this study may be available on reasonable request from the corresponding author. The data are not publicly available due to ethical restrictions to preserve the anonymity of the participants.

Additional information

Funding

The first author acknowledges the receipt of a Commonwealth Government Research Training Program scholarship in support of her studies.

References

  • Australian Psychology Accreditation Council. (2019). Accreditation standards for psychology programs. https://www.psychologycouncil.org.au/standards_review
  • Billingham, S. A., Whitehead, A. L., & Julious, S. A. (2013). An audit of sample sizes for pilot and feasibility trials being undertaken in the United Kingdom registered in the United Kingdom clinical research network database. BMC Medical Research Methodology, 13(1), 1–6. https://doi.org/10.1186/1471-2288-13-104
  • Bowen, D. J., Kreuter, M., Spring, B., Cofta-Woerpel, L., Linnan, L., Weiner, D., Bakken, S., Kaplan, C. P., Squiers, L., Fabrizio, C., & Fernandez, M. (2009). How we design feasibility studies. American Journal of Preventive Medicine, 36(5), 452–457. https://doi.org/10.1016/j.amepre.2009.02.002
  • Braun, V., & Clarke, V. (2021). Thematic analysis: A practical guide. Sage Publications.
  • Braun, V., Clarke, V., Boulton, E., Davey, L., & McEvoy, C. (2021). The online survey as a qualitative research tool. International Journal of Social Research Methodology, 24(6), 641–654. https://doi.org/10.1080/13645579.2020.1805550
  • Clay, C. J., Schmitz, B. A., Balakrishnan, B., Hopfenblatt, J. P., Evans, A., & Kahng, S. (2021). Feasibility of virtual reality behavior skills training for preservice clinicians. Journal of Applied Behavior Analysis, 54(2), 547–565. https://doi.org/10.1002/jaba.809
  • Cohen, J. (2013). Statistical power analysis for the behavioral sciences. Academic Press.
  • Cosh, S., Rice, K., Bartik, W., Jefferys, A., Hone, A., Murray, C., & Lykins, A. D. (2021). Acceptability and feasibility of telehealth as a training modality for trainee psychologist placements: A COVID-19 response study. Australian Psychologist, 57(1), 28–36. https://doi.org/10.1080/00050067.2021.1968275
  • Cosh, S., Rice, K., Bartik, W., Jefferys, A., Hone, A., Murray, C., & Lykins, A. D. (2022). Acceptability and feasibility of telehealth as a training modality for trainee psychologist placements: A COVID-19 response study. Australian Psychologist, 57(1), 28–36. https://doi.org/10.1080/00050067.2021.1968275
  • Cybulski, J., Holt, D., Segrave, S., O’Brien, D., Munro, J., & Corbitt, B., Smith, R., Dick, M., Searle, I., Zadeh, H., Sarhar, P., Keppell, M., Murdoch, D., & Bradley, B. (2010). Building academic staff capacity for using eSimulations in professional education for experience transfer. Australian Learning and Teaching Council.
  • Donovan, R. A., & Ponce, A. N. (2009). Identification and measurement of core competencies in professional psychology: Areas for consideration. Training and Education in Professional Psychology, 3(4S), S46–S49. https://doi.org/10.1037/a0017302
  • Eklom, B. (2019, March 24–27). Barriers and opportunities to clinical placements in regional, rural and remote settings [ Paper presentation]. 15th National Rural Health Conference. Tasmania, Australia. https://www.ruralhealth.org.au/15nrhc/sites/default/files/D8-5_Stronach.pdf
  • Forsman, H., Jansson, I., Leksell, J., Lepp, M., Sundin Andersson, C., Engström, M., & Nilsson, J. (2019). Clusters of competence: Relationship between self‐reported professional competence and achievement on a national examination among graduating nursing students. Journal of Advanced Nursing, 76(1), 199–208. https://doi.org/10.1111/jan.14222
  • Franklin, C. S., Cody, P. A., & Ballan, M. (2010). Reliability and validity in qualitative research. In B. A. Thyer (Ed.), The handbook of social work research methods (2nd ed., pp. 355–374). SAGE Publications, Inc.
  • Helmes, E., & Pachana, N. A. (2011). Perspectives on clinical psychology training by students at Australian regional and urban universities. Australian Psychologist, 46(2), 113–119. https://doi.org/10.1111/j.1742-9544.2011.00032.x
  • Herrera-Aliaga, E., & Estrada, L. D. (2022). Trends and innovations of simulation for twenty first century medical education. Frontiers in Public Health, 10, 10. https://doi.org/10.3389/fpubh.2022.619769
  • IBM Corp. (2021). IBM SPSS Statistics for Windows (Version 28.0).
  • Imms, C., Froude, E., Chu, E. M. Y., Sheppard, L., Darzins, S., Guinea, S., Gospodarevskaya, E., Carter, R., Symmons, M. A., Penman, M., Nicola-Richmond, K., Gilbert Hunt, S., Gribble, N., Ashby, S., & Mathieu, E. (2018). Simulated versus traditional occupational therapy placements: A randomised controlled trial. Australian Occupational Therapy Journal, 65, 556–564. https://doi.org/10.1111/1440-1630.12513
  • James, A., & Chapman, Y. (2010). Preceptors and patients – the power of two: Nursing student experiences on their first acute clinical placement. Contemporary Nurse: A Journal for the Australian Nursing Profession, 34(1), 34–47. https://doi.org/10.5172/conu.2009.34.1.034
  • Kjærgaard, T., & Sorensen, E. K. (2014). Qualifying the quantified self – A study of conscious learning [ Paper presentation]. Proceedings of International Conference on E-Learning, Valparaiso, Chile (pp. 213–220). https://login.ezproxy.une.edu.au/login?url=https://www.proquest.com/conference-papers-proceedings/qualifying-quantified-self-study-conscious/docview/1545530982/se-2?accountid=17227
  • Kuittinen, M., Meriläinen, M., & Räty, H. (2014). Professional competences of young psychologists: The dimensions of self-rated competence domains and their variation in the early years of the psychologist’s career. European Journal of Psychology of Education, 29(1), 63–80. https://doi.org/10.1007/s10212-013-0187-0
  • Kwan, M. M., Kondalsamy-Chennakesavan, S., Ranmuthugala, G., Toombs, M. R., Nicholson, G. C., & Mercer, A. M. (2017). The rural pipeline to longer-term rural practice: General practitioners and specialists. PLoS One, 12(7). https://doi.org/10.1371/journal.pone.0180394
  • Lateef, F. (2010). Simulation-based learning: Just like the real thing. Journal of Emergencies, Trauma and Shock, 3(4), 348–352. https://doi.org/10.4103/0974-2700.70743
  • Legislative Council. (2022). Health outcomes and access to health and hospital services in rural, regional and remote New South Wales. Parliament of New South Wales. https://www.parliament.nsw.gov.au/committees/inquiries/Pages/inquiry-details.aspx?pk=2615
  • Liaw, S. Y., Chen, F. G., Klainin, P., Brammer, J. A. O., & Samarasekera, D. D. (2010). Developing clinical competency in crisis event management: An integrated simulation problem-based learning activity. Advances in Health Sciences Education, 15(3), 403–413. https://doi.org/10.1007/s10459-009-9208-9
  • Lioce, L., Lopreiato, J. O., Downing, D., Chang, T. P., Robertson, J. M., Anderson, M., Diaz, D. A., Spain, A. E., & the Terminology and Concepts Working Group (Eds.). (2020). Healthcare simulation dictionary (2nd ed.). Agency for Healthcare Research and Quality. https://doi.org/10.23970/simulationv2
  • Lucas, A. N. (2014). Promoting continuing competence and confidence in nurses through high-fidelity simulation-based learning. The Journal of Continuing Education in Nursing, 45(8), 360–365. https://doi.org/10.3928/00220124-20140716-02
  • Maran, N. J., & Glavin, R. J. (2003). Low-to high-fidelity simulation – A continuum of medical education? Medical Education, 37(s1), 22–28. https://doi.org/10.1046/j.1365-2923.37.s1.9.x
  • McGaghie, W. C., Issenberg, S. B., Petruss, E. R., & Scalese, R. J. (2010). A critical review of simulation-based medical education research: 2003–2009. Medical Education, 44(1), 50–63. https://doi.org/10.1111/j.1365-2923.2009.03547.x
  • Nedeljkovic, M., Chaffey, L., Murray, G., & Brennan, C. (2014). Postgraduate clinical psychology placements in Victoria: The experience of students and supervisors. Australian Psychologist, 49(6), 348–357. https://doi.org/10.1111/ap.12067
  • Nehyba, K., Miller, S., Connaughton, J., & Singer, B. (2017). Assessing student clinical learning experiences. The Clinical Teacher, 14(4), 247–250. https://doi.org/10.1111/tct.12557
  • Oxlad, M., D’Annunzio, J., Sawyer, A., & Paparo, J. (2022). Postgraduate students’ perceptions of simulation-based learning in professional psychology training. Australian Psychologist, 57(4), 226–235. https://doi.org/10.1080/00050067.2022.2073807
  • Paparo, B. G., Canoy, D., Chur-Hansen, A., Conti, J. E., Correia, H., Dudley, A., Gooi, C., Hammond, S., Kavanagh, P. S., Monfries, M., Norris, K., Oxlad, M., Rooney, R. M., Sawyer, A., Sheen, J., Xenos, S., Yap, K., & Thielking, M. (2021). A new reality: The role of simulated learning activities in postgraduate psychology training programs. Frontiers in Education (Lausanne), 6. https://doi.org/10.3389/feduc.2021.653269
  • Phillips, A. C., Mackintosh, S. F., Bell, A., & Johnston, K. N. (2017). Developing physiotherapy student safety skills in readiness for clinical placement using standardised patients compared with peer-role play: A pilot non-randomised controlled trial. BMC Medical Education, 17(1), 133–133. https://doi.org/10.1186/s12909-017-0973-5
  • Rice, K., Murray, C. V., Tully, P. J., Hone, A., Bartik, W. J., Newby, D., & Cosh, S. M. (2022). Commentary: An extension of the Australian postgraduate psychology education simulation working group guidelines: Simulated learning activities within professional psychology placements. Frontiers in Education, 7. https://doi.org/10.3389/feduc.2022.840258
  • Rice, K., Schutte, N. S., Cosh, S. M., Rock, A. J., Banner, S. E., & Sheen, J. (2022). The utility and development of the Competencies of Professional Psychology Rating scales (COPPR). Frontiers in Education (Lausanne), 7. https://doi.org/10.3389/feduc.2022.818077
  • Roberts, R., Chur-Hansen, A., Winefield, H., Patten, S., Ward, H., & Dorstyn, D. (2017). Using OSCEs with simulation to maximise student learning and assess competencies in psychology: A pilot study. Focus on Health Professional Education: A Multi-Disciplinary Journal, 18(2), 61–75. https://doi.org/10.11157/fohpe.v18i2.140
  • Roberts, R., Oxlad, M., Dorstyn, D., & Chur-Hansen, A. (2020). Objective structured clinical examinations with simulated patients in postgraduate psychology training: Student perceptions. Australian Psychologist, 55(5), 488–497. https://doi.org/10.1111/ap.12457
  • Rodolfa, E. R., Owen, J. J., & Clark, S. (2007). Practicum training hours: Fact and fantasy. Training and Education in Professional Psychology, 1(1), 64. https://doi.org/10.1037/1931-3918.1.1.64
  • Scalese, R. J., Obeso, V. T., & Issenberg, S. B. (2008). Simulation technology for skills training and competency assessment in medical education. Journal of General Internal Medicine, 23(S1), 46–49. https://doi.org/10.1007/s11606-007-0283-4
  • Shires, A., Vrklevski, L., Hyde, J., Bliokas, V., & Simmons, A. (2017). Barriers to provision of external clinical psychology student placements. Australian Psychologist, 52(2), 140–148. https://doi.org/10.1111/ap.12254
  • Watt, H. M. G., Ehrich, E., Stewart, S. E., Snell, T., Bucich, M., Jacobs, N., Furlonger, B., & English, D. (2019). Development of the psychologist and counsellor self-efficacy scale. Higher Education, Skills & Work-Based Learning, 9(3), 485–509. https://doi.org/10.1108/HESWBL-07-2018-0069
  • White, M. D., & Marsh, E. E. (2006). Content analysis: A flexible methodology. Library Trends, 55(1), 22–45. https://doi.org/10.1353/lib.2006.0053