639
Views
0
CrossRef citations to date
0
Altmetric
Research Article

What counts? A Delphi consensus-based approach to interpreting accreditation standards for Direct Client Activities in clinical psychology and clinical neuropsychology programs

ORCID Icon, ORCID Icon, ORCID Icon, , , ORCID Icon, ORCID Icon & ORCID Icon show all
Pages 457-465 | Received 30 Jan 2023, Accepted 06 Jun 2023, Published online: 02 Jul 2023

ABSTRACT

Objective

Despite operating from the same professional accreditation standards, discrepancies in what is recorded as Direct Client Activities (DCA) in postgraduate psychology trainee logbooks have been noted across training programs in Australia. The aim of this study was to create a consensus-based list of DCA to guide the completion of trainee logbooks while undertaking practicums in clinical psychology or clinical neuropsychology.

Method

A modified Delphi approach was used to gather data across three rounds. Two expert panels, representing the two areas of practice endorsement (AoPEs), rated the degree to which they agreed or disagreed with a range of activities being recorded as DCA in trainee logbooks. Activities with over 80% agreement or disagreement during any round were accepted or rejected from the final list, respectively.

Results

Sixteen activities for clinical psychology and 30 activities for clinical neuropsychology were endorsed by the expert panels. Only nine activities across the two panels did not reach consensus over the three rounds.

Conclusions

This study has created a list of consensus activities within these two AoPEs which will facilitate benchmarking activities, and reduce confusion and anxiety for trainees, supervisors, and placement coordinators. Discrepancies do, however, remain across the AoPEs, warranting further clarification and definition refinement.

Key Points

What is already known about this topic:

  1. Logbooks are a key strategy in ensuring individual trainees and postgraduate training programs are meeting minimum standards set by accrediting bodies.

  2. Although required by the Australian Psychology Accreditation Council (APAC) to be recorded in trainee logbooks, there is a lack of consensus across Australian postgraduate psychology program providers as to what constitutes “Direct Client Activities”.

  3. Inconsistent logging practices prevents benchmarking and creates confusion and anxiety for trainees, supervisors, and placement coordinators.

What this topic adds:

  1. This paper was the first, to our knowledge, to obtain expert consensus on trainee Direct Client Activities for logbook recording within the clinical psychology and clinical neuropsychology areas of practice endorsement.

  2. The list provides greater certainty for “what counts” as Direct Client Activities for logbook recording within these two areas of practice endorsement, with 16 clinical psychology and 30 clinical neuropsychology activities endorsed.

  3. There is a need to better clarify the rationale and principles for recording Direct Client Activities across areas of practice endorsement, to inform ongoing practices in these areas.

As is the case in various countries, training to become a registered psychologist in Australia requires that provisional psychologists attend a number of placements that provide an opportunity to gain experience across the lifespan where they complete a defined amount of client facing hours. Although there has been a move towards competency-based approaches to clinical training and assessment in Australian postgraduate psychology education (Stevens et al., Citation2017), there remains an ongoing need to ensure that trainees obtain sufficient client-facing activities to build skills and competence during their training. Typically, logbooks are used to record the amount and types of placement activities, including client-facing and supervision activities during postgraduate psychology training. Logbooks are standard practice across a broad range of undergraduate and postgraduate health professional training programs to record clinical training experiences. They serve a range of functions that ensure trainees are meeting minimum training standards, such as to monitor the consistency and quality of placement experiences, to record reflection and competency assessments, and/or to ease communication regarding trainee experiences across diverse placement settings within a health profession (Gooi et al., Citation2021; Schüttpelz-Brauns et al., Citation2016).

Use of logbooks in Australian area of practice endorsement training

In Australia, postgraduate psychology logbooks serve a range of purposes, but primarily provide a formal means of ensuring both individual trainees and training programs meet the current minimum standards set by their accrediting body. The Australian Psychology Accreditation Council (APAC) provides accreditation standards and guidelines for evidence that Australian education providers must meet in order to initially attain, and later retain, the accreditation status of their psychology programs (Australian Psychology Accreditation Council [APAC], Citation2019a, Citation2019b). Different standards apply to general registration as a psychologist versus registration with an area of practice endorsement (AoPE). An AoPE indicates the psychologist has demonstrated advanced specialised knowledge and skills in an area such as clinical psychology, clinical neuropsychology, or health psychology, among others. There are currently nine AoPEs in Australia. To be eligible to apply for registration as an Australian psychologist with an AoPE, trainees must first complete an APAC approved postgraduate training course. While engaged in the course, trainees must demonstrate a minimum number of Client-Related and Direct Client Activities (DCA) and are expected to maintain and submit to the education provider, as evidence of these activities, a logbook specifying the activity type and hours completed. The logbook must be reviewed regularly and signed by their practicum supervisor to capture a formal record of the trainee’s placement experience. APAC provides a definition of DCA that is to be applied across all areas of practice endorsement. Specifically, DCA include “ … phone-calls with clients, face-to-face contact with clients (including e-health), meetings where the student reports to the team or organisation, and work with clients and others relevant to their care, including families, employers, supervisors, teachers, health providers or legal guardians” (Citation2019a, p. 28).

Discrepancies in logbook practices

Despite common professional accreditation standards, discrepancies in what is recorded in psychology trainee logbooks occurs across fieldwork settings, training providers, and areas of practice endorsement, both in Australia and overseas. For example, varying levels of agreement concerning the recording of activities was noted among 195 doctoral psychology students in the United States (Hatcher et al., Citation2011), where 95% of programs agreed that direct service to individual clients, couples, groups, and organisations could be recorded but only 37% agreed with recording consultation with professionals and systems external to the practicum site, including schools, courts, physicians, and other providers. Likewise, among 393 psychology training directors in the US, there was virtually universal recognition of clinical intervention and assessment hours as legitimate practicum activities, whereas community consultation was considered an acceptable activity by only 53% of the sample (n = 211; Kaslow et al., Citation2005). These findings indicate that while US trainees and placement providers may record some activities with a sense of certainty, the recording of other activities may be a source of contention.

This uncertainty about “what counts” as psychology placement hours, and differences in what is recorded in logbooks within and across education providers has also been reflected in a recent Australian study. Quinlan et al. (Citation2022) found that trainees and academic staff from a range of postgraduate programs noted a lack of understanding of what constituted DCA for the purpose of logbook recording, despite the definition provided by APAC. This finding is consistent with anecdotal reports of differences across courses in interpreting the DCA definition within an AoPE, with some courses applying strict interpretations of DCA, and others using more generous definitions. This is also consistent with anecdotal reports of differences across AoPE programs, with activities that may be deemed suitable to be recorded as DCA in one AoPE deemed unsuitable in another. For example, conducting a service evaluation may be deemed acceptable to record as DCA during an organisational psychology placement but not during a clinical neuropsychology placement. Such discrepancies may see an organisation hosting two trainees from the same AoPE but different universities, or from the same education provider but different AoPEs, given conflicting instructions regarding how to categorise and thus log DCA. This is not only confusing but may create anxiety for education providers needing to meet the APAC standards, for trainees wanting to ensure they have met the minimum requirements in their placements, and for supervisors wanting to ensure they are providing sufficient DCA work for their trainee psychologists. With such discrepancies in opinion, understanding and practice, some of the value of recording placement activities is likely diminished and lost. Furthermore, with increased uncertainty about what minimum DCA experiences trainees graduating from these programs have, there is potential for loss of confidence in or reputation of the psychology profession.

Aims of this study

This study sought to create an expert consensus-based list of DCAs to guide the completion of trainee logbooks while undertaking a postgraduate degree in an AoPE. It forms part of the work of the Australian Psychology Placement Alliance (APPA) Working Group, which all authors on this paper are part of, looking to develop recommendations around the definition of DCA in the context of postgraduate psychology training. This study focussed on clinical psychology and clinical neuropsychology AoPEs in particular, given that some other AoPEs have already developed their own AoPE-specific guidelines or lists for this purpose, and clinical psychology and clinical neuropsychology represent the largest and equal second largest number of postgraduate AoPE courses (APAC, May, Citation2023), and largest and fourth largest number of practice endorsements in Australia, respectively (Psychology Board of Australia, Citation2022). It is anticipated that the two lists of accepted DCA will address the current differences in opinion and interpretation of the current APAC requirements and, in turn, decrease the confusion and anxiety currently experienced by placement coordinators, placement supervisors, and trainees when completing placement logbooks during their degree. In addition, the outcome of this study will add to the considerations of the APPA Working Group around national recommendations for DCA definitions.

Method

Design

A modified Delphi study was used to gather data across three rounds. This method involves repeatedly presenting a series of statements to an expert panel who independently rate their agreement or disagreement with each statement until a consensus on “what counts” has been achieved. While a standard Delphi approach involves an initial open round of questions, the modified Delphi approach involves presenting structured statements in the first round, and is commonly used to save time and effort for the panellists (Keeney et al., Citation2011; Rowe et al., Citation1991). Given many placement coordinators had their own lists of approved activities within their educational institutions, the modified approach was deemed most appropriate for this study.

Participants

Two expert panels were formed by contacting placement coordinators and AoPE supervisors representing clinical psychology (N = 24) and clinical neuropsychology (N = 15). Placement coordinators and AoPE supervisors were sought due to their role in overseeing placement standards and student progression, and oversight of the student’s day-to-day practice, respectively. Both may be involved with decision making concerned with logbook recording practices. All participants except one in each panel held Psychology Board of Australia AoPE supervisor status. For Delphi designs using a homogeneous panel, a minimum of 10–15 participants generally allows for results to be deemed generalisable and representative of the larger population (Keeney et al., Citation2011), with over-recruiting allowing for potential dropouts between rounds. At the time of writing, there are approximately 40 clinical psychology program placement coordinators, spanning all states, in Australia. Of these, 23 placement coordinators and one psychology clinic director, who completed the survey on behalf of their postgraduate program, responded to the invitation and agreed to participate. There are only five clinical neuropsychology programs in Australia and their placement coordinators and program/course directors were identified and invited to participate. They were also asked to forward the recruitment materials to their accredited AoPE placement supervisors. This resulted in eight placement coordinators or program directors and seven AoPE supervisors forming the clinical neuropsychology panel. The locations and experience of panel members are summarised in .

Table 1. Location and experience of expert panel participants.

Materials and procedure

The research was approved by the Curtin University Human Research Ethics (HRE2022–0234). The initial Delphi survey statements for round one were collated from two sources. First, an email was sent to the APPA listserv requesting that clinical psychology and clinical neuropsychology placement coordinators submit any pre-existing DCA definitions and lists used in their training programs. Second, university placement coordinators from the remaining AoPEs were contacted to see if they held any AoPE-specific definitions and lists. This yielded three additional guidance documents for health psychology health promotion placements, organisational psychology, and forensic psychology. These three documents were reviewed for possible additional items not yet captured. Items were collated by the first author [RA] and reviewed by each member of the research team to ensure ambiguous content was clarified, and repetition limited. This process generated 56 items for the round one survey, with both panels receiving the same initial survey items.

Placement coordinators across Australia were contacted and invited to participate in the study via the APPA listserv. The clinical neuropsychology placement coordinators were also asked to identify additional recent practicum supervisors to partake in the survey rounds. For survey round one, participants were emailed the information and consent form and a link to the survey via Qualtrics Survey Software. Participants were asked to review each statement in the survey and indicate their level of agreement that the listed activity should be considered DCA for the purpose of recording in trainee logbooks. Participants selected from the following responses: 1 = Strongly Agree, 2 = Agree, 3 = Don’t know/Depends, 4 = Disagree, 5 = Strongly Disagree. In the first round, participants could add further items to be considered by the research team and expert panel. They could also clarify their response to “Don’t know/Depends” via a free text box. Statements from round one with at least 80% of the panel selecting agreement/strong agreement were considered endorsed by the expert panel and removed from the future rounds. Items with at least 80% disagreement/strong disagreement were considered rejected by the expert panel and removed from future rounds. Items below the 80% threshold plus any new/revised items generated from round one were presented for rating again in round two. The research team reviewed the free text submissions associated with any “Don’t know/Depends” responses and reworded and split some items, as needed, for clarification (see ).

Table 2. Clinical psychology endorsed items by round.

The remaining items for each AoPE were then sent to the two expert panels for a second round of rating via Qualtrics. For round two, items reaching 80% agreement/strong agreement were considered endorsed by the expert panel and items reaching 80% disagreement/strong disagreement were considered rejected. Items reaching endorsement or rejection in round two were removed from the list and not presented again for round three. All remaining items were presented again in round three, where the 80% threshold was again used to determine acceptance or rejection to the final consensus list of activities. Any remaining items after round three were marked as “Non-consensus items”. Participants were given two weeks to complete each round, with up to three reminder emails sent to participants yet to complete the round.

To assist with their decision making, for all items being re-rated in rounds two and three, participants were informed as to what the level of agreement/disagreement was, expressed as a percentage, in the prior round. For rounds two and three, the option to suggest new items was removed. The option to state “Don’t know/Depends” was also removed to approximate the real-world conditions where placement coordinators/supervisors must make dichotomous decisions on these matters when requested.

Results

Round one

The clinical psychology expert panel endorsed 13 items, rejected 20 items, and generated one new item. The new item was reviewed by the research team but deemed to be a duplicate of an existing item, so was not included in round two. Consensus was not achieved on 23 items. The clinical neuropsychology expert panel endorsed 23 items, rejected 14 items, and did not suggest any new items. Consensus was not achieved on 19 items.

Open text submissions related to non-consensus items for both panels were reviewed. One item on the clinical psychology survey and nine items on the clinical neuropsychology survey were adjusted or split to clarify the level of input by the trainee (e.g., active participation versus passive observation of an activity), whether the client was in attendance, and the recipient of the activity (i.e., client versus non-client stakeholders). The open text submissions for 13 clinical psychology and three clinical neuropsychology items also revealed that many participants believed there should be a cap on the number of hours for some listed activities. As this was outside the scope of this project, an instruction was set for rounds two and three to rate the item with the knowledge that a cap could be set for any of these activities.

Rounds two and three

Twenty-three of the original 24 clinical psychology expert panel participated in round two. They endorsed two further items and rejected 13 items. This left nine items to be re-rated in round three. There were 21 participants in round three. They endorsed one final item and rejected two. The final list of endorsed items across all rounds is available in . Six items did not reach consensus by the end of the three rounds (see ).

Table 3. Clinical neuropsychology endorsed items by round.

Table 4. Non-consensus items following three rounds.

Twelve of the original 15 clinical neuropsychology expert panel participated in round two. They endorsed five further items and rejected nine items. This left nine items to be re-rated in round three. There were 10 participants in round three. They endorsed two final items and rejected four. The final list of endorsed items across all rounds is available in . Three items did not reach consensus by the end of the three rounds (see ). A list of rejected items for each panel is available in the supplementary materials. The impact of dropout was explored by comparing participants last rating prior to dropping out with the final group consensus. We found no incidents of participants disagreeing with an item that was subsequently accepted in a later round, strengthening the confidence in the final retained items.

Discussion

The aims of this study were partially met, in that we were able to obtain consensus on what does and does not count as DCA for most of the suggested activities within each AoPE. For clinical psychology and clinical neuropsychology there are now 16 and 30 activities, respectively, that placement coordinators can recommend be recorded in trainee placement logbooks as DCA. These activities lists may potentially serve a basis for standardised logbook recording practices within each AoPE across Australian placement settings, allowing higher education settings to record and benchmark activities with greater certainty than previously.

Of note, several activities within each AoPE did not reach consensus to either reject or accept. It is concerning that, after three rounds of deliberation, those tasked with making decisions about whether to allow that these activities be recorded in trainee logbooks were not able to agree. It should be noted that in the open text of the round one responses, there were 12 comments made by clinical psychology and four comments made by clinical neuropsychology panel members regarding the need for a cap on hours for certain activities. It may be that there would be further agreement and allowance for a broader range of activities if some explicit caps were introduced for some specific activities.

Discrepancies also remain across the two AoPEs. All but one of the clinical psychology endorsed items (i.e., delivering a training, workshop, or presentation to a placement entity’s clients or other stakeholders regarding psychologically-related material) were endorsed by the clinical neuropsychology panel. The clinical neuropsychology panel did not reach consensus on this activity but 70% of respondents disagreed that it should be recorded as DCA. Furthermore, the clinical neuropsychology panel endorsed the recording of approximately twice as many activities as the clinical psychology panel. At a content level, the clinical neuropsychology panel endorsed the recording of time spent in client-focussed group supervision, reviewing and interpreting case materials, and preparing written work (i.e., seven additional endorsed items), activities that were outright rejected by the clinical psychology panel. The clinical neuropsychology panel also endorsed a greater range of both active and passive observational client related activities, so long as there was debrief and/or reflection of these activities with an appropriate practitioner or supervisor (i.e., seven additional items). The same or similar content items were not outright rejected but failed to reach consensus across the three rounds for the clinical psychology panel, although there was generally more agreement than disagreement that they should be included.

There are several plausible explanations for these differences. There have been historical differences in the requirements for different AoPEs. The 2019 APAC standards (APAC, Citation2019a, Citation2019b) were the first from APAC to specify a minimum requirement for 400 Direct Client Hours for all two-year AoPE courses. The 400 hours requirement appears to be based on the standard for face-to-face client contact hours required for clinical psychology programs in the Australian Psychological Society College Course Approval Guidelines for Postgraduate Specialist Courses (Australian Psychological Society, Citation2010). These, now retired, 2010 Guidelines did not set minimum Direct Client Hours for clinical neuropsychology, community, educational and developmental, forensic, health or organisational psychology programs, and allowed a less stringent 300 hours of “client related work” for the counselling psychology specialised programs, although it should be noted that there is very little evidence to suggest how many hours are required to reach competence for the different AoPEs. It is possible that raising the requirement for all AoPEs to meet the 400 Direct Client Hours minimum has made for an unachievable standard for some AoPEs, leading to more abstract, conceptual and innovative interpretations of what can be recorded as Direct Client Hours. It is also conceivable that the fundamental differences in what is valued in developing core competencies across various AoPEs has impacted on what respondents across the different expert panels viewed as appropriate to record. This would be consistent with the advanced training each APAC level 4 program receives, highlighting the diversity and niche set of skills developed at that level across each AoPE. Finally, it may be that the inclusion of different respondents in the panels influenced the outcomes. The clinical psychology panel consisted of only placement coordinators, whose role involves demonstrating that their course meets the APAC standards. The clinical neuropsychology panel, who endorsed more items as appropriate to be recorded as DCA in logbooks, consisted of placement coordinators and recent practicum supervisors. It is possible that the recent supervisors were more generous in their interpretations of the standards as they viewed the broader activities as core to client work, had less awareness of or concerns about meeting the accreditation standards, or were keen to not limit the number of activities that could be recorded in their settings to ensure the viability of hosting trainees on an ongoing basis. Future research may wish to explore differences, if any, between placement coordinators and practicum supervisors definitions of DCA.

Conclusion

The Delphi approach was well-suited to answer the question of “what counts?”, with the three anonymous rounds of responding designed to reduce bias in gaining consensus by removing peer pressure while also encouraging convergence of ideas (Chalmers & Armour, Citation2019). There was excellent retention of participants across the three rounds and no observed impacts of dropout on the final retained items, strengthening the validity of these results. From a practical perspective, this study provides the first consensus-based lists of endorsed DCA and represents a starting point for agreement and thus benchmarking of activities within clinical psychology and clinical neuropsychology. There were, however, notable discrepancies across AoPEs, indicating that there are unresolved differences in how each AoPE is interpreting and thus defining DCA.

The findings from this study contributes to the processes of the APPA Working Group that is currently developing national guidelines for the definition of DCA in postgraduate psychology training. Various issues remain including the need to clarify whether the 400 hours of DCA set by APAC is achievable and relevant for demonstrating competence for all AoPEs, whether there can or should be consensus across AoPEs on how these DCA are demonstrated, and whether clarification and consensus can be achieved as to the underlying principles of recording DCA in AoPE postgraduate programs. It is feasible that future iterations of these current accepted and rejected AoPE specific DCA will contract, expand or even disappear as the underlying principles of DCA are established, and in response to wider psychology profession and healthcare system reforms that may change the role of psychologists in the future. It is the hope that the findings and recommendations of the APPA Working Group will address some of these issues and provide helpful guidance for placement coordinators in making decisions around the categorisation of placement activities.

Data available statement

The data that support the findings of this study are available on reasonable request from the corresponding author, RA.

CRediT author statement

All authors: Conceptualisation, Methodology, Resources, Writing – review and editing Rebecca A. Anderson: Data collection, curation and analysis, Writing original draft.

Supplemental material

Supplemental Material

Download MS Word (20.9 KB)

Disclosure statement

No potential conflict of interest was reported by the author(s).

Supplementary material

Supplemental data for this article can be accessed at https://doi.org/10.1080/00050067.2023.2225694

References

  • Australian Psychological Society. (2010). College course approval guidelines for postgraduate specialist courses.
  • Australian Psychology Accreditation Council. (2019a). Accreditation standards for psychology programs version 1.2. The Australian Psychological Society Limited.
  • Australian Psychology Accreditation Council. (2019b). Evidence-guide Version 1.2. The Australian Psychological Society Limited.
  • Australian Psychology Accreditation Council. (2023, May). Search for an Accredited Psychology Program. https://apac.au/accredited-programs/search-a-program/
  • Chalmers, J., & Armour, M. (2019). The Delphi Technique. In P. Liamputtong (Ed.), Handbook of research methods in health social sciences (pp. 715–735). Springer Singapore. https://doi.org/10.1007/978-981-10-5251-4_99
  • Gooi, C. H., Quinlan, E., Paparo, J., & Deane, F. P. (2021). Balancing the books: A benchmarking study into Australian postgraduate psychology practicum logbooks. Australian Psychologist, 56(4), 335–343. https://doi.org/10.1080/00050067.2021.1913970
  • Hatcher, R. L., Grus, C. L., & Wise, E. H. (2011). Administering practicum training: A survey of graduate programs’ policies and procedures. Training and Education in Professional Psychology, 5(4), 244–252. https://doi.org/10.1037/a0025088
  • Kaslow, N. J., Pate, W. E., & Thorn, B. (2005). Academic and internship directors’ perspectives on practicum experiences: Implications for training. Professional Psychology: Research and Practice, 36(3), 307–317. https://doi.org/10.1037/0735-7028.36.3.307
  • Keeney, S., McKenna, H., & Hasson, F. (2011). The Delphi technique in nursing and health research. Wiley-Blackwell. https://doi.org/10.1002/9781444392029
  • Psychology Board of Australia. (2022, September). Registrant data. Reporting period: 01 July 2022 to 30 September 2022. https://www.psychologyboard.gov.au/About/Statistics.aspx
  • Quinlan, E., Paparo, J., Gooi, C. H., & Deane, F. P. (2022). “It’s just a table of numbers”: The search for pedagogical meaning in psychology practicum logbooks. Clinical Psychologist, 26(3), 309–318. https://doi.org/10.1080/13284207.2022.2073210
  • Rowe, G., Wright, G., & Bolger, F. (1991). Delphi: A reevaluation of research and theory. Technological Forecasting and Social Change, 39(3), 235–251. https://doi.org/10.1016/0040-16259190039-I
  • Schüttpelz-Brauns, K., Narciss, E., Schneyinck, C., Böhme, K., Brüstle, P., Mau-Holzmann, U., Lammerding-Koeppel, M., & Obertacke, U. (2016). Twelve tips for successfully implementing logbooks in clinical training. Medical Teacher, 38(6), 564–569. https://doi.org/10.3109/0142159X.2015.1132830
  • Stevens, B., Hyde, J., Knight, R., Shires, A., & Alexander, R. (2017). Competency‐based training and assessment in Australian postgraduate clinical psychology education. Clinical Psychologist, 21(3), 174–185. https://doi.org/10.1111/cp.12061