133
Views
0
CrossRef citations to date
0
Altmetric
ORIGINAL RESEARCH

Exploring Endoscopic Competence in Gastroenterology Training: A Simulation-Based Comparative Analysis of GAGES, DOPS, and ACE Assessment Tools

, , , , &
Pages 75-84 | Received 19 Jul 2023, Accepted 09 Jan 2024, Published online: 31 Jan 2024
 

Abstract

Purpose

Accurate and convenient evaluation tools are essential to document endoscopic competence in Gastroenterology training programs. The Direct Observation of Procedural Skills (DOPS), Global Assessment of Gastrointestinal Endoscopic Skills (GAGES), and Assessment of Endoscopic Competency (ACE) are widely used validated competency assessment tools for gastrointestinal endoscopy. However, studies comparing these 3 tools are lacking, leading to lack of standardization in this assessment. Through simulation, this study seeks to determine the most reliable, comprehensive, and user-friendly tool for standardizing endoscopy competency assessment.

Methods

A mixed-methods quantitative-qualitative approach was utilized with sequential deductive design. All nine trainees in a gastroenterology training program were assessed on endoscopic procedural competence using the Simbionix Gi-bronch-mentor high-fidelity simulator, with 2 faculty raters independently completing the 3 assessment forms of DOPS, GAGES, and ACE. Psychometric analysis was used to evaluate the tools’ reliability. Additionally, faculty trainers participated in a focused group discussion (FGD) to investigate their experience in using the tools.

Results

For upper GI endoscopy, Cronbach’s alpha values for internal consistency were 0.53, 0.8, and 0.87 for ACE, DOPS, and GAGES, respectively. Inter-rater reliability (IRR) scores were 0.79 (0.43–0.92) for ACE, 0.75 (−0.13–0.82) for DOPS, and 0.59 (−0.90–0.84) for GAGES. For colonoscopy, Cronbach’s alpha values for internal consistency were 0.53, 0.82, and 0.85 for ACE, DOPS, and GAGES, respectively. IRR scores were 0.72 (0.39–0.96) for ACE, 0.78 (−0.12–0.86) for DOPS, and 0.53 (−0.91–0.78) for GAGES. The FGD yielded three key themes: the ideal tool should be scientifically sound, comprehensive, and user-friendly.

Conclusion

The DOPS tool performed favourably in both the qualitative assessment and psychometric evaluation to be considered the most balanced amongst the three assessment tools. We propose that the DOPS tool be used for endoscopic skill assessment in gastroenterology training programs. However, gastroenterology training programs need to match their learning outcomes with the available assessment tools to determine the most appropriate one in their context.

Disclosure

All authors report no conflicts of interest in this work.