925
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Objective assessment for open surgical suturing training by finger tracking can discriminate novices from experts

ORCID Icon, , , &
Article: 2198818 | Received 22 Dec 2022, Accepted 30 Mar 2023, Published online: 04 Apr 2023

ABSTRACT

It is difficult, time consuming and expensive to assess manual skills in open surgery. The aim of this study is to investigate the construct validity of a low-cost, easily accessible tracking technique for basic open suturing tasks. Medical master students, surgical residents, and surgeons at the Radboud University Medical Center were recruited between September 2020 until September 2021. The participants were divided, according to experience, in a novice group (≤10 sutures performed) and an expert group (>50 sutures performed). For objective tracking, a tablet with SurgTrac software was used, which tracked a blue and a red tag placed on respectively their left and right index finger. The participants executed four basic tasks on a suturing model: 1) knot tying by hand, 2) transcutaneous suturing with an instrument knot, 3) ‘Donati’ (vertical mattress suture) with an instrument knot and 4) continuous intracutaneous suturing without a knot. In total 76 participants were included: 57 novices and 19 experts. All four tasks showed significant differences between the novice group and expert group for the parameters time (p<0.001), distance (p<0.001 for Task 1, 2 and 3 and p=0.034 for Task 4) and smoothness (p<0.001). Additionally, Task 3 showed a significant difference for the parameter handedness (p=0.006) and Task 4 for speed (p=0.033). Tracking index finger movements using SurgTrac software on a tablet while executing basic open suturing skills on a simulator shows excellent construct validity for time, distance and motion smoothness in all four suturing tasks.

Introduction

It is important to have effective surgical training devices that can be used regularly to maintain surgical skills. Tools to train surgical skills are investigated and developed frequently [Citation1–9]. Most of these training tools are used to train minimal invasive skills [Citation1–4,Citation6,Citation7]. For example, box trainers or virtual reality simulators are developed, which can be used to track laparoscopic instruments and some even provide the trainee with haptic feedback [Citation1,Citation10,Citation11]. New and more advanced techniques to practice minimal invasive surgical skills continue to develop, while devices to adequately train and assess open surgical skills fall behind. Several tools to enable the practice of open surgical procedures do exist, ranging from basic suturing pads to three-dimensional, layered models and even (live) animals [Citation9,Citation12–16]. For all these tools, an expert is necessary to provide the trainee with feedback, which make their use less flexible. This factor makes this approach unsuitable for self-directed continuous training. Furthermore, those experts use observer-based tools, such as OSATS [Citation17], GERT [Citation18] or UWOMSA [Citation19], which are always more or less subjective. For adequate assessment a training tool with objective parameters is desired. An available technique with objective parameters that overcomes the need for expert observation, uses electromagnetic motion tracking to track hand movements during an open surgical procedure [Citation8,Citation20]. Unfortunately, this system is very expensive and consequently not fitted for broad implementation in continuous training.

In previous research, we showed that a low-cost tracking tool has potential to be used in open surgical simulated tasks for the tracking of finger movements [Citation21]. This technique has the potential to overcome the need for expert observation, while remaining cost-effective. Therefore, this assessment tool could be useful in the training of open surgical skills. This tracking software has been used in MIS training and has shown be able to discriminate between expertise levels (construct validity), however this has not been evaluated in the use for open surgical training. Prior to using this system as a training device, it is important to demonstrate a good construct validity. The aim of this study is to investigate the ability to discriminate between expertise levels of this low-cost, broadly accessible technique of finger tracking in simple open surgical suturing tasks.

Methods

Participants

Surgeons, surgical residents and medical students at the Radboud University Medical Center, Nijmegen, The Netherlands, were recruited to participate in this study. Participants were included from September 2020 until September 2021. The medical students were included during the first week of their surgical rotation (internship) and assumed to be novices in surgical suturing skills, because they had no previous surgical rotation. The students had been taught basic suturing skills during education prior to their surgical internship, after which they were included, before any clinical exposure. Therefore, they had knowledge on how to perform the sutures needed for this study and did not need additional guidance during the study. Surgical residents and surgeons were perceived as suturing experts for the selected tasks. First, the participants completed an informed consent form and a short questionnaire, regarding their suturing experience. Novices were only included if they had no previous experience prior to their suturing education and experts had equal to or more than fifty sutures as previous experience. This was based on previous studies stating that expertise should be reached by 50 repetitions for most basic surgical procedures, such as laparoscopic cholecystectomy [Citation22–24]. By ensuring that the experts had done at least 50 repetitions on all different suturing skills that we evaluated in this study, we regard them as experienced.

Participants agreed with anonymous processing of the collected data. A waiver for medical ethical approval was provided, because of the non-medical intervention setup of this study.

Equipment To objectively track participant’s finger movements, a tablet with tracking software and an open surgical simulator were used. show a setup of the used materials. To track the participants’ finger movements, SurgTrac software (eoSurgical Ltd., Edinburgh, United Kingdom) was used [Citation25] as a tablet application. The tablet was placed in a stand above the right shoulder of the participant. The distance between the camera and the simulator was set at sixty centimeters to have an adequate overview. SurgTrac software has been developed to track minimally invasive surgical instruments, tagged with a blue and red sticker, in a simulated setting. The software recorded 30 frames per second for accurate constant motion tracking [Citation7]. Recent research has shown that it is possible to use SurgTrac software for finger tracking in open surgical simulation [Citation21]. Participants wore white surgical gloves with the right index finger tagged by a red balloon-tube and the left index finger tagged by a blue balloon-tube. A model by PediatrickBoxx (Nijmegen, The Netherlands) [Citation26] was used for a standardized simulation task of open surgery. The model consists of a wooden cast placed in a position of forty-five degrees angle to the table, with a suturing pad by EduStitch [Citation27]. This achieves an adequate view of the task execution through the camera.

Figure 1. a and b Research setup with a Lenovo P10 tablet in a stand and a simulator by PediatrickBoxx.

Figure 1. a and b Research setup with a Lenovo P10 tablet in a stand and a simulator by PediatrickBoxx.

Tasks

All participants executed four suturing tasks on the given equipment in the following order:

  1. Knot tying by hand (): participants tied a reef knot consisting of an underhand and overhand throw.

    Figure 2. a) Task 1: Knot tying by hand. b) Task 2: Transcutaneous suture. c) Task 3: ‘Donati’ suture. d) Task 4: Intracutaneous suture.

    Figure 2. a) Task 1: Knot tying by hand. b) Task 2: Transcutaneous suture. c) Task 3: ‘Donati’ suture. d) Task 4: Intracutaneous suture.

  2. Transcutaneous suturing and knot tying with instruments (): participants executed one transcutaneous suture on the incision of their pad and tied the suture using their instruments.

  3. Vertical mattress suturing (‘Donati’ suture) and knot tying with instruments (): participants executed one vertical mattress suture on the incision of their pad and tied the suture using their instruments.

  4. Continuous intracutaneous suturing without knot tying (): an intracutaneous knot was tied in advance by the researcher at the upper side of the incision. Participants made an intracutaneous suture through the total incision in their pad (four centimeters in length). No intracutaneous or extracutaneous knot was done at the end of the suture

Outcomes

The SurgTrac software on the tablet tracked the red and blue tag on the index fingers of the participants during the execution of the suturing tasks. The parameters that the SurgTrac software measures are time of executing a task (in seconds), distance traveled by the left and right tag (in meters), distance between hands (average distance between the red and blue tag in centimeters), hands off-screen (in percentage of time), speed (mean speed of right or left hand, in millimeters/second), acceleration (mean acceleration of right or left hand, in millimeters/second [Citation2], smoothness (mean smoothness of motion of right or left hand, in millimeters/second [Citation3] and handedness (percentage right- and left-hand usage). These parameters, as measured by SurgTrac have been frequently validated for training of minimally invasive skills [Citation7,Citation28,Citation29] especially time and distance. Because no other data for open surgery and finger-tracking is available for this software, the proven valid parameters in MIS skills, namely time and distance serve as the primary outcome parameters for the validation of finger-tracking in this open surgical suturing tasks study.

The other parameters provided by the software will serve as secondary outcome parameters. All parameters will be included in the construct validation process, barring the parameter off-screen, because it lacks clinical relevance in open surgery (no use of a screen).

The parameters are measured separately for the red and blue tag by SurgTrac. The combined total score of the red and blue tag for the parameter distance, consisted of distance travelled by a participant’s right and left hand in total. The combined total scores of the parameters speed, acceleration and smoothness were analysed as a mean score for right and left hand in total. The score of handedness is described as difference between percentage usage of the right hand and percentage usage of the left hand.

Statistical analyses

Data was analyzed by IBM Statistical Package for Social Sciences (SPSS), version 25. A p-value of<0.05 was assumed statistically significant. To compare outcomes of the novices and expert group, Mann Whitney U tests were used in case of data with non-normal distribution. All other data were represented as descriptive statistics. For a desired power of 0.80 with a power rate of 0.05 a sample size of 19 participants is required. This was based on the expected and clinically relevant differences in time to complete the task. For basic open suturing tasks we assumed 45 seconds as a clinically relevant difference. However, a large variation in results was expected in the novices group. Therefore, more novices were included, to overcome this problem and strengthen the data.

Results

Demographics

76 participants were included, 57 participants in the novice group and 19 participants in the expert group. The novice group consisted of medical students without any suturing experience, prior to their basic training, just before this study. The expert group comprised of six residents and thirteen surgeons, with varying, but at least three years of surgical experience. All the experts had executed fifty or more sutures in their medical career.

The mean age was 23.8 years (SD 3.0) in the novice group, 43.9 years (SD 9.9) in the expert group and 28.9 years (SD 10.3) overall. The novice group consisted of fourteen male and forty-three female, where the expert group consisted of more male than female, namely fifteen male and four female participants. No significant differences in outcome parameters were found for gender within both groups.

Construct validity

shows the primary outcome parameters of the four executed tasks in both groups. Novices and experts differed significantly for the parameters time (p < 0.001) and distance (<0.001<p < 0.041) in all four tasks. The differences between novices and experts for the primary outcome parameters are visualized in . In the secondary outcome parameters are shown. Of those, smoothness differed significantly (p < 0.001) for all four tasks. Furthermore, for Task 2 and 3 there was a significant difference between novices and experts in handedness (0.001<p < 0.046). While Task 4 shows a significant difference in distance between the hands (p = 0.015), in addition to the significant differences in time, distance and smoothness.

Figure 3. a and b Bar graph of primary outcome parameters time and distance.

Figure 3. a and b Bar graph of primary outcome parameters time and distance.

Table 1. Primary outcome parameters of novices and experts. Mann Whitney U tests. P-value<0.05 is assumed as a statistically significant difference.

Table 2. Secondary outcome parameters of novices and experts. Mann Whitney U tests. P-value<0.05 is assumed as a statistically significant difference.

Discussion

The SurgTrac software, used in this study for tracking index finger movements, was initially validated for tracking laparoscopic instruments during execution of minimally invasive surgical tasks [Citation4,Citation25]. In a prior experimental study (accepted manuscript) the feasibility of tracking index finger movements using SurgTrac was confirmed. Therefore, this study focusses on the construct validation of this assessment method for open surgical training, to evaluate the ability of this assessment method to discriminate between novices and experts.

All executed basic suturing tasks in this study showed a significant difference in the outcome parameters time, distance and smoothness when comparing novices to experts. For these parameters construct validity is established. For the other parameters measured by SurgTrac, such as handedness and distance between hands, construct validity could not be established. Although these latter two do not seem to have any clinical relevance in the assessment of the trainee’s skill level. Unequal distribution of using the right and left hand or the distance between both hands, does for itself not affect the quality of a suture and has no clinical relevance in the assessment of it. Smoothness, on the contrary, is an indicator of a participant’s instrument handling and a relevant indicator for the executioner’s level of basic suturing skills. Because the parameter smoothness is calculated using the parameters time and distance, it is debatable whether this parameter as such can be best used to differentiate between skills levels of the trainees. Nevertheless, this calculated parameter did show a significant difference between novices and experts, which means it can potentially be used by a trainee to monitor their own skills level. The current simulation and assessment setting with tracking finger movements can adequately discriminate between a novice or expert level for basic suturing skills. Tracking finger movements using SurgTrac while performing basic suturing skills on a simulator is therefore a promising setting to train and objectively assess basic open suturing skills. The aim of this study is to evaluate whether and which parameters of the SurgTrac application are able to discriminate novices from experts. This had been done in numerous studies as a construct validity study [Citation30–37]. We acknowledge that this is a basic form of validity testing, however, it is a first step towards the assessment of the true potential and capabilities of this tracking software in the use for open surgical training. The next step is a concurrent validity study, which we are currently doing, based on the results that we have from this study. After that a pass-fail cut of should be established, to be able to truly assess and inform the trainee whether they are proficient or not.

The previously mentioned simulation setting is, unlike previous investigated settings, such as using sensory gloves with electromagnets [Citation8,Citation20], easily accessible and relatively affordable, because of the use of simple materials like gloves and balloon-tubes. Consequently, this setting is accessible for nearly every potential trainee in virtually every country, because the SurgTrac application can be downloaded to any smartphone or tablet. This improves the possibilities of medical students, residents and doctors to train and assess their skills independently. Furthermore, this tool is very compact, which makes it usable in every desired place, like home or work. Another advantage of this tool can be that no expert is needed for assessment, allowing a trainee to train at any suited time.

A simulator for basic open suturing tasks combined with finger motion tracking, provides an easily accessible simulation setting, which is available for everyone to train anytime. This is not only to learn new surgical skills, but also to maintain the optimal level of skills, especially if there is no consistent exposure in the clinical setting. Because continuous training is a necessity to maintain surgical skills, as shown before in MIS studies on this subject [Citation29,Citation38–41].

This study showed that time, distance and smoothness are valid parameters to assess technical aspects of performing a suture in open surgery. Yet, these parameters, cannot be extrapolated to assess quality aspects, such as firmness of the knot, distance between sutures or distance between the wound edges. Clinical outcome of any surgical task is the ultimate outcome parameter, therefore further research should focus on correlating clinically relevant parameters with quality of technical performance.

Limitations

No information was collected about the right- or left-handedness of participants. Analyses were performed using the combined total scores of the trainees right and left hand, in which dominance had no influence on the outcome parameters. Nevertheless, it would be interesting to further explore the differences between novices and experts taken hand dominance into account and analyse their dominant and non-dominant hand separately. Although not all parameters provided by the SurgTrac software seemed to be relevant for open surgery, these were all included in the study to avoid assumption bias. Due to the position of the tablet, and thus camera, with the tracking app, sometimes the fingers were not clearly in view, which could affect the outcome data. This was visible when looking at the out of view data and when this was too much, it was clear that the data could be less reliable. However, this did not seem to affect our results. When further validation studies have been performed, such as concurrent validity, where the results of this assessment method are compared to expert assessment, for example, the true potential will be evident. Based on this kind of studies a pass-fail score could be made, which trainees can use to evaluate their skills independently. However, it could also state that more parameters are needed for a true relevant assessment, which are not accounted for in this method.

Conclusion

Tracking index finger movements using SurgTrac software on a tablet, while executing basic suturing skills on a low-cost surgical suturing model, shows excellent construct validity for time, distance and motion smoothness in all four suturing tasks. This new open surgical assessment method can be implemented in any training setting, because it is easily set up as an application on any mobile device, making it a potent objective assessment tool in open surgical training.

Availability of data

The datasets used and analysed during the current study are available from the corresponding author on reasonable request.

Acknowledgments

No further acknowledgements need to be appointed.

Disclosure statement

This research did not receive any specific grant from funding agencies in the public, commercial, or not-for-profit sectors. No author disclosures or conflicts of interests have to be disclosed, other than the fact that Dr. S. Botden is co-founder of PediatrickBoxx, a small non-for-profit cooperation, that has developed the wooden casing used in this study. The minimal profit on selling the models is used in the development and production of new models.

Additional information

Funding

The author(s) reported there is no funding associated with the work featured in this article.

References

  • Bökkerink GM, Joosten M, Leijte E, et al. Validation of low-cost models for minimal invasive surgery training of congenital diaphragmatic hernia and esophageal atresia. J Pediatr Surg. 2020;56(3):465–8.
  • Hiyoshi Y, Miyamoto Y, Akiyama T, et al. Time trial of dry box laparoscopic surgical training improves laparoscopic surgical skills and surgical outcomes. Asian J Endosc Surg. 2021;14(3):373–378. DOI:10.1111/ases.12871
  • Alaker M, Wynn GR, Arulampalam T. Virtual reality training in laparoscopic surgery: a systematic review & meta-analysis. Int J Surg. 2016;29:85–94.
  • Arts EEA, Leijte E, Witteman BPL, et al. Face, content, and construct validity of the take-home eosim augmented reality laparoscopy simulator for basic laparoscopic tasks. J Laparoendosc Adv Surg Tech A. 2019;29(11):1419–1426.
  • Genovese B, Yin S, Sareh S, et al. Surgical hand tracking in open surgery using a versatile motion sensing system: are we there yet? Am Surg. 2016;82(10):872–875. DOI:10.1177/000313481608201002
  • Higuchi M, Abe T, Hotta K, et al. Development and validation of a porcine organ model for training in essential laparoscopic surgical skills. Int J Urol. 2020;27(10):929–938. DOI:10.1111/iju.14315
  • Keni S, Ilin R, Partridge R, et al. Using automated continuous instrument tracking to benchmark simulated laparoscopic performance and personalize training. J Surg Educ. 2021;78(3):998–1006.
  • Datta V, Mackay S, Mandalia M, et al. The use of electromagnetic motion tracking analysis to objectively measure open surgical skill in the laboratory-based model. J Am Coll Surg. 2001;193(5):479–485.
  • Joosten M, Bökkerink GMJ, Levitt MA, et al. The use of an inanimate simulation model for the correction of an anorectal malformation in the training of colorectal pediatric surgery. Eur J Pediatr Surg. 2021;32(03):287–293. DOI:10.1055/s-0041-1723035
  • Jokinen E, Mikkola TS, Härkki P. Simulator training and residents’ first laparoscopic hysterectomy: a randomized controlled trial. Surg Endosc. 2020;34(11):4874–4882.
  • Kuroki T, Fujioka H. Training for laparoscopic pancreaticoduodenectomy. Surg Today. 2019;49(2):103–107.
  • Ueda K, Kino H, Katayama M, et al. Simulation surgery using 3D 3-layer models for congenital anomaly. Plast Reconstr Surg Glob Open. 2020;8(8):e3072.
  • Gonzalez-Navarro AR, Quiroga-Garza A, Acosta-Luna AS, et al. Comparison of suturing models: the effect on perception of basic surgical skills. BMC Med Educ. 2021;21(1):250. DOI:10.1186/s12909-021-02692-x
  • Pérez-Escamirosa F, Montoya-Alvarez S, Ordorica-Flores RM, et al. Design of a dynamic force measurement system for training and evaluation of suture surgical skills. J Med Syst. 2020;44(10):174. DOI:10.1007/s10916-020-01642-2
  • DeMasi SC, Katsuta E, Takabe K. Live animals for preclinical medical student surgical training. Edorium J Surg. 2016;3(2):24–31.
  • Kite AC, Yacoe M, Rhodes JL. The use of a novel local flap trainer in plastic surgery education. Plast Reconstr Surg Glob Open. 2018;6(6):e1786.
  • Martin JA, Regehr G, Reznick R, et al. Objective structured assessment of technical skill (OSATS) for surgical residents. Br J Surg. 1997;84(2):273–278. DOI:10.1046/j.1365-2168.1997.02502.x
  • Bonrath EM, Zevin B, Dedy NJ, et al. Error rating tool to identify and analyse technical errors and events in laparoscopic surgery. Br J Surg. 2013;100(8):1080–1088.
  • Temple CLF, Ross DC. A new, validated instrument to evaluate competency in microsurgery: the University of Western Ontario Microsurgical Skills Acquisition/Assessment instrument [outcomes article]. Plast Reconstr Surg. 2011;127(1):215–222.
  • Saggio G, Lazzaro A, Sbernini L, et al. Objective surgical skill assessment: an initial experience by means of a sensory glove paving the way to open surgery simulation? J Surg Educ. 2015;72(5):910–917. DOI:10.1016/j.jsurg.2015.04.023
  • Hillemans V, Verhoeven B, Botden S. Feasibility of tracking in open surgical simulation. Int J Healthc Simul. 2022;1–10. DOI:10.54531/juvj5939
  • Kowalewski KF, Hendrie JD, Schmidt MW, et al. Development and validation of a sensor- and expert model-based training system for laparoscopic surgery: the iSurgeon. Surg Endosc. 2017;31(5):2155–2165. DOI:10.1007/s00464-016-5213-2
  • Nickel F, Kowalewski K-F, Rehberger F, et al. Face validity of the pulsatile organ perfusion trainer for laparoscopic cholecystectomy. Surg Endosc. 2017;31(2):714–722. DOI:10.1007/s00464-016-5025-4
  • Ayodeji ID, Schijven M, Jakimowicz J, et al. Face validation of the Simbionix LAP Mentor virtual reality training module and its applicability in the surgical curriculum. Surg Endosc. 2007;21(9):1641–1649.
  • Ferns J. An app to make a surgeon. BMJ. 2013;346:f3361.
  • Bökkerink GM PediatrickBoxx [Website]. Available from: https://www.pediatrickboxx.com/.
  • EduStitch Available from: https://www.edustitch.com/.
  • Mansoor SM, Våpenstad C, Mårvik R, et al. Construct validity of eoSim - a low-cost and portable laparoscopic simulator. Minim Invasive Ther Allied Technol. 2020;29(5):261–268.
  • Joosten M, Bökkerink GMJ, Stals JJM, et al. The effect of an interval training on skill retention of high-complex low-volume minimal invasive pediatric surgery skills: a pilot study. J Laparoendosc Adv Surg Tech A. 2021;31(7):820–828.
  • Elarbi MM, Ragle CA, Fransson BA, et al. Face, construct, and concurrent validity of a simulation model for laparoscopic ovariectomy in standing horses. J Am Vet Med Assoc. 2018;253(1):92–100.
  • Rueda Esteban RJ, López-McCormick JS, Rodríguez-Bermeo AS, et al. Face, content, and construct validity evaluation of simulation models in general surgery laparoscopic training and education: a systematic review. Surg Innov. 2022;15533506221123704. DOI:10.1177/15533506221123704
  • Suebnukarn S, Chaisombat M, Kongpunwijit T, et al. Construct validity and expert benchmarking of the haptic virtual reality dental simulator. J Dent Educ. 2014;78(10):1442–1450.
  • Koch AD, Buzink SN, Heemskerk J, et al. Expert and construct validity of the Simbionix GI Mentor II endoscopy simulator for colonoscopy. Surg Endosc. 2008;22(1):158–162.
  • van der Wiel Se, Koch AD, Bruno MJ. Face and construct validity of a novel mechanical ERCP simulator. Endosc Int Open. 2018;6(6):E758–e65.
  • Alshuaibi M, Perrenot C, Hubert J, et al. Concurrent, face, content, and construct validity of the RobotiX Mentor simulator for robotic basic skills. Int J Med Robot. 2020;16(3):e2100.
  • Leijte E, Arts E, Witteman B, et al. Construct, content and face validity of the eoSim laparoscopic simulator on advanced suturing tasks. Surg Endosc. 2019;33(11):3635–3643.
  • Sinceri S, Berchiolli R, Marconi M, et al. Face, content, and construct validity of a simulator for training in endovascular procedures. Minim Invasive Ther Allied Technol. 2018;27(6):315–320.
  • Bekele A, Wondimu S, Firdu N, et al. Trends in retention and decay of basic surgical skills: evidence from addis ababa university, Ethiopia: a prospective case-control cohort study. World J Surg. 2019;43(1):9–15.
  • Varley M, Choi R, Kuan K, et al. Prospective randomized assessment of acquisition and retention of SILS skills after simulation training. Surg Endosc. 2015;29(1):113–118.
  • Scerbo MW, Britt RC, Montano M, et al. Effects of a retention interval and refresher session on intracorporeal suturing and knot tying skill and mental workload. Surgery. 2017;161(5):1209–1214.
  • Joosten M, Hillemans V, van Capelleveen M, et al. The effect of continuous at-home training of minimally invasive surgical skills on skill retention. Surg Endosc. 2022;1–9.