5,094
Views
0
CrossRef citations to date
0
Altmetric
Innovation in Biomedical Science and Engineering

Tracking-by-detection of surgical instruments in minimally invasive surgery via the convolutional neural network deep learning-based method

, , , &

References

  • Stoyanov D. Surgical vision. Ann Biomed Eng. 2012;40:332–345.
  • Yang W, Hu C, Meng M, et al. A 6D magnetic localization algorithm for a rectangular magnet objective based on a particle swarm optimizer. IEEE Trans Magn. 2009;45:3092–3099.
  • Joskowicz L, Milgrom C, Simkin A, et al. Fracas: a system for computer-aided image-guided long bone fracture surgery. Comput Aided Surg. 1998;3:271–288.
  • Wei G, Arbter K, Hirzinger G. Automatic tracking of laparoscopic instruments by color coding. In: Joint Conference Computer Vision, Virtual Reality and Robotics in Medicine and Medical Robotics and Computer-Assisted Surgery. Heidelberg (BER): Springer; 1997. p. 357–366.
  • Krupa A, Gangloff J, Doignon C, et al. Autonomous 3-d positioning of surgical instruments in robotized laparoscopic surgery using visual servoing. IEEE Trans Robot Automat. 2003;19:842–853.
  • Bouarfa L, Akman O, Schneider A, et al. In-vivo real-time tracking of surgical instruments in endoscopic video. Minim Invasive Therapy. 2012;21:129–134.
  • Zhao Z. Real-time 3D visual tracking of laparoscopic instruments for robotized endoscope holder. Bio-med Mater Eng. 2014;24:2665–2672.
  • Wolf R, Duchateau J, Cinquin P, et al. 3D tracking of laparoscopic instruments using statistical and geometric modeling. In: 14th International Conference on Medical Image Computing and Computer-Assisted Intervention-MICCAI. Heidelberg (BER): Springer; 2011. p. 203–210.
  • Agustinos A, Voros S. 2D/3D real-time tracking of surgical instruments based on endoscopic image processing. In: 2nd International Workshop on Computer Assisted and Robotic Endoscopy-CARE. Heidelberg (BER): Springer; 2015. p. 90–100.
  • Dockter R, Sweet R, Kowalewski T. A fast, low-cost, computer vision approach for tracking surgical tool. In: 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems-IROS. Chicago (IL); 2014. p. 1984–1989.
  • Pezzementi Z, Voros S, Hager DG. Articulated object tracking by rendering consistent appearance parts. In: 2009 IEEE International Conference on Robotics and Automation-ICRA. Kobe; 2009. p. 3940–3947.
  • Reiter A, Allen PK. An online learning approach to in-vivo tracking using synergistic features. In: 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems-IROS. Taipei; 2010. p. 3441–3446.
  • Li Y, Chen C, Huang X, et al. Instrument tracking via online learning in retinal microsurgery. In: 17th International Conference on Medical Image Computing and Computer-Assisted Intervention-MICCAI. Heidelberg (BER): Springer; 2014. p. 464–471.
  • Reiter A, Allen PK, Zhao T. Feature classification for tracking articulated surgical tools. In: 15th International Conference on Medical Image Computing and Computer-Assisted Intervention-MICCAI. Heidelberg (BER): Springer; 2012. p. 592–600.
  • Allan M, Thompson S, Clarkson MJ, et al. 2D-3D pose tracking of rigid instruments in minimally invasive surgery. In: 5th International Conference on Information Processing in Computer-Assisted Interventions-IPCAI. Heidelberg (BER): Springer; 2014. p. 1–10.
  • Du X, Allan M, Dore A, et al. Combined 2D and 3D tracking of surgical instruments for minimally invasive and robotic-assisted surgery. Int J Cars. 2016;11:1–11.
  • Wesierski D, Wojdyga G, Jezierska A. Instrument Tracking with Rigid Part Mixtures Model. In: 2nd International Workshop on Computer Assisted and Robotic Endoscopy-CARE. Heidelberg (BER): Springer; 2015. p. 22–34.
  • Sznitman R, Richa R, Taylor RH, et al. Unified detection and tracking of instruments during retinal microsurgery. IEEE Trans Pattern Anal Mach Intell. 2013;35:1263–1272.
  • Richa R, Balicki M, Sznitman R, et al. Vision-based proximity detection in retinal surgery. IEEE Trans Biomed Eng. 2012;59:2291–2301.
  • Bourdev L, Malik J. Poselets: body part detectors trained using 3D human pose annotations. In: 2009 International Conference on Computer Vision-ICCV2009. Kyoto; 2009. p. 1365–1372.
  • Fischler MA, Bolles RC. Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Commun Acm. 1981;24:381–395.
  • Fukushima K. Analysis of the process of visual pattern recognition by the neocognitron. Neural Netw. 1989;2:413–420.
  • Doignon C, Nageotte F, de Mathelin M. The role of insertion points in the detection and positioning of instruments in laparoscopy for robotic tasks. In: 9th International Conference on Medical Image Computing and Computer-Assisted Intervention-MICCAI. Heidelberg (BER): Springer; 2006. p. 527–534.
  • Clarke JC, Carlsson S, Zisserman A. Detecting and tracking linear features efficiently. In: 7th British Machine Vision Conference. Edinburgh; 1996. p. 415–424.
  • Krizhevsky A, Sutskever I, Hinton GE. Imagenet classification with deep convolutional neural networks. Adv Neural Inf Process Syst. 2012;25:1097–1105.
  • Zhang Z. A flexible new technique for camera calibration. IEEE Trans Pattern Anal Machine Intell. 2000;22:1330–1334.
  • Wang W, Zhang P, Shi Y, et al. Design and compatibility evaluation of magnetic resonance imaging-guided needle insertion system. J Med Imaging Health Inform. 2015;5:1963–1967.
  • Wang W, Shi Y, Goldenberg A, et al. Experimental analysis of robot-assisted needle insertion into porcine liver. BME. 2015;26:S375–S380.