King's College London

Research portal

Real-time segmentation of non-rigid surgical tools based on deep learning and tracking

Research output: Chapter in Book/Report/Conference proceedingConference paper

Luis C. García-Peraza-Herrera, Wenqi Li, Caspar Gruijthuijsen, Alain Devreker, George Attilakos, Jan Deprest, Emmanuel Vander Poorten, Danail Stoyanov, Tom Vercauteren, Sébastien Ourselin

Original languageEnglish
Title of host publicationComputer-Assisted and Robotic Endoscopy - 3rd International Workshop, CARE 2016 Held in Conjunction with MICCAI 2016, Revised Selected Papers
PublisherSpringer Verlag
Pages84-95
Number of pages12
Volume10170 LNCS
ISBN (Print)9783319540566
DOIs
Publication statusE-pub ahead of print - 22 Feb 2017
Event2nd International Workshop on Patch-Based Techniques in Medical Imaging, Patch-MI 2016 held in conjunction with 19th International Conference on Medical Image Computing and Computer Assisted Intervention, MICCAI 2016 - Athens, Greece
Duration: 17 Oct 201617 Oct 2016

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume10170 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference2nd International Workshop on Patch-Based Techniques in Medical Imaging, Patch-MI 2016 held in conjunction with 19th International Conference on Medical Image Computing and Computer Assisted Intervention, MICCAI 2016
CountryGreece
CityAthens
Period17/10/201617/10/2016

King's Authors

Abstract

Real-time tool segmentation is an essential component in computer-assisted surgical systems. We propose a novel real-time automatic method based on Fully Convolutional Networks (FCN) and optical flow tracking. Our method exploits the ability of deep neural networks to produce accurate segmentations of highly deformable parts along with the high speed of optical flow. Furthermore, the pre-trained FCN can be fine-tuned on a small amount of medical images without the need to hand-craft features. We validated our method using existing and new benchmark datasets, covering both ex vivo and in vivo real clinical cases where different surgical instruments are employed. Two versions of the method are presented, non-real-time and real-time. The former, using only deep learning, achieves a balanced accuracy of 89.6% on a real clinical dataset, outperforming the (non-real-time) state of the art by 3.8% points. The latter, a combination of deep learning with optical flow tracking, yields an average balanced accuracy of 78.2% across all the validated datasets.

View graph of relations

© 2018 King's College London | Strand | London WC2R 2LS | England | United Kingdom | Tel +44 (0)20 7836 5454